- 3341 Views
- 2 replies
- 0 kudos
Azure Network Settings in regards to Databricks Table Monitoring
I have set up my Unity Catalog on an Azure Data Lake which uses the companies virtual network to allow access.I have all privileges on my account, so I am able to create, alter or delete catalogs, schemas and tables. I can do these things either usin...
- 3341 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi Kaniz,Thanks for the response and for identifying the problem.I would like some steps on how to adjust the network settings, as everything I have tried so far, hasn't seemed to work.
- 0 kudos
- 887 Views
- 1 replies
- 0 kudos
databricks-connect 14.3 spark error against 14.3 cluster with data_security_mode = NONE
I am running into an issue with trying to use a 14.3 cluster with databricks-connect 14.3.My cluster config: { "autoscale": { "min_workers": 2, "max_workers": 10 }, "cluster_name": "Developer Cluster", "spark_version": "14.3.x-scala2...
- 887 Views
- 1 replies
- 0 kudos
- 0 kudos
Are you running the latest version of the Databricks connect?
- 0 kudos
- 2689 Views
- 2 replies
- 3 kudos
Unable to read resources - Unsupported Protocol Scheme (Terraform AWS)
Hello everyone!Over the last few weeks my company has been trying to deploy a Databricks workspace on AWS adapted to the customer's needs, using Terraform. To do this, we started from a base code on Databricks own github (https://github.com/databrick...
- 2689 Views
- 2 replies
- 3 kudos
- 3 kudos
Whats the solution for this? facing same issue.
- 3 kudos
- 2789 Views
- 2 replies
- 0 kudos
Databricks dashboard programatically
Hi,How can I create a databricks dashboard, filters and visuals programatically (api, terraform, sdk, cli...)?Thanks,Pawel
- 2789 Views
- 2 replies
- 0 kudos
- 0 kudos
Maybe slightly late (maybe because development was late :P), but hopefully it will also help other.1. There seems to be support added to the newest terraform databricks provider - 1.49.0 - here2. Other solution would be to use databricks cli (e.g. `d...
- 0 kudos
- 691 Views
- 1 replies
- 2 kudos
List deployed Databricks asset bundles (DABs)?
Is there a databricks cli command or REST API to list all the DABs that have been deployed to a workspace?
- 691 Views
- 1 replies
- 2 kudos
- 2 kudos
Hi @Schofield ,Unfortunately, I don't think there is out of the box command that will provide you this information yet. As a workaround, you can try write some code that will extract this information from REST API. For example, you can use /api/2.1/j...
- 2 kudos
- 941 Views
- 2 replies
- 2 kudos
Resolved! delta sharing issue after enable predictive optimization
Some of our delta sharing tables are not working May be related to this, or maybe not, we enabled predictive optimization on all tables a few days agoes not working any morebut any new tables created works fine after setting thisSET TBLPROPERTIES (de...
- 941 Views
- 2 replies
- 2 kudos
- 2 kudos
after some debugging, I find out a very unique cause if we used a JSON string in column comment, and it make sense that a JSON string in column comment breaks delta sharingexample: column COMMENT {"key": "primary_key", "is_identity": "true"}The erro...
- 2 kudos
- 3286 Views
- 2 replies
- 1 kudos
Delta Lake S3 multi-cluster writes - DynamoDB
Hi there!I'm trying to figure out how the multi-writers architecture for Delta Lake tables is implemented under the hood.I understand that a DynamoDB table is used to provide mutual exclusion, but the question is: where is the table located? Is it in...
- 3286 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi, could you please help me here? How can i use this configuration in DataBricks? So I will maintain my transcription logs there, and with Parallel, I can use the Delta-RS job.spark.conf.set("spark.delta.logStore.s3a.impl", "io.delta.storage.S3Dynam...
- 1 kudos
- 1540 Views
- 1 replies
- 0 kudos
Technical Architecture - Feedback
Hello MembersI have designed a Technical Architecture (image attached). I would like some feedback on the current design (especially from 5.1 and onwards) and maybe some more ideas or anything else I can use instead of Azure Service Bus and Cosmos DB...
- 1540 Views
- 1 replies
- 0 kudos
- 0 kudos
In the step 3 you will want to consider using Databricks Workflows for orchestration. The ADF databricks notebook activity is not actively developed by microsoft and the API it uses is legacy by Databricks So neither vendor is actively supporting t...
- 0 kudos
- 486 Views
- 0 replies
- 0 kudos
- 486 Views
- 0 replies
- 0 kudos
- 16395 Views
- 4 replies
- 0 kudos
Resolved! Help Needed: Errors with df.display() and df.show() in Databricks
Dear Databricks Community,I am reaching out to you for assistance with some issues I'm encountering in my Databricks environment. I'm hoping the community can provide some guidance to help me resolve these problems.1. Error with df.display(): When I ...
- 16395 Views
- 4 replies
- 0 kudos
- 0 kudos
Dear Databricks Community,I wanted to share some updates regarding the issues I've been encountering in my Databricks environment.After raising a ticket with Microsoft and collaborating with their team for approximately a week, we undertook several t...
- 0 kudos
- 3286 Views
- 2 replies
- 0 kudos
Asset bundle yml factorization
Hello,I have a project using asset bundle in which I have several jobs using roughly the same job definition (tags and job clusters definitions are always the same) Is there a way to put everything in common in a yml file and reuse that in each indiv...
- 3286 Views
- 2 replies
- 0 kudos
- 0 kudos
@erigaud What might work, I actually never tried it by myself so far, is this:Define your complex variables in a separate yaml file (complex variables are supported since v0.222.0), import this file using include, and reference these variables accord...
- 0 kudos
- 510 Views
- 1 replies
- 1 kudos
Capture error for databricks job
Greetings ! We have created a Databricks job using Notebook. This notebook has 6 cells . Can we capture the Success and failure (along with error) and Store it into for monitoring and analysis . Ex if we want to capture the below error
- 510 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @sukanya09 ,You can use jobs API, each run will have information about status of teach task in the jobhttps://docs.databricks.com/api/workspace/jobs/getrunoutput
- 1 kudos
- 1279 Views
- 2 replies
- 0 kudos
Account Verification Code
There's a recent change in the way users in my company workspace now log in to Databricks.When logging into an instance I get the prompt to check my email with a OTP that can be used to login after entering the password.Email body as follows:Account ...
- 1279 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @Retired_mod Is there a way to disable signing in with email OTPs and continue using only passwords? If so, please provide the steps. Thanks!
- 0 kudos
- 604 Views
- 2 replies
- 0 kudos
why is recommended default setting is delta deleted file duration to be 7 days?
Due to frequent updates on table our backend storage size is growing a lot, even though we have vacuum and optimize scheduled unable to clean up files 7 days or less.Current settings is: delta.logRetentionDuration="interval 7 days" and deleted files ...
- 604 Views
- 2 replies
- 0 kudos
- 0 kudos
It's not really a recommendation per se, it's basically a default, which you simply need. And yes, it's supposed to be adapted to your specific needs.In this case:I don't need delta log more than one day.If you're fine that you won't be able to rollb...
- 0 kudos
- 2194 Views
- 4 replies
- 1 kudos
Resolved! Creating Azure Databricks Workspace Without NAT Gateway
Hello,Recently, when I create a new Databricks workspace on Azure, it automatically create a NAT Gateway which incurs additional cost !When creating the workspace, I don't choose secure cluster connectivity, so I'm expecting not to have a NAT Gateway...
- 2194 Views
- 4 replies
- 1 kudos
- 1 kudos
Hi @ziad ,Do you create workspace with secure cluster connectivity? According to the documentation: If you use secure cluster connectivity with the default VNet that Azure Databricks creates, Azure Databricks automatically creates a NAT gateway for o...
- 1 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
Access Delta Tables
2 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
83 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
42 | |
26 | |
25 | |
16 | |
10 |