- 1488 Views
- 4 replies
- 0 kudos
Get a static IP for my Databricks App
Hello,I'm trying to find how to set-up a static IP for a Azure Databricks App. I tried to set-up a NAT gateway to have a static IP for the workspace, but it doesn't change anything, I still can't access my OpenAI ressource even if I authorize the NaT...
- 1488 Views
- 4 replies
- 0 kudos
- 0 kudos
Hi, I’m following up here as I have the same issue. Did the solution provided in the replies help resolve this for you?
- 0 kudos
- 815 Views
- 1 replies
- 1 kudos
Determining spill from system tables
I'm trying to optimize machine selection (D, E, or L types on Azure) for job clusters and all-purpose compute and am struggling to identify where performance is sagging on account of disk spill. Disk spill would suggest that more memory is needed. ...
- 815 Views
- 1 replies
- 1 kudos
- 1 kudos
For historical diagnostics, you might need to consider setting up a custom logging mechanism that captures these metrics over time and stores them in a persistent storage solution, such as a database or a logging service. This way, you can query and ...
- 1 kudos
- 3062 Views
- 15 replies
- 0 kudos
Resolved! Permissions error on cluster requirements.txt installation
Hi Databricks Community,I'm looking to resolve the following error:Library installation attempted on the driver node of cluster {My cluster ID} and failed. Please refer to the following error message to fix the library or contact Databricks support. ...
- 3062 Views
- 15 replies
- 0 kudos
- 0 kudos
Noting here for other users: I was able to resolve the issue on a shared cluster by cloning the cluster and using the clone.
- 0 kudos
- 2294 Views
- 8 replies
- 3 kudos
PrivateLink Validation Error - When trying to access to Workspace
We have a workspace that had been deployed on AWS customer architecture using Terraform privatelink: https://registry.terraform.io/providers/databricks/databricks/latest/docs/guides/aws-private-link-workspaceThe fact is when we disable the Public Acc...
- 2294 Views
- 8 replies
- 3 kudos
- 3 kudos
Can you share your workspace id so I can do a validation?
- 3 kudos
- 602 Views
- 2 replies
- 0 kudos
Can't create cluster in AWS with p3 instance type
Hi, I'm trying to create a `p3.2xlarge` in my workspace, but the cluster fails to instantiate, specifically getting this error message: `No zone supports both the driver instance type [p3.2xlarge] and the worker instance type [p3.2xlarge]` (though I ...
- 602 Views
- 2 replies
- 0 kudos
- 0 kudos
Yes sorry for the double post (I couldn't figure out how to delete this one)
- 0 kudos
- 362 Views
- 1 replies
- 0 kudos
Querying on multi-node cluster on AWS does not complete
Querying in isolation mode is completely fine but when trying to run the same query using the multi-node it does complete or error out. Any assistance to troubleshoot this issue? oh, Happy New year if you're reading this.
- 362 Views
- 1 replies
- 0 kudos
- 0 kudos
hello John,Happy new year to you, can you please confirm what is the error message received? when you say isolation mode do you mean single node or do you refer to single user cluster while the other is shared mode?
- 0 kudos
- 1697 Views
- 5 replies
- 2 kudos
Resolved! S3 access credentials: Pandas vs Spark
Hi,I need to read Parquet files located in S3 into the Pandas dataframe.I configured "external location" to access my S3 bucket and havedf = spark.read.parquet(s3_parquet_file_path)working perfectly well.However, df = pd.read_parquet(s3_parquet_file_...
- 1697 Views
- 5 replies
- 2 kudos
- 2 kudos
Yes, you understand correctly. The Spark library in Databricks uses the Unity Catalog credential model, which includes the use of "external locations" for managing data access. This model ensures that access control and permissions are centrally mana...
- 2 kudos
- 1446 Views
- 2 replies
- 1 kudos
Assistance Required: Integrating Databricks ODBC Connector with Azure App Service
Hi,I have successfully established an ODBC connection with Databricks to retrieve data from the Unity Catalog in a local C# application using the Simba Spark ODBC Driver, and it is working as expected.I now need to integrate this functionality into a...
- 1446 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @nanda_ ,So basically what you need to do is to install simba odbc driver on your Azure App Service environment. Then your code should work in the same way as in your local machine.One possibility is to use Windows or Linux Containers on Azure App...
- 1 kudos
- 3595 Views
- 1 replies
- 0 kudos
Resolved! How to add 'additionallyAllowedTenants' in Databricks config or PySpark config?
I have a multi-tenant Azure app. I am using this app's credentials to read ADLS container files from Databricks cluster using PySpark dataframe.I need to set this 'additionallyAllowedTenants' flag value to '*' or a specific tenant_id of the multi-ten...
- 3595 Views
- 1 replies
- 0 kudos
- 846 Views
- 2 replies
- 1 kudos
Errors on Databricks and Terraform upgrades for DABS
Hi AllI've recently upgraded my Databricks CLI to 0.235 and Terraform provider to 1.58 locally on my machine and my DABs deployments have broken. They did work in the past with previous versions and now I can't even run Terraform -v. The command Data...
- 846 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @SumeshD, Can you run databricks bundle debug terraform to obtain more details on the failure? The error messages you are encountering, such as "ConflictsWith skipped for [task.0.for_each_task.0.task.0.new_cluster.0.aws_attributes task.0.for_each_...
- 1 kudos
- 704 Views
- 1 replies
- 0 kudos
Resolved! Optional JDBC Parameters in external connection
Is there any way to specify optional JDBC parameters like batchSize through the External Connections created in Unity Catalog? (specifically I'm trying to speed up data retrieval from a SQL Server database)
- 704 Views
- 1 replies
- 0 kudos
- 0 kudos
As per information I found internally seems that this option is not currently supported.
- 0 kudos
- 5203 Views
- 3 replies
- 1 kudos
Can I configure Notebook Result Downloads with Databricks CLI , API or Terraform provider ?
I'm Databricks Admin and I'm looking for a solution to automate some Security Workspace settings.Those are:Notebook result downloadSQL result downloadNotebook table clipboard featuresI can't find these options in the Databricks terraform provider, Da...
- 5203 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi @Redford, With the Databricks API, you have the capability to toggle the following features: Enable/Disable Features Notebook result download (key name: enableResultsDownloading)Notebook table clipboard features (key name: enableNotebo...
- 1 kudos
- 2318 Views
- 7 replies
- 0 kudos
unknown geo redundancy storage events (& costs) in azure databricks resource group
Hi All,I'm after some guidance on how to identify massive (100000%) spikes in bandwidth usage (and related costs) in the azure databricks provisioned/managed resource group storage account & stop themThese blips are adding 30-50% to our monthly costs...
- 2318 Views
- 7 replies
- 0 kudos
- 0 kudos
Thanks for opening a case with us, we will have a look at it.
- 0 kudos
- 1428 Views
- 6 replies
- 0 kudos
Unity catalog meta store is created within undesired storage account
I came to know that our unity catalog meta store has been created in the default storage account of our databricks workspace and this storage account has some system denied access policies, therefore we don't have access to see the data inside. I'm w...
- 1428 Views
- 6 replies
- 0 kudos
- 0 kudos
You will need to backup the current metastore including the metadata and then start recreating the catalogs, schemas and tables on the new metastore.
- 0 kudos
- 1374 Views
- 10 replies
- 0 kudos
Update existing Metastores in AWS databricks
Hello Team,I am unable to update the Existing Metastore in my AWS databricks. I have new aws account and I am trying to update my existing workspace, however I am unable to update the s3 bucket details and Network configuration ( greyed out ) in the...
- 1374 Views
- 10 replies
- 0 kudos
- 0 kudos
Unfortunately there is no way to move the state of the workspace manually so based on this the solution will be to recreate the workspace and migrate the data
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
AWS
5 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta
4 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
41 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
User | Count |
---|---|
96 | |
37 | |
25 | |
24 | |
18 |