- 2962 Views
- 5 replies
- 2 kudos
Resolved! S3 access credentials: Pandas vs Spark
Hi,I need to read Parquet files located in S3 into the Pandas dataframe.I configured "external location" to access my S3 bucket and havedf = spark.read.parquet(s3_parquet_file_path)working perfectly well.However, df = pd.read_parquet(s3_parquet_file_...
- 2962 Views
- 5 replies
- 2 kudos
- 2 kudos
Yes, you understand correctly. The Spark library in Databricks uses the Unity Catalog credential model, which includes the use of "external locations" for managing data access. This model ensures that access control and permissions are centrally mana...
- 2 kudos
- 2522 Views
- 2 replies
- 1 kudos
Assistance Required: Integrating Databricks ODBC Connector with Azure App Service
Hi,I have successfully established an ODBC connection with Databricks to retrieve data from the Unity Catalog in a local C# application using the Simba Spark ODBC Driver, and it is working as expected.I now need to integrate this functionality into a...
- 2522 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @nanda_ ,So basically what you need to do is to install simba odbc driver on your Azure App Service environment. Then your code should work in the same way as in your local machine.One possibility is to use Windows or Linux Containers on Azure App...
- 1 kudos
- 4814 Views
- 1 replies
- 0 kudos
Resolved! How to add 'additionallyAllowedTenants' in Databricks config or PySpark config?
I have a multi-tenant Azure app. I am using this app's credentials to read ADLS container files from Databricks cluster using PySpark dataframe.I need to set this 'additionallyAllowedTenants' flag value to '*' or a specific tenant_id of the multi-ten...
- 4814 Views
- 1 replies
- 0 kudos
- 1349 Views
- 2 replies
- 1 kudos
Errors on Databricks and Terraform upgrades for DABS
Hi AllI've recently upgraded my Databricks CLI to 0.235 and Terraform provider to 1.58 locally on my machine and my DABs deployments have broken. They did work in the past with previous versions and now I can't even run Terraform -v. The command Data...
- 1349 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @SumeshD, Can you run databricks bundle debug terraform to obtain more details on the failure? The error messages you are encountering, such as "ConflictsWith skipped for [task.0.for_each_task.0.task.0.new_cluster.0.aws_attributes task.0.for_each_...
- 1 kudos
- 1119 Views
- 1 replies
- 0 kudos
Resolved! Optional JDBC Parameters in external connection
Is there any way to specify optional JDBC parameters like batchSize through the External Connections created in Unity Catalog? (specifically I'm trying to speed up data retrieval from a SQL Server database)
- 1119 Views
- 1 replies
- 0 kudos
- 0 kudos
As per information I found internally seems that this option is not currently supported.
- 0 kudos
- 6019 Views
- 3 replies
- 1 kudos
Can I configure Notebook Result Downloads with Databricks CLI , API or Terraform provider ?
I'm Databricks Admin and I'm looking for a solution to automate some Security Workspace settings.Those are:Notebook result downloadSQL result downloadNotebook table clipboard featuresI can't find these options in the Databricks terraform provider, Da...
- 6019 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi @Redford, With the Databricks API, you have the capability to toggle the following features: Enable/Disable Features Notebook result download (key name: enableResultsDownloading)Notebook table clipboard features (key name: enableNotebo...
- 1 kudos
- 3383 Views
- 7 replies
- 0 kudos
unknown geo redundancy storage events (& costs) in azure databricks resource group
Hi All,I'm after some guidance on how to identify massive (100000%) spikes in bandwidth usage (and related costs) in the azure databricks provisioned/managed resource group storage account & stop themThese blips are adding 30-50% to our monthly costs...
- 3383 Views
- 7 replies
- 0 kudos
- 0 kudos
Thanks for opening a case with us, we will have a look at it.
- 0 kudos
- 2181 Views
- 6 replies
- 0 kudos
Unity catalog meta store is created within undesired storage account
I came to know that our unity catalog meta store has been created in the default storage account of our databricks workspace and this storage account has some system denied access policies, therefore we don't have access to see the data inside. I'm w...
- 2181 Views
- 6 replies
- 0 kudos
- 0 kudos
You will need to backup the current metastore including the metadata and then start recreating the catalogs, schemas and tables on the new metastore.
- 0 kudos
- 2189 Views
- 10 replies
- 0 kudos
Update existing Metastores in AWS databricks
Hello Team,I am unable to update the Existing Metastore in my AWS databricks. I have new aws account and I am trying to update my existing workspace, however I am unable to update the s3 bucket details and Network configuration ( greyed out ) in the...
- 2189 Views
- 10 replies
- 0 kudos
- 0 kudos
Unfortunately there is no way to move the state of the workspace manually so based on this the solution will be to recreate the workspace and migrate the data
- 0 kudos
- 2603 Views
- 3 replies
- 1 kudos
How Can a Workspace Admin Grant Workspace Admin Permissions to a Group?
I want to grant Workspace Admin permissions to a group instead of individual users, but I haven’t found a way to do this. I considered assigning permissions by adding the group to the Databricks-managed 'admins' group (establishing a parent-child rel...
- 2603 Views
- 3 replies
- 1 kudos
- 1 kudos
No problem! I will check internally if there is any feature request of this nature. You can use the "admins" group for adding admin users or SPs.
- 1 kudos
- 1623 Views
- 2 replies
- 1 kudos
Resolved! DLT TABLES schema mapping
how we map table that in delta live table to a bronze , sliver , gold schema ? is that possible to store in different schema the dlt tables??
- 1623 Views
- 2 replies
- 1 kudos
- 1304 Views
- 2 replies
- 0 kudos
AzureDevOps Repos Databricks update via pipeline not working
Hi all, im working with Azure DevOps and Databricks, using an app registration which it has permission on AzureDevOps and inside databricks as manager,user and in the group admins so it has permission over the repos.Im doing a pipeline to update or c...
- 1304 Views
- 2 replies
- 0 kudos
- 0 kudos
Hello @TatiMun thanks for your question, can we review the following: Verify Remote URL: Double-check that the remote Git repo URL associated with the REPO_ID in Databricks is correct and accessible.Check PAT Permissions: Ensure that the Personal Acc...
- 0 kudos
- 2751 Views
- 2 replies
- 1 kudos
Security Consideration for OAUTH Secrets to use Service Principal to authenticate with Databricks
What are the security consideration we need to keep in mind when we want to us OAUTH Secrets to use a Service Principal to access Azure Databricks when Identity federation is disabled and workspace is not yet on boarded on to Unity Catalog? Can we co...
- 2751 Views
- 2 replies
- 1 kudos
- 1 kudos
Any updates on this?Also struggling with the OAuth security considerations. Specifically with updating the OAuth Secrets.Currently using a SP to access Databricks workspace for DevOps purposes through the Databricks CLI.I have the SP set up to renew ...
- 1 kudos
- 2007 Views
- 4 replies
- 0 kudos
Database Error in model Couldn't initialize file system for path abfss://
Recently the following error ocurs when running DBT:Database Error in model un_unternehmen_sat (models/2_un/partner/sats/un_unternehmen_sat.sql)Couldn't initialize file system for path abfss://dp-ext-fab@stcssdpextfabprd.dfs.core.windows.net/__unitys...
- 2007 Views
- 4 replies
- 0 kudos
- 0 kudos
Hi @Th0r ,Here is the explanation:Shallow clones in Databricks rely on references to data files of the original table. If the original table is dropped, recreated, or altered in a way that changes its underlying files, the shallow clone’s references ...
- 0 kudos
- 891 Views
- 0 replies
- 1 kudos
Databricks Asset Bundle a new way for amazing ETL
They say having the right tools at your disposal can make all the difference when navigating complex terrains. For organizations leveraging Databricks, simplifying deployment and scaling operations is often a key challenge.Over the years, I’ve explor...
- 891 Views
- 0 replies
- 1 kudos
-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
75 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 118 | |
| 53 | |
| 38 | |
| 36 | |
| 25 |