cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Lakehouse Architecture


Forum Posts

punamrandive32
by New Contributor II
  • 685 Views
  • 1 replies
  • 0 kudos

Exam for Databricks Certified Data Engineer Associte

My Databricks professional data Engineer certification exam got suspended. My Exam just went for half hour, it was showing me error for eye movement when I was reading question, exam suspended on 11th of July 2024 and still showing in progress assess...

  • 685 Views
  • 1 replies
  • 0 kudos
Latest Reply
gchandra
Databricks Employee
  • 0 kudos

I'm sorry to hear your exam was suspended. Please file a ticket with our support team and allow the support team 24-48 hours for a resolution. You should also review this documentation:Room requirementsBehavioral considerations

  • 0 kudos
johnb1
by Contributor
  • 274 Views
  • 1 replies
  • 0 kudos

Access Git folder information from notebook

In my Workspace, I have a repository with Git folder.I would like to access programatically with Python from within a notebook:- name of the repo- currently checked out branch in the repoI want to do this in two different ways:(1) Access said informa...

  • 274 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Contributor III
  • 0 kudos

Hi @johnb1 ,You can use one of the following options to achieve what you want: - databricks CLI repos commands - databricks python SDK- databricks rest API calls

  • 0 kudos
camilo_s
by Contributor
  • 2464 Views
  • 6 replies
  • 3 kudos

Resolved! Hard reset programatically

Is it possible to trigger a git reset --hard programatically?I'm running a platform service where, as part of CI/CD, repos get deployed into the Databricks workspace. Normally, our developers work with upstream repos both from their local IDEs and fr...

  • 2464 Views
  • 6 replies
  • 3 kudos
Latest Reply
nicole_lu_PM
Databricks Employee
  • 3 kudos

Thank you for the feedback there! We recently added more docs for SP OAuth support for DevOps. SP OAuth support for Github is being discussed. 

  • 3 kudos
5 More Replies
jasont41
by New Contributor II
  • 496 Views
  • 1 replies
  • 2 kudos

Resolved! Trouble with host url parameterization

I am attempting to parameterize a databricks yaml so I can deploy it to multiple databricks accounts via Gitlab CICD, and have ran into a snag when parameterizing the workpace host value. My variable block looks like this: variables:    databricks_ho...

  • 496 Views
  • 1 replies
  • 2 kudos
Latest Reply
szymon_dybczak
Contributor III
  • 2 kudos

Hi @jasont41 ,Your assumption is correct. You can't use variable for host mapping. You can find information about it in the following documentation entry: https://docs.databricks.com/en/dev-tools/bundles/settings.html#other-workspace-mappings

  • 2 kudos
PabloCSD
by Contributor III
  • 1215 Views
  • 3 replies
  • 3 kudos

Resolved! Use a Service Principal Token instead of Personal Access Token for Databricks Asset Bundle

How can I connect using a Service Principal Token, I did this, but it is not a PAT: databricks configure Databricks host: https:// ... Personal access token: ****  I also tried this, but didn't work either: [profile] host = <workspace-url> client_id ...

  • 1215 Views
  • 3 replies
  • 3 kudos
Latest Reply
PabloCSD
Contributor III
  • 3 kudos

Thanks Pedro, we did it, for anyone in the future (I added fake host and service principal id's):1. Modify your databricks.yml so it have the service principal id and the databricks host: bundle: name: my_workflow # Declare to Databricks Assets Bu...

  • 3 kudos
2 More Replies
dspatil
by New Contributor II
  • 316 Views
  • 2 replies
  • 0 kudos

Automatic schema rendering of files in unity catalog

Hi team,Can anyone please confirm if Unity catalog supports automatic schema rendering from csv, json, pdfs, and structured/unstructured files?Meaning, if i create a volume with path/location to folder (or S3 bucket) having such files, can unity cata...

  • 316 Views
  • 2 replies
  • 0 kudos
Latest Reply
dspatil
New Contributor II
  • 0 kudos

ok..thanks a lot @gchandra .So, I am new to Unity Catalog and particularly interested (and evaluating) the open sourced version of unity catalog (https://www.unitycatalog.io/)I know that, we can create volumes and those in turn can point to csv, json...

  • 0 kudos
1 More Replies
AlbertWang
by Contributor III
  • 826 Views
  • 2 replies
  • 1 kudos

Resolved! Error getting locations Unsupported response format: STREAM [Azure Databricks - Catalog Explorer]

Hi all,I saw this error when I checked my External Locations and Storage Credentials and used Catalog Explorer. The error message gives zero information for diagnosis. Do you have any idea what the reason is?Thank you.

AlbertWang_1-1727297979431.png AlbertWang_2-1727297996274.png AlbertWang_3-1727298157976.png
  • 826 Views
  • 2 replies
  • 1 kudos
Latest Reply
AlbertWang
Contributor III
  • 1 kudos

And it works now ... No idea what happened ~

  • 1 kudos
1 More Replies
chinmay0924
by New Contributor II
  • 818 Views
  • 4 replies
  • 0 kudos

How to disable spark connect in the databricks compute?

I want to be able to access the RDD methods of a Dataframe, but it seems that this is not supported in spark connect. I have been trying to disable spark connect in the spark config using,spark.databricks.service.server.enabled false but when I check...

  • 818 Views
  • 4 replies
  • 0 kudos
Latest Reply
Bareaj
New Contributor II
  • 0 kudos

I have found that when the cluster is shared, it automatically uses that type of session, and in that case, I have not been able to disable it. I don't know if this is your situation. I have avoided some problems that I had with the previous clause.

  • 0 kudos
3 More Replies
ViliamG
by New Contributor
  • 208 Views
  • 0 replies
  • 0 kudos

MLFlow Tracking versions

Hi team,we are migrating from self-self hosted MLFlow Tracking server to the Databricks-hosted one. However, there are concerns about the unclear process of version changes and releases at the Tracking server side. Is there any public  information av...

  • 208 Views
  • 0 replies
  • 0 kudos
AlbertWang
by Contributor III
  • 1187 Views
  • 5 replies
  • 2 kudos

Resolved! How to create Storage Credential using Service Principal [Azure]

As the document indicates, An Azure Databricks access connector is a first-party Azure resource that lets you connect managed identities to an Azure Databricks account. You must have the Contributor role or higher on the access connector resource in ...

  • 1187 Views
  • 5 replies
  • 2 kudos
Latest Reply
AlbertWang
Contributor III
  • 2 kudos

Thank you, @szymon_dybczak. This is what I thought. After deploying the Databricks workspace, it automatically creates the Databricks managed `Access Connector for Azure Databricks` in the Databricks managed resource group.As I understand, I should c...

  • 2 kudos
4 More Replies
sdick_vg
by New Contributor
  • 1419 Views
  • 1 replies
  • 1 kudos

Cluster Upsize Issue: Storage Download Failure Slow

Hi,We're currently experiencing the following issue across our entire Databricks Workspace when either starting a cluster, running a workflow, or upscaling a running cluster. The following errors we receive on our AP clusters and job clusters are bel...

  • 1419 Views
  • 1 replies
  • 1 kudos
Latest Reply
filipniziol
Contributor III
  • 1 kudos

Hi @sdick_vg ,The error is about connectivity issues when trying to reach Azure Storage.Have you maybe enabled any kind of firewall in your organization recently?Could you run for example code to test DNS resolution to your storage account:Have you m...

  • 1 kudos
jkdatabricks
by New Contributor
  • 580 Views
  • 1 replies
  • 0 kudos

Error: PERMISSION_DENIED: AWS IAM role does

Hello, We are trying to setup a new workspace. However we are getting following error.  Workspace failed to launch.Error: PERMISSION_DENIED: AWS IAM role does not have READ permissions on url s3://jk-databricks-prods3/unity-catalog/742920957025975.Pl...

  • 580 Views
  • 1 replies
  • 0 kudos
Latest Reply
caldempseyai
New Contributor II
  • 0 kudos

Hey! I'm experiencing this with the latest Terraform release. Try 1.51.0 if you are deploying via TF, downgrading fixed this for me.

  • 0 kudos
dpc
by New Contributor III
  • 547 Views
  • 2 replies
  • 0 kudos

table deployment (DDL) from one catalog to another

HelloWe have a development, a test and a production environmentHow do you generally deploy DDL changes?So, alter a table in development and apply to test then productione.g.table1 has column1, column2, column3I add column4I now want to deploy this ch...

  • 547 Views
  • 2 replies
  • 0 kudos
Latest Reply
dpc
New Contributor III
  • 0 kudos

Thanks.I'll step through this solution and see if I can get it working

  • 0 kudos
1 More Replies
weilin0323
by New Contributor III
  • 244 Views
  • 1 replies
  • 0 kudos

Testing and Issues Related to Admin Role Changes

Hello,I would like to ask a question regarding user permissions.Currently, all team members are admins. Recently, we plan to change the admin roles so that only I and another user, A, will be admins. The other members will retain general usage permis...

  • 244 Views
  • 1 replies
  • 0 kudos
Latest Reply
weilin0323
New Contributor III
  • 0 kudos

Hi @Kaniz_Fatma , Can you please help me to delete my post as I accidentally resubmitted the post.Thank you.

  • 0 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels