- 1045 Views
- 1 replies
- 0 kudos
Access Git folder information from notebook
In my Workspace, I have a repository with Git folder.I would like to access programatically with Python from within a notebook:- name of the repo- currently checked out branch in the repoI want to do this in two different ways:(1) Access said informa...
- 1045 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @johnb1 ,You can use one of the following options to achieve what you want: - databricks CLI repos commands - databricks python SDK- databricks rest API calls
- 0 kudos
- 4819 Views
- 6 replies
- 3 kudos
Resolved! Hard reset programatically
Is it possible to trigger a git reset --hard programatically?I'm running a platform service where, as part of CI/CD, repos get deployed into the Databricks workspace. Normally, our developers work with upstream repos both from their local IDEs and fr...
- 4819 Views
- 6 replies
- 3 kudos
- 3 kudos
Thank you for the feedback there! We recently added more docs for SP OAuth support for DevOps. SP OAuth support for Github is being discussed.
- 3 kudos
- 1520 Views
- 1 replies
- 2 kudos
Resolved! Trouble with host url parameterization
I am attempting to parameterize a databricks yaml so I can deploy it to multiple databricks accounts via Gitlab CICD, and have ran into a snag when parameterizing the workpace host value. My variable block looks like this: variables: databricks_ho...
- 1520 Views
- 1 replies
- 2 kudos
- 2 kudos
Hi @jasont41 ,Your assumption is correct. You can't use variable for host mapping. You can find information about it in the following documentation entry: https://docs.databricks.com/en/dev-tools/bundles/settings.html#other-workspace-mappings
- 2 kudos
- 5942 Views
- 3 replies
- 3 kudos
Resolved! Use a Service Principal Token instead of Personal Access Token for Databricks Asset Bundle
How can I connect using a Service Principal Token, I did this, but it is not a PAT: databricks configure Databricks host: https:// ... Personal access token: **** I also tried this, but didn't work either: [profile] host = <workspace-url> client_id ...
- 5942 Views
- 3 replies
- 3 kudos
- 3 kudos
Thanks Pedro, we did it, for anyone in the future (I added fake host and service principal id's):1. Modify your databricks.yml so it have the service principal id and the databricks host: bundle: name: my_workflow # Declare to Databricks Assets Bu...
- 3 kudos
- 1012 Views
- 2 replies
- 0 kudos
Automatic schema rendering of files in unity catalog
Hi team,Can anyone please confirm if Unity catalog supports automatic schema rendering from csv, json, pdfs, and structured/unstructured files?Meaning, if i create a volume with path/location to folder (or S3 bucket) having such files, can unity cata...
- 1012 Views
- 2 replies
- 0 kudos
- 0 kudos
ok..thanks a lot @gchandra .So, I am new to Unity Catalog and particularly interested (and evaluating) the open sourced version of unity catalog (https://www.unitycatalog.io/)I know that, we can create volumes and those in turn can point to csv, json...
- 0 kudos
- 1668 Views
- 2 replies
- 1 kudos
Resolved! Error getting locations Unsupported response format: STREAM [Azure Databricks - Catalog Explorer]
Hi all,I saw this error when I checked my External Locations and Storage Credentials and used Catalog Explorer. The error message gives zero information for diagnosis. Do you have any idea what the reason is?Thank you.
- 1668 Views
- 2 replies
- 1 kudos
- 1 kudos
And it works now ... No idea what happened ~
- 1 kudos
- 3275 Views
- 4 replies
- 1 kudos
How to disable spark connect in the databricks compute?
I want to be able to access the RDD methods of a Dataframe, but it seems that this is not supported in spark connect. I have been trying to disable spark connect in the spark config using,spark.databricks.service.server.enabled false but when I check...
- 3275 Views
- 4 replies
- 1 kudos
- 1 kudos
I have found that when the cluster is shared, it automatically uses that type of session, and in that case, I have not been able to disable it. I don't know if this is your situation. I have avoided some problems that I had with the previous clause.
- 1 kudos
- 4173 Views
- 5 replies
- 2 kudos
Resolved! How to create Storage Credential using Service Principal [Azure]
As the document indicates, An Azure Databricks access connector is a first-party Azure resource that lets you connect managed identities to an Azure Databricks account. You must have the Contributor role or higher on the access connector resource in ...
- 4173 Views
- 5 replies
- 2 kudos
- 2 kudos
Thank you, @szymon_dybczak. This is what I thought. After deploying the Databricks workspace, it automatically creates the Databricks managed `Access Connector for Azure Databricks` in the Databricks managed resource group.As I understand, I should c...
- 2 kudos
- 5237 Views
- 1 replies
- 1 kudos
Cluster Upsize Issue: Storage Download Failure Slow
Hi,We're currently experiencing the following issue across our entire Databricks Workspace when either starting a cluster, running a workflow, or upscaling a running cluster. The following errors we receive on our AP clusters and job clusters are bel...
- 5237 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @sdick_vg ,The error is about connectivity issues when trying to reach Azure Storage.Have you maybe enabled any kind of firewall in your organization recently?Could you run for example code to test DNS resolution to your storage account:Have you m...
- 1 kudos
- 1663 Views
- 1 replies
- 0 kudos
Error: PERMISSION_DENIED: AWS IAM role does
Hello, We are trying to setup a new workspace. However we are getting following error. Workspace failed to launch.Error: PERMISSION_DENIED: AWS IAM role does not have READ permissions on url s3://jk-databricks-prods3/unity-catalog/742920957025975.Pl...
- 1663 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey! I'm experiencing this with the latest Terraform release. Try 1.51.0 if you are deploying via TF, downgrading fixed this for me.
- 0 kudos
- 1754 Views
- 2 replies
- 0 kudos
table deployment (DDL) from one catalog to another
HelloWe have a development, a test and a production environmentHow do you generally deploy DDL changes?So, alter a table in development and apply to test then productione.g.table1 has column1, column2, column3I add column4I now want to deploy this ch...
- 1754 Views
- 2 replies
- 0 kudos
- 0 kudos
Thanks.I'll step through this solution and see if I can get it working
- 0 kudos
- 1537 Views
- 1 replies
- 0 kudos
Testing and Issues Related to Admin Role Changes
Hello,I would like to ask a question regarding user permissions.Currently, all team members are admins. Recently, we plan to change the admin roles so that only I and another user, A, will be admins. The other members will retain general usage permis...
- 1537 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Kaniz_Fatma , Can you please help me to delete my post as I accidentally resubmitted the post.Thank you.
- 0 kudos
- 1006 Views
- 1 replies
- 0 kudos
How to fully enable row-level concurrency on Databricks 14.1
Hey guys,I hope whoever's reading this is doing well.We're trying to enable row-level concurrency on Databricks 14.1. However, the documentation seems a bit contradictory as to whether this is possible, and whether the whole capability of row-level c...
- 1006 Views
- 1 replies
- 0 kudos
- 0 kudos
Yes, your understanding is correct. As best practice, use 14.3 LTS DBR, as LTS has longer support compared to 6 months.
- 0 kudos
- 727 Views
- 1 replies
- 0 kudos
Unable to publish notebook
Hello, I receive an error each time I attempt to publish my notebook. the error message does not provide any additional information on why it is occurring. I need to publish this for a bootcamp assignment, so please let me know what I can do to resol...
- 727 Views
- 1 replies
- 0 kudos
- 0 kudos
Try it from a different browser, or try it in incognito mode.
- 0 kudos
- 1564 Views
- 2 replies
- 0 kudos
Allow non-admin users to view the driver logs from a Unity Catalog-enabled pipeline
We are trying to enable the option to allow log reading to the non-admin users in the databricks wokspace but we are not able to see the obvious way to check them. The documentation is not showing after enabling the below property where to check the...
- 1564 Views
- 2 replies
- 0 kudos
- 0 kudos
Do they see these links in the pipeline?
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
58 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 118 | |
| 37 | |
| 36 | |
| 28 | |
| 25 |