cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

daniel23
by New Contributor II
  • 2620 Views
  • 0 replies
  • 0 kudos

Delete Users that are Maintenance Readers

I am an Account Admin at Databricks (Azure), and trying to delete users that are being offboarded.I have managed to delete most users. However, for a couple, I get the following message (see screenshot):ABORTED: Account <account> is read-only during ...

abort-delete.PNG
  • 2620 Views
  • 0 replies
  • 0 kudos
Tito
by New Contributor II
  • 2855 Views
  • 0 replies
  • 0 kudos

VS Code Databricks Connect Cluster Configuration

I am currently setting up the VSCode extension for Databricks Connect, and it’s working fine so far. However, I have a question about cluster configurations. I want to access Unity Catalog from VSCode through the extension, and I’ve noticed that I ca...

  • 2855 Views
  • 0 replies
  • 0 kudos
FlukeStarbucker
by New Contributor III
  • 1504 Views
  • 4 replies
  • 1 kudos

Resolved! Unable to get S3 connection working

I can't get past the error below. I've read and reread the instructions several times at the URL below and for the life of me cannot figure out what I'm missing in my AWS setup. Any tips on how to track down my issue? https://docs.databricks.com/en/c...

  • 1504 Views
  • 4 replies
  • 1 kudos
Latest Reply
FlukeStarbucker
New Contributor III
  • 1 kudos

I got it working, there was a weird typo where the role ARN was duplicated. Thanks.

  • 1 kudos
3 More Replies
unj1m
by New Contributor III
  • 2598 Views
  • 0 replies
  • 0 kudos

Getting "Data too long for column session_data'" creating a CACHE table

Hi, I'm trying to leverage CACHE TABLE to create temporary tables that are cleaned up at the end of the session.In creating one of these, I'm getting  Data too long for column 'session_data'.  The query I'm using isn't referencing a session_data colu...

  • 2598 Views
  • 0 replies
  • 0 kudos
mx2323
by New Contributor
  • 788 Views
  • 1 replies
  • 0 kudos

samples catalog doesnt have an information schema

 we are looking to do an integration with databricks, and i've noticed that the samples database doesn't have an INFORMATION_SCHEMA  we rely on the existence of the information_schema to help us understand what views / tables exist in each catalog. w...

mx2323_0-1726607557549.png
  • 788 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

The "samples" catalog in Databricks does not have an INFORMATION_SCHEMA because it is designed primarily for demonstration and educational purposes, rather than for production use. This schema is typically included in catalogs created on Unity Catalo...

  • 0 kudos
NetRider
by New Contributor
  • 1687 Views
  • 1 replies
  • 0 kudos

Issue when connecting to Databricks using OAuth 2.0 token connection string

Following the instructions on Authentication settings for the Databricks ODBC Driver | Databricks on AWS I have created both Personal access token and OAuth 2.0 token connection options in an application. However, I realized that, when I use OAuth 2....

  • 1687 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

This will require further analysis, based on the information it seems there is no direct way to restrict this. Are you able to submit a support case so this can be analyzed? 

  • 0 kudos
erigaud
by Honored Contributor
  • 2546 Views
  • 1 replies
  • 0 kudos

Resolved! Databricks connect - SQL Server - Login error with all purpose cluster

Hello everyone,I'm using the Databricks connect feature to connect to an SQL server on cloud.I created a foreign catalog based on the connection but whenever I try to access the tables, I get a login error : I have tried with a serverless cluster and...

erigaud_0-1727337571433.png
  • 2546 Views
  • 1 replies
  • 0 kudos
Latest Reply
erigaud
Honored Contributor
  • 0 kudos

Solved.Turns out it was a networking issue, once the subnets from databricks where allowed by the cloud sql server we managed to connect. The error message is misleading because the credentials were correct

  • 0 kudos
bean
by New Contributor II
  • 5368 Views
  • 3 replies
  • 1 kudos

Resolved! PERMISSION_DENIED: User is not an owner of Table/Schema

Hi,We have recently added a service principal for running and managing all of our jobs. The service principal has ALL PRIVILEGES to our catalogs/schemas/and table. But we're still seeing the error message `PERMISSION_DENIED: User is not an owner of T...

  • 5368 Views
  • 3 replies
  • 1 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 1 kudos

I think the feedback button is the right place. At least I don't know of another way.

  • 1 kudos
2 More Replies
PabloCSD
by Valued Contributor II
  • 7957 Views
  • 1 replies
  • 0 kudos

Resolved! [DATA_SOURCE_NOT_FOUND] Failed to find data source

Context:Hello, I was using a workflow for a periodic process, with my team we were using a Job Compute, but the libraries were not working (even though we had a PIP_EXTRA_INDEX_URL defined in the Environment Variables of the Cluster, so we now use a ...

  • 7957 Views
  • 1 replies
  • 0 kudos
Latest Reply
PabloCSD
Valued Contributor II
  • 0 kudos

I installed in the cluster this library:spark_mssql_connector_2_12_1_4_0_BETA.jarA colleague passed me this .jar file. It seems that can be obtained from here: https://github.com/microsoft/sql-spark-connector/releases.This allows the task to end succ...

  • 0 kudos
punamrandive32
by New Contributor II
  • 1396 Views
  • 1 replies
  • 0 kudos

Exam for Databricks Certified Data Engineer Associte

My Databricks professional data Engineer certification exam got suspended. My Exam just went for half hour, it was showing me error for eye movement when I was reading question, exam suspended on 11th of July 2024 and still showing in progress assess...

  • 1396 Views
  • 1 replies
  • 0 kudos
Latest Reply
gchandra
Databricks Employee
  • 0 kudos

I'm sorry to hear your exam was suspended. Please file a ticket with our support team and allow the support team 24-48 hours for a resolution. You should also review this documentation:Room requirementsBehavioral considerations

  • 0 kudos
johnb1
by Contributor
  • 776 Views
  • 1 replies
  • 0 kudos

Access Git folder information from notebook

In my Workspace, I have a repository with Git folder.I would like to access programatically with Python from within a notebook:- name of the repo- currently checked out branch in the repoI want to do this in two different ways:(1) Access said informa...

  • 776 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @johnb1 ,You can use one of the following options to achieve what you want: - databricks CLI repos commands - databricks python SDK- databricks rest API calls

  • 0 kudos
camilo_s
by Contributor
  • 4255 Views
  • 6 replies
  • 3 kudos

Resolved! Hard reset programatically

Is it possible to trigger a git reset --hard programatically?I'm running a platform service where, as part of CI/CD, repos get deployed into the Databricks workspace. Normally, our developers work with upstream repos both from their local IDEs and fr...

  • 4255 Views
  • 6 replies
  • 3 kudos
Latest Reply
nicole_lu_PM
Databricks Employee
  • 3 kudos

Thank you for the feedback there! We recently added more docs for SP OAuth support for DevOps. SP OAuth support for Github is being discussed. 

  • 3 kudos
5 More Replies
jasont41
by New Contributor II
  • 1140 Views
  • 1 replies
  • 2 kudos

Resolved! Trouble with host url parameterization

I am attempting to parameterize a databricks yaml so I can deploy it to multiple databricks accounts via Gitlab CICD, and have ran into a snag when parameterizing the workpace host value. My variable block looks like this: variables:    databricks_ho...

  • 1140 Views
  • 1 replies
  • 2 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 2 kudos

Hi @jasont41 ,Your assumption is correct. You can't use variable for host mapping. You can find information about it in the following documentation entry: https://docs.databricks.com/en/dev-tools/bundles/settings.html#other-workspace-mappings

  • 2 kudos
PabloCSD
by Valued Contributor II
  • 4806 Views
  • 3 replies
  • 3 kudos

Resolved! Use a Service Principal Token instead of Personal Access Token for Databricks Asset Bundle

How can I connect using a Service Principal Token, I did this, but it is not a PAT: databricks configure Databricks host: https:// ... Personal access token: ****  I also tried this, but didn't work either: [profile] host = <workspace-url> client_id ...

  • 4806 Views
  • 3 replies
  • 3 kudos
Latest Reply
PabloCSD
Valued Contributor II
  • 3 kudos

Thanks Pedro, we did it, for anyone in the future (I added fake host and service principal id's):1. Modify your databricks.yml so it have the service principal id and the databricks host: bundle: name: my_workflow # Declare to Databricks Assets Bu...

  • 3 kudos
2 More Replies