cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

manuel-barreiro
by New Contributor II
  • 1760 Views
  • 5 replies
  • 0 kudos

Unable to view hive_metastore schemas although I have the same permissions as co-workers who can

Hello! I'm having trouble accessing the schemas of the hive_metastore. I have the same level of permissions as my fellow coworkers who don't have any trouble viewing the schemas. Please I would really appreciate it if you could help me with this beca...

manuelbarreiro_0-1736274758836.png
  • 1760 Views
  • 5 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Where you able to get this issue resolved after looking at the permissions level on your schema and tables?

  • 0 kudos
4 More Replies
yevsh
by New Contributor II
  • 2404 Views
  • 4 replies
  • 0 kudos

UDF java can't access files in Unity Catalog - Operation not permitted

I am using Databricks on Azure.in pyspark I register UDF java functionspark.udf.registerJavaFunction("foo", "com.foo.Foo", T.StringType())Foo tries to load a file,  using Files.readAllLines(), located in the Databricks unity catalog .stderr log:Tue J...

  • 2404 Views
  • 4 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

To address the issue of needing to run initialization code that reads file content during the load of a UDF (User Defined Function) in Databricks, you should avoid performing file operations in the constructor due to security restrictions. Instead, y...

  • 0 kudos
3 More Replies
jeremy98
by Honored Contributor
  • 5484 Views
  • 7 replies
  • 6 kudos

Migrating logic from Airflow DAGs to Databricks Workflow

Hello community,I'm planning to migrate some logics of Airflow DAGs on Databricks Workflow. But, I was facing out to some doubts that I have in order to migrate (to find the respective) the logic of my actual code from DAGs to Workflow.There are two ...

  • 5484 Views
  • 7 replies
  • 6 kudos
Latest Reply
Walter_C
Databricks Employee
  • 6 kudos

You can use Asset Bundles https://docs.databricks.com/en/dev-tools/bundles/index.html 

  • 6 kudos
6 More Replies
Paul92S
by New Contributor III
  • 3720 Views
  • 12 replies
  • 5 kudos

Delta sharing service Issue making requests to Unity System Access tables

Hi all, We have been having an issue as of yesterday which I believe is related to queries against the system.access.table_linage in Unity Catalogs. This issue still persists todayWe get the following error:AnalysisException: [RequestId= ErrorClass=B...

table lineage.png delta sharing issue.png
  • 3720 Views
  • 12 replies
  • 5 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 5 kudos

Thanks team, please let me know if you need any other help!

  • 5 kudos
11 More Replies
jar
by Contributor
  • 1845 Views
  • 8 replies
  • 1 kudos

Databricks single user compute cannot write to storage

I've deployed unrestricted single user compute for each developer in our dev workspace and everything works fine except for writing to storage where the cell will continuously run but seemingly not execute anything. If I switch to an unrestricted sha...

  • 1845 Views
  • 8 replies
  • 1 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 1 kudos

Adding to @saurabh18cs comments, also check if any instance profile attached to the cluster. What is the difference between the clusters, only access mode?

  • 1 kudos
7 More Replies
Anirudh077
by New Contributor III
  • 1471 Views
  • 1 replies
  • 0 kudos

Resolved! Cannot create serverless sql warehouse, only classic and pro option available

Hey teamI am using databricks on Azure(East US region) and i have enabled serverless compute in Settings -> Feature Enablement. When i click on create sql workspace, i do not see serverless option.Any setting i am missing ?

  • 1471 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anirudh077
New Contributor III
  • 0 kudos

I found the root cause for this issue, In Security and Compliance we had PCI-DSS selected and according to this doc we can not have that instead we can select HIPAA

  • 0 kudos
eballinger
by Contributor
  • 2859 Views
  • 4 replies
  • 2 kudos

Resolved! DLT notebook dynamic declaration

Hi Guys,We have a DLT pipeline that is reading data from landing to raw (csv files into tables) for approximately 80 tables. In our first attempt at this we declared each table separately in a python notebook. One @Dlt table declared per cell. Then w...

  • 2859 Views
  • 4 replies
  • 2 kudos
Latest Reply
VZLA
Databricks Employee
  • 2 kudos

Good catch and glad to hear you've identified the source of delay!

  • 2 kudos
3 More Replies
Dean_Lovelace
by New Contributor III
  • 31596 Views
  • 13 replies
  • 2 kudos

How can I deploy workflow jobs to another databricks workspace?

I have created a number of workflows in the Databricks UI. I now need to deploy them to a different workspace.How can I do that?Code can be deployed via Git, but the job definitions are stored in the workspace only.

  • 31596 Views
  • 13 replies
  • 2 kudos
Latest Reply
Walter_C
Databricks Employee
  • 2 kudos

@itacdonev great option provided, @Dean_Lovelace you can also select the option View JSON on the Workflow and move to the option create, with this code you can use the API https://docs.databricks.com/api/workspace/jobs/create and create the job in th...

  • 2 kudos
12 More Replies
RoelofvS
by New Contributor III
  • 1373 Views
  • 2 replies
  • 2 kudos

Resolved! "delta-lake" demo fails with"config" notebook not found when running "00-setup"

Hi all,The delta-lake demo used to run fine for us around October 2024. Reinstalling it now, it fails on initialisation.Using runtime version 15.4 on a trial Databricks installation, and executing dbdemos.install('delta-lake', overwrite=True, use_cur...

  • 1373 Views
  • 2 replies
  • 2 kudos
Latest Reply
brockb
Databricks Employee
  • 2 kudos

Hi @RoelofvS , I was able to replicate the issue and reached out to the team that maintains dbdemos, they will get this addressed. Until that is addressed, you can try manually creating that `config` notebook as follows: #Note: we do not recommend to...

  • 2 kudos
1 More Replies
David_Billa
by New Contributor III
  • 1817 Views
  • 2 replies
  • 1 kudos

Resolved! Explode function to flatten the JSON

I've the DDL as below. Create or replace table test ( prices ARRAY<STRUCT<Ord:STRING:,Vndr:STRING,Prc:STRING>> ) using delta location "path" Now I want to flatten the JSON and I've tried as below but it's throwing an error, "[UNRESOLVED.COLUMN.WITH_...

  • 1817 Views
  • 2 replies
  • 1 kudos
Latest Reply
hari-prasad
Valued Contributor II
  • 1 kudos

Hi @David_Billa ,You can use following from_json function in spark which can convert struct into individual column. Refer this link https://spark.apache.org/docs/3.4.0/api/python/reference/pyspark.sql/api/pyspark.sql.functions.from_json.html.Also, yo...

  • 1 kudos
1 More Replies
impresent
by Databricks Partner
  • 642 Views
  • 1 replies
  • 0 kudos

Managed table storage not accessible in cloud storage

Hi All,I have create a new catalog in the unity catalog which has a cloud location for managed tables. but when accessing the location in Azure portal it denies me to see the files.I want to see all my data files ( parquet / JSON) in the storage acco...

  • 642 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Do you have full permissions on this storage location? Also please remember the best practices for Managed tables: You should not use tools outside of Databricks to manipulate files in managed tables directly. You should only interact with data files...

  • 0 kudos
UM1
by New Contributor
  • 6290 Views
  • 4 replies
  • 1 kudos

drop or alter primary key constraint for a streaming table in delta live tables

I have a dlt streaming table with a primary key constraint defined through the schema definitions. I have redeployed the same dlt pipeline with the same target. When attempting to run the pipeline, I get the error,  ErrorClass=RESOURCE_ALREADY_EXISTS...

  • 6290 Views
  • 4 replies
  • 1 kudos
Latest Reply
Sidhant07
Databricks Employee
  • 1 kudos

The `RESOURCE_ALREADY_EXISTS` error you're encountering suggests that the primary key constraint `pk_constraint_name` already exists in the target delta table. This constraint may have been created during a previous deployment of the DLT pipeline or ...

  • 1 kudos
3 More Replies
manideep04d
by New Contributor
  • 4956 Views
  • 2 replies
  • 0 kudos

Creating Linked Service in ADF to link Databricks Community Edition

Using Current Community Edition I was Unable to create a linked service unable to generate/copy personal Access Token .Tried Installing ODBC but unable to set it properly. Any suggestions / Help would be highly Appreciated

  • 4956 Views
  • 2 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

In community Edition workspaces the PAT token is not allowed, instead of this you might need to generate an Oauth token to be able to authenticate: https://docs.databricks.com/en/dev-tools/auth/oauth-u2m.htmlYou can refer to the ODBC guide in page 19...

  • 0 kudos
1 More Replies
Labels