cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Anirudh077
by New Contributor III
  • 563 Views
  • 1 replies
  • 0 kudos

Resolved! Cannot create serverless sql warehouse, only classic and pro option available

Hey teamI am using databricks on Azure(East US region) and i have enabled serverless compute in Settings -> Feature Enablement. When i click on create sql workspace, i do not see serverless option.Any setting i am missing ?

  • 563 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anirudh077
New Contributor III
  • 0 kudos

I found the root cause for this issue, In Security and Compliance we had PCI-DSS selected and according to this doc we can not have that instead we can select HIPAA

  • 0 kudos
eballinger
by Contributor
  • 1203 Views
  • 4 replies
  • 2 kudos

Resolved! DLT notebook dynamic declaration

Hi Guys,We have a DLT pipeline that is reading data from landing to raw (csv files into tables) for approximately 80 tables. In our first attempt at this we declared each table separately in a python notebook. One @Dlt table declared per cell. Then w...

  • 1203 Views
  • 4 replies
  • 2 kudos
Latest Reply
VZLA
Databricks Employee
  • 2 kudos

Good catch and glad to hear you've identified the source of delay!

  • 2 kudos
3 More Replies
om_bk_00
by New Contributor III
  • 828 Views
  • 4 replies
  • 1 kudos

Resolved! passing job parameters through the terminal to a job

I am having troubles overriding the job parameters that are deployed in my local workspace.e.g I have a job that fills tables with data,the parameters given to it are random and I would like to override them when I run through my terminaldatabricks b...

  • 828 Views
  • 4 replies
  • 1 kudos
Latest Reply
VZLA
Databricks Employee
  • 1 kudos

Excellent! Thanks for sharing your working version and solution!

  • 1 kudos
3 More Replies
shubham_007
by Contributor III
  • 1553 Views
  • 5 replies
  • 2 kudos

Resolved! What are powerfull data quality tools/libraries to build data quality framework in Databricks ?

Dear Community Experts,I need your expert advice and suggestions on development of data quality framework. What are powerfull data quality tools or libraries are good to go for development of data quality framework in Databricks ? Please guide team.R...

  • 1553 Views
  • 5 replies
  • 2 kudos
Latest Reply
Takuya-Omi
Valued Contributor III
  • 2 kudos

Any short guidance on how to implement data quality framework in databricks ?With dbdemos, you can learn a practical architecture for data quality testing using the expectations feature of DLT. I hope this helps! (Please note that some DLT syntax mig...

  • 2 kudos
4 More Replies
Dean_Lovelace
by New Contributor III
  • 24817 Views
  • 13 replies
  • 2 kudos

How can I deploy workflow jobs to another databricks workspace?

I have created a number of workflows in the Databricks UI. I now need to deploy them to a different workspace.How can I do that?Code can be deployed via Git, but the job definitions are stored in the workspace only.

  • 24817 Views
  • 13 replies
  • 2 kudos
Latest Reply
Walter_C
Databricks Employee
  • 2 kudos

@itacdonev great option provided, @Dean_Lovelace you can also select the option View JSON on the Workflow and move to the option create, with this code you can use the API https://docs.databricks.com/api/workspace/jobs/create and create the job in th...

  • 2 kudos
12 More Replies
RoelofvS
by New Contributor III
  • 616 Views
  • 2 replies
  • 2 kudos

Resolved! "delta-lake" demo fails with"config" notebook not found when running "00-setup"

Hi all,The delta-lake demo used to run fine for us around October 2024. Reinstalling it now, it fails on initialisation.Using runtime version 15.4 on a trial Databricks installation, and executing dbdemos.install('delta-lake', overwrite=True, use_cur...

  • 616 Views
  • 2 replies
  • 2 kudos
Latest Reply
brockb
Databricks Employee
  • 2 kudos

Hi @RoelofvS , I was able to replicate the issue and reached out to the team that maintains dbdemos, they will get this addressed. Until that is addressed, you can try manually creating that `config` notebook as follows: #Note: we do not recommend to...

  • 2 kudos
1 More Replies
David_Billa
by New Contributor III
  • 666 Views
  • 2 replies
  • 1 kudos

Resolved! Explode function to flatten the JSON

I've the DDL as below. Create or replace table test ( prices ARRAY<STRUCT<Ord:STRING:,Vndr:STRING,Prc:STRING>> ) using delta location "path" Now I want to flatten the JSON and I've tried as below but it's throwing an error, "[UNRESOLVED.COLUMN.WITH_...

  • 666 Views
  • 2 replies
  • 1 kudos
Latest Reply
hari-prasad
Valued Contributor II
  • 1 kudos

Hi @David_Billa ,You can use following from_json function in spark which can convert struct into individual column. Refer this link https://spark.apache.org/docs/3.4.0/api/python/reference/pyspark.sql/api/pyspark.sql.functions.from_json.html.Also, yo...

  • 1 kudos
1 More Replies
impresent
by New Contributor
  • 263 Views
  • 1 replies
  • 0 kudos

Managed table storage not accessible in cloud storage

Hi All,I have create a new catalog in the unity catalog which has a cloud location for managed tables. but when accessing the location in Azure portal it denies me to see the files.I want to see all my data files ( parquet / JSON) in the storage acco...

  • 263 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Do you have full permissions on this storage location? Also please remember the best practices for Managed tables: You should not use tools outside of Databricks to manipulate files in managed tables directly. You should only interact with data files...

  • 0 kudos
UM1
by New Contributor
  • 2092 Views
  • 4 replies
  • 1 kudos

drop or alter primary key constraint for a streaming table in delta live tables

I have a dlt streaming table with a primary key constraint defined through the schema definitions. I have redeployed the same dlt pipeline with the same target. When attempting to run the pipeline, I get the error,  ErrorClass=RESOURCE_ALREADY_EXISTS...

  • 2092 Views
  • 4 replies
  • 1 kudos
Latest Reply
Sidhant07
Databricks Employee
  • 1 kudos

The `RESOURCE_ALREADY_EXISTS` error you're encountering suggests that the primary key constraint `pk_constraint_name` already exists in the target delta table. This constraint may have been created during a previous deployment of the DLT pipeline or ...

  • 1 kudos
3 More Replies
manideep04d
by New Contributor
  • 1531 Views
  • 2 replies
  • 0 kudos

Creating Linked Service in ADF to link Databricks Community Edition

Using Current Community Edition I was Unable to create a linked service unable to generate/copy personal Access Token .Tried Installing ODBC but unable to set it properly. Any suggestions / Help would be highly Appreciated

  • 1531 Views
  • 2 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

In community Edition workspaces the PAT token is not allowed, instead of this you might need to generate an Oauth token to be able to authenticate: https://docs.databricks.com/en/dev-tools/auth/oauth-u2m.htmlYou can refer to the ODBC guide in page 19...

  • 0 kudos
1 More Replies
Rjdudley
by Honored Contributor
  • 878 Views
  • 3 replies
  • 0 kudos

Resolved! Deploying Data Source API code

This might be a stupid question but there's just no mention of what to do here.  I'm looking at the blog (https://www.databricks.com/blog/simplify-data-ingestion-new-python-data-source-api) and documentation (https://learn.microsoft.com/en-us/azure/d...

  • 878 Views
  • 3 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

You're very welcome!

  • 0 kudos
2 More Replies
Rajeshwar_Reddy
by New Contributor II
  • 1057 Views
  • 2 replies
  • 0 kudos

Resolved! ODBC connection issue Simba 64 bit driver

Hello AllAm getting the below error when trying to create ODBC DSN Simba 64 in local system to connect Databricks Server using the token and enabled SSL System trust store & Thrift Transport: HTTP. Thanks,[Simba][ThriftExtension] (14) Unexpected resp...

  • 1057 Views
  • 2 replies
  • 0 kudos
Latest Reply
Rajeshwar_Reddy
New Contributor II
  • 0 kudos

Yes its the same question 

  • 0 kudos
1 More Replies
Phuonganh
by New Contributor II
  • 2117 Views
  • 3 replies
  • 4 kudos

Databricks SDK for Python: Errors with parameters for Statement Execution

Hi team,Im using Databricks SDK for python to run SQL queries. I created a variable as below:param = [{'name' : 'a', 'value' :x'}, {'name' : 'b', 'value' : 'y'}]and passed it the statement as below_ = w.statement_execution.execute_statement( warehous...

  • 2117 Views
  • 3 replies
  • 4 kudos
Latest Reply
vfrcode
New Contributor II
  • 4 kudos

The following works: response = w.statement_execution.execute_statement( statement='ALTER TABLE users ALTER COLUMN :col_name SET NOT NULL', warehouse_id='<warehouseID>', parameters=[sql.StatementParameterListItem(name='col_name' value='u...

  • 4 kudos
2 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels