cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

excavator-matt
by New Contributor III
  • 100 Views
  • 4 replies
  • 2 kudos

How do use Databricks Lakeflow Declarative Pipeline on AWS DMS data?

Hi!I am trying to replicate an AWS RDS PostgreSQL database in Databricks. I have successfully manage to enable CDC using AWS DMS that writes an initial load file and continuous CDC files in parquet.I have been trying to follow the official guide Repl...

Data Engineering
AUTO CDC
AWS DMS
declarative pipelines
LakeFlow
  • 100 Views
  • 4 replies
  • 2 kudos
Latest Reply
mmayorga
Databricks Employee
  • 2 kudos

Hi @excavator-matt , Yes, you are correct. CloudFiles/Autoloader handles idempotency on the file level.  From the guide's perspective, the View is created from the source files in the specified location. This view captures all files and their corresp...

  • 2 kudos
3 More Replies
smoortema
by New Contributor III
  • 29 Views
  • 1 replies
  • 0 kudos

How to make FOR cycle and dynamic SQL and variables work together

I am working on a testing notebook where the table that is tested can be given as a widget. I wanted to write it in SQL. The notebook does the following steps in a cycle that should run 10 times:1. Store the starting version of a delta table in a var...

  • 29 Views
  • 1 replies
  • 0 kudos
Latest Reply
mmayorga
Databricks Employee
  • 0 kudos

Hi @smoortema , Thank you for reaching out! You are very close to getting the “start_version”; you just need to include “INTO start_version” after the “EXECUTE IMMEDIATE”. Here is the updated code BEGIN DECLARE sum INT DEFAULT 0; DECLARE start_ve...

  • 0 kudos
IONA
by New Contributor III
  • 12 Views
  • 0 replies
  • 1 kudos

Dev/Pie/Prd and the same workspace

Hi all!I'm appealing to all you folk who are clever than I for some advice on Databricks dev ops.I was asked by my team leader to expand our singular environment to a devops style dev/pie/prd system, potentially using Dabs to promote code to higher e...

  • 12 Views
  • 0 replies
  • 1 kudos
LonguiVic1
by New Contributor III
  • 24 Views
  • 1 replies
  • 1 kudos

Resolved! How to Find DBU Consumption and Cost for a Serverless Job?

Hello community,I'm new to using Serverless compute for my Jobs and I need some help understanding how to monitor the costs.I have configured and run a job that executes a notebook using the "Serverless" compute option. The job completed successfully...

  • 24 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @LonguiVic1 ,You can use system table to track consumption in serverless. In below article they even provide sample queries you can use. Also, notice that there's list_prices system table that includes list prices over time for each available SKU....

  • 1 kudos
SuMiT1
by New Contributor II
  • 38 Views
  • 3 replies
  • 0 kudos

Flattening the json in databricks

I have chatbot data  I read adls json file in databricks and i stored the output in dataframeIn that table two columns contains json data but the data type is string1.content2.metadata Now i have to flatten the.data but i am not getting how to do tha...

  • 38 Views
  • 3 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @SuMiT1 ,Here's one approach you can use:1. First, let's create a dataframe with a sample data that matches your table:from pyspark.sql.functions import from_json, explode, col from pyspark.sql.types import * metadata_json = ''' {"BotId":"487d-b...

  • 0 kudos
2 More Replies
Raj_DB
by New Contributor III
  • 22 Views
  • 1 replies
  • 0 kudos

Streamlining Custom Job Notifications with a Centralized Email List

Hi Everyone,I am working on setting up success/failure notifications for a large number of jobs in our Databricks environment. The manual process of configuring email notification using UI for each job individually is not scalable and is becoming ver...

  • 22 Views
  • 1 replies
  • 0 kudos
Latest Reply
mmayorga
Databricks Employee
  • 0 kudos

Hi @Raj_DB , Thank you for reaching out! You can easily achieve this by leveraging the Python SDK that is already installed within the Databricks clusters or by using the Jobs API. With the SDK, you can update each job and its corresponding “email_no...

  • 0 kudos
daan_dw
by New Contributor III
  • 24 Views
  • 1 replies
  • 0 kudos

Databricks asset bundles in Python: referencing variables

Hey,I am using DAB's and in my .yml files I can reference my variables set in my databricks.yml like this: git_branch: ${var.branch}I would like to do the same thing in my DAB's written in Python but I cannot find any documentation on how to do this....

  • 24 Views
  • 1 replies
  • 0 kudos
Latest Reply
SP_6721
Honored Contributor
  • 0 kudos

Hi @daan_dw ,To reference variables defined in your databricks.yml in Python DAB code, define your variables class and use bundle.resolve_variablehttps://docs.databricks.com/aws/en/dev-tools/bundles/python/#access-bundle-variables

  • 0 kudos
seefoods
by Valued Contributor
  • 13 Views
  • 0 replies
  • 0 kudos

DQX - datacontract cli

Hello Guyz, Someone can i combine dqx databricks rules check with datacontract cli ? If yes can we share your idea? https://gpt.datacontract.com/sources/cli.datacontract.com/Cordially, 

  • 13 Views
  • 0 replies
  • 0 kudos
nulltype
by New Contributor
  • 55 Views
  • 1 replies
  • 1 kudos

Online Table Migration

I am currently trying to migrate our Online Tables to synced tables with Online Feature Store since Online Tables is deprecated. When creating a new table, it worked just fine and how the docs said it would (https://docs.databricks.com/aws/en/machine...

  • 55 Views
  • 1 replies
  • 1 kudos
Latest Reply
mark_ott
Databricks Employee
  • 1 kudos

Migrating from deprecated Online Tables to synced tables with the Databricks Online Feature Store can be tricky due to several points of integration and timing between Unity Catalog (UC), Feature Store metadata, and the underlying online store. The m...

  • 1 kudos
KristiLogos
by Contributor
  • 2209 Views
  • 1 replies
  • 0 kudos

Spark JDBC Netsuite error - SQLSyntaxErrorException: [NetSuite][OpenAccess SDK JDBC Driver][OpenAcc

 I'm  trying to query the Customer netsuite tables with spark jdbc and  I've added the .jar file to the cluster and trying to run the below:jdbc_url = "jdbc:ns://xxxx.connect.api.netsuite.com:1708;ServerDataSource=NetSuite2.com;Encrypted=1;NegotiateS...

  • 2209 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

If you are on a Spark version that supports .option("query", ...), you can do:   python df = spark.read \ .format("jdbc") \ .option("url", jdbc_url) \ .option("query", "SELECT TOP 10 * FROM Customer") \ .option("user", "xxxx") \ ...

  • 0 kudos
iskidet
by New Contributor
  • 113 Views
  • 2 replies
  • 1 kudos

Declarative Pipeline Failure for Autoloader

Hello Folks, After moving my working serverless Auto Loader notebook to a declarative (DLT) pipeline, I’m getting an AccessDenied error. What could be causing this?”here is the DLT json  and error message in the DLT I googled around got saw some hint...

iskidet_1-1759326363933.png iskidet_4-1759326671286.png iskidet_5-1759327166901.png iskidet_2-1759326493353.png
  • 113 Views
  • 2 replies
  • 1 kudos
Latest Reply
Advika_
Databricks Employee
  • 1 kudos

Hello @iskidet! Were you able to resolve the AccessDenied issue? If the above suggestion helped, or if you found another solution, it would be great if you could mark it as the accepted solution or share your approach with the community.

  • 1 kudos
1 More Replies
timo82
by New Contributor
  • 151 Views
  • 7 replies
  • 4 kudos

Resolved! [CANNOT_OPEN_SOCKET] Can not open socket: ["tried to connect to ('127.0.0.1', 45287)

Hello,after databricks update the Runtime from Release: 15.4.24 to Release: 15.4.25 we getting in all jobs the Error:[CANNOT_OPEN_SOCKET] Can not open socket: ["tried to connect to ('127.0.0.1', 45287)What we can do here?Greetings

  • 151 Views
  • 7 replies
  • 4 kudos
Latest Reply
Vasireddy
Contributor II
  • 4 kudos

Hi @Hansjoerg,Apologies for the confusion earlier. You are right Bundles doesn't allow pinning to specific patch versions like 15.4.24.Your best option is to skip Bundles for now and use the regular Databricks Jobs setup (via UI or Jobs API) where yo...

  • 4 kudos
6 More Replies
AkhileshVB
by New Contributor
  • 1963 Views
  • 1 replies
  • 0 kudos

Syncing lakebase table to delta table

I have been exploring Lakebase and I wanted to know if there is a way to sync CDC data from Lakebase tables to delta table in Lakehouse. I know the other way is possible and that's what was shown in the demo. Can you tell how I can I sync both the ta...

  • 1963 Views
  • 1 replies
  • 0 kudos
Latest Reply
sarahbhord
Databricks Employee
  • 0 kudos

  Hey AkhileshVB! Lakebase-to-Delta CDC sync is in Private Preview—GA/Preview dates are not firm yet. Doo you have a Databricks contact or account manager? They are the right place to go if you want early involvement. Workarounds & DIY Approaches: Fo...

  • 0 kudos
SuMiT1
by New Contributor II
  • 75 Views
  • 5 replies
  • 2 kudos

Read files from adls in databricks

I have unity catalogue access connector but its not enabled as i have only admin access so i dont have access to the admin portal to enable this as its need global admin permissions.I am trying to read adls json data in databricks by using service pr...

  • 75 Views
  • 5 replies
  • 2 kudos
Latest Reply
saurabh18cs
Honored Contributor II
  • 2 kudos

Hi @SuMiT1 once networking issue is resolved , also  make sure your service principal has at least Storage Blob Data Reader on the storage account/container.

  • 2 kudos
4 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels