cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Engineering & Streaming

Forum Posts

sriradh
by New Contributor
  • 1375 Views
  • 0 replies
  • 0 kudos

ACID properties in delta?

How are locks maintained within a Delta Lake? For instance, lets say there are 2 simple tables, customer_details and say orders. Lets say I am running a job that will say insert an order in the orders table for say $100 for a specific customerId, it ...

Data Engineering
acid
delta
  • 1375 Views
  • 0 replies
  • 0 kudos
AB_MN
by New Contributor III
  • 5489 Views
  • 4 replies
  • 1 kudos

Resolved! Read data from Azure SQL DB

I am trying to read data into a dataframe from Azure SQL DB, using jdbc. Here is the code I am using.driver = "com.microsoft.sqlserver.jdbc.SQLServerDriver"   database_host = "server.database.windows.net" database_port = "1433" database_name = "dat...

  • 5489 Views
  • 4 replies
  • 1 kudos
Latest Reply
AB_MN
New Contributor III
  • 1 kudos

That did the trick. Thank you!

  • 1 kudos
3 More Replies
Hubert-Dudek
by Esteemed Contributor III
  • 1106 Views
  • 1 replies
  • 1 kudos

Foreign catalogs

With the introduction of the Unity Catalog in databricks, many of us have become familiar with creating catalogs. However, did you know that the Unity Catalog also allows you to create foreign catalogs? You can register databases from the following s...

db.png
  • 1106 Views
  • 1 replies
  • 1 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 1 kudos

Thank you for sharing @Hubert-Dudek !!!

  • 1 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 1002 Views
  • 1 replies
  • 3 kudos

row-level concurrency

With the introduction of Databricks Runtime 14, you can now enable row-level concurrency using these simple techniques!

row-level.png
  • 1002 Views
  • 1 replies
  • 3 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 3 kudos

Thank you for sharing this @Hubert-Dudek 

  • 3 kudos
mike_engineer
by New Contributor
  • 951 Views
  • 0 replies
  • 0 kudos

Window functions in Change Data Feed

Hello!I am currently exploring the possibility of implementing incremental changes in our company's ETL pipeline and looking into Change Data Feed option. There are a couple of challenges I'm uncertain about.For instance, we have a piece of logic lik...

  • 951 Views
  • 0 replies
  • 0 kudos
RYBK
by New Contributor III
  • 10589 Views
  • 2 replies
  • 1 kudos

Resolved! External location + Failure to initialize configuration for storage account

Hello,I created a storage credential and an external location. Test is ok, I'm able to browse it from the portal. I have a notebook to create a table :%sqlCREATE OR REPLACE TABLE myschema.mytable(  data1 string, data2 string)USING DELTA LOCATION "abf...

  • 10589 Views
  • 2 replies
  • 1 kudos
Latest Reply
" src="" />
This widget could not be displayed.
This widget could not be displayed.
This widget could not be displayed.
  • 1 kudos

This widget could not be displayed.
Hello,I created a storage credential and an external location. Test is ok, I'm able to browse it from the portal. I have a notebook to create a table :%sqlCREATE OR REPLACE TABLE myschema.mytable(  data1 string, data2 string)USING DELTA LOCATION "abf...

This widget could not be displayed.
  • 1 kudos
This widget could not be displayed.
1 More Replies
117074
by New Contributor III
  • 10680 Views
  • 1 replies
  • 1 kudos

[INCONSISTENT_BEHAVIOR_CROSS_VERSION.PARSE_DATETIME_BY_NEW_PARSER]

Hi all,I'm trying to join 2 views in SQL editor for some analysis. I get the following error:[INCONSISTENT_BEHAVIOR_CROSS_VERSION.PARSE_DATETIME_BY_NEW_PARSER] You may get a different result due to the upgrading to Spark >= 3.0: Fail to parse '22/12/...

  • 10680 Views
  • 1 replies
  • 1 kudos
Latest Reply
117074
New Contributor III
  • 1 kudos

Hi Kaniz, I found the equivalent SQL code for this but it didn't seem to store the operation past the execution. I.e I would run the code to configure settings, then run the troublesome code afterwards and still get the same result. The problem has b...

  • 1 kudos
yliu
by New Contributor III
  • 13466 Views
  • 2 replies
  • 1 kudos

Z-ordering optimization with multithreading

Hi, I am wondering if multithreading will help with the performance for z-ordering optimization on multiple delta tables.We are periodically doing optimization on thousands of tables and it easily takes a few days to finish the job. So we are looking...

  • 13466 Views
  • 2 replies
  • 1 kudos
Latest Reply
" src="" />
This widget could not be displayed.
This widget could not be displayed.
This widget could not be displayed.
  • 1 kudos

This widget could not be displayed.
Hi, I am wondering if multithreading will help with the performance for z-ordering optimization on multiple delta tables.We are periodically doing optimization on thousands of tables and it easily takes a few days to finish the job. So we are looking...

This widget could not be displayed.
  • 1 kudos
This widget could not be displayed.
1 More Replies
Eeg
by New Contributor III
  • 12862 Views
  • 4 replies
  • 5 kudos

Pyflake errors when using %run

I am using %run command to import shared resources for each of my processes. Because it was the most easy way to import my common libraries. However, in that way, pyflake can't resolve the dependencies quite well. And I end up working in code with ma...

  • 12862 Views
  • 4 replies
  • 5 kudos
Latest Reply
btafur
Databricks Employee
  • 5 kudos

You could use something like flake8 and customize the rules in the .flake8 file or ignore specific lines with #noqa. https://flake8.pycqa.org/en/latest/user/configuration.html

  • 5 kudos
3 More Replies
turagittech
by New Contributor
  • 6833 Views
  • 0 replies
  • 0 kudos

Pandas 2.x availability

Hi All,I am wondering if Pandas 2.x will be available soon or is it an available option to install.I have a small job I built to manipulate some strings from  a database table when technically did the job, but doesn't scale with older versions of pan...

  • 6833 Views
  • 0 replies
  • 0 kudos
melodiesd
by New Contributor
  • 4834 Views
  • 0 replies
  • 0 kudos

Parse_Syntax_Error Help

Hello all, I'm new to Databricks and can't figure out why I'm getting an error in my SQL code.Error in SQL statement: ParseException: [PARSE_SYNTAX_ERROR] Syntax error at or near 'if'.(line 1, pos 0) == SQL == if OBJECT_ID('tempdb.#InitialData') IS N...

  • 4834 Views
  • 0 replies
  • 0 kudos
pygreg
by New Contributor
  • 2473 Views
  • 1 replies
  • 1 kudos

Resolved! Workflows : pass parameters to a "run job" task

Hi folks!I would like to know if there is a way to pass parameters to a "run job" task.For example:Let's have a Job A with:a notebook task A.1 that takes as input a parameter year-month in the format yyyymma "run job" task A.2 that calls a Job BI wou...

  • 2473 Views
  • 1 replies
  • 1 kudos
Latest Reply
btafur
Databricks Employee
  • 1 kudos

This feature will be available soon as part of Job Parameters. Right now it is not possible to easily pass parameters to a child job.

  • 1 kudos
peterwishart
by New Contributor III
  • 3798 Views
  • 4 replies
  • 0 kudos

Resolved! Programmatically updating the “run_as_user_name” parameter for jobs

I am trying to write a process that will programmatically update the “run_as_user_name” parameter for all jobs in an Azure Databricks workspace, using powershell to interact with the Jobs API. I have been trying to do this with a test job without suc...

  • 3798 Views
  • 4 replies
  • 0 kudos
Latest Reply
baubleglue
New Contributor II
  • 0 kudos

  Solution you've submitted is a solution for different topic (permission to run job, the job still runs as the user in run_as_user_name field). Here is an example of changing "run_as_user_name"Docs:https://docs.databricks.com/api/azure/workspace/job...

  • 0 kudos
3 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels