cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Engineering & Streaming

Forum Posts

Peston
by New Contributor
  • 2130 Views
  • 0 replies
  • 0 kudos

DataAI Summit

Having a great time at the summit and learning about the advances with AI. The advantages of adding the ability to use english as a new programming language and the advantage that it will bring to the companies that adapt to the future of Databricks....

  • 2130 Views
  • 0 replies
  • 0 kudos
vrakesh77
by New Contributor
  • 467 Views
  • 0 replies
  • 0 kudos

DAIS 2023

Amazing! Excited to be part of my first DAIS. Incredible contributions. Looking forward to putting some of the tools and features into practice.

  • 467 Views
  • 0 replies
  • 0 kudos
Ncman
by New Contributor
  • 372 Views
  • 0 replies
  • 0 kudos

Summit

Having a great time at the data summit! It is an amazing experience and is organized very well! #databricks

  • 372 Views
  • 0 replies
  • 0 kudos
bikash84
by New Contributor III
  • 2677 Views
  • 1 replies
  • 1 kudos

Resolved! need help with Concurrent append exception during delta merge

I have 2 datasets getting loaded into a common silver table. These are event driven and notebooks are triggered when a file is dropped into the storage account. When the files come in at the same time, one dataset fails with concurrent append excepti...

  • 2677 Views
  • 1 replies
  • 1 kudos
Latest Reply
yuvapraveen_k
New Contributor III
  • 1 kudos

Databricks provides ACID guarantees since the inception of Delta format. In order to ensure the C - Consistency is addressed, it limits concurrent workflows to perform updates at the same time, like other ACID compliant SQL engines. The key differenc...

  • 1 kudos
Databricks_-Dat
by New Contributor II
  • 5255 Views
  • 3 replies
  • 1 kudos

Databricks workflows, sample script/method to deploy jobs.json to other workspace

Could someone point me at right direction to deploy Jobs from one workspace to other workspace using josn file in Devops CI/CD pipeline? Thanks in advance.

  • 5255 Views
  • 3 replies
  • 1 kudos
Latest Reply
yuvapraveen_k
New Contributor III
  • 1 kudos

Your are welcome. There was a feature that databricks released to linked the workflow definition to the GIT automatically. Please refer the link below,https://www.databricks.com/blog/2022/06/21/build-reliable-production-data-and-ml-pipelines-with-git...

  • 1 kudos
2 More Replies
Nam
by New Contributor II
  • 1881 Views
  • 1 replies
  • 0 kudos
  • 1881 Views
  • 1 replies
  • 0 kudos
Latest Reply
AlanF
New Contributor II
  • 0 kudos

No difference, both are very similar only difference is that in the SQL editor you cannot take advantage of things like SQL serverless clusters, better visibility of data objects and a more simplified exploration tooling. SQL in notebooks mostly run ...

  • 0 kudos
Essjay
by New Contributor II
  • 1861 Views
  • 1 replies
  • 3 kudos

Resolved! Standard SKU to Premium SKU?

What's the effort involved with converting a databricks standard workspace to a premium? Is it 1 click of a button? Or are there other considerations you need to think about?

  • 1861 Views
  • 1 replies
  • 3 kudos
Latest Reply
srusso
New Contributor III
  • 3 kudos

It’s pretty easy! You can use the Azure API or the CLI. To upgrade, use the Azure Databricks workspace creation API to recreate the workspace with exactly the same parameters as the Standard workspace, specifying the sku property as Premium. To use t...

  • 3 kudos
msa_2j212
by New Contributor II
  • 2255 Views
  • 2 replies
  • 1 kudos

Downloading and storing a PDF file to FileStore not working

I'm trying to download a PDF file and store it in FileStore using this code in a Notebook: with open('/dbfs/FileStore/file.pdf', 'wb') as f: f.write(requests.get('https://url.com/file.pdf').content) But I'm getting this error:FileNotFoundError: [...

  • 2255 Views
  • 2 replies
  • 1 kudos
Latest Reply
msa_2j212
New Contributor II
  • 1 kudos

This worked, thanks. 

  • 1 kudos
1 More Replies
berdoni
by New Contributor III
  • 795 Views
  • 0 replies
  • 0 kudos

Migrate an Azure Delta Table from Standard to Premium Storage

Hello,I'm exercising a migration of an azure delta table (10TB) from Azure Standard  performance tier to Azure Premium. The plan is to create a new storage account and copy the table into it. Then we will switch to the new table. The table contains r...

Data Engineering
azure
deep clone
  • 795 Views
  • 0 replies
  • 0 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 4084 Views
  • 3 replies
  • 25 kudos

Bamboolib with databricks, low-code programming is now available on #databricks Now you can prepare your databricks code without ... coding. Low code ...

Bamboolib with databricks, low-code programming is now available on #databricksNow you can prepare your databricks code without ... coding. Low code solution is now available on Databricks. Install and import bamboolib to start (require a version of ...

Picture2 Picture3 bamboolib Picture4
  • 4084 Views
  • 3 replies
  • 25 kudos
Latest Reply
Palkers
New Contributor III
  • 25 kudos

I have tried to load parquet file using bamboolib menu, and getting below error that path does not existI can load the same file without no problem using spark or pandas using following pathciti_pdf = pd.read_parquet(f'/dbfs/mnt/orbify-sales-raw/Wide...

  • 25 kudos
2 More Replies
Kamalpandey
by New Contributor
  • 1781 Views
  • 2 replies
  • 0 kudos

How to connect third party apps

help us to understand how integrate third party app such as M365 to databricks

  • 1781 Views
  • 2 replies
  • 0 kudos
Latest Reply
VRS
New Contributor II
  • 0 kudos

Try using "partner connect" API by databricks. They have manual connect and automatic connect options for this. databricks have individual partner connect api for each third application. Check if M365 supports this type of integration with databricks...

  • 0 kudos
1 More Replies
whleeman
by New Contributor III
  • 1786 Views
  • 2 replies
  • 2 kudos

Resolved! Can I use SQL to create a table and set an expiration date?

Can I use SQL to create a table and set and expiration date? So that the table will be deleted automatically after the expiration date.

  • 1786 Views
  • 2 replies
  • 2 kudos
Latest Reply
prakash1
New Contributor II
  • 2 kudos

Right now it i not possible. You can always create a custom trigger solution with parameter to call notebook and delete the table.

  • 2 kudos
1 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels