cancel
Showing results for 
Search instead for 
Did you mean: 
Page Title

Welcome to the Databricks Community

Discover the latest insights, collaborate with peers, get help from experts and make meaningful connections

102288members
52584posts
cancel
Showing results for 
Search instead for 
Did you mean: 
Submit your feedback and win a $25 gift card!

The first 25 people to submit a completed survey response will receive a $25 gift card as an expression of our gratitude. Treat yourself on Databricks! Your feedback is crucial to us and directly influences how we innovate and improve our customer ex...

  • 2536 Views
  • 3 replies
  • 1 kudos
2 weeks ago
Meet DBRX, the New Standard for High-Quality LLMs

Get your first look at DBRX April 25, 2024 | 8 AM PT If you’re using off-the-shelf LLMs to build GenAI applications, you’re probably struggling with quality, privacy and governance issues. What you need is a way to cost-effectively build a custom LLM...

  • 2196 Views
  • 3 replies
  • 2 kudos
2 weeks ago
Data Warehousing in the Era of AI

AI has the power to address the data warehouse’s biggest challenges — performance, governance and usability — thanks to its deeper understanding of your data and how it’s used. This is data intelligence and it’s revolutionizing the way you query, man...

  • 2791 Views
  • 5 replies
  • 1 kudos
2 weeks ago
Meet the Community Team Virtually!

Prepare to enhance your socializing adventure! Date: April 18, 2024 Time: 9:00 - 9:30 AM IST  Location: Virtual Event (Link provided upon registration) What's in Store for You?  Exciting Icebreaker Activities Engaging Discussions Networking Oppo...

  • 5471 Views
  • 5 replies
  • 1 kudos
3 weeks ago

Community Activity

QPeiran
by New Contributor III
  • 239 Views
  • 3 replies
  • 0 kudos

Does Delta Table can be the source of streaming/auto loader?

Hi,Since the Auto Loader only accept "append-only" data as the source, I am wondering if the "Delta Table" can also be the source.Does VACCUM(deleting stale files) or _delta_log(creating nested and different file format than parquet) going to break A...

  • 239 Views
  • 3 replies
  • 0 kudos
Latest Reply
artsheiko
Valued Contributor III
  • 0 kudos

Hi @QPeiran, Auto-loader is a feature that allows to integrate files into the Data Platform. Once your data is stored into the Delta Table, you can rely on spark.readStream.table("<my_table_name>") to continuously read from the table. Take a look at ...

  • 0 kudos
2 More Replies
vinay076
by New Contributor II
  • 288 Views
  • 1 replies
  • 0 kudos

There is no certification number in my Databricks certificate that i had received after passing the

I enrolled myself for the Databricks data engineer certification recently and gave a shot at the exam and i did clear it successfully. I have received the certificate in the form of a pdf file along with a URL in which i can see my certificate and ba...

  • 288 Views
  • 1 replies
  • 0 kudos
Latest Reply
artsheiko
Valued Contributor III
  • 0 kudos

Hi, please ping a @Cert-Team in https://community.databricks.com/t5/certifications/bd-p/databricks-certification-discussion 

  • 0 kudos
alano
by New Contributor
  • 116 Views
  • 1 replies
  • 0 kudos

Handling large volumes of streamed transactional data using DLT

We have a data stream from event hub with approximately 10 million rows per day (into one table) - these records are insert only (no update). We are trying to find a solution to aggregate / group by the data based on multiple data points and our requ...

  • 116 Views
  • 1 replies
  • 0 kudos
Latest Reply
artsheiko
Valued Contributor III
  • 0 kudos

Hi, please find below a set of resources I believe relevant for you. Success stories You can find the success stories of companies leveraging the streaming on Databricks here. Videos Introduction to Data Streaming on the Lakehouse : Structured Stream...

  • 0 kudos
leireroman
by New Contributor
  • 66 Views
  • 1 replies
  • 0 kudos

Bootstrap Timeout during job cluster start

My job was not able to start because I got this problem in the job cluster.This job is running on a Azure Databricks workspace that has been deployed for almost a year and I have not had this error before. It is deployed in North Europe.After getting...

leireroman_0-1713160992292.png
  • 66 Views
  • 1 replies
  • 0 kudos
Latest Reply
lukasjh
Visitor
  • 0 kudos

We have the same problem randomly occurring since yesterday in two workspaces.The cluster started fine today in the morning at 08:00, but failed again from around 09:00 on. 

  • 0 kudos
SunilRenukaiah
by New Contributor
  • 101 Views
  • 3 replies
  • 0 kudos

Resolved! Certification Exam suspended

Hello @Cert-Team.  I was almost about to submit the exam by reviewing the final questions, but the proctor suddenly suspended the exam. You can check the entire video and I haven't done any malpractice and cooperated with all requests to show the ent...

  • 101 Views
  • 3 replies
  • 0 kudos
Latest Reply
SunilRenukaiah
New Contributor
  • 0 kudos

Thanks for the support. I passed the certification today

  • 0 kudos
2 More Replies
chemajar
by New Contributor III
  • 374 Views
  • 3 replies
  • 1 kudos

Resolved! Rearrange tasks in databricks workflow

Hello,There is anyway to rearrange tasks in databricks workflow?.I would like that line that join the two marked tasks doesn't pass behind the other tasks.  It is posible that this line by one side?Thanks. 

image.png
  • 374 Views
  • 3 replies
  • 1 kudos
Latest Reply
artsheiko
Valued Contributor III
  • 1 kudos

Hi @chemajar, Take a look at Databricks Asset Bundles. It allows you to streamline the development of complex workflows using a yaml definition. In case you need to change the task dependencies, you can rearrange the flow as you need just change the ...

  • 1 kudos
2 More Replies
MrJava
by New Contributor III
  • 3279 Views
  • 9 replies
  • 9 kudos

How to know, who started a job run?

Hi there!We have different jobs/workflows configured in our Databricks workspace running on AWS and would like to know who actually started the job run? Are they started by a user or a service principle using curl?Currently one can only see, who is t...

  • 3279 Views
  • 9 replies
  • 9 kudos
Latest Reply
leonorgrosso
  • 9 kudos

I've just posted this idea on the Idea Portal of Databricks regarding this subject. Upvote it so it may get developed!https://feedback.azure.com/d365community/idea/5d0fdbbf-eefb-ee11-a73c-0022485313bb

  • 9 kudos
8 More Replies
RahulChaubey
by New Contributor II
  • 360 Views
  • 3 replies
  • 0 kudos

Do we pay just for qurery run duration while using databricks serverless sql ?

While using databricks serverless sql to run queries does we only pay for the compute resources during the run duration of the query ?

  • 360 Views
  • 3 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @RahulChaubey ,    When using Databricks Serverless SQL, the pricing model is designed to be pay-as-you-go and is based on Databricks Units (DBUs).   Let me break it down for you: Serverless SQL allows you to run SQL queries for BI reporting,...

  • 0 kudos
2 More Replies
Alexandru
by New Contributor
  • 53 Views
  • 1 replies
  • 0 kudos

vscode python project for development

Hi,I'm trying to set up a local development environment using python / vscode / poetry. Also, linting is enabled (Microsoft pylance extension) and the python.analysis.typeCheckingMode is set to strict.We are using python files for our code (.py) whit...

  • 53 Views
  • 1 replies
  • 0 kudos
Latest Reply
artsheiko
Valued Contributor III
  • 0 kudos

Hi Alexandru, Take a look at VSCode extension for Databricks : https://marketplace.visualstudio.com/items?itemName=databricks.databricks 

  • 0 kudos
Ruby8376
by Valued Contributor
  • 65 Views
  • 1 replies
  • 0 kudos
  • 65 Views
  • 1 replies
  • 0 kudos
Latest Reply
artsheiko
Valued Contributor III
  • 0 kudos

Hi, can you clatify what is your aim ? Maybe there is no need to use DB SDK at all ?

  • 0 kudos
prasha123
by New Contributor
  • 162 Views
  • 3 replies
  • 0 kudos

Unity Catalog view access in Azure Storage account

Hi,I have my unity catalog in Azure Storage account and I can able to access table objects but I couldn't find my views that were created on top of those table. 1. I can can access Delta tables & related views via Databricks SQL and also find the tab...

Community Discussions
Azure Storage Account
Delta views
Unity Catalog
  • 162 Views
  • 3 replies
  • 0 kudos
Latest Reply
artsheiko
Valued Contributor III
  • 0 kudos

Hi, Couple of options are possible :  Use Databricks to do the complex SQL queries (joins, unions, etc) and write to a staging Delta Table. Then use DataFlow to read from that staged table. Orchestrate all of this using ADF or even Databricks Workflo...

  • 0 kudos
2 More Replies
Carpender
by New Contributor II
  • 211 Views
  • 2 replies
  • 1 kudos

PowerBI Tips

Does anyone have any tips for using PowerBI on top of databricks? Any best practices you know of or roadblocks you have run into that should be avoided?Thanks.

  • 211 Views
  • 2 replies
  • 1 kudos
Latest Reply
artsheiko
Valued Contributor III
  • 1 kudos

Hey, Use Partner Connect to establish a connection to PBI Consider to use Databricks SQL Serverless warehouses for the best user experience and performance (see Intelligent Workload Management aka auto-scaling and query queuing, remote result cache, ...

  • 1 kudos
1 More Replies
Databricks_info
by New Contributor II
  • 223 Views
  • 4 replies
  • 0 kudos

Concurrent Update to Delta - Throws error

Team,I get a ConcurrentAppendException: Files were added to the root of the table by a concurrent update when trying to update a table which executes via jobs with for each activity in ADF,I tried with Databricks run time 14.x and set the delete vect...

  • 223 Views
  • 4 replies
  • 0 kudos
Latest Reply
artsheiko
Valued Contributor III
  • 0 kudos

Hey, This issue happens whenever two or more jobs try to write to the same partition for a table. This exception is often thrown during concurrent DELETE, UPDATE, or MERGE operations. While the concurrent operations may be physically updating differe...

  • 0 kudos
3 More Replies
Nagarathna
by New Contributor
  • 38 Views
  • 1 replies
  • 0 kudos

File not found error when trying to read json file from aws s3 using with open.

I am trying to reading json from aws s3 using with open in databricks notebook using shared cluster.Error message:No such file or directory:'/dbfs/mnt/datalake/input_json_schema.json'In single instance cluster the above error is not found.  

  • 38 Views
  • 1 replies
  • 0 kudos
Latest Reply
artsheiko
Valued Contributor III
  • 0 kudos

Hey, Please consider to use Unity Catalog with Volumes. You'll find a quickstart notebook here : https://docs.databricks.com/en/connect/unity-catalog/volumes.html#tutorial-unity-catalog-volumes-notebook  Hope it helps, Best,

  • 0 kudos
RuchiK
by New Contributor
  • 64 Views
  • 1 replies
  • 0 kudos

Connecting to DataBricks Sql warehouse from .net

Hi,How can I connect to DataBricks sql warehouse from .net application. Kr

  • 64 Views
  • 1 replies
  • 0 kudos
Latest Reply
artsheiko
Valued Contributor III
  • 0 kudos

Hey, Please, take a look at Statement Execution API Best,

  • 0 kudos

Latest from our Blog

Attributing Costs in Databricks Model Serving

Databricks Model Serving provides a scalable, low-latency hosting service for AI models. It supports models ranging from small custom models to best-in-class large language models (LLMs). In this blog...

2352Views 1kudos

MLOps Gym - Unity Catalog Setup for MLOps

Unity Catalog (UC) is Databricks unified governance solution for all data and AI assets on the Data Intelligence Platform. UC is central to implementing MLOps on Databricks as it is where all your as...

2685Views 0kudos

Highly selective: SQL refined beyond the WHERE

Inuktitut, the language of the Inuit, has 50 words for snow and ice. That’s - as they say - fake news, but the point made is metaphorical: When something is important to a people, their language finds...

3317Views 3kudos