cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 
Databricks Learning Festival (Virtual): 15 January - 31 January 2025

Join us for the return of the Databricks Learning Festival (Virtual)! Mark your calendars from 15 January - 31 January 2025! Upskill today across data engineering, data analysis, machine learning, and generative AI. Join the thousands who have el...

  • 75208 Views
  • 171 replies
  • 49 kudos
11-26-2024
Share Your Feedback in Our Community Survey

Your opinion matters! Take a few minutes to complete our Customer Experience Survey to help us improve the Databricks Community. Your input is crucial in shaping the future of our community and ensuring it meets your needs. Take the Survey Now Why p...

  • 646 Views
  • 0 replies
  • 0 kudos
a week ago
Submit your feedback and win a $50 gift card!

Be among the first 20 people to share your experience with Databricks on G2 and receive a $50 gift card as our way of saying thanks. Why participate? Your feedback is invaluableIt helps us innovate and improve and takes less than 10 minutes to comple...

  • 507 Views
  • 0 replies
  • 4 kudos
Tuesday
Databricks Named a Leader in the 2024 Gartner® Magic Quadrant™ for Cloud Database Management Systems

We’re thrilled to share that Databricks has once again been recognized as a Leader in the 2024 Gartner® Magic Quadrant™ for Cloud Database Management Systems. This acknowledgement underscores our commitment to innovation and our leadership in the dat...

  • 1416 Views
  • 0 replies
  • 3 kudos
3 weeks ago
Milestone: DatabricksTV Reaches 100 Videos!

We are thrilled to announce that DatabricksTV, our growing video hub, has hit a major milestone: 100 videos and counting! What is DatabricksTV?DatabricksTV is a community-driven video hub designed to help data practitioners maximize the Databricks e...

  • 1628 Views
  • 1 replies
  • 4 kudos
12-11-2024
Announcing the new Meta Llama 3.3 model on Databricks

The Meta Llama 3.3 multilingual large language model (LLM) is a pretrained and instruction tuned generative model in 70B (text in/text out). The Llama 3.3 instruction tuned text only model is optimized for multilingual dialogue use cases and outperfo...

  • 2111 Views
  • 0 replies
  • 3 kudos
12-11-2024

Community Activity

ls
by > New Contributor II
  • 16 Views
  • 1 replies
  • 0 kudos

Change spark configs in Serverless compute clusters

Howdy!I wanted to know how I can change some spark configs in a Serverless compute. I have a base.yml file and tried placing: spark_conf:     - spark.driver.maxResultSize: "16g"but I still get his error:[CONFIG_NOT_AVAILABLE] Configuration spark.driv...

  • 16 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Spark configs are limited in Serverless, this are the supported configs you can set https://docs.databricks.com/en/release-notes/serverless/index.html#supported-spark-configuration-parameters 

  • 0 kudos
CBL
by > New Contributor
  • 1093 Views
  • 1 replies
  • 0 kudos

Schema Evolution in Azure databricks

Hi All -In my scenario, Loading data from 100 of Json files.Problem is, fields/columns are missing when JSON file contains new fields.Full Load: while writing JSON to delta use the option ("mergeschema", "true") so that we do not miss new columns Inc...

  • 1093 Views
  • 1 replies
  • 0 kudos
Latest Reply
cgrant
Databricks Employee
  • 0 kudos

For these scenarios, you can use schema evolution capabilities like mergeSchema or opt to use the new VariantType to avoid requiring a schema at time of ingest.

  • 0 kudos
TheDataEngineer
by > New Contributor
  • 3161 Views
  • 1 replies
  • 0 kudos

'replaceWhere' clause in spark.write for a partitioned table

Hi, I want to be clear about 'replaceWhere' clause in spark.write.Here is the scenario:I would like to add a column to few existing records.The table is already partitioned on "PickupMonth" column.Here is example: Without 'replaceWhere'spark.read \.f...

  • 3161 Views
  • 1 replies
  • 0 kudos
Latest Reply
cgrant
Databricks Employee
  • 0 kudos

For this style of ETL, there are 2 methods. The first method, strictly for partitioned tables, is Dynamic Partition Overwrites, which require a Spark configuration to be set and detect which partitions that are to be overwritten by scanning the input...

  • 0 kudos
mrstevegross
by > Visitor
  • 31 Views
  • 7 replies
  • 0 kudos

Resolved! Tutorial docs for running a job using serverless?

I'm exploring whether serverless (https://docs.databricks.com/en/jobs/run-serverless-jobs.html#create-a-job-using-serverless-compute) could be useful for our use case. I'd like to see an example of using serverless via the API. The docs say "To learn...

  • 31 Views
  • 7 replies
  • 0 kudos
Latest Reply
mrstevegross
  • 0 kudos

Thanks!

  • 0 kudos
6 More Replies
somedeveloper
by > New Contributor II
  • 93 Views
  • 2 replies
  • 1 kudos

Resolved! Accessing Application Listening to Port Through Driver Proxy URL

Good afternoon,I have an application, Evidently, that I am starting a dashboard service for and that listens to an open port. I would like to access this through the driver proxy URL, but when starting the service and accessing it, I am given a 502 B...

  • 93 Views
  • 2 replies
  • 1 kudos
Latest Reply
somedeveloper
New Contributor II
  • 1 kudos

The solution was to add --host 0.0.0.0 as an argument to the command. 

  • 1 kudos
1 More Replies
a_user12
by > New Contributor II
  • 16 Views
  • 2 replies
  • 1 kudos

Get Notebooks of "Data Engineer Learning Plan" Course

Hi!In Databricks Academy I registered for the new "Data Engineer Learning Plan". The videos show a lot of notebooks. Where can I download them?

  • 16 Views
  • 2 replies
  • 1 kudos
Latest Reply
Walter_C
Databricks Employee
  • 1 kudos

it appears that Databricks no longer offers downloadable notebooks or DBC files for self-paced courses, including the "Data Engineer Learning Plan."  However, if you are interested in working on labs within a provided Databricks environment, you can ...

  • 1 kudos
1 More Replies
mrstevegross
by > Visitor
  • 27 Views
  • 6 replies
  • 0 kudos

preloaded_docker_images: how do they work?

At my org, when we start a databricks cluster, it oftens takes awhile to become available (due to (1) instance provisioning, (2) library loading, and (3) init script execution). I'm exploring whether an instance pool could be a viable strategy for im...

  • 27 Views
  • 6 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Sure, I will inform the team in charge of it to review it.

  • 0 kudos
5 More Replies
jabori
by > New Contributor
  • 2051 Views
  • 2 replies
  • 0 kudos

How can I pass job parameters to a dbt task?

I have a dbt task that will use dynamic parameters from the job: {"start_time": "{{job.start_time.[timestamp_ms]}}"}My SQL is edited like this:select 1 as idunion allselect null as idunion allselect {start_time} as idThis causes the task to fail. How...

  • 2051 Views
  • 2 replies
  • 0 kudos
Latest Reply
MathieuDB
Databricks Employee
  • 0 kudos

Also, you need to pass the parameters using the --vars flag like that: dbt run --vars '{"start_time": "{{job.start_time.[timestamp_ms]}}"}' You will need to modify the 3rd dbt command in your job.

  • 0 kudos
1 More Replies
Akshith_Rajesh
by > New Contributor III
  • 1718 Views
  • 1 replies
  • 0 kudos

Get the thrift hive.metastore.uri for Databricks unity catalog

I am trying to connect to Unity catalog meta store tables using Presto Based on the presto documentation I need to use the below configuration to connect to delta tables in the unity catalog https://prestodb.io/docs/current/connector/hive.htmlSo from...

  • 1718 Views
  • 1 replies
  • 0 kudos
Latest Reply
MathieuDB
Databricks Employee
  • 0 kudos

Hello @Akshith_Rajesh, In order to connect Trino to Unity Catalog HMS, please use this configuration instead: hive.metastore.uri=https://<DATABRICKS_HOST>:443/api/2.0/unity-hms-proxy/metadata hive.metastore.http.client.bearer-token=${ENV:DATABRICKS_T...

  • 0 kudos
dfrozo
by Databricks Employee
  • 1194 Views
  • 1 replies
  • 1 kudos

Enterprise-wide data governance integrations with Unity Catalog

Unity Catalog is the foundation of all data governance-related aspects of the Databricks Data and Intelligence platform. It provides a unified, centralized solution for key enterprise capabilities, including data discoverability, data lineage, auditi...

dfrozo_0-1729759042704.png dfrozo_1-1729759075114.png dfrozo_6-1729759339985.png dfrozo_7-1729759395308.png
  • 1194 Views
  • 1 replies
  • 1 kudos
Latest Reply
IntellaNOVA
  • 1 kudos

Hi Dfrozo,Great article. I liked how you wrote and explained everything in a very simplistic manner. I wish to read more from you.One comment. One of the Link in the blog is broken.In the sentence: "Privacera: Integrates with Databricks Unity Catalog...

  • 1 kudos
neeth
by > New Contributor
  • 141 Views
  • 7 replies
  • 0 kudos

Data bricks -connect error

Hello, I new to Databricks and Scala. I created a scala application in my local machine and tried to connect to my cluster in databricks workspace using databricks connect as per the documentation. My cluster is using Databricks runtime version 16.0 ...

  • 141 Views
  • 7 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Can you try creating another profile instead of the Default one and try with it, it seems that what it is not collecting is the cluster details but wanted to check with a new profile  

  • 0 kudos
6 More Replies
colospring
by > New Contributor
  • 903 Views
  • 2 replies
  • 0 kudos

create_feature_table returns error saying database does not exist while it does

Hi, I am new on databricks and I am taking the training course on databricks machine learning: https://www.databricks.com/resources/webinar/azure-databricks-free-training-series-asset4-track/thank-you. When executing the code to create a feature tabl...

Capture4.JPG
  • 903 Views
  • 2 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

What would be the result if instead of using ' ' you use ` `? 

  • 0 kudos
1 More Replies
rpshgupta
by > New Contributor III
  • 181 Views
  • 7 replies
  • 1 kudos

How to find the source code for the data engineering learning path?

Hi Everyone,I am taking data engineering learning path in customer-academy.databricks.com . I am not able to find any source code attached to the course. Can you please help me to find it so that I can try hands on as well ?ThanksRupesh

  • 181 Views
  • 7 replies
  • 1 kudos
Latest Reply
Walter_C
Databricks Employee
  • 1 kudos

Got it, allow me with some time to look for information around this one

  • 1 kudos
6 More Replies
jeremy98
by > Contributor
  • 138 Views
  • 16 replies
  • 1 kudos

wheel package to install in a serveless workflow

Hi guys, Which is the way through Databricks Asset Bundle to declare a new job definition having a serveless compute associated on each task that composes the workflow and be able that inside each notebook task definition is possible to catch the dep...

  • 138 Views
  • 16 replies
  • 1 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 1 kudos

Hi @jeremy98, I think it has to do with the serverless version being used outside the workflow versus in DABs, since python version changes. please see: https://docs.databricks.com/en/release-notes/serverless/index.html both the versions have differe...

  • 1 kudos
15 More Replies
BobBubble2000
by > New Contributor II
  • 3362 Views
  • 2 replies
  • 0 kudos

Who or what is System user?

I noticed in the Catalog Explorer of an Unity Catalog integrated workspace that there is a default catalog named 'system' owned by 'System user'. Who is this system user? It is not listed in the admin dashboard of all workspace users. 

  • 3362 Views
  • 2 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Is a special user account created by Databricks for managing system-level operations and configurations. This user is not listed in the admin dashboard of all workspace users because it is not a regular user account but rather a system account used i...

  • 0 kudos
1 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Featured Event

Join Us for an Exclusive Databricks Community Event in San Francisco!

Thursday, January 23, 2025

View Event
Top Kudoed Authors
Read Databricks Data Intelligence Platform reviews on G2

Latest from our Blog

Deep Dive - Streaming Deduplication

In this article we will cover in depth about streaming deduplication using watermarking with dropDuplicates and dropDuplicatesWithinWatermark, how they are different. This blog expects you to have a g...

312Views 1kudos

Data Engineering SQL Holiday Specials

December is the most celebrated time of year in the Data Engineering calendar as we embrace the important holiday: change freeze season.  As we come back to the office to start our new projects, I wan...

2078Views 3kudos