cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Boban12335
by New Contributor
  • 174 Views
  • 1 replies
  • 0 kudos

Unity Catalog tool function with custom parameters not being used

I have created a UC tool that takes in a few custom STRING parameters. I gave this tool to an ai agent using the mosaic ai agent framework with hardcoded parameter values for testing. The issue is my ai agent hallucinates and injects its own ai gener...

  • 174 Views
  • 1 replies
  • 0 kudos
Latest Reply
Nivethan_Venkat
Contributor III
  • 0 kudos

Hi @Boban12335,Can we get UC function definition to understand your problem better?Best Regards,Nivethan V

  • 0 kudos
ChristianRRL
by Valued Contributor III
  • 301 Views
  • 3 replies
  • 3 kudos

Resolved! AutoLoader - Write To Console (Notebook Cell) Long Running Issue

Hi there,I am likely misunderstanding how to use AutoLoader properly while developing/testing. I am trying to write a simple AutoLoader notebook cell to *read* the contents of a path with json files, and *write* them to console (i.e. notebook cell) i...

ChristianRRL_0-1754403001614.png
  • 301 Views
  • 3 replies
  • 3 kudos
Latest Reply
SP_6721
Contributor III
  • 3 kudos

Hi @ChristianRRL ,It looks like spark.readStream with Auto Loader creates a continuous streaming job by default, which means it keeps running while waiting for new files.To avoid this, you can control the behaviour using trigger(availableNow=True), w...

  • 3 kudos
2 More Replies
Lucas_N
by New Contributor II
  • 2829 Views
  • 2 replies
  • 3 kudos

Resolved! Documentation for spatial SQL public preview - Where is it?

Hi everybody,since DBR 17.1 spatial sql functions (st_point(), st_distancesphere, ... ) are in public preview.The functionality is presented in this talk Geospatial Insights With Databricks SQL: Techniques and Applications or discussed here in the fo...

  • 2829 Views
  • 2 replies
  • 3 kudos
Latest Reply
Geospatial_Gwen
New Contributor III
  • 3 kudos

Is this what you were after?https://docs.databricks.com/aws/en/sql/language-manual/sql-ref-st-geospatial-functions

  • 3 kudos
1 More Replies
Danish1105
by New Contributor II
  • 231 Views
  • 1 replies
  • 1 kudos

Resolved! Run_type has some null

Just wondering — we know that the run_type column in the job run timeline usually has only three values: JOB_RUN, SUBMIT_RUN, and WORKFLOW_RUN. So why do we also see a null value there? Any reason?  

Danish1105_0-1754303528409.jpeg
  • 231 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @Danish1105 ,One possible explanation is that you see null values because of the following reason they stated in documentation:"Not populated for rows emitted before late August 2024."In case of my workspace, this seems valid. I have only nulls wh...

  • 1 kudos
ivan7256
by New Contributor II
  • 974 Views
  • 3 replies
  • 2 kudos

PERMISSION_DENIED: Cannot access Spark Connect. when trying to run serverless databricks connect

I am not able to run a file as "run as workflow" nor "run with databricks connect" when I choose serverless run on my paid account.  However I can perform this action in my free edition account . See error : pyspark.errors.exceptions.connect.SparkCon...

  • 974 Views
  • 3 replies
  • 2 kudos
Latest Reply
SP_6721
Contributor III
  • 2 kudos

Hi @ivan7256 ,This might be because serverless compute isn't enabled for workflows in your paid workspace.

  • 2 kudos
2 More Replies
devdbk
by New Contributor II
  • 557 Views
  • 3 replies
  • 5 kudos

Databricks Free Edition Needs Transparency About Data Access

When I first discovered the Databricks Free Edition, I thought it was a generous offering for data enthusiasts, researchers, and developers who just needed a personal sandbox. No cost. Easy setup. Promises of productivity. But what caught me off guar...

  • 557 Views
  • 3 replies
  • 5 kudos
Latest Reply
devdbk
New Contributor II
  • 5 kudos

Thanks again for all the perspectives shared so far. I want to re-emphasize that the Databricks Free Edition offers real value. For data enthusiasts, learners, and builders, it’s a genuinely powerful environment to get hands-on without jumping throug...

  • 5 kudos
2 More Replies
florianb
by New Contributor III
  • 4522 Views
  • 3 replies
  • 8 kudos

Resolved! Rss feeds for databricks releases

Hi,are there any rss feeds for the databricks platform, sql & runtime releases? We have a big tech stack so it is sometimes hard to keep up with the ever changing technologies. We are using rss feeds to keep up with all of that.Cant find anything for...

  • 4522 Views
  • 3 replies
  • 8 kudos
Latest Reply
kerem
Contributor
  • 8 kudos

Databricks recently published an RSS feed for all their updates. As far as I can find, it is only for AWS at the moment.https://docs.databricks.com/aws/en/feed.xml 

  • 8 kudos
2 More Replies
mano7438
by New Contributor III
  • 79172 Views
  • 7 replies
  • 7 kudos

Resolved! How to create temporary table in databricks

Hi Team,I have a requirement where I need to create temporary table not temporary view.Can you tell me how to create temporary table in data bricks ?

  • 79172 Views
  • 7 replies
  • 7 kudos
Latest Reply
NandiniN
Databricks Employee
  • 7 kudos

I see, thanks for sharing, can you mark the solution which worked for you @abueno as Accepted.

  • 7 kudos
6 More Replies
holunder
by New Contributor
  • 445 Views
  • 1 replies
  • 0 kudos

CLI: Export-dir provides LatestClone

Hi everyone,I want to download the current databricks codebase out of a workspace and tried viadatabricks databricks workspace export-dir /Sandbox/foo .Surprisingly, some of the subfolders are twice in the export target: One with the expected name (`...

  • 445 Views
  • 1 replies
  • 0 kudos
Latest Reply
SP_6721
Contributor III
  • 0 kudos

Hi @holunder ,This could be because the backend stores both the original and cloned versions of folders, even if only one appears in the web UI. The Databricks CLI exports everything from the backend, not just what's visible in the UI.

  • 0 kudos
Akira
by New Contributor II
  • 3455 Views
  • 5 replies
  • 1 kudos

"PutWithBucketOwnerFullControl" privilege missing for storage configuration

Hi. I've been unable to create workspaces manually for a while now. The error I get is "MALFORMED_REQUEST: Failed storage configuration validation checks: List,Put,PutWithBucketOwnerFullControl,Delete".  The storage configuration is on a bucket that ...

putwithownercontrols_error.trimmed.png
  • 3455 Views
  • 5 replies
  • 1 kudos
Latest Reply
Rahul14Gupta
New Contributor III
  • 1 kudos

I faced same issue because I have created bucket in wrong region.

  • 1 kudos
4 More Replies
Saubhik
by New Contributor II
  • 606 Views
  • 3 replies
  • 0 kudos

Getting [08S01/500593] Can't connect to database - [Databricks][JDBCDriver](500593) Communication

I am getting below error connecting a databricks instance using JDBC driver .ERROR: [08S01/500593] Can't connect to database - [Databricks][JDBCDriver](500593) Communication link failure. Failed to connect to server. Reason: HTTP Response code: 401, ...

  • 606 Views
  • 3 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @Saubhik ,You're trying to connect DBeaver to Databricks? If so, yes it is possible. Here's a detailed guide how to do that.DBeaver integration with Azure Databricks - Azure Databricks | Microsoft LearnIt doesn't look like a network issue. JDBC cl...

  • 0 kudos
2 More Replies
Rishabh_Tiwari
by Databricks Employee
  • 8431 Views
  • 12 replies
  • 35 kudos

Are you part of a Databricks user group in your city/region?

Joining a regional user group is a great way to connect with data and AI professionals near you. These groups bring together practitioners to learn, network, and share real-world experiences — all within your local context. To join a group: Select y...

Screenshot 2025-07-04 at 4.11.32 PM.png
  • 8431 Views
  • 12 replies
  • 35 kudos
Latest Reply
junaid-databrix
New Contributor III
  • 35 kudos

@Rishabh_Tiwari appreciate this initiative! I think user groups are the best way to bring together the community as well as to learn, share and grow. I would like to start a local user group in my city since there is none already. Could you please gu...

  • 35 kudos
11 More Replies
prakashhinduja
by New Contributor III
  • 348 Views
  • 1 replies
  • 0 kudos

Prakash Hinduja Geneva (Swiss) fix access denied issues when using DBFS in Databricks Community?

Hello Community, I’m Prakash Hinduja, a financial strategist residing in Geneva, Switzerland (Swiss). My primary focus is on supporting Swiss businesses by crafting tailored financial strategies. These strategies attract global investments and foster...

  • 348 Views
  • 1 replies
  • 0 kudos
Latest Reply
Khaja_Zaffer
Contributor
  • 0 kudos

Helllo @prakashhinduja  So the file limit in community edition DBFS limit is 10 GB. Are you trying to upload more than 10 GB?

  • 0 kudos
zc
by New Contributor III
  • 1597 Views
  • 11 replies
  • 4 kudos

Resolved! How to create a widget in SQL with variables?

I want to create a widget in SQL and use it in R later. Below is my code%sqldeclare or replace date1 date = "2025-01-31";declare or replace date2 date ;set var date2=add_months(date1,5); What's the correct syntax of using date2 to create a widget? I ...

  • 1597 Views
  • 11 replies
  • 4 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 4 kudos

Hi @zc ,Unfortunately, I think in case of sql widgets default value needs to be string literals. So above approach won't work.Regarding your second question about accessing variables decalared in SQL in R cell, you cannot do such a thing. Here's an e...

  • 4 kudos
10 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels