cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

EricMa
by New Contributor III
  • 9420 Views
  • 20 replies
  • 4 kudos

Mounting Data IOException

Hello,I am currently taking a course from Coursera for data science using SQL. For one of our assignments we need to mount some data by running a script that has been provided to us by the class. When I run the script I receive the following error. I...

IOException.jpg IOException_Classroom-Setup.jpg
  • 9420 Views
  • 20 replies
  • 4 kudos
Latest Reply
raghdafaris
New Contributor II
  • 4 kudos

Hello all, we came up with a solution: to download the data directly instead of mounting it. The community version is limited, and we don't have access to S3 unless we create our own aws account, load the data there, and then mount our account on dat...

  • 4 kudos
19 More Replies
Shravanshibu
by New Contributor III
  • 4530 Views
  • 6 replies
  • 3 kudos

Unable to install a wheel file which is in my volume to a serverless cluster

I am trying to install a wheel file which is in my volume to a serverless cluster, getting the below error@ken@Retired_mod Note: you may need to restart the kernel using %restart_python or dbutils.library.restartPython() to use updated packages. WARN...

  • 4530 Views
  • 6 replies
  • 3 kudos
Latest Reply
tom-ph
New Contributor II
  • 3 kudos

Same issue here. Any solution?

  • 3 kudos
5 More Replies
Sangamswadik
by New Contributor III
  • 1978 Views
  • 4 replies
  • 1 kudos

Resolved! Agents and Inference table errors

Hi, I'm trying to deploy a rag model from GCP databricks. I've added an external gpt4o endpoint and enabled inference table in settings. But when Im trying to deploy agents I'm still getting the inference table not enabled error. (I've registered the...

Sangamswadik_0-1740977753362.png Sangamswadik_1-1740978040054.png
  • 1978 Views
  • 4 replies
  • 1 kudos
Latest Reply
MariuszK
Valued Contributor III
  • 1 kudos

The Model Serving is supported in your region so it can be another problem or limitation.

  • 1 kudos
3 More Replies
Nik_Vanderhoof
by Contributor
  • 1265 Views
  • 2 replies
  • 0 kudos

Resolved! DatabricksWorkflowTaskGroup

Hello,I recently learned about the DatabricksWorkflowTaskGroup operator for Airflow that allows one to run multiple Notebook tasks on a shared job compute cluster from Airflow.Is a similar feature possible to run multiple non-Notebook tasks from Airf...

Get Started Discussions
Airflow
workflow
  • 1265 Views
  • 2 replies
  • 0 kudos
Latest Reply
Nik_Vanderhoof
Contributor
  • 0 kudos

Thank you!

  • 0 kudos
1 More Replies
anil_reddaboina
by New Contributor II
  • 1068 Views
  • 2 replies
  • 0 kudos

Databricks tasks are not skipping if running tasks using Airflow DatabricksworkflowTaskgroup

Currently we are facing a challenge with below use case:The Airflow DAG has 4 tasks (Task1, Task2, Task3 and Task4) and The dependency is like thisTask 1>> Task2 >> Task3 >> Task4 (All tasks are spark-jar task typesIn Airflow DAG for Task2, there is ...

  • 1068 Views
  • 2 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @anil_reddaboina, Databricks allows you to add control flow logic to tasks based on the success, failure, or completion of their dependencies. This can be achieved using the "Run if" dependencies fiel: https://docs.databricks.com/aws/en/jobs/run-i...

  • 0 kudos
1 More Replies
shkelzeen
by New Contributor II
  • 2763 Views
  • 3 replies
  • 1 kudos

Databricks JDBC driver multi query in one request.

Can I run multi query in one command using databricks JDBC driver and would databricks execute one query faster then running multi queries in one script?  

  • 2763 Views
  • 3 replies
  • 1 kudos
Latest Reply
NandiniN
Databricks Employee
  • 1 kudos

Yes, you can run multiple queries in one command using the Databricks JDBC driver.The results will be displayed in separate tables. When you run the multiple queries, they are all still individual queries. Running multiple queries in a script will no...

  • 1 kudos
2 More Replies
Arindam19
by New Contributor II
  • 895 Views
  • 3 replies
  • 0 kudos

Are row filters and column masks supported on foreign catalogs in Azure Databricks Unity Catalog?

In my solution I am planning to bring in an Azure SQL Database to Azure Databricks Unity Catalog as Foreign Catalog. Are table row filters and column masks supported in my scenario ?

  • 895 Views
  • 3 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @Arindam19, Yes. Certain operations, including filtering, can be pushed down from Databricks to SQL Server. This is managed by querying the SQL Server directly via a federated connection, allowing SQL Server to handle the filter criteria and retur...

  • 0 kudos
2 More Replies
KaustubhShah
by New Contributor
  • 706 Views
  • 1 replies
  • 0 kudos

GCP Databricks Spark Connector for Cassandra - Error: com.typesafe.config.impl.ConfigImpl.newSimple

Hello,I am using Databricks runtime 12.2 with the spark connector - com.datastax.spark:spark-cassandra-connector_2.12:3.3.0as runtime 12.2 comes with spark 3.3.2 and scala 2.12. I encounter an issue with conneciting to cassandra DB using the below co...

  • 706 Views
  • 1 replies
  • 0 kudos
Latest Reply
cgrant
Databricks Employee
  • 0 kudos

Try using the assembly version of the jar with 12.2.  https://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector-assembly  If this doesn't work, please paste the full, original stacktrace

  • 0 kudos
mrstevegross
by Contributor III
  • 2795 Views
  • 6 replies
  • 0 kudos

Resolved! Is it possible to obtain a job's event log via the REST API?

Currently, to investigate job performance, I can look at a job's information (via the UI) to see the "Event Log" (pictured below):I'd like to obtain this information programmatically, so I can analyze it across jobs. However, the docs for the `get` c...

mrstevegross_0-1736967992555.png
  • 2795 Views
  • 6 replies
  • 0 kudos
Latest Reply
mrstevegross
Contributor III
  • 0 kudos

I also see there is a "list cluster events" API (https://docs.databricks.com/api/workspace/clusters/events); can I get the event log this way?

  • 0 kudos
5 More Replies
crowley
by New Contributor III
  • 4706 Views
  • 2 replies
  • 1 kudos

Resolved! How are Struct type columns stored/accessed (interested in efficiency)?

Hello, I've searched around for awhile and didn't find a similar question here or elsewhere, so thought I'd ask...I'm assessing the storage/access efficiency of Struct type columns in delta tables.  I want to know more about how Databricks is storing...

  • 4706 Views
  • 2 replies
  • 1 kudos
Latest Reply
crowley
New Contributor III
  • 1 kudos

Thank you very much for the thoughful response.  Please excuse my belated feedback and thanks!

  • 1 kudos
1 More Replies
pardeep7
by New Contributor II
  • 1085 Views
  • 3 replies
  • 0 kudos

Databricks Clean Rooms with 3 or more collaborators

Let's say I create a clean room with 2 other collaborators, call them collaborator A and collaborator B (so 3 in total, including me) and then shared some tables to the clean room. If collaborator A writes code that does a "SELECT * FROM creator.<tab...

  • 1085 Views
  • 3 replies
  • 0 kudos
Latest Reply
KaranamS
Contributor III
  • 0 kudos

Hi @pardeep7 , As per my understanding, all participants of clean room can only see metadata. The raw data in your tables is not directly accessed by other collaborators.Any output tables created by Collaborators based on the queries/notebooks will b...

  • 0 kudos
2 More Replies
harsh_Dev
by New Contributor III
  • 1206 Views
  • 2 replies
  • 1 kudos

Resolved! Connect databricks community edition to datalake s3/adls2

Can anybody know how can i connect with aws s3 object storage with databricks community edition or can i connect with community databricks account or not ? 

  • 1206 Views
  • 2 replies
  • 1 kudos
Latest Reply
KaranamS
Contributor III
  • 1 kudos

Hi @harsh_Dev ,You can read from/write to AWS S3 with Databricks Community edition. As you will not be able to use instance profiles, you will need to configure the AWS credentials manually and access S3 using S3 URI. Try below code spark._jsc.hadoop...

  • 1 kudos
1 More Replies
AGnewbie
by New Contributor
  • 692 Views
  • 1 replies
  • 1 kudos

Required versus current compute setup

To run demo and lab notebooks, I am required to have the following Databricks runtime(s): 15.4.x-cpu-ml-scala2.12 but the compute in my setup is of the following runtime version, will that be an issue? 11.3 LTS (includes Apache Spark 3.3.0, Scala 2.1...

  • 692 Views
  • 1 replies
  • 1 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 1 kudos

Hello @AGnewbie, Firstly, regarding the Databricks runtime: your compute setup is currently running version 11.3 LTS, which will indeed be an issue as the specified version is not present in your current runtime. Hence, you need to update your runtim...

  • 1 kudos
Boyeenas
by New Contributor
  • 3610 Views
  • 1 replies
  • 0 kudos

Decimal(32,6) datatype in Databricks - precision roundoff

Hello All,I need your assistance. I recently started a migration project from Synapse Analytics to Databricks. While dealing with the datatypes, I came across a situation where in Dedicated Sql Pool the value is 0.033882, but in DataBricks the value ...

  • 3610 Views
  • 1 replies
  • 0 kudos
Latest Reply
KaranamS
Contributor III
  • 0 kudos

Hi @Boyeenas ,I believe your assumption is correct. Databricks is built on Apache Spark and the system applies rounding automatically based on the value of the subsequent digit. In your case, if the original value had a 7th decimal digit of 5 or high...

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels