cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

NoviKamayana
by New Contributor
  • 3556 Views
  • 0 replies
  • 0 kudos

Database: Delta Lake or PostgreSQL

Hey all,I am searching for a non-political answer to my database questions. Please know that I am a data newbie and litteraly do not know anything about this topic, but I want to learn, so please be gentle.  Some context: I am working for an OEM that...

  • 3556 Views
  • 0 replies
  • 0 kudos
Bhavishya
by New Contributor II
  • 3222 Views
  • 2 replies
  • 0 kudos

Databricks jdbc driver connectiion issue with apache solr

Hi,databricks jdbc version - 2.6.34I am facing the below issue with connecting databricks sql from apache solr Caused by: java.sql.SQLFeatureNotSupportedException: [Databricks][JDBC](10220) Driver does not support this optional feature.at com.databri...

  • 3222 Views
  • 2 replies
  • 0 kudos
Latest Reply
Bhavishya
New Contributor II
  • 0 kudos

Databricks team recommended to set IgnoreTransactions=1 and autocommit=false in the connection string but that didn't resolve the issue .Ultimately I had to use solr update API for uploading documents

  • 0 kudos
1 More Replies
Stogpon
by New Contributor III
  • 5031 Views
  • 3 replies
  • 4 kudos

Resolved! Error not a delta table for Unity Catalog table

Is anyone able to advise why I am getting the error not a delta table?  The table was created in Unity Catalog.  I've also tried DeltaTable.forName and also using 13.3 LTS and 14.3 LTS clusters. Any advice would be much appreciated 

Screenshot 2024-03-18 at 12.10.30 PM.png Screenshot 2024-03-18 at 12.14.24 PM.png
  • 5031 Views
  • 3 replies
  • 4 kudos
Latest Reply
addy
New Contributor III
  • 4 kudos

@StogponI believe if you are using DeltaTable.forPath then you have to pass the path where the table is. You can get this path from the Catalog. It is available in the details tab of the table.Example:delta_table_path = "dbfs:/user/hive/warehouse/xyz...

  • 4 kudos
2 More Replies
sarvar-anvarov
by New Contributor II
  • 2397 Views
  • 4 replies
  • 3 kudos

BAD_REQUEST: ExperimentIds cannot be empty when checking ACLs in bulk

I was going through this tutorial https://mlflow.org/docs/latest/getting-started/tracking-server-overview/index.html#method-2-start-your-own-mlflow-server, I ran the whole script and when I try to open the experiment on the databricks website I get t...

sarvaranvarov_0-1710056939049.png
  • 2397 Views
  • 4 replies
  • 3 kudos
Latest Reply
stanjs
New Contributor III
  • 3 kudos

Hi did u resolve that? I encountered the same error

  • 3 kudos
3 More Replies
dollyb
by Contributor
  • 6831 Views
  • 1 replies
  • 0 kudos

How to detect if running in a workflow job?

Hi there,what's the best way to differentiate in what environment my Spark session is running? Locally I develop with databricks-connect's DatabricksSession, but that doesn't work when running a workflow job which requires SparkSession.getOrCreate()....

  • 6831 Views
  • 1 replies
  • 0 kudos
Latest Reply
dollyb
Contributor
  • 0 kudos

Thanks, dbutils.notebook.getContext does indeed contain information about the job run.

  • 0 kudos
Surajv
by New Contributor III
  • 2446 Views
  • 2 replies
  • 0 kudos

Limit the scope of workspace level access token to access only specific REST APIs of Databricks

Hi Community, Is there a way to limit the scope of workspace level token to hit only certain REST APIs of Databricks.In short, Once we generate a workspace level token following this doc. Link: https://docs.databricks.com/en/dev-tools/auth/oauth-m2m....

  • 2446 Views
  • 2 replies
  • 0 kudos
Latest Reply
Surajv
New Contributor III
  • 0 kudos

 <Replied to previous message as response to @Retired_mod's answer> 

  • 0 kudos
1 More Replies
db_allrails
by New Contributor II
  • 9404 Views
  • 2 replies
  • 1 kudos

Resolved! Configuring NCC does not show option to add private endpoints

Hi!I am following this guide: https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/serverless-private-linkHowever in Step 3: Create private endpoint rules, number 6 there is no option for me to Add a private...

  • 9404 Views
  • 2 replies
  • 1 kudos
Latest Reply
db_allrails
New Contributor II
  • 1 kudos

@saikumar246  you were correct. It was really super easy to set up and works flawlessly! Good job dev-team!

  • 1 kudos
1 More Replies
B_J_Innov
by New Contributor III
  • 3174 Views
  • 1 replies
  • 0 kudos

Make API Call to run job

Hi everyone,I want to trigger a run for a job using API Call.Here's my code"""import shleximport subprocessdef call_curl(curl):args = shlex.split(curl)process = subprocess.Popen(args, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE)stdout...

  • 3174 Views
  • 1 replies
  • 0 kudos
Latest Reply
" src="" />
This widget could not be displayed.
This widget could not be displayed.
This widget could not be displayed.
  • 0 kudos

This widget could not be displayed.
Hi everyone,I want to trigger a run for a job using API Call.Here's my code"""import shleximport subprocessdef call_curl(curl):args = shlex.split(curl)process = subprocess.Popen(args, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE)stdout...

This widget could not be displayed.
  • 0 kudos
This widget could not be displayed.
BillGuyTheScien
by New Contributor II
  • 2681 Views
  • 1 replies
  • 0 kudos

Resolved! combining accounts

I have an AWS based databricks account with a few workspaces and an Azure Databricks workspace.  How do I combine them into one account?I am particularly interested in setting up a single billing drop with all my Databricks costs.  

  • 2681 Views
  • 1 replies
  • 0 kudos
Latest Reply
AlliaKhosla
Databricks Employee
  • 0 kudos

Hi @BillGuyTheScien  Greetings! Currently, we do not have such a feature to combine multiple cloud usage into a single account. We do have a feature request for the same and it is considered for future. Currently, there is no ETA on that. You can bro...

  • 0 kudos
anushajalesh28
by New Contributor II
  • 3835 Views
  • 1 replies
  • 0 kudos

Catalog issue

When i was trying to create catalog i got an error saying to mention azure storage account and storage container in the following query -CREATE CATALOG IF NOT EXISTS Databricks_Anu_Jal_27022024MANAGED LOCATION 'abfss://<databricks-workspace-stack-anu...

Get Started Discussions
Azure Databricks
  • 3835 Views
  • 1 replies
  • 0 kudos
Latest Reply
" src="" />
This widget could not be displayed.
This widget could not be displayed.
This widget could not be displayed.
  • 0 kudos

This widget could not be displayed.
When i was trying to create catalog i got an error saying to mention azure storage account and storage container in the following query -CREATE CATALOG IF NOT EXISTS Databricks_Anu_Jal_27022024MANAGED LOCATION 'abfss://<databricks-workspace-stack-anu...

This widget could not be displayed.
  • 0 kudos
This widget could not be displayed.
Surajv
by New Contributor III
  • 4674 Views
  • 0 replies
  • 0 kudos

Run spark code in notebook by setting spark conf instead of databricks connect configure in runtime

Hi community, I wanted to understand if there is a way to pass config values to spark session in runtime than using databricks-connect configure to run spark code. One way I found out is given here: https://stackoverflow.com/questions/63088121/config...

  • 4674 Views
  • 0 replies
  • 0 kudos
Flachboard84
by New Contributor II
  • 3859 Views
  • 4 replies
  • 1 kudos

sparkR.session

Why might this be erroring out? My understanding is that SparkR is built into Databricks.Code:library(SparkR, include.only=c('read.parquet', 'collect'))sparkR.session() Error:Error in sparkR.session(): could not find function "sparkR.session"

  • 3859 Views
  • 4 replies
  • 1 kudos
Latest Reply
Flachboard84
New Contributor II
  • 1 kudos

It happens with any code; even something as simple as...x <- 2 + 2

  • 1 kudos
3 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels