cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Bhavishya
by New Contributor II
  • 1714 Views
  • 3 replies
  • 0 kudos

Resolved! Databricks jdbc driver connectiion issue with apache solr

Hi,databricks jdbc version - 2.6.34I am facing the below issue with connecting databricks sql from apache solr Caused by: java.sql.SQLFeatureNotSupportedException: [Databricks][JDBC](10220) Driver does not support this optional feature.at com.databri...

  • 1714 Views
  • 3 replies
  • 0 kudos
Latest Reply
Bhavishya
New Contributor II
  • 0 kudos

Databricks team recommended to set IgnoreTransactions=1 and autocommit=false in the connection string but that didn't resolve the issue .Ultimately I had to use solr update API for uploading documents

  • 0 kudos
2 More Replies
ChristianRRL
by Contributor
  • 637 Views
  • 1 replies
  • 0 kudos

Auto-Update API Data

Not sure if this has come up before, but I'm wondering if Databricks has any kind of functionality to "watch" an API call for changes?E.g. Currently I have a frequently running job that pulls data via an API call and overwrites the old data. This see...

  • 637 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @ChristianRRL, Databricks provides a REST API that allows you to interact with various aspects of your Databricks workspace programmatically. While there isn’t a direct built-in feature to “watch” an API call for changes, you can design a solut...

  • 0 kudos
Stogpon
by New Contributor III
  • 1417 Views
  • 4 replies
  • 4 kudos

Resolved! Error not a delta table for Unity Catalog table

Is anyone able to advise why I am getting the error not a delta table?  The table was created in Unity Catalog.  I've also tried DeltaTable.forName and also using 13.3 LTS and 14.3 LTS clusters. Any advice would be much appreciated 

Screenshot 2024-03-18 at 12.10.30 PM.png Screenshot 2024-03-18 at 12.14.24 PM.png
  • 1417 Views
  • 4 replies
  • 4 kudos
Latest Reply
addy
New Contributor III
  • 4 kudos

@StogponI believe if you are using DeltaTable.forPath then you have to pass the path where the table is. You can get this path from the Catalog. It is available in the details tab of the table.Example:delta_table_path = "dbfs:/user/hive/warehouse/xyz...

  • 4 kudos
3 More Replies
Surajv
by New Contributor III
  • 397 Views
  • 1 replies
  • 0 kudos

Restrict access of user/entity to hitting only specific Databricks Rest APIs

Hi community,Assume I generate a personal access token for an entity. Post generation, can I restrict the access of the entity to specific REST APIs? In other words, consider this example where once I use generate the token and setup a bearer token b...

  • 397 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Surajv, You can control permissions using the Permissions API. Although Personal Access Tokens (PATs) do not directly support fine-grained API restrictions, you can achieve this by carefully configuring permissions for the entity associated with ...

  • 0 kudos
Ujeen
by New Contributor
  • 428 Views
  • 1 replies
  • 0 kudos

Using com.databricks:databricks-jdbc:2.6.36 inside oracle stored proc

Hi dear Databricks community,We tried to use databricks-jdbc inside oracle store procedure to load something from hive. However Oracle marked databricks-jdbc invalid because some classes (for example  com.databricks.client.jdbc42.internal.io.netty.ut...

  • 428 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Ujeen , When integrating Databricks JDBC with an Oracle-stored procedure to load data from Hive, encountering issues related to missing classes can be frustrating. Let’s explore some potential solutions: Check Dependencies: Ensure that all n...

  • 0 kudos
sarvar-anvarov
by New Contributor II
  • 1215 Views
  • 6 replies
  • 3 kudos

BAD_REQUEST: ExperimentIds cannot be empty when checking ACLs in bulk

I was going through this tutorial https://mlflow.org/docs/latest/getting-started/tracking-server-overview/index.html#method-2-start-your-own-mlflow-server, I ran the whole script and when I try to open the experiment on the databricks website I get t...

sarvaranvarov_0-1710056939049.png
  • 1215 Views
  • 6 replies
  • 3 kudos
Latest Reply
stanjs
New Contributor III
  • 3 kudos

Hi did u resolve that? I encountered the same error

  • 3 kudos
5 More Replies
Chinu
by New Contributor III
  • 826 Views
  • 1 replies
  • 0 kudos

Tableau Desktop connection error from Mac M1

Hi, Im getting the below error while connecting SQL Warehouse from the tableau desktop. I installed the latest ODBC drivers (2.7.5) but I can confirm that the driver name is different. From the error message I see libsparkodbc_sbu.dylib but in my lap...

  • 826 Views
  • 1 replies
  • 0 kudos
Latest Reply
jiro
New Contributor II
  • 0 kudos

Hello @Chinu  It looks like Tableau Desktop by default searches for /Library/simba/spark/lib/libsparkodbc_sbu.dylib , but the file in the SimbaSparkODBC-2.7.7.1016-OS gets installed as libsparkodbc_sb64-universal.dylib I was able to go around this by...

  • 0 kudos
Arnold_Souza
by New Contributor III
  • 3644 Views
  • 5 replies
  • 3 kudos

How to move a metastore to a new Storage Account in unity catalog?

Hello, I would like to change the Metastore location in Databricks Account Console. I have one metastore created that is in an undesired container/storage account. I could see that it's not possible to edit a metastore that is already created. I coul...

1.JPG
  • 3644 Views
  • 5 replies
  • 3 kudos
Latest Reply
ac0
New Contributor III
  • 3 kudos

Bumping this thread as well.

  • 3 kudos
4 More Replies
dollyb
by New Contributor III
  • 4227 Views
  • 2 replies
  • 0 kudos

Resolved! How to detect if running in a workflow job?

Hi there,what's the best way to differentiate in what environment my Spark session is running? Locally I develop with databricks-connect's DatabricksSession, but that doesn't work when running a workflow job which requires SparkSession.getOrCreate()....

  • 4227 Views
  • 2 replies
  • 0 kudos
Latest Reply
dollyb
New Contributor III
  • 0 kudos

Thanks, dbutils.notebook.getContext does indeed contain information about the job run.

  • 0 kudos
1 More Replies
277745
by New Contributor
  • 772 Views
  • 1 replies
  • 0 kudos

Pandas_Udod max batch size not working in notebook

Hello I am trying to set max batch size for pandas-udf in Databricks notebook, but in my tests it doesn’t have any effect on size. spark.conf.set("spark.sql.execution.arrow.enabled", "true")spark.conf.set('spark.sql.execution.arrow.maxRecordsPerBatch...

  • 772 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @277745, It seems you’re working with Pandas UDF in a Databricks Notebook and trying to set the maximum batch size. Let’s address your query: Setting Max Batch Size for Pandas UDF: You’ve already taken the right steps by configuring the follo...

  • 0 kudos
Surajv
by New Contributor III
  • 730 Views
  • 3 replies
  • 0 kudos

Limit the scope of workspace level access token to access only specific REST APIs of Databricks

Hi Community, Is there a way to limit the scope of workspace level token to hit only certain REST APIs of Databricks.In short, Once we generate a workspace level token following this doc. Link: https://docs.databricks.com/en/dev-tools/auth/oauth-m2m....

  • 730 Views
  • 3 replies
  • 0 kudos
Latest Reply
Surajv
New Contributor III
  • 0 kudos

 <Replied to previous message as response to @Kaniz's answer> 

  • 0 kudos
2 More Replies
Sardenberg
by New Contributor II
  • 1035 Views
  • 2 replies
  • 0 kudos

How list all USERs FROM a especific GROUP USING SQL?

I want to list, using sql editor, all users name from a specific group.Reading documentation, I only learned how to show the groups or the users, using simples filters, like:SHOW GROUPS LIKE '*XPTO*';SHOW GROUPS WITH USER `test@gmail.com`SHOW USERS L...

  • 1035 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Sardenberg,  To retrieve a list of users from a specific group using SQL, you can follow these steps:   Assumptions: Let’s assume you have three tables: USERS, GROUPS, and GROUP_USERS.The USERS table contains user information.The GROUPS table con...

  • 0 kudos
1 More Replies
Mal
by New Contributor
  • 1140 Views
  • 1 replies
  • 0 kudos

FileNotFoundError: [Errno 2] No such file or directory: 'ffmpeg'

import whisperimport ffmpegmodel = whisper.load_model("base")transcription = model.transcribe("dbfs:/FileStore/Call_Center_Conversation__03.mp3")print(transcription["text"])FileNotFoundError: [Errno 2] No such file or directory: 'ffmpeg'I have import...

  • 1140 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Mal, It appears you're facing a FileNotFoundError with ffmpeg while using OpenAI Whisper. To troubleshoot, first try installing ffmpeg as a Python module with `pip install ffmpeg` in your Python environment. Next, ensure ffmpeg is accessible by r...

  • 0 kudos
db_allrails
by New Contributor II
  • 3807 Views
  • 2 replies
  • 1 kudos

Resolved! Configuring NCC does not show option to add private endpoints

Hi!I am following this guide: https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/serverless-private-linkHowever in Step 3: Create private endpoint rules, number 6 there is no option for me to Add a private...

  • 3807 Views
  • 2 replies
  • 1 kudos
Latest Reply
db_allrails
New Contributor II
  • 1 kudos

@saikumar246  you were correct. It was really super easy to set up and works flawlessly! Good job dev-team!

  • 1 kudos
1 More Replies
TestuserAva
by New Contributor II
  • 2231 Views
  • 7 replies
  • 2 kudos

Getting HTML sign I page as api response from databricks api with statuscode 200

Response:<!doctype html><html><head>    <meta charset="utf-8" />    <meta http-equiv="Content-Language" content="en" />    <title>Databricks - Sign In</title>    <meta name="viewport" content="width=960" />    <link rel="icon" type="image/png" href="...

TestuserAva_0-1701165195616.png
  • 2231 Views
  • 7 replies
  • 2 kudos
Latest Reply
SJR
New Contributor III
  • 2 kudos

Hello @Abhishek10745 It was just like you said! We have a completely private instance of Databricks and the DevOps Pipeline that I was using didin't have access to the private vnet. Switching pools solved the problem. Thanks for all the help!

  • 2 kudos
6 More Replies
Labels
Top Kudoed Authors