cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Floody
by New Contributor II
  • 910 Views
  • 1 replies
  • 0 kudos

New draft for every post I visit

When I visit my profile page, under the drafts section I see an entry for every post I visit in the discussions. Is this normal?

  • 910 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Floody, Yes, it is normal to see an entry for every post you visit in the discussions under the drafts section of your profile page. This feature allows you to easily access and continue working on drafts of posts that you have started or viewed ...

  • 0 kudos
NoviKamayana
by New Contributor
  • 767 Views
  • 1 replies
  • 1 kudos

Database: Delta Lake or PostgreSQL

Hey all,I am searching for a non-political answer to my database questions. Please know that I am a data newbie and litteraly do not know anything about this topic, but I want to learn, so please be gentle.  Some context: I am working for an OEM that...

  • 767 Views
  • 1 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @NoviKamayana, Let’s dive into the world of data storage and explore the differences between Delta Lake and PostgreSQL. As a data newbie, you’re on the right track to learn more about these concepts. Data Lake vs. Delta Lake Benefits and Limit...

  • 1 kudos
mohaimen_syed
by New Contributor III
  • 3055 Views
  • 5 replies
  • 1 kudos

Fuzzy Match on PySpark using UDF/Pandas UDF

I'm trying to do fuzzy matching on two dataframes by cross joining them and then using a udf for my fuzzy matching. But using both python udf and pandas udf its either very slow or I get an error. @pandas_udf("int")def core_match_processor(s1: pd.Ser...

  • 3055 Views
  • 5 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @mohaimen_syed, Could you please help me with these details:-  - Cluster details, and - Check if Apache Apache Arrow optimization is enabled in your cluster.

  • 1 kudos
4 More Replies
ntvdatabricks
by New Contributor II
  • 1145 Views
  • 2 replies
  • 1 kudos

Resolved! Okta and Unified login

Hey Folks anyone put Databricks behind Okta and enabled Unified Login with workspaces that have a Unity Catalog metastore applied and some that don't?There are some workspaces we can't move over yet and it isn't clear in documentation if Unity Catalo...

  • 1145 Views
  • 2 replies
  • 1 kudos
Latest Reply
Walter_C
Valued Contributor II
  • 1 kudos

Yes, users should be able to use a single Okta application for all workspaces, regardless of whether the Unity Catalog metastore has been applied or not. The Unity Catalog is a feature that allows you to manage and secure access to your data across a...

  • 1 kudos
1 More Replies
Khalil
by Contributor
  • 2971 Views
  • 5 replies
  • 7 kudos

Incremental ingestion of Snowflake data with Delta Live Table (CDC)

Hello,I have some data which are lying into Snowflake, so I want to apply CDC on them using delta live table but I am having some issues.Here is what I am trying to do:  @dlt.view() def table1(): return spark.read.format("snowflake").options(**opt...

  • 2971 Views
  • 5 replies
  • 7 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 7 kudos

The CDC for delta live works fine for delta tables, as you have noticed.  However it is not a full blown CDC implementation/software.If you want to capture changes in Snowflake, you will have to implement some CDC method on Snowflake itself, and read...

  • 7 kudos
4 More Replies
thethirtyfour
by New Contributor III
  • 1279 Views
  • 1 replies
  • 0 kudos

Resolved! Configure Databricks in VSCode through WSL

Hi,I am having a hard time configuring my Databricks workspace when working in VSCode via WSL. When following the steps to setup Databricks authentication I am receiving the following error on the Step 5 of "Step 4: Set up Databricks authentication"....

  • 1279 Views
  • 1 replies
  • 0 kudos
Latest Reply
thethirtyfour
New Contributor III
  • 0 kudos

Scratch that, I found the alternative means of authenticating via this link: Authentication setup for the Databricks extension for Visual Studio Code

  • 0 kudos
Bhavishya
by New Contributor II
  • 1455 Views
  • 3 replies
  • 0 kudos

Resolved! Databricks jdbc driver connectiion issue with apache solr

Hi,databricks jdbc version - 2.6.34I am facing the below issue with connecting databricks sql from apache solr Caused by: java.sql.SQLFeatureNotSupportedException: [Databricks][JDBC](10220) Driver does not support this optional feature.at com.databri...

  • 1455 Views
  • 3 replies
  • 0 kudos
Latest Reply
Bhavishya
New Contributor II
  • 0 kudos

Databricks team recommended to set IgnoreTransactions=1 and autocommit=false in the connection string but that didn't resolve the issue .Ultimately I had to use solr update API for uploading documents

  • 0 kudos
2 More Replies
ChristianRRL
by Contributor
  • 497 Views
  • 1 replies
  • 0 kudos

Auto-Update API Data

Not sure if this has come up before, but I'm wondering if Databricks has any kind of functionality to "watch" an API call for changes?E.g. Currently I have a frequently running job that pulls data via an API call and overwrites the old data. This see...

  • 497 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @ChristianRRL, Databricks provides a REST API that allows you to interact with various aspects of your Databricks workspace programmatically. While there isn’t a direct built-in feature to “watch” an API call for changes, you can design a solut...

  • 0 kudos
Stogpon
by New Contributor III
  • 1113 Views
  • 4 replies
  • 2 kudos

Resolved! Error not a delta table for Unity Catalog table

Is anyone able to advise why I am getting the error not a delta table?  The table was created in Unity Catalog.  I've also tried DeltaTable.forName and also using 13.3 LTS and 14.3 LTS clusters. Any advice would be much appreciated 

Screenshot 2024-03-18 at 12.10.30 PM.png Screenshot 2024-03-18 at 12.14.24 PM.png
  • 1113 Views
  • 4 replies
  • 2 kudos
Latest Reply
addy
New Contributor III
  • 2 kudos

@StogponI believe if you are using DeltaTable.forPath then you have to pass the path where the table is. You can get this path from the Catalog. It is available in the details tab of the table.Example:delta_table_path = "dbfs:/user/hive/warehouse/xyz...

  • 2 kudos
3 More Replies
Surajv
by New Contributor III
  • 321 Views
  • 1 replies
  • 0 kudos

Restrict access of user/entity to hitting only specific Databricks Rest APIs

Hi community,Assume I generate a personal access token for an entity. Post generation, can I restrict the access of the entity to specific REST APIs? In other words, consider this example where once I use generate the token and setup a bearer token b...

  • 321 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Surajv, You can control permissions using the Permissions API. Although Personal Access Tokens (PATs) do not directly support fine-grained API restrictions, you can achieve this by carefully configuring permissions for the entity associated with ...

  • 0 kudos
Ujeen
by New Contributor
  • 346 Views
  • 1 replies
  • 0 kudos

Using com.databricks:databricks-jdbc:2.6.36 inside oracle stored proc

Hi dear Databricks community,We tried to use databricks-jdbc inside oracle store procedure to load something from hive. However Oracle marked databricks-jdbc invalid because some classes (for example  com.databricks.client.jdbc42.internal.io.netty.ut...

  • 346 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Ujeen , When integrating Databricks JDBC with an Oracle-stored procedure to load data from Hive, encountering issues related to missing classes can be frustrating. Let’s explore some potential solutions: Check Dependencies: Ensure that all n...

  • 0 kudos
sarvar-anvarov
by New Contributor II
  • 1019 Views
  • 6 replies
  • 3 kudos

BAD_REQUEST: ExperimentIds cannot be empty when checking ACLs in bulk

I was going through this tutorial https://mlflow.org/docs/latest/getting-started/tracking-server-overview/index.html#method-2-start-your-own-mlflow-server, I ran the whole script and when I try to open the experiment on the databricks website I get t...

sarvaranvarov_0-1710056939049.png
  • 1019 Views
  • 6 replies
  • 3 kudos
Latest Reply
stanjs
New Contributor III
  • 3 kudos

Hi did u resolve that? I encountered the same error

  • 3 kudos
5 More Replies
Chinu
by New Contributor III
  • 728 Views
  • 1 replies
  • 0 kudos

Tableau Desktop connection error from Mac M1

Hi, Im getting the below error while connecting SQL Warehouse from the tableau desktop. I installed the latest ODBC drivers (2.7.5) but I can confirm that the driver name is different. From the error message I see libsparkodbc_sbu.dylib but in my lap...

  • 728 Views
  • 1 replies
  • 0 kudos
Latest Reply
jiro
New Contributor II
  • 0 kudos

Hello @Chinu  It looks like Tableau Desktop by default searches for /Library/simba/spark/lib/libsparkodbc_sbu.dylib , but the file in the SimbaSparkODBC-2.7.7.1016-OS gets installed as libsparkodbc_sb64-universal.dylib I was able to go around this by...

  • 0 kudos
Arnold_Souza
by New Contributor III
  • 3208 Views
  • 5 replies
  • 3 kudos

How to move a metastore to a new Storage Account in unity catalog?

Hello, I would like to change the Metastore location in Databricks Account Console. I have one metastore created that is in an undesired container/storage account. I could see that it's not possible to edit a metastore that is already created. I coul...

1.JPG
  • 3208 Views
  • 5 replies
  • 3 kudos
Latest Reply
ac0
New Contributor III
  • 3 kudos

Bumping this thread as well.

  • 3 kudos
4 More Replies
dollyb
by New Contributor III
  • 3880 Views
  • 2 replies
  • 0 kudos

Resolved! How to detect if running in a workflow job?

Hi there,what's the best way to differentiate in what environment my Spark session is running? Locally I develop with databricks-connect's DatabricksSession, but that doesn't work when running a workflow job which requires SparkSession.getOrCreate()....

  • 3880 Views
  • 2 replies
  • 0 kudos
Latest Reply
dollyb
New Contributor III
  • 0 kudos

Thanks, dbutils.notebook.getContext does indeed contain information about the job run.

  • 0 kudos
1 More Replies
Labels
Top Kudoed Authors