Hey Folks anyone put Databricks behind Okta and enabled Unified Login with workspaces that have a Unity Catalog metastore applied and some that don't?There are some workspaces we can't move over yet and it isn't clear in documentation if Unity Catalo...
Yes, users should be able to use a single Okta application for all workspaces, regardless of whether the Unity Catalog metastore has been applied or not. The Unity Catalog is a feature that allows you to manage and secure access to your data across a...
Hello,I have some data which are lying into Snowflake, so I want to apply CDC on them using delta live table but I am having some issues.Here is what I am trying to do: @dlt.view()
def table1():
return spark.read.format("snowflake").options(**opt...
The CDC for delta live works fine for delta tables, as you have noticed. However it is not a full blown CDC implementation/software.If you want to capture changes in Snowflake, you will have to implement some CDC method on Snowflake itself, and read...
Hi,I am having a hard time configuring my Databricks workspace when working in VSCode via WSL. When following the steps to setup Databricks authentication I am receiving the following error on the Step 5 of "Step 4: Set up Databricks authentication"....
Hi,databricks jdbc version - 2.6.34I am facing the below issue with connecting databricks sql from apache solr Caused by: java.sql.SQLFeatureNotSupportedException: [Databricks][JDBC](10220) Driver does not support this optional feature.at com.databri...
Databricks team recommended to set IgnoreTransactions=1 and autocommit=false in the connection string but that didn't resolve the issue .Ultimately I had to use solr update API for uploading documents
Not sure if this has come up before, but I'm wondering if Databricks has any kind of functionality to "watch" an API call for changes?E.g. Currently I have a frequently running job that pulls data via an API call and overwrites the old data. This see...
Hi @ChristianRRL, Databricks provides a REST API that allows you to interact with various aspects of your Databricks workspace programmatically. While there isn’t a direct built-in feature to “watch” an API call for changes, you can design a solut...
Is anyone able to advise why I am getting the error not a delta table? The table was created in Unity Catalog. I've also tried DeltaTable.forName and also using 13.3 LTS and 14.3 LTS clusters. Any advice would be much appreciated
@StogponI believe if you are using DeltaTable.forPath then you have to pass the path where the table is. You can get this path from the Catalog. It is available in the details tab of the table.Example:delta_table_path = "dbfs:/user/hive/warehouse/xyz...
Hi community,Assume I generate a personal access token for an entity. Post generation, can I restrict the access of the entity to specific REST APIs? In other words, consider this example where once I use generate the token and setup a bearer token b...
Hi @Surajv, You can control permissions using the Permissions API. Although Personal Access Tokens (PATs) do not directly support fine-grained API restrictions, you can achieve this by carefully configuring permissions for the entity associated with ...
Hi dear Databricks community,We tried to use databricks-jdbc inside oracle store procedure to load something from hive. However Oracle marked databricks-jdbc invalid because some classes (for example com.databricks.client.jdbc42.internal.io.netty.ut...
Hi @Ujeen , When integrating Databricks JDBC with an Oracle-stored procedure to load data from Hive, encountering issues related to missing classes can be frustrating.
Let’s explore some potential solutions:
Check Dependencies:
Ensure that all n...
I was going through this tutorial https://mlflow.org/docs/latest/getting-started/tracking-server-overview/index.html#method-2-start-your-own-mlflow-server, I ran the whole script and when I try to open the experiment on the databricks website I get t...
Hi, Im getting the below error while connecting SQL Warehouse from the tableau desktop. I installed the latest ODBC drivers (2.7.5) but I can confirm that the driver name is different. From the error message I see libsparkodbc_sbu.dylib but in my lap...
Hello @Chinu
It looks like Tableau Desktop by default searches for /Library/simba/spark/lib/libsparkodbc_sbu.dylib , but the file in the SimbaSparkODBC-2.7.7.1016-OS gets installed as libsparkodbc_sb64-universal.dylib
I was able to go around this by...
Hello, I would like to change the Metastore location in Databricks Account Console. I have one metastore created that is in an undesired container/storage account. I could see that it's not possible to edit a metastore that is already created. I coul...
Hi there,what's the best way to differentiate in what environment my Spark session is running? Locally I develop with databricks-connect's DatabricksSession, but that doesn't work when running a workflow job which requires SparkSession.getOrCreate()....
Hello I am trying to set max batch size for pandas-udf in Databricks notebook, but in my tests it doesn’t have any effect on size. spark.conf.set("spark.sql.execution.arrow.enabled", "true")spark.conf.set('spark.sql.execution.arrow.maxRecordsPerBatch...
Hi @277745, It seems you’re working with Pandas UDF in a Databricks Notebook and trying to set the maximum batch size.
Let’s address your query:
Setting Max Batch Size for Pandas UDF:
You’ve already taken the right steps by configuring the follo...
Hi Community, Is there a way to limit the scope of workspace level token to hit only certain REST APIs of Databricks.In short, Once we generate a workspace level token following this doc. Link: https://docs.databricks.com/en/dev-tools/auth/oauth-m2m....
I want to list, using sql editor, all users name from a specific group.Reading documentation, I only learned how to show the groups or the users, using simples filters, like:SHOW GROUPS LIKE '*XPTO*';SHOW GROUPS WITH USER `test@gmail.com`SHOW USERS L...
Hi @Sardenberg, To retrieve a list of users from a specific group using SQL, you can follow these steps:
Assumptions:
Let’s assume you have three tables: USERS, GROUPS, and GROUP_USERS.The USERS table contains user information.The GROUPS table con...