cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Rags98
by New Contributor II
  • 2349 Views
  • 1 replies
  • 0 kudos

Undrop a table from built-in catalogs Azure Databricks

How can I undrop a table from a built-in catalog in Azure Databricks

  • 2349 Views
  • 1 replies
  • 0 kudos
Latest Reply
Lakshay
Databricks Employee
  • 0 kudos

If you are using Unity Catalog, you can simply run the UnDrop command. Ref Doc:- https://docs.databricks.com/en/sql/language-manual/sql-ref-syntax-ddl-undrop-table.html

  • 0 kudos
SenthilJ
by New Contributor III
  • 3391 Views
  • 1 replies
  • 2 kudos

Unity Catalog and Data Accessibility

Hi,I got a few question about the internals of #Unity Catalog in #Databricks1. Understand that we can customize the UC metastore at different levels (catalog/schema). Wondering where is the information about UC permission model stored for every data ...

Data Engineering
Databricks
Unity Catalog
  • 3391 Views
  • 1 replies
  • 2 kudos
Latest Reply
SenthilJ
New Contributor III
  • 2 kudos

thank you @Retired_mod ,your response really helps. A quick follow up - when Unity Catalog uses its permissions to access objects across workspaces, what kind of connection method does it use to access the data object i.e. in this case, when User Y q...

  • 2 kudos
Simon_T
by New Contributor III
  • 2487 Views
  • 1 replies
  • 0 kudos

CURL API - Error while parsing token: io.jsonwebtoken.ExpiredJwtException: JWT expired

I am running this code:curl -X --request GET -H "Authorization: Bearer <databricks token>" "https://adb-1817728758721967.7.azuredatabricks.net/api/2.0/clusters/list"And I am getting this error:2024-01-17T13:21:41.4245092Z </head>2024-01-17T13:21:41.4...

  • 2487 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi, Could you please renew the token and confirm? 

  • 0 kudos
jborn
by New Contributor III
  • 7392 Views
  • 6 replies
  • 1 kudos

Resolved! Connecting an Azure Databricks to Azure Gen 2 storage stuck on "Running Command..."

I recently had an Azure Databricks setup done behind a VPN.  I'm trying to connect to my Azure Storage Account Gen 2  Using the following code I haven't been able to connect and keep getting stuck on reading the file.  What should I be checking?   #i...

  • 7392 Views
  • 6 replies
  • 1 kudos
Latest Reply
jborn
New Contributor III
  • 1 kudos

I ended up opening a ticket with Microsoft support about this issue and they walked us through the debugging on the issue.  In the end the route table was not attached to the subnet.  Once attached everything worked.

  • 1 kudos
5 More Replies
VJ3
by Contributor
  • 4143 Views
  • 3 replies
  • 2 kudos

Best Practice to use/implement SQL Persona using Azure Databricks

Hello,I am looking for details of Security Controls to use/implement SQL Persona using Azure Databricks.

  • 4143 Views
  • 3 replies
  • 2 kudos
Latest Reply
Debayan
Databricks Employee
  • 2 kudos

Hi, There are several documents for the same and can be followed, let me know if the below helps.  https://learn.microsoft.com/en-us/answers/questions/1039176/whitelist-databricks-to-read-and-write-into-azure https://www.databricks.com/blog/2020/03/2...

  • 2 kudos
2 More Replies
Twilight
by New Contributor III
  • 5569 Views
  • 5 replies
  • 3 kudos

Resolved! Bug - Databricks requires extra escapes in repl string in regexp_replace (compared to Spark)

In Spark (but not Databricks), these work:regexp_replace('1234567890abc', '^(?<one>\\w)(?<two>\\w)(?<three>\\w)', '$3$2$1') regexp_replace('1234567890abc', '^(?<one>\\w)(?<two>\\w)(?<three>\\w)', '${three}${two}${one}')In Databricks, you have to use ...

  • 5569 Views
  • 5 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

@Stephen Wilcoxon​ : No, it is not a bug. Databricks uses a different flavor of regular expression syntax than Apache Spark. In particular, Databricks uses Java's regular expression syntax, whereas Apache Spark uses Scala's regular expression syntax....

  • 3 kudos
4 More Replies
ChristianRRL
by Valued Contributor
  • 2063 Views
  • 1 replies
  • 1 kudos

Resolved! DLT Bronze: Incremental File Updates

Hi there, I would like to clarify if there's a way for bronze data to be ingested from "the same" CSV file if the file has been modified (i.e. new file with new records overwriting the old file)? Currently in my setup my bronze table is a `streaming ...

  • 2063 Views
  • 1 replies
  • 1 kudos
Latest Reply
Lakshay
Databricks Employee
  • 1 kudos

You can use the option "cloudFiles.allowOverwrites" in DLT. This option will allow you to read the same csv file again but you should use it cautiously, as it can lead to duplicate data being loaded.

  • 1 kudos
otum
by New Contributor II
  • 2704 Views
  • 6 replies
  • 0 kudos

[Errno 2] No such file or directory

I am reading a Json a file as in below location, using the below code,    file_path = "/dbfs/mnt/platform-data/temp/ComplexJSON/sample.json" # replace with the file path f = open(file_path, "r") print(f.read())     but it is failing for no such file...

otum_0-1704950000614.png otum_0-1704949958734.png
  • 2704 Views
  • 6 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi, As Shan mentioned, could you please cat the file and see if it exists. 

  • 0 kudos
5 More Replies
ac0
by Contributor
  • 5007 Views
  • 3 replies
  • 0 kudos

Resolved! Setting environment variables to use in a SQL Delta Live Table Pipeline

I'm trying to use the Global Init Scripts in Databricks to set an environment variable to use in a Delta Live Table Pipeline. I want to be able to reference a value passed in as a path versus hard coding it. Here is the code for my pipeline:CREATE ST...

  • 5007 Views
  • 3 replies
  • 0 kudos
Latest Reply
ac0
Contributor
  • 0 kudos

I was able to accomplish this by creating a Cluster Policy that put in place the scripts, config settings, and environment variables I needed.

  • 0 kudos
2 More Replies
ChrisS
by New Contributor III
  • 5078 Views
  • 7 replies
  • 8 kudos

How to get data scraped from the web into your data storage

I learning data bricks for the first time following the book that is copywrited in 2020 so I imagine it might be a little outdated at this point. What I am trying to do is move data from an online source (in this specific case using shell script but ...

  • 5078 Views
  • 7 replies
  • 8 kudos
Latest Reply
CharlesReily
New Contributor III
  • 8 kudos

In Databricks, you can install external libraries by going to the Clusters tab, selecting your cluster, and then adding the Maven coordinates for Deequ. This represents the best b2b data enrichment services in Databricks.In your notebook or script, y...

  • 8 kudos
6 More Replies
aockenden
by New Contributor III
  • 2095 Views
  • 2 replies
  • 0 kudos

Switching SAS Tokens Mid-Script With Spark Dataframes

Hey all, my team has settled on using directory-scoped SAS tokens to provision access to data in our Azure Gen2 Datalakes. However, we have encountered an issue when switching from a first SAS token (which is used to read a first parquet table in the...

  • 2095 Views
  • 2 replies
  • 0 kudos
Latest Reply
aockenden
New Contributor III
  • 0 kudos

Bump

  • 0 kudos
1 More Replies
pyter
by New Contributor III
  • 6180 Views
  • 5 replies
  • 2 kudos

Resolved! [13.3] Vacuum on table fails if shallow clone without write access exists

Hello everyone,We use unity catalog, separating our dev, test and prod data into individual catalogs.We run weekly vacuums on our prod catalog using a service principal that only has (read+write) access to this production catalog, but no access to ou...

  • 6180 Views
  • 5 replies
  • 2 kudos
Latest Reply
Lakshay
Databricks Employee
  • 2 kudos

Are you using Unity Catalog in single user access mode? If yes, could you try using shared access mode.

  • 2 kudos
4 More Replies
pawelzak
by New Contributor III
  • 2531 Views
  • 2 replies
  • 1 kudos

Dashboard update through API

Hi,I would like to create / update dashboard definition based on the json file. How can one do it? I tried the following:databricks api post /api/2.0/preview/sql/dashboards/$dashboard_id --json @file.json  But it does not update the widgets...How can...

  • 2531 Views
  • 2 replies
  • 1 kudos
Latest Reply
Gamlet
New Contributor II
  • 1 kudos

To programmatically create/update dashboards in Databricks using a JSON file, you can use the Databricks REST API's workspace/export and workspace/import endpoints. Generate a JSON representation of your dashboard using workspace/export, modify it as...

  • 1 kudos
1 More Replies
israelst
by New Contributor II
  • 2783 Views
  • 3 replies
  • 1 kudos

structured streaming schema inference

I want to stream data from kinesis using DLT. the Data is in json format. How can I use structured streaming to automatically infer the schema? I know auto-loader has this feature but it doesn't make sense for me to use autoloader since my data is st...

  • 2783 Views
  • 3 replies
  • 1 kudos
Latest Reply
israelst
New Contributor II
  • 1 kudos

I wanted to use Databricks for this. I don't want to depend on AWS Glue. Same way I could do it with AutoLoader...

  • 1 kudos
2 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels