cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

skirock
by New Contributor
  • 1044 Views
  • 0 replies
  • 0 kudos

DLT live tables error while reading file from datalake gen2

I am getting following error while running cell in python.  Same file is run fine when i upload json file into databrics and then give this path to df.read syntex while reading it.   When i use DLT for same file which is in datalake it gives me follo...

  • 1044 Views
  • 0 replies
  • 0 kudos
dbdude
by New Contributor II
  • 10265 Views
  • 2 replies
  • 2 kudos

Delete Delta Live Table Completely

I've been struggling with figuring out how to delete a managed delta live table. If I run a drop command in Databricks SQL I get:[STREAMING_TABLE_OPERATION_NOT_ALLOWED.DROP_DELTA_LIVE_TABLE] The operation DROP is not allowed: The operation does not a...

  • 10265 Views
  • 2 replies
  • 2 kudos
Phani1
by Valued Contributor II
  • 452 Views
  • 0 replies
  • 0 kudos

Databricks Cross platform data access

 Hi Team,We have a requirement . data storage on the S3 platform, while our databricks is hosted on Azure.Our objective is to access the data from the S3 location.Could you kindly provide us with the most suitable approach for this scenario? ex- exte...

  • 452 Views
  • 0 replies
  • 0 kudos
dbph
by New Contributor II
  • 3354 Views
  • 1 replies
  • 0 kudos

Databricks asset bundles error "failed to instantiate provider"

Hi all,I'm trying to deploy with databricks asset bundles. When running bundle deploy, the process fails with following error message:failed execution pid=25092 exit_code=1 error="terraform apply: exit status 1\n\nError: failed to read schema for dat...

  • 3354 Views
  • 1 replies
  • 0 kudos
Latest Reply
dbph
New Contributor II
  • 0 kudos

Hi,sorry for my late answer and thank you for your help. I was on parental leave in the meantime.I solved the problem by installing the latest Databricks extension. With the new extension and updated Databricks-CLI I got the same error.However: The l...

  • 0 kudos
ManojReddy
by New Contributor II
  • 6661 Views
  • 4 replies
  • 0 kudos

How to deal with delete records from the source Files in DLT .

Can apply_changes feature deal with deleted records in incoming source Files?By delete I mean record is being removed (Not a soft delete with Flag).If not, how to automate with deleting records from Bronze Streaming table based on source Files.

  • 6661 Views
  • 4 replies
  • 0 kudos
Latest Reply
2vinodhkumar
New Contributor II
  • 0 kudos

Hi Manoj,Did you get the solution or design change for this problem. We have 200K files on to S3 bucket and when there is change in upstream app we get new feed, feed name is fixed. On DLT we should have only new records from replaced file but in dlt...

  • 0 kudos
3 More Replies
hzh
by New Contributor II
  • 1269 Views
  • 1 replies
  • 0 kudos

Credential passthrough and Hive metastore table access controls are deprecated

Hello,Based on the recent platform release, the credential passthrough will be deprecated for runtime 15.0 and later. Our current setup involves using Databricks alongside AWS Glue and Athena, i.e. registering delta tables in AWS Glue and running oth...

Data Engineering
aws glue
Unity Catalog
  • 1269 Views
  • 1 replies
  • 0 kudos
Anwitha
by New Contributor II
  • 1455 Views
  • 2 replies
  • 0 kudos

Connecting to ADLS azure storage and reading csv file from adls

Hi Friends , How are you all? I am Anwitha. I am a newbie in azure data bricks community. I am stuck with this error while practising with azure storage account. Could anyone throw a light on this error and provide a solution?I tried connecting to or...

  • 1455 Views
  • 2 replies
  • 0 kudos
Latest Reply
jacovangelder
Honored Contributor
  • 0 kudos

Ermm, you might want to edit your post and remove that storage account access key! In combination with the storage account name (that is also in your post), everyone will be able to access it if it isn't behind a firewall! Btw, you might want to stud...

  • 0 kudos
1 More Replies
Bhaskar29
by New Contributor II
  • 720 Views
  • 3 replies
  • 0 kudos

Error while reading steaming source from DLT Pipeline

Hi All, I am getting this error, when I am reading the streaming sourceFull load - It loadsIncremental load - am facing this errorThis is the piece of code that am using def gim_suppliers_ln():    logger.info("Starting __cn_gim_suppliers")    overall...

Bhaskar29_0-1718877478034.png
  • 720 Views
  • 3 replies
  • 0 kudos
Latest Reply
rachelgreen2
New Contributor II
  • 0 kudos

thanks for the info

  • 0 kudos
2 More Replies
dvmentalmadess
by Valued Contributor
  • 6673 Views
  • 10 replies
  • 2 kudos

Resolved! Data Explorer minimum permissions

What are the minimum permissions are required to search and view objects in Data Explorer? For example, does a user have to have `USE [SCHEMA|CATALOG]` to search or browse in the Data Explorer? Or can anyone with workspace access browse objects and, ...

  • 6673 Views
  • 10 replies
  • 2 kudos
Latest Reply
bearded_data
New Contributor III
  • 2 kudos

Circling back to this.  With one of the recent releases you can now GRANT BROWSE at the catalog level!  Hopefully they will be rolling this feature out at every object level (schemas and tables specifically).

  • 2 kudos
9 More Replies
dollyb
by Contributor
  • 2028 Views
  • 2 replies
  • 0 kudos

Resolved! Differences between Spark SQL and Databricks

Hello,I'm using a local Docker Spark 3.5 runtime to test my Databricks Connect code. However I've come across a couple of cases where my code would work in one environment, but not the other.Concrete example, I'm reading data from BigQuery via spark....

  • 2028 Views
  • 2 replies
  • 0 kudos
Latest Reply
daniel_sahal
Esteemed Contributor
  • 0 kudos

@dollyb That's because when you've added another dependency on Databricks, it doesn't really know which one it should use. By default it's using built-in com.google.cloud.spark.bigquery.BigQueryRelationProvider.What you can do is provide whole packag...

  • 0 kudos
1 More Replies
thiagoawstest
by Contributor
  • 943 Views
  • 1 replies
  • 0 kudos

Azure Devops - Entra ID - AWS Databricks

Hi, I need to integrate Azure Devops repos with AWS Databricks, but not via personal token.I need it via main service, integrated with Azure Entra ID, using Azure Databricks when I go to create main service, "Entra ID application ID" appears, but in ...

  • 943 Views
  • 1 replies
  • 0 kudos
christian_chong
by New Contributor III
  • 1140 Views
  • 1 replies
  • 0 kudos

Resolved! unity catalog with external table and column masking

Hi everbody, I am facing a issue with spark structured steaming. here is a sample of my code:   df = spark.readStream.load(f"{bronze_table_path}") df.writeStream \ .format("delta") \ .option("checkpointLocation", f"{silver_checkpoint}") \ .option("me...

  • 1140 Views
  • 1 replies
  • 0 kudos
Latest Reply
christian_chong
New Contributor III
  • 0 kudos

My first message was not well formatted. i wrote :  df = spark.readStream.load(f"{bronze_table_path}") df.writeStream \ .format("delta") \ .option("checkpointLocation", f"{silver_checkpoint}") \ .option("mergeSchema", "true") \ .trigger(availabl...

  • 0 kudos
philipkd
by New Contributor III
  • 2100 Views
  • 1 replies
  • 0 kudos

Cannot get past Query Data tutorial for Azure Databricks

I created a new workspace on Azure Databricks, and I can't get past this first step in the tutorial: DROP TABLE IF EXISTS diamonds; CREATE TABLE diamonds USING CSV OPTIONS (path "/databricks-datasets/Rdatasets/data-001/csv/ggplot2/diamonds.csv", hea...

  • 2100 Views
  • 1 replies
  • 0 kudos
Latest Reply
dollyb
Contributor
  • 0 kudos

Struggling with this as well. So using dbfs:/ with CREATE TABLE statement works on AWS, but not Azure?

  • 0 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels