cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Phani1
by Valued Contributor II
  • 9607 Views
  • 5 replies
  • 0 kudos

Data Quality in Databricks

Hi Databricks Team, would like to implement data quality rules in Databricks, apart from DLT do we have any standard approach to perform/ apply data quality rules on bronze layer before further proceeding to silver and gold layer.

  • 9607 Views
  • 5 replies
  • 0 kudos
Latest Reply
joarobles
New Contributor III
  • 0 kudos

Looks nice! However I don't see Databricks support in the docs

  • 0 kudos
4 More Replies
narendra11
by New Contributor
  • 1595 Views
  • 4 replies
  • 1 kudos

Resolved! getting Status code: 301 Moved Permanently error

getting this error while running the cells Failed to upload command result to DBFS. Error message: Status code: 301 Moved Permanently, Error message: <?xml version="1.0" encoding="UTF-8"?> <Error><Code>PermanentRedirect</Code><Message>The bucket you ...

  • 1595 Views
  • 4 replies
  • 1 kudos
Latest Reply
stefano0929
New Contributor II
  • 1 kudos

Same problem and I don't know how to solve.. Here an example of cell that has always worked correctly but from yesterday it stopped.# Compute the correlation matrixcorrelation_matrix = data.corr()# Set up the matplotlib figureplt.figure(figsize=(14, ...

  • 1 kudos
3 More Replies
hayden_blair
by New Contributor III
  • 2105 Views
  • 3 replies
  • 3 kudos

Resolved! Delta Live Table automatic table removal and schema update

Hello, I made a delta live table workflow that created 3 streaming tables in unity catalog. I then removed the source code for the 3rd table from the workflow and reran. After about a week, the 3rd streaming table is no longer available in unity cata...

  • 2105 Views
  • 3 replies
  • 3 kudos
Latest Reply
hayden_blair
New Contributor III
  • 3 kudos

This makes sense @raphaelblg! Just to confirm my understanding, is the following statement true:If I remove the source code for a unity catalog DLT streaming table from a DLT pipeline and wait 7 days, that table will be dropped from unity catalog, an...

  • 3 kudos
2 More Replies
dpc
by New Contributor III
  • 1123 Views
  • 2 replies
  • 0 kudos

Returing and reusing the identity value

Hello I have a table that has a column defined as an identity (BIGINT GENERATED ALWAYS AS IDENTITY)I will be inserting rows into this table in parallelHow can I get the identity and use that within a pipelineParallel is relevant as there will be mult...

  • 1123 Views
  • 2 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @dpc ,What you're trying to achieve does not make sense in the context of identity columns. Look at below entry from documentation. So, the answer is - if you want to have concurrent transaction, don't use identity columns Declaring an identity co...

  • 0 kudos
1 More Replies
mbaas
by New Contributor III
  • 1805 Views
  • 4 replies
  • 4 kudos

Resolved! Temporary streaming tables (CDC)

I am currently using the `apply_changes` feature. I saw for the regular decorator `dlt.table` you can create temporary tables. I do not see the option you could use this feature with `dlt.create_streaming_table(`, in the sql version it looks it is su...

  • 1805 Views
  • 4 replies
  • 4 kudos
Latest Reply
Icassatti
New Contributor III
  • 4 kudos

Read this articles:Delta Live Tables Python language reference - Azure Databricks | Microsoft LearnThe APPLY CHANGES APIs: Simplify change data capture with Delta Live Tables - Azure Databricks | Microsoft LearnEven you could define as temporary, it ...

  • 4 kudos
3 More Replies
joaogilsa
by New Contributor II
  • 2644 Views
  • 3 replies
  • 1 kudos

Resolved! Delete folder using Databricks CLI

Hello,I am trying to delete a folder and its content using databricks cli, but I'm getting the following error:databricks workspace delete /Workspace/Users/XXX/XXX --profile DEFAULT --recursive trueError: expected to have the absolute path of the not...

  • 2644 Views
  • 3 replies
  • 1 kudos
Latest Reply
joaogilsa
New Contributor II
  • 1 kudos

Thank you for the help, @szymon_dybczak, it worked!

  • 1 kudos
2 More Replies
FerArribas
by Contributor
  • 11993 Views
  • 4 replies
  • 6 kudos

Resolved! Redirect error in access to web app in Azure Databricks with private front endpoint

I have created a workspace with private endpoint in Azure following this guide:https://learn.microsoft.com/en-us/azure/databricks/administration-guide/cloud-configurations/azure/private-linkOnce I have created the private link of type browser_authent...

  • 11993 Views
  • 4 replies
  • 6 kudos
Latest Reply
flomader
New Contributor II
  • 6 kudos

You don't need a CNAME record.Go to your private link resource in Azure and click on Settings > DNS Configuration. Make sure you have created private link A records for all the FQDNs listed under 'Custom DNS records'. You have most likely missed one ...

  • 6 kudos
3 More Replies
yvishal519
by Contributor
  • 1209 Views
  • 2 replies
  • 3 kudos

Resolved! Databricks DLT with Hive Metastore and ADLS Access Issues

We are currently working on Databricks DLT tables to transform data from bronze to silver. we are specifically instructed us not to use mount paths for accessing data from ADLS Gen 2. To comply, I configured storage credentials and created an externa...

yvishal519_0-1721908544085.png
  • 1209 Views
  • 2 replies
  • 3 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 3 kudos

Hi @yvishal519 ,Since you're using hive metastore you have no other option than mount points. Storage credentials and external locations are only supported in Unity Catalog

  • 3 kudos
1 More Replies
helghe
by New Contributor II
  • 976 Views
  • 3 replies
  • 3 kudos

Unavailable system schemas

When I list the available schemas I get the following:{"schemas":[{"schema":"storage","state":"AVAILABLE"},{"schema":"operational_data","state":"UNAVAILABLE"},{"schema":"access","state":"AVAILABLE"},{"schema":"billing","state":"ENABLE_COMPLETED"},{"s...

  • 976 Views
  • 3 replies
  • 3 kudos
Latest Reply
hle
New Contributor II
  • 3 kudos

I have the same issue for the compute schema. Workspace is UC enabled and I'm account admin. 

  • 3 kudos
2 More Replies
Amit_Dass_Chmp
by New Contributor III
  • 564 Views
  • 1 replies
  • 0 kudos

Auto-tuning capability available for external tables?

If I am using Databricks Runtime 11.3 and above to create managed Delta tables cataloged in Unity Catalog (Databricks’ data catalog), I don’t need to worry about optimizing the underlying file sizes or configuring a target file size for my Delta tabl...

  • 564 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @Amit_Dass_Chmp ,Yep, according to documentation. As of second question, such capability will be available in the future. If you are using Databricks Runtime 11.3 and above to create managed Delta tables cataloged in Unity Catalog (Databricks’ dat...

  • 0 kudos
dpc
by New Contributor III
  • 1703 Views
  • 3 replies
  • 4 kudos

Resolved! Approach to monthly data snapshots

HelloI'm building a datawarehouse with all the usual facts and dimensionsIt will flush (truncate) and rebuild on a monthly basisUsers have the need to not only view the data now but also view it historically i.e. what it was a point in timeMy initial...

  • 1703 Views
  • 3 replies
  • 4 kudos
Latest Reply
dpc
New Contributor III
  • 4 kudos

Great, thanks

  • 4 kudos
2 More Replies
angel531
by New Contributor II
  • 1383 Views
  • 3 replies
  • 3 kudos

Resolved! getting error while accessing dbfs from databricks community account and couldnt upload any files

Hi, I have enabled dbfs in my databricks community account and started the cluster. while accessing dbfs its throwing an error.

doubt.png
  • 1383 Views
  • 3 replies
  • 3 kudos
Latest Reply
satyakiguha
New Contributor III
  • 3 kudos

Hi @Retired_mod I am no longer facing this issue, Thanks to the team for fixing it !  

  • 3 kudos
2 More Replies
fdeba
by New Contributor
  • 1239 Views
  • 2 replies
  • 0 kudos

DatabricksSession and SparkConf

Hi,I want to initialize a Spark session using `DatabricksSession`. However, it seems not possible to call `.config()` and pass it a `SparkConf` instance. The following works:# Initialize the configuration for the Spark session confSettings = [ ("...

  • 1239 Views
  • 2 replies
  • 0 kudos
Latest Reply
Witold
Honored Contributor
  • 0 kudos

In almost all cases you don't need to create a new spark session, as Databricks will do it for you automatically.If it's only about spark configurations, there are multiple ways to set it:Cluster settingsspark.conf.set

  • 0 kudos
1 More Replies
mkd
by New Contributor II
  • 5680 Views
  • 3 replies
  • 3 kudos

Resolved! CSV import error

Upload ErrorError occurred when processing file tips1.csv: [object Object].  I've been trying to import a csv file from my local machine to the databricks. The above mentioned error couldn't be resolved. Anyone pls help me in this regard.

  • 5680 Views
  • 3 replies
  • 3 kudos
Latest Reply
clentin
Contributor
  • 3 kudos

@Retired_mod - this is now fixed. Thank you so much for your prompt action. Appreciate it. 

  • 3 kudos
2 More Replies
kwinsor5
by New Contributor II
  • 2646 Views
  • 2 replies
  • 0 kudos

Delta Live Table autoloader's inferColumnTypes does not work

I am experimenting with DLTs/Autoloader. I have a simple, flat JSON file that I am attempting to load into a DLT (following this guide) like so:  CREATE OR REFRESH STREAMING LIVE TABLE statistics_live COMMENT "The raw statistics data" TBLPROPERTIES (...

  • 2646 Views
  • 2 replies
  • 0 kudos
Latest Reply
pavlos_skev
New Contributor III
  • 0 kudos

I had the same issue with a similar JSON structure as yours. Adding the option "multiLine" set to true fixed it for me.df = (spark.readStream.format("cloudFiles") .option("multiLine", "true") .option("cloudFiles.schemaLocation", schemaLocation) ...

  • 0 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels