cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

viniaperes
by New Contributor II
  • 2807 Views
  • 0 replies
  • 0 kudos

Pass Databricks's Spark session to a user defined module

Hello everyone,I have a .py file (not a notebook) where I have the following class with the following constructor:class DataQualityChecker: def __init__(self, spark_session: SparkSession, df: DataFrame, quality_config_filepath: str) -> None: ...

  • 2807 Views
  • 0 replies
  • 0 kudos
jgen17
by New Contributor II
  • 14834 Views
  • 2 replies
  • 0 kudos

Cluster library installation fails

Hello everyone,I get a weird error when installing additional libraries in my cluster.I have a predefined Databricks cluster (Standard_L8s_v2) as a Compute instance. I run pipelines on that cluster in Azure ADF. The pipeline consists several tasks. T...

  • 14834 Views
  • 2 replies
  • 0 kudos
successhawk
by New Contributor II
  • 2008 Views
  • 1 replies
  • 1 kudos

How can I provide read only access to the Admin console?

As a DevSecOps engineer, I want to provide Ops support personnel READ ONLY access to the admin console in my production workspaces, so that they can easily view non-secret configurations, such as user/group memberships/entitlements and workspace sett...

  • 2008 Views
  • 1 replies
  • 1 kudos
Latest Reply
418971
New Contributor II
  • 1 kudos

Have you found out a solution for this?

  • 1 kudos
mgrave
by New Contributor II
  • 4052 Views
  • 2 replies
  • 2 kudos

Temporary table names are highlighted as syntax errors in SQL notebooks

See attached screenshot. In my SQL notebook, declare a temporary view:CREATE OR REPLACE TEMP VIEW tmp_table ASSELECT ...;SELECT count(*) FROM tmp_table; The code editor considers tmp_table is not a valid name in that latter SELECT. The reason is:Coul...

  • 4052 Views
  • 2 replies
  • 2 kudos
Latest Reply
Craig_
New Contributor III
  • 2 kudos

My temp views always show red as well.  Maybe it is something with our specific environment?I've also noticed, when browsing the catalog from within the notebook, the temp tables are listed but an error is thrown when you try to view the columns of t...

  • 2 kudos
1 More Replies
aerofish
by New Contributor III
  • 1628 Views
  • 0 replies
  • 0 kudos

Structured streaming deduplication issue

Recently we are using structured streaming to ingest data. We want to use watermark to drop duplicated event. But We encountered some wired behavior and unexpected exception. Anyone can help me to explain what is the expected behavior and how should ...

Data Engineering
deduplication
streaming
watermark
  • 1628 Views
  • 0 replies
  • 0 kudos
StephanieAlba
by Databricks Employee
  • 5564 Views
  • 2 replies
  • 0 kudos

When would you not want to use autoloader?

I am genuinely curious why would you ever not use Autoloader? I see it in one-off downloads of course. When you pull data from another platform, say Salesforce, is it better to append to a table without Autoloader? There must be cases I am missing. T...

  • 5564 Views
  • 2 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

Autoloader is pretty handy, but not open source.  That is one reason f.e.Another reason is f.e. if you cannot guarantee lexicographically generated files, or you do not want to use streaming, or you do not land your raw data into a data lake (read fr...

  • 0 kudos
1 More Replies
Ruby8376
by Valued Contributor
  • 4065 Views
  • 3 replies
  • 1 kudos

Resolved! DATABRICKS TO AZ SQL??

Hi All,, quick question:Is this correct data flow pattern: Databricks -> Az SQL -> Tableau??Or does it have to go through ADLS: Databricks -> ADLS -> Az SQL - > Tableau? Also, is it better to leverage databricks lakehouse sql warehouse capability as ...

  • 4065 Views
  • 3 replies
  • 1 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 1 kudos

I would not call it 'better' per se.  A lakehouse is a more modern approach to a classic datawarehouse, using flexible distributed cloud compute, cheap storage and open file formats.If you have an existing environment, which works well, that is heavi...

  • 1 kudos
2 More Replies
eimis_pacheco
by Contributor
  • 7988 Views
  • 1 replies
  • 0 kudos

What are the Delta Live Tables limitations with relation to Unity Catalog?

Hi community!I was in a Databricks webinar and one of the participants said "Delta Live Tables seems to have some limitations when using with Unity Catalog. Is the idea to get parity with Hive?" and someone answered "DLT + Unity Catalog combination h...

  • 7988 Views
  • 1 replies
  • 0 kudos
Michael_Appiah
by Contributor II
  • 29564 Views
  • 1 replies
  • 1 kudos

Hashing Functions in PySpark

Hashes are commonly used in SCD2 merges to determine whether data has changed by comparing the hashes of the new rows in the source with the hashes of the existing rows in the target table. PySpark offers multiple different hashing functions like:MD5...

  • 29564 Views
  • 1 replies
  • 1 kudos
Latest Reply
Michael_Appiah
Contributor II
  • 1 kudos

Hi @Retired_mod ,thank you for your comprehensive answer. What is your opinion on the trade-off between using a hash like xxHASH64 which returns a LongType column and thus would offer good performance when there is a need to join on the hash column v...

  • 1 kudos
kaleighspitz
by New Contributor
  • 1818 Views
  • 0 replies
  • 0 kudos

Delta Live Tables saving as corrupt files

Hello,I am using Delta Live Tables to store data and then trying to save them to ADLS. I've specified the storage location of the Delta Live Tables in my Delta Live Tables pipeline. However, when I check the files that are saved in ADLS, they are cor...

Data Engineering
Delta Live Tables
  • 1818 Views
  • 0 replies
  • 0 kudos
jfarmer
by New Contributor II
  • 8515 Views
  • 3 replies
  • 1 kudos

PermissionError / Operation not Permitted with Files-in-Repos

I've been running a notebook using files-in-repo. Previously this has worked fine. I'm unsure what's changed (I was testing integration with DCS on older runtimes, but don't think I made any persistent changes)--but now it's throwing an error (always...

image image
  • 8515 Views
  • 3 replies
  • 1 kudos
Latest Reply
_carleto_
New Contributor II
  • 1 kudos

Hi @jfarmer , did you solved this issue? I'm having exactly the same challenge.Thanks!

  • 1 kudos
2 More Replies
Paval
by New Contributor
  • 1947 Views
  • 0 replies
  • 0 kudos

Failed to run the job on databricks version LTS 9.x and 10.x(AWS)

Hi Team,When we tried to change the databricks version from 7.3 to 9.x or 10.x we are getting below error. Caused by: java.lang.RuntimeException: MetaException(message:Unable to verify existence of default database: com.amazonaws.services.glue.model....

  • 1947 Views
  • 0 replies
  • 0 kudos
rp16
by New Contributor II
  • 3029 Views
  • 2 replies
  • 2 kudos

How can we create streaming tables as external delta tables ?

We would like to introduce DLT, Streaming tables to our medallion architecture but we are unable to create the streaming tables with concerned schemas. STREAMING Tables doesn't have an option to be stored with custom schemas. The requirement we have ...

  • 3029 Views
  • 2 replies
  • 2 kudos
Latest Reply
Faisal
Contributor
  • 2 kudos

If unity catalog is used, by default tables under that would be managed

  • 2 kudos
1 More Replies
nikhilkumawat
by New Contributor III
  • 5312 Views
  • 2 replies
  • 1 kudos

[INTERNAL_ERROR] Cannot generate code for expression: claimsconifer.default.decrypt_colA(

A column contains encrypted data at rest. I am trying to create a sql function which will decrypt the data if the user is a part of a particular group. Below is the function: %sql CREATE OR REPLACE FUNCTION test.default.decrypt_if_valid_user(col_a ST...

  • 5312 Views
  • 2 replies
  • 1 kudos
Latest Reply
nikhilkumawat
New Contributor III
  • 1 kudos

Hi @Retired_mod After removing "TABLE" keyword from create or replace statement this function got registered as builtin function. Just to verify that I displayed all the functions and I can see that function--> decrypt_if_valid_user:Now I am trying t...

  • 1 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels