cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Bie1234
by New Contributor III
  • 5155 Views
  • 3 replies
  • 4 kudos

Resolved! How to delete records that column have same value in another table?

delete from DWH.SALES_FACT where SALES_DATE in (select SALES_DATE from STG.SALES_FACT_SRC) AND STORE_ID in (select STORE_ID from STG.SALES_FACT_SRC)output : Error in SQL statement: DeltaAnalysisException: Nested subquery is not supported in the...

  • 5155 Views
  • 3 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @pansiri panaudom​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.T...

  • 4 kudos
2 More Replies
dhanu
by New Contributor
  • 1881 Views
  • 2 replies
  • 0 kudos

Fatal error: Python kernel is unresponsive

i have submitted around 90 job at a time to databricks, the job was running continuously for 2 hours after that i am getting fatal error Pyhon kernel is unresponsive.I am using Databricks runtime version : 11.2Cluster Configuration Details are given...

  • 1881 Views
  • 2 replies
  • 0 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 0 kudos

Hi @Dhanaraj Jogihalli​,Just a friendly follow-up. Did any of the responses help you to resolve your question? if it did, please mark it as best. Otherwise, please let us know if you still need help.

  • 0 kudos
1 More Replies
rajalakshmi9394
by New Contributor II
  • 3919 Views
  • 3 replies
  • 4 kudos

Resolved! QUERY_RESULT_ROWS without first row as trigger in SQL Alerts of databricks

Hi Team - In azure databricks sql alerts, I was able to use QUERY_RESULT_ROWS only if I'm selecting the trigger as first row. Is there a possibility to get count of number of rows and also the query result (both rows and columns to display the data a...

  • 3919 Views
  • 3 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @Rajalakshmi Amara​ Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell u...

  • 4 kudos
2 More Replies
Ajay-Pandey
by Esteemed Contributor III
  • 1834 Views
  • 1 replies
  • 6 kudos

Cluster policies now support limiting the max number of clusters per user can create Policy permissions allow you to set a max number of clusters per ...

Cluster policies now support limiting the max number of clusters per user can createPolicy permissions allow you to set a max number of clusters per user. This determines how many clusters a user can create using that policy. If the user exceeds the ...

image
  • 1834 Views
  • 1 replies
  • 6 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 6 kudos

Thank you for sharing

  • 6 kudos
Ajay-Pandey
by Esteemed Contributor III
  • 3125 Views
  • 1 replies
  • 6 kudos

Variable explorer in Databricks With Databricks Runtime 12.1 and above, you can directly observe current Python variables in the notebook UI.To open t...

Variable explorer in DatabricksWith Databricks Runtime 12.1 and above, you can directly observe current Python variables in the notebook UI.To open the variable explorer, click in the right sidebar. The variable explorer opens, showing the value and ...

image Untitled
  • 3125 Views
  • 1 replies
  • 6 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 6 kudos

Thank you for sharing

  • 6 kudos
mickniz
by Contributor
  • 2265 Views
  • 1 replies
  • 1 kudos

ErrorClass=DAC_DOES_NOT_EXIST]

While creating external table in unity enabled catalog i am geting below error:Data access configuration for metastore does not exist. I can see data access is there.Can anyone let me know if I am misisng here anything

  • 2265 Views
  • 1 replies
  • 1 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 1 kudos

could you share the full error stack trace? how do you create the table? please provide more details, so we can help you to find a solution

  • 1 kudos
chanansh
by Contributor
  • 1754 Views
  • 1 replies
  • 0 kudos

QueryExecutionListener cannot be found in pyspark

According to the documentation you can monitor a spark structure stream job using QueryExecutionListener. However I cannot find it. https://docs.databricks.com/structured-streaming/stream-monitoring.html#language-python

  • 1754 Views
  • 1 replies
  • 0 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 0 kudos

Which DBR version are you using? also, can you share some code snippet on how you are using the QueryExecutionListener?

  • 0 kudos
ackerman_chris
by New Contributor III
  • 3236 Views
  • 0 replies
  • 0 kudos

Azure Devops Git sync failed in Azure Databricks

Hello,I am currently attempting to setup a Git Repo within Azure Devops to use on my Azure Databricks Workspace environment for various notebooks. I went through the process of creating a Personal Access Token (PAT) on Devops, and have inputted the t...

  • 3236 Views
  • 0 replies
  • 0 kudos
Mado
by Valued Contributor II
  • 13887 Views
  • 4 replies
  • 3 kudos

Resolved! Databricks Audit Logs, What is "dataSourceId"?

Hi,I want to access the Databricks Audit Logs to check user activity.I created a Databricks workspace on the premium pricing tier.I configured Audit logs to be sent to Azure Diagnostic log delivery. What I got in the "Log Analytics Workspace":  I hav...

image image image
  • 13887 Views
  • 4 replies
  • 3 kudos
Latest Reply
youssefmrini
Databricks Employee
  • 3 kudos

The data_source_id field specifies the id of the SQL warehouse against which this query will run. You can use the Data Sources API to see a complete list of available SQL warehouses.

  • 3 kudos
3 More Replies
Jkb
by New Contributor II
  • 4110 Views
  • 0 replies
  • 1 kudos

Workflow triggered by CLI shown "manually" triggered

We trigger different Worflows by ADF.These workflows will be shown triggered "manually".Is this behaviour intentional? At least for users, this is confusing.ADF-triggered Run: Databricks-Workflows: 

ADF_Monitor manually1 manually2
  • 4110 Views
  • 0 replies
  • 1 kudos
Twilight
by Contributor
  • 4026 Views
  • 2 replies
  • 0 kudos

How to make backreferences in regexp_replace repl string work correctly in Databricks SQL?

Both of these work in Spark SQL:regexp_replace('1234567890abc', '^(?<one>\\w)(?<two>\\w)(?<three>\\w)', '$1') regexp_replace('1234567890abc', '^(?<one>\\w)(?<two>\\w)(?<three>\\w)', '${one}')However, neither work in Databricks SQL. I found that this ...

  • 4026 Views
  • 2 replies
  • 0 kudos
Latest Reply
User16764241763
Honored Contributor
  • 0 kudos

Hello @Stephen Wilcoxon​ Could you please share the expected output in Spark SQL?

  • 0 kudos
1 More Replies
Ria
by New Contributor
  • 1861 Views
  • 1 replies
  • 1 kudos

py4j.security.Py4JSecurityException

Getting this error while loading data with autoloader. Although table access control is already disabled still getting this error."py4j.security.Py4JSecurityException: Method public org.apache.spark.sql.streaming.DataStreamReader org.apache.spark.sql...

image
  • 1861 Views
  • 1 replies
  • 1 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 1 kudos

Hi,Are you using a High concurrency cluster? which DBR version are you running?

  • 1 kudos
lurban
by New Contributor II
  • 2099 Views
  • 1 replies
  • 0 kudos

Delta Live Tables Development Mode Resets Cluster On Each Trigger

I believe this is a bug identified, but in the last few days, each time I trigger a test Delta Live Tables run in Development mode, the associated cluster will take 5-7 minutes to spin up each time. The cluster does stay on as anticipated in the comp...

  • 2099 Views
  • 1 replies
  • 0 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 0 kudos

Hi,Can you share your cluster JSON settings? it will help us to undertand the settings and VMs you are using.

  • 0 kudos
manasa
by Contributor
  • 5606 Views
  • 3 replies
  • 1 kudos

Need help to insert huge data into cosmos db from azure data lake storage using databricks

I am trying to insert 6GB of data into cosmos db using OLTP ConnectorContainer RU's:40000Cluster Config:cfg = { "spark.cosmos.accountEndpoint" : cosmosdbendpoint, "spark.cosmos.accountKey" : cosmosdbmasterkey, "spark.cosmos.database" : cosmosd...

image.png
  • 5606 Views
  • 3 replies
  • 1 kudos
Latest Reply
ImAbhishekTomar
New Contributor III
  • 1 kudos

Did anyone find solution for this, I’m also using similar clutter and RAU and data ingestion taking lot of time….?

  • 1 kudos
2 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels