cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Ajay-Pandey
by Esteemed Contributor III
  • 1655 Views
  • 1 replies
  • 6 kudos

Cluster policies now support limiting the max number of clusters per user can create Policy permissions allow you to set a max number of clusters per ...

Cluster policies now support limiting the max number of clusters per user can createPolicy permissions allow you to set a max number of clusters per user. This determines how many clusters a user can create using that policy. If the user exceeds the ...

image
  • 1655 Views
  • 1 replies
  • 6 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 6 kudos

Thank you for sharing

  • 6 kudos
Ajay-Pandey
by Esteemed Contributor III
  • 2809 Views
  • 1 replies
  • 6 kudos

Variable explorer in Databricks With Databricks Runtime 12.1 and above, you can directly observe current Python variables in the notebook UI.To open t...

Variable explorer in DatabricksWith Databricks Runtime 12.1 and above, you can directly observe current Python variables in the notebook UI.To open the variable explorer, click in the right sidebar. The variable explorer opens, showing the value and ...

image Untitled
  • 2809 Views
  • 1 replies
  • 6 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 6 kudos

Thank you for sharing

  • 6 kudos
mickniz
by Contributor
  • 2149 Views
  • 1 replies
  • 1 kudos

ErrorClass=DAC_DOES_NOT_EXIST]

While creating external table in unity enabled catalog i am geting below error:Data access configuration for metastore does not exist. I can see data access is there.Can anyone let me know if I am misisng here anything

  • 2149 Views
  • 1 replies
  • 1 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 1 kudos

could you share the full error stack trace? how do you create the table? please provide more details, so we can help you to find a solution

  • 1 kudos
chanansh
by Contributor
  • 1623 Views
  • 1 replies
  • 0 kudos

QueryExecutionListener cannot be found in pyspark

According to the documentation you can monitor a spark structure stream job using QueryExecutionListener. However I cannot find it. https://docs.databricks.com/structured-streaming/stream-monitoring.html#language-python

  • 1623 Views
  • 1 replies
  • 0 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 0 kudos

Which DBR version are you using? also, can you share some code snippet on how you are using the QueryExecutionListener?

  • 0 kudos
ackerman_chris
by New Contributor III
  • 3129 Views
  • 0 replies
  • 0 kudos

Azure Devops Git sync failed in Azure Databricks

Hello,I am currently attempting to setup a Git Repo within Azure Devops to use on my Azure Databricks Workspace environment for various notebooks. I went through the process of creating a Personal Access Token (PAT) on Devops, and have inputted the t...

  • 3129 Views
  • 0 replies
  • 0 kudos
Mado
by Valued Contributor II
  • 12987 Views
  • 4 replies
  • 3 kudos

Resolved! Databricks Audit Logs, What is "dataSourceId"?

Hi,I want to access the Databricks Audit Logs to check user activity.I created a Databricks workspace on the premium pricing tier.I configured Audit logs to be sent to Azure Diagnostic log delivery. What I got in the "Log Analytics Workspace":  I hav...

image image image
  • 12987 Views
  • 4 replies
  • 3 kudos
Latest Reply
youssefmrini
Databricks Employee
  • 3 kudos

The data_source_id field specifies the id of the SQL warehouse against which this query will run. You can use the Data Sources API to see a complete list of available SQL warehouses.

  • 3 kudos
3 More Replies
Jkb
by New Contributor II
  • 3889 Views
  • 0 replies
  • 1 kudos

Workflow triggered by CLI shown "manually" triggered

We trigger different Worflows by ADF.These workflows will be shown triggered "manually".Is this behaviour intentional? At least for users, this is confusing.ADF-triggered Run: Databricks-Workflows: 

ADF_Monitor manually1 manually2
  • 3889 Views
  • 0 replies
  • 1 kudos
Twilight
by New Contributor III
  • 3659 Views
  • 2 replies
  • 0 kudos

How to make backreferences in regexp_replace repl string work correctly in Databricks SQL?

Both of these work in Spark SQL:regexp_replace('1234567890abc', '^(?<one>\\w)(?<two>\\w)(?<three>\\w)', '$1') regexp_replace('1234567890abc', '^(?<one>\\w)(?<two>\\w)(?<three>\\w)', '${one}')However, neither work in Databricks SQL. I found that this ...

  • 3659 Views
  • 2 replies
  • 0 kudos
Latest Reply
User16764241763
Honored Contributor
  • 0 kudos

Hello @Stephen Wilcoxon​ Could you please share the expected output in Spark SQL?

  • 0 kudos
1 More Replies
Ria
by New Contributor
  • 1632 Views
  • 1 replies
  • 1 kudos

py4j.security.Py4JSecurityException

Getting this error while loading data with autoloader. Although table access control is already disabled still getting this error."py4j.security.Py4JSecurityException: Method public org.apache.spark.sql.streaming.DataStreamReader org.apache.spark.sql...

image
  • 1632 Views
  • 1 replies
  • 1 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 1 kudos

Hi,Are you using a High concurrency cluster? which DBR version are you running?

  • 1 kudos
lurban
by New Contributor II
  • 1964 Views
  • 1 replies
  • 0 kudos

Delta Live Tables Development Mode Resets Cluster On Each Trigger

I believe this is a bug identified, but in the last few days, each time I trigger a test Delta Live Tables run in Development mode, the associated cluster will take 5-7 minutes to spin up each time. The cluster does stay on as anticipated in the comp...

  • 1964 Views
  • 1 replies
  • 0 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 0 kudos

Hi,Can you share your cluster JSON settings? it will help us to undertand the settings and VMs you are using.

  • 0 kudos
manasa
by Contributor
  • 5261 Views
  • 3 replies
  • 1 kudos

Need help to insert huge data into cosmos db from azure data lake storage using databricks

I am trying to insert 6GB of data into cosmos db using OLTP ConnectorContainer RU's:40000Cluster Config:cfg = { "spark.cosmos.accountEndpoint" : cosmosdbendpoint, "spark.cosmos.accountKey" : cosmosdbmasterkey, "spark.cosmos.database" : cosmosd...

image.png
  • 5261 Views
  • 3 replies
  • 1 kudos
Latest Reply
ImAbhishekTomar
New Contributor III
  • 1 kudos

Did anyone find solution for this, I’m also using similar clutter and RAU and data ingestion taking lot of time….?

  • 1 kudos
2 More Replies
youssefmrini
by Databricks Employee
  • 1451 Views
  • 1 replies
  • 2 kudos
  • 1451 Views
  • 1 replies
  • 2 kudos
Latest Reply
Sivaprasad1
Valued Contributor II
  • 2 kudos

@Youssef Mrini​ : Please have a look at below link which gives the databricks resource limitshttps://docs.databricks.com/resources/limits.html

  • 2 kudos
VictoriaM
by New Contributor II
  • 1528 Views
  • 2 replies
  • 0 kudos

@Chris Grabiel​  Do you have any experience connecting REDCAP API to Databricks you would be able to share?

@Chris Grabiel​  Do you have any experience connecting REDCAP API to Databricks you would be able to share?

  • 1528 Views
  • 2 replies
  • 0 kudos
Latest Reply
Chris_Grabiel
New Contributor III
  • 0 kudos

We absolutely do. We ingest to the lake via Redcap API AND folks use it in notebooks. How can we help?

  • 0 kudos
1 More Replies
JRT5933
by New Contributor III
  • 3661 Views
  • 4 replies
  • 7 kudos

Resolved! GOLD table slowed down at MERGE INTO

Howdy - I recently took a table FACT_TENDER and made it into a medalliona tyle TABLE to test performance since I suspected medallion would be quicker. Key differences: Both tables use bronze dataoriginal has all logic in one long notebookMERGE INTO t...

  • 3661 Views
  • 4 replies
  • 7 kudos
Latest Reply
JRT5933
New Contributor III
  • 7 kudos

I ended up instituing true and tried PARTITIONING and PRUNING methods to boost performance, which has succeeded.

  • 7 kudos
3 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels