cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

sgreenuk
by New Contributor
  • 152 Views
  • 1 replies
  • 0 kudos

Orphaned __dlt_materialization schemas left behind after dropping materialized views

Hi everyone,I’m seeing several internal schemas under the __databricks_internal catalog that were auto-created when I built a few materialized views in Databricks SQL. However, after dropping the materialized views, the schemas were not automatically...

  • 152 Views
  • 1 replies
  • 0 kudos
Latest Reply
nayan_wylde
Honored Contributor III
  • 0 kudos

Yes, this is expected behavior in Databricks. The __databricks_internal catalog contains system-owned schemas that support features like materialized views and Delta Live Tables (DLT). When you create materialized views, Databricks generates internal...

  • 0 kudos
pranaav93
by New Contributor II
  • 87 Views
  • 1 replies
  • 1 kudos

Databricks Compute Metrics Alerts

Hi All,Im looking for some implementation ideas where i can use information from the system.compute.node_timeline table to catch memory spikes and if above a given threshold restart the cluster through an API call. Have any of you implemented a simil...

  • 87 Views
  • 1 replies
  • 1 kudos
Latest Reply
NandiniN
Databricks Employee
  • 1 kudos

Hey @pranaav93  A very common use case for using system table system.compute.node_timeline to build alerting and remediation. Check this KB https://kb.databricks.com/en_US/clusters/getting-node-specific-instead-of-cluster-wide-memory-usage-data-from-...

  • 1 kudos
vpacik
by New Contributor
  • 2203 Views
  • 1 replies
  • 0 kudos

Databricks-connect OpenSSL Handshake failed on WSL2

When trying to setup databricks-connect on WSL2 using 13.3 cluster, I receive the following error regarding OpenSSL CERTIFICATE_ERIFY_FAILED.The authentication is done via SPARK_REMOTE env. variable. E0415 11:24:26.646129568 142172 ssl_transport_sec...

  • 2203 Views
  • 1 replies
  • 0 kudos
Latest Reply
ez
New Contributor II
  • 0 kudos

@vpacik Was it solved? I have the same issue

  • 0 kudos
Hritik_Moon
by New Contributor II
  • 333 Views
  • 7 replies
  • 3 kudos

Resolved! create delta table in free edition

table_name = f"project.bronze.{file_name}"spark.sql(    f"""    CREATE TABLE IF NOT EXISTS {table_name}    USING DELTA    """) what am I getting wrong?

  • 333 Views
  • 7 replies
  • 3 kudos
Latest Reply
Hritik_Moon
New Contributor II
  • 3 kudos

yes, multiline solved it. .Is there any better approach to this scenario?

  • 3 kudos
6 More Replies
B_Stam
by New Contributor II
  • 105 Views
  • 1 replies
  • 2 kudos

Resolved! Set default tblproperties for pipeline

I like to set tblproperties ("delta.feature.timestampNtz" = "supported") for all tables in a pipeline. instead of set this option for every table definition. The property must be set direct on creation. I have tried it in the pipeline settings - conf...

  • 105 Views
  • 1 replies
  • 2 kudos
Latest Reply
ManojkMohan
Honored Contributor
  • 2 kudos

Databricks does not allow you to set a global default for all TBLPROPERTIES. However, you can use the spark.databricks.delta.properties.defaults configuration key to set defaults for new Delta tables created in a specific session or pipeline.If you w...

  • 2 kudos
sumitkumar_284
by New Contributor
  • 209 Views
  • 3 replies
  • 1 kudos

Not able to refresh powerbi dashboar form databricks jobs

I am trying to refresh Power BI Dashboard using Databricks jobs and constantly getting this error, but I am providing optional parameters which includes catalog and database. Also, things to note that I am able to do refresh on Power BI UI using both...

  • 209 Views
  • 3 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @sumitkumar_284 ,Can you provide us more details? Are you using Unity Catalog? Which authentication mechanism you have? In which version of Power BI Desktop you've developed your semantic model/dashboard? Do you meet all below requirements?Publish...

  • 1 kudos
2 More Replies
donlxz
by New Contributor III
  • 145 Views
  • 2 replies
  • 3 kudos

Resolved! Error occurs on create materialized view with spark.sql

When creating materialized view with spark.sql function it returns following error message.[MATERIALIZED_VIEW_OPERATION_NOT_ALLOWED.MV_NOT_ENABLED] The materialized view operation CREATE is not allowed: Materialized view features are not enabled for ...

  • 145 Views
  • 2 replies
  • 3 kudos
Latest Reply
donlxz
New Contributor III
  • 3 kudos

Hi, @szymon_dybczak Thank you for your response.You're right, it was mentioned in the documentation—I missed it when checking.I understand now that it's not possible to do this with spark.sql. Thanks for clarifying!

  • 3 kudos
1 More Replies
fellipeao
by New Contributor III
  • 2061 Views
  • 9 replies
  • 3 kudos

Resolved! How to create parameters that works in Power BI Report Builder (SSRS)

Hello!I'm trying to create an item in Power Bi Report Server (SSRS) connected to Databricks. I can connect normally, but I'm having trouble using a parameter that Databricks recognizes.First, I'll illustrate what I do when I connect to SQL Server and...

fellipeao_0-1747918499426.png fellipeao_1-1747918679264.png fellipeao_2-1747918734966.png fellipeao_3-1747918927934.png
  • 2061 Views
  • 9 replies
  • 3 kudos
Latest Reply
J-Usef
New Contributor II
  • 3 kudos

@fellipeao This is the only way I found that works well with databricks since positional arguments (?) was a fail for me. This is the latest version of paginated report builder.https://learn.microsoft.com/en-us/power-bi/paginated-reports/report-build...

  • 3 kudos
8 More Replies
SuMiT1
by New Contributor III
  • 250 Views
  • 10 replies
  • 3 kudos

Workspace got disabled

 Hi everyone,I was creating a database linked service in ADF, but I got an error: "unauthorized network access to workspace." After that, I went to ADB networking and changed the public setting to "enable"; previously, it was disabled.I think it is b...

  • 250 Views
  • 10 replies
  • 3 kudos
Latest Reply
nayan_wylde
Honored Contributor III
  • 3 kudos

Also adding to @szymon_dybczak reply. You can refer to this github page.https://github.com/databricks/terraform-databricks-sra/tree/main/azureIt has all the biceps and terraform templates. if you are deploying a secured workspace that have security c...

  • 3 kudos
9 More Replies
vim17
by New Contributor II
  • 489 Views
  • 4 replies
  • 0 kudos

Databricks Delta MERGE fails with row filter — “Cannot find column index for attribute 'account_id'”

Problem:I’m getting the below error when performing a MERGE (or any other DML command) on a Delta table with a row filter in Databricks.Error: Cannot find column index for attribute 'account_id#48219' in: Map(transaction_id#47260 -> 5, file_path#4725...

  • 489 Views
  • 4 replies
  • 0 kudos
Latest Reply
Amruth_Ashok
Databricks Employee
  • 0 kudos

Hi @vim17, I see "partitionValues_parsed#47264" in the Error trace. Is the table partitioned, by any chance? Which DBR version are you using?

  • 0 kudos
3 More Replies
ext07_rvoort
by New Contributor II
  • 132 Views
  • 2 replies
  • 1 kudos

Databricks Asset Bundles: issue with python_file path in spark_python_task

Hi,I am trying to run a python file which is stored in src folder. However, I am getting the following error: Error: cannot update job: Invalid python file reference: src/get_git_credentials.py. Please visit the Databricks user guide for supported py...

ext07_rvoort_2-1759997838194.png
Data Engineering
DAB
Databricks Asset Bundles
  • 132 Views
  • 2 replies
  • 1 kudos
Latest Reply
ext07_rvoort
New Contributor II
  • 1 kudos

Hi @szymon_dybczak,Thanks for you reply. The issue is related to the fact if am also providing git_source parameters in this job (even though the source parameter of my task is set to WORKSPACE). When I comment the git_source part, specifying the pat...

  • 1 kudos
1 More Replies
SahabazKhan
by New Contributor
  • 72 Views
  • 1 replies
  • 0 kudos

Unable to login to community edition

Hi All,I am not able to login into "databricks community edition". When I hit sign up its redirecting me into "free edition" as there’s currently no way to create or enable clusters in the Databricks Free Edition. Please suggest someway where I can u...

  • 72 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Databricks Employee
  • 0 kudos

Hello @SahabazKhan! If you’re trying to create a new Community Edition account, that’s not possible, it will redirect you to create a Free Edition account instead. In the Free Edition, you’ll have access to serverless compute and other features. Want...

  • 0 kudos
vamsi_simbus
by New Contributor III
  • 122 Views
  • 8 replies
  • 3 kudos

Error in Viewing the Table

Facing below error while accessing a table with mutiple row filters but i am not able to delete the row filter using SQL query. Please help  Failed to request /ajax-api/2.1/unity-catalog/tables/product_return_prediction_dev.bronze.customers_data?incl...

  • 122 Views
  • 8 replies
  • 3 kudos
Latest Reply
vamsi_simbus
New Contributor III
  • 3 kudos

Hi @pranaav93  @szymon_dybczak  below query worked for me "DROP POLICY city_filter_policy ON product_return_prediction_dev.bronze.customers_data"

  • 3 kudos
7 More Replies
rachelh
by New Contributor
  • 218 Views
  • 5 replies
  • 0 kudos

[INSUFFICIENT_PERMISSIONS] Insufficient privileges: User does not have permission MODIFY on any file

Just wondering if anyone could help me understand why we are hitting this error: `[INSUFFICIENT_PERMISSIONS] Insufficient privileges: User does not have permission MODIFY on any file`A job is trying to create a table with an external location (alread...

  • 218 Views
  • 5 replies
  • 0 kudos
Latest Reply
saurabh18cs
Honored Contributor II
  • 0 kudos

Hi @rachelh As I understand , you need to look for azure access connector setup for your unity catalog because Serverless clusters run under a Azure Databricks-managed identity, not the service principal.Access Connector (Azure Managed Identity): Use...

  • 0 kudos
4 More Replies
saicharandeepb
by New Contributor III
  • 266 Views
  • 2 replies
  • 2 kudos

Capturing Streaming Metrics in Near Real-Time Using Cluster Logs

Over the past few weeks, I’ve been exploring ways to capture streaming metrics from our data load jobs. The goal is to monitor job performance and behavior in real time, without disrupting our existing data load pipelines.Initial Exploration: Streami...

saicharandeepb_0-1760081131866.png
  • 266 Views
  • 2 replies
  • 2 kudos
Latest Reply
Krishna_S
Databricks Employee
  • 2 kudos

Hi @saicharandeepb  Good job on doing such detailed research on monitoring structured streaming. If you need lower latency than rolling log permits, then have you tried this:Cluster-wide listener injection: Use spark.extraListeners to register a cust...

  • 2 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels