cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

niruban
by New Contributor II
  • 78 Views
  • 1 replies
  • 0 kudos

Migrate a notebook that reside in workspace using Databricks Asset Bundle

Hello Community Folks -Did anyone implemented migration of notebooks that is in workspace to production databricks workspace using Databricks Asset Bundle? If so can you please help me with any documentation which I can refer? Thanks!!RegardsNiruban ...

  • 78 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @niruban, Migrating notebooks from one Databricks workspace to another using Databricks Asset Bundles is a useful approach. Let me guide you through the process and provide relevant documentation. Databricks Asset Bundles Overview: Databricks ...

  • 0 kudos
Oliver_Angelil
by Valued Contributor II
  • 103 Views
  • 1 replies
  • 0 kudos

Append-only table from non-streaming source in Delta Live Tables

I have a DLT pipeline, where all tables are non-streaming (materialized views), except for the last one, which needs to be append-only, and is therefore defined as a streaming table.The pipeline runs successfully on the first run. However on the seco...

  • 103 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Oliver_Angelil, It appears that you’re encountering an issue with your DLT (Databricks Delta Live Tables) pipeline, specifically related to having an append-only table at the end of the pipeline. Let’s explore some potential solutions: Stream...

  • 0 kudos
BerkerKozan
by New Contributor III
  • 73 Views
  • 1 replies
  • 0 kudos

Using AAD Spn on AWS Databricks

I use AWS Databricks which has an SSO&Scim integration with AAD. I generated an SPN in AAD, synced it to Databricks, and want to use this SPN with using AAD client secrets to use Databricks SDK. But it doesnt work. I dont want to generate another tok...

  • 73 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @BerkerKozan, It sounds like you’re trying to set up provisioning to Databricks using Microsoft Entra ID (formerly known as Azure Active Directory) and encountering some issues. Let’s break down the steps and address your concerns: Provisionin...

  • 0 kudos
sasi2
by New Contributor II
  • 246 Views
  • 1 replies
  • 0 kudos

Connecting to MuleSoft from Databricks

Hi, Is there any connectivity pipeline established already to access MuleSoft or AnyPoint exchange data using Databricks. I have seen many options to access databricks data in mulesoft but can we read the data from Mulesoft into databricks. Please gi...

  • 246 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

  Hi @sasi2, Connecting MuleSoft or AnyPoint to exchange data with Databricks is possible, and there are several options you can explore. Let’s dive into some solutions: Using JDBC Driver for Databricks in Mule Applications: The CData JDBC Driver...

  • 0 kudos
MartinH
by New Contributor II
  • 2477 Views
  • 7 replies
  • 4 kudos

Azure Data Factory and Photon

Hello, we have Databricks Python workbooks accessing Delta tables. These workbooks are scheduled/invoked by Azure Data Factory. How can I enable Photon on the linked services that are used to call Databricks?If I specify new job cluster, there does n...

  • 2477 Views
  • 7 replies
  • 4 kudos
Latest Reply
CharlesReily
New Contributor III
  • 4 kudos

When you create a cluster on Databricks, you can enable Photon by selecting the "Photon" option in the cluster configuration settings. This is typically done when creating a new cluster, and you would find the option in the advanced cluster configura...

  • 4 kudos
6 More Replies
subha2
by New Contributor II
  • 340 Views
  • 1 replies
  • 0 kudos

Not able to read tables in Unity Catalog parallel

There are some tables under schema/database under Unity Catalog.The Notebook need to read the table parallel using loop and thread and execute the query configuredBut the sql statement is not getting executed via spark.sql() or spark.read.table().It ...

  • 340 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @subha2, It seems you’re encountering an issue related to executing SQL statements in Spark. Let’s troubleshoot this step by step: Check the Unity Catalog Configuration: Verify that the Unity Catalog configuration is correctly set up. Ensure t...

  • 0 kudos
DBX-2024
by New Contributor
  • 79 Views
  • 1 replies
  • 0 kudos

Job Cluster's CPU utilization goes higher than 100% few times during the workload run

I have Data Engineering Pipeline workload that run on Databricks.Job cluster has following configuration :- Worker  i3.4xlarge with 122 GB memory and 16 coresDriver i3.4xlarge with 122 GB memory and 16 cores ,Min Worker -4 and Max Worker 8 We noticed...

Data Engineering
Databricks
  • 79 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @DBX-2024, Let’s break down your questions: High CPU Utilization Spikes: Are They Problematic? High CPU utilization spikes can be problematic depending on the context. Here are some considerations: Normal Behavior: It’s common for CPU utilizat...

  • 0 kudos
smukhi
by New Contributor
  • 82 Views
  • 1 replies
  • 0 kudos

Encountering Error UNITY_CREDENTIAL_SCOPE_MISSING_SCOPE

As of this morning we started receiving the following error message on a Databricks job with a single Pyspark Notebook task. The job has not had any code changes in 2 months. The cluster configuration has also not changed. The last successful run of ...

  • 82 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @smukhi, The error message you’re encountering, specifically the “Py4JJavaError” with the “Missing Credential Scope” issue, can be quite puzzling. Let’s explore some potential solutions and ideas to troubleshoot this problem: Check Cluster Con...

  • 0 kudos
Skr7
by New Contributor
  • 7 Views
  • 0 replies
  • 0 kudos

Databricks Asset Bundles

Hi, I'm implementing Databricks Asset bundles, my scripts are in GitHub and my /resource has all the .yml of my Databricks workflow which are pointing to the main branch      git_source: git_url: https://github.com/xxxx git_provider: ...

Data Engineering
Databricks
  • 7 Views
  • 0 replies
  • 0 kudos
gabe123
by New Contributor
  • 237 Views
  • 1 replies
  • 0 kudos

Strange Error with custom module in delta live table pipeline

The chunk of code in questionsys.path.append( spark.conf.get("util_path", "/Workspace/Repos/Production/loch-ness/utils/") ) from broker_utils import extract_day_with_suffix, proper_case_address_udf, proper_case_last_name_first_udf, proper_case_ud...

  • 237 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @gabe123 , It seems like you’re encountering a ModuleNotFoundError when trying to import the broker_utils module in your Python code. Let’s troubleshoot this issue step by step: Check Module Location: First, ensure that the broker_utils.py fil...

  • 0 kudos
lieber_augustin
by New Contributor
  • 102 Views
  • 1 replies
  • 0 kudos

Reading from one Postgres table result in several Scan JDBCRelation operations

Hello,I am working on a Spark job where I'm reading several tables from PostgreSQL into DataFrames as follows: df = (spark.read .format("postgresql") .option("query", query) .option("host", database_host) .option("port...

  • 102 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @lieber_augustin, Optimizing the performance of your PostgreSQL queries involves several considerations. Let’s address both the potential optimizations and the reason behind multiple Scan JDBCRelation operations. Database Design: Properly des...

  • 0 kudos
Husky
by New Contributor III
  • 1203 Views
  • 4 replies
  • 1 kudos

Resolved! Upload file from local file system to Unity Catalog Volume (via databricks-connect)

Context:IDE: IntelliJ 2023.3.2Library: databricks-connect 13.3Python: 3.10Description:I develop notebooks and python scripts locally in the IDE and I connect to the spark cluster via databricks-connect for a better developer experience.  I download a...

  • 1203 Views
  • 4 replies
  • 1 kudos
Latest Reply
lathaniel
New Contributor III
  • 1 kudos

Late to the discussion, but I too was looking for a way to do this _programmatically_, as opposed to the UI.The solution I landed on was using the Python SDK (though you could assuredly do this using an API request instead if you're not in Python):w ...

  • 1 kudos
3 More Replies
jainshasha
by New Contributor
  • 83 Views
  • 4 replies
  • 0 kudos

Job Cluster in Databricks workflow

Hi,I have configured 20 different workflows in Databricks. All of them configured with job cluster with different name. All 20 workfldows scheduled to run at same time. But even configuring different job cluster in all of them they run sequentially w...

  • 83 Views
  • 4 replies
  • 0 kudos
Latest Reply
jainshasha
New Contributor
  • 0 kudos

Hi @Kaniz Attaching the screenshots of 5 of the workflows which schedule at same time

  • 0 kudos
3 More Replies
dbdude
by New Contributor II
  • 4513 Views
  • 4 replies
  • 0 kudos

AWS Secrets Works In One Cluster But Not Another

Why can I use boto3 to go to secrets manager to retrieve a secret with a personal cluster but I get an error with a shared cluster?NoCredentialsError: Unable to locate credentials 

  • 4513 Views
  • 4 replies
  • 0 kudos
Latest Reply
Husky
New Contributor III
  • 0 kudos

Hey @dbdude, I am facing the same error. Did you find a solution to access the AWS credentials on a Shared Cluster?This article describes a way of storing credentials in a Unity Catalog Volume to fetch by the Shared Cluster:https://medium.com/@amluci...

  • 0 kudos
3 More Replies
madhumitha
by Visitor
  • 51 Views
  • 2 replies
  • 0 kudos

Connect power bi desktop semantic model output to databricks

Hello, I am trying to connect the power bi semantic model output (basically the data that has already been pre processed) to databricks. Does anybody know how to do this? I would like it to be an automated process so I would like to know any way to p...

  • 51 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @madhumitha, Connecting Power BI semantic model output to Databricks can be done in a few steps. Here are a couple of options: Databricks Power Query Connector: The new Databricks connector is natively integrated into Power BI. You can configu...

  • 0 kudos
1 More Replies
Labels
Top Kudoed Authors