Started getting this error while running all the scripts. All the scripts were running fine before. I tried de-attaching and also restart nothing seems to work.Internal error. Attach your notebook to a different cluster or restart the current cluste...
Hi @210573 (Customer)​ ,I got the same error, tried to restart and create a new cluster but the solution does not work. What I did to fix the issue: Instead of putting in function, break the code out to run line by line. I just want to see where the ...
Hello all, I am following the Databrick's documentation on unit testing found here: Run tests with pytest for the Databricks extension for Visual Studio Code - Azure Databricks | Microsoft LearnHowever, when taking it a step further I get an ImportEr...
HelloImport errors happen often with Pytest. To Debug this error you can add this in your "test_myfunction_test.py":import sys
# printing all directories for
# interpreter to search
sys.pathsys.path is a built-in variable within the sys module. I...
Is there a way to make databricks-connector wait for cluster to be running?Details:databricks-connector==13.1.0 and the python minor version of cluster and environment are both 3.10If the cluster is not running this will start it, but any commands af...
Hi @AFox , I want to express my gratitude for your effort in selecting the most suitable solution. It's great to hear that your query has been successfully resolved. Thank you for your contribution.
We have 3 workspaces - 1 old version in one AWS account, 2 latest versions in another.We are PAYG full edition, not using SSO.Our admins (existing DBX users in the `admins` group) can invite new users via the Admin Console from the 1 old and 1 new wo...
We are facing same issue, We are on azure. @AndyAtINX you mean if user exist in workspace with abc@gmail.com we should add the user in workspace2 with abc@gmail.com not ABC@GMAIL.COM. if this the case we tried this and its not working for us.
I am performing some tests with delta tables. For each test, I write a delta table to Azure Blob Storage. Then I manually delete the delta table. After deleting the table and running my code again, I get this error: AnalysisException: [PATH_NOT_FOUN...
Good afternoon,I'm currently going through Module 4 of the Data Engineering Associate pathway, specifically lesson 4.1 - DLT UI Walkthrough. We are instructed to specify the Cluster Policy as 'DBAcademy DLT' when configuring the pipeline. However, th...
I am trying to learn more about Vacuum operation and came across the two properties: delta.deletedFileRetentionDurationdelta.logRetentionDurationSo, let's say I have a delta table where few records/files have been deleted. The delta.deletedFileRetent...
I'm trying this code but getting the following error testDF = (eventsDF
.groupBy("user_id")
.pivot("event_name")
.count("event_name")) TypeError: _api() takes 1 positional argument but 2 were givenPlease guide how to fix...
If anyone has example code for building a CDC live streaming pipeline generated by AWS DMS using import dlt, I'd love to see it.I'm currently able to see the parquet file starting with Load on the first full load to S3 and the cdc parquet file after ...
Hi @rt-slowth ,
Certainly! Let’s explore how to create a Change Data Capture (CDC) live streaming pipeline using Delta Live Tables and AWS Database Migration Service (DMS).
Delta Live Tables and AWS DMS:
Delta Live Tables is an open-source storage ...
Hello, I am trying to generate a DLT but need to use a UDF Table Function in the process. This is what I have so far, everything works (without e CREATE OR REFRESH LIVE TABLE wrapper)```sqlCREATE OR REPLACE FUNCTION silver.portal.get_workflows_from_...
Hi @alexiswl , I want to express my gratitude for your effort in selecting the most suitable solution. It's great to hear that your query has been successfully resolved. Thank you for your contribution.
Hi!What is the maximum amount of tables that is possible to create in a Unity catalog?Is there any difference between managed and external tables? If so, what is the limit for external tables? Thanks,Jonathan.
Hi @JonLaRose , I want to express my gratitude for your effort in selecting the most suitable solution. It's great to hear that your query has been successfully resolved. Thank you for your contribution.
What is best practice for automatically refreshing service princpal PAT in Power BI for a connection to a Databricks dataset? Ideally when the PAT is updated it will automatically be stored in Azure Key Vault, is there a way that Power BI can pick it...
Hi @js54123875 , Certainly! Refreshing a Power BI dataset with a Service Principal and managing PATs can be achieved through a combination of best practices.
Let’s explore some approaches:
Service Principal and Azure Key Vault:
Create a Service ...
Hi @rt-slowth , To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This will also help other community members who may have similar ques...
Hi!I use Databricks in Azure and I find it inconvenient not knowing the last modified user and modified time. How can I trace the history of modified time and user details? Would it be possible to deploy the workflows into higher environments?Thanks!
Sorry! I added another issue at the end without mentioning it was a new issue I encountered. I had challenges in changing the owner of a workflow when I created a workflow. I ended up seeking help from another user with admin privileges to change the...