cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Gustavo_Az
by Contributor
  • 5559 Views
  • 3 replies
  • 3 kudos

Resolved! Error creating external location in Unity Catalog

HelloWhen I try to create an external location I get this error:Failed to access cloud storage: [AbfsRestOperationException] HTTP Error -1CustomTokenProvider getAccessToken threw com.databricks.api.base.DatabricksServiceException : INTERNAL_ERROR: Un...

  • 5559 Views
  • 3 replies
  • 3 kudos
Latest Reply
Gustavo_Az
Contributor
  • 3 kudos

I think I sould have something missconfigured, the way I solved the problem was to re-create the workspace and start from scratch, it was a small one for testing proposes.

  • 3 kudos
2 More Replies
glebex
by New Contributor II
  • 5548 Views
  • 8 replies
  • 8 kudos

Resolved! Accessing workspace files within cluster init script

Greetings all!I am currently facing an issue while accessing workspace files from the init script.As it was explained in the documentation, it is possible to place init script inside workspace files (link). This works perfectly fine and init script i...

  • 5548 Views
  • 8 replies
  • 8 kudos
Latest Reply
jacob_hill_prof
New Contributor II
  • 8 kudos

@Gleb Smolnik​ You might also want to try cloning a github repo in your init script and then storing dependencies like requirements.txt files and other init scripts there. By doing this you can pull a whole slew of init scripts to be utilized in your...

  • 8 kudos
7 More Replies
akshay_patni228
by New Contributor II
  • 3687 Views
  • 2 replies
  • 3 kudos

Missing Credential Scope - Unable to call databrick(Scala) notebook from ADF

Hi Team ,I am using job cluster while setting Linked Service in ADF to call Data bricks Notebook activity .Cluster Detail - Policy - UnrestrictedAccess Mode - Single userUnity Catalog Enabled.databrick run time - 12.2 LTS (includes Apache Spark 3.3.2...

  • 3687 Views
  • 2 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Akshay Patni​ We haven't heard from you since the last response from @Debayan Mukherjee​ ​ . Kindly share the information with us, and in return, we will provide you with the necessary solution. Thanks and Regards

  • 3 kudos
1 More Replies
ros
by New Contributor III
  • 757 Views
  • 2 replies
  • 2 kudos

merge vs MERGE INTO

from 10.4 LTS version we have low shuffle merge, so merge is more faster. But what about MERGE INTO function that we run in sql notebook of databricks. Is there any performance difference when we use databrciks pyspark ".merge" function vs databricks...

  • 757 Views
  • 2 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Roshan RC​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers you...

  • 2 kudos
1 More Replies
signo
by New Contributor II
  • 1272 Views
  • 3 replies
  • 2 kudos

Delta lake schema enforcement allows datatype mismatch on write using MERGE-operation [python]

Databricks Runtime: 12.2 LTS, Spark: 3.3.2, Delta Lake: 2.2.0A target table with schema ([c1: integer, c2: integer]), allows us to write into target table using data with schema ([c1: integer, c2: double]). I expected it to throw an exception (same a...

  • 1272 Views
  • 3 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Sigrun Nordli​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers...

  • 2 kudos
2 More Replies
Gilg
by Contributor II
  • 672 Views
  • 0 replies
  • 0 kudos

Databricks Runtime 12.1 spins VM in Ubuntu 18.04 LTS

Hi Team,Our cluster is currently in DBR 12.1 but it spins up a VMs with Ubuntu 18.04 LTS. 18.04 will be EOL soon. According to this https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/12.1 OS version should be 20.04 and now a bit...

  • 672 Views
  • 0 replies
  • 0 kudos
yalei
by New Contributor
  • 4880 Views
  • 1 replies
  • 0 kudos

leaflet not works in notebook(R language)

I saw this notebook: htmlwidgets-azure - Databricks (microsoft.com)However, it is not reproducible. I got a lot errors:there is no package called ‘R.utils’. This is easy to fix, just install the package "R.utils""can not be unloaded". This is not ...

  • 4880 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @yalei du​, Thank you for contacting us regarding the issues you're experiencing with the Leaflet and Dygraphs packages in your R notebook. I understand you have successfully installed the required packages, but the visualizations are not in the n...

  • 0 kudos
Hitesh_goswami
by New Contributor
  • 648 Views
  • 1 replies
  • 0 kudos

Upgrading Ipython version without changing LTS version

I am using a specific Pydeeque function called ColumnProfilerRunner which is only supported with Spark 3.0.1, so I must use 7.3 LTS. Currently, I am trying to install "great_expectations" library on Python, which requires Ipython version==7.16.3, an...

  • 648 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Hitesh Goswami​ : please check if the below helps!To upgrade the Ipython version on a Databricks 7.3LTS cluster, you can follow these steps:Create a new library installation command using the Databricks CLI by running the following command in your l...

  • 0 kudos
Anjum
by New Contributor II
  • 2868 Views
  • 6 replies
  • 1 kudos

PGP encryption and decryption using gnupg

Hi,We are using python-gnupg==0.4.8 package for encryption and decryption and this was working as expected when we are using Databricks runtime : 9.1 LTS but when we upgarded our runtime to 12.1, it stopped working with error "gnupghome should be a d...

  • 2868 Views
  • 6 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Anjum Aara​ Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we...

  • 1 kudos
5 More Replies
MetaRossiVinli
by Contributor
  • 2688 Views
  • 1 replies
  • 1 kudos

Resolved! Find root path to Repo for .py file import

I want to import a Python function stored in the following file path:`<repo>/lib/lib_helpers.py`I want to import the function from any file in my repo. For instance from these:`<repo>/notebooks/etl/bronze/dlt_bronze_elt``<repo>/workers/job_worker`It ...

  • 2688 Views
  • 1 replies
  • 1 kudos
Latest Reply
MetaRossiVinli
Contributor
  • 1 kudos

Ok, I figured it out. If you just make it a Python module by adding an empty `__init__.py`, Databricks will load it on start. Then, you can just import it.

  • 1 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 588 Views
  • 1 replies
  • 5 kudos

Exciting news for #azure users! The #databricks runtime 12.2 has been officially released as a long-term support (LTS) version, providing a stable and...

Exciting news for #azure users! The #databricks runtime 12.2 has been officially released as a long-term support (LTS) version, providing a stable and reliable platform for users to build and deploy their applications. As part of this release, the en...

122
  • 588 Views
  • 1 replies
  • 5 kudos
Latest Reply
Kaniz
Community Manager
  • 5 kudos

Hi @Hubert Dudek​,We express appreciation for the informative content you've contributed to our community.Your posts have sparked engaging discussions and proven invaluable resources for our members.You've truly made a difference in our community, an...

  • 5 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 822 Views
  • 1 replies
  • 6 kudos

Exciting news for #azure users! The #databricks runtime 12.2 has been officially released as a long-term support (LTS) version, providing a stable and...

Exciting news for #azure users! The #databricks runtime 12.2 has been officially released as a long-term support (LTS) version, providing a stable and reliable platform for users to build and deploy their applications. As part of this release, the en...

122
  • 822 Views
  • 1 replies
  • 6 kudos
Latest Reply
jose_gonzalez
Moderator
  • 6 kudos

Thank you for sharing @Hubert Dudek​ !!!

  • 6 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 824 Views
  • 1 replies
  • 7 kudos

Starting from #databricks 12.2 LTS, the explode function can be used in the FROM statement to manipulate data in new and powerful ways. This function ...

Starting from #databricks 12.2 LTS, the explode function can be used in the FROM statement to manipulate data in new and powerful ways. This function takes an array column as input and returns a new row for each element in the array, offering new pos...

ezgif-3-f42040b788
  • 824 Views
  • 1 replies
  • 7 kudos
Latest Reply
jose_gonzalez
Moderator
  • 7 kudos

Thank you for sharing @Hubert Dudek​ 

  • 7 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 528 Views
  • 1 replies
  • 5 kudos

Starting from #databricks runtime 12.2 LTS, implicit lateral column aliasing is now supported. This feature enables you to reuse an expression defined...

Starting from #databricks runtime 12.2 LTS, implicit lateral column aliasing is now supported. This feature enables you to reuse an expression defined earlier in the same SELECT list, thus avoiding repetition of the same calculation.For instance, in ...

ezgif-3-d3fac0139c
  • 528 Views
  • 1 replies
  • 5 kudos
Latest Reply
Anonymous
Not applicable
  • 5 kudos

Thanks for sharing this with the Databricks community.

  • 5 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 654 Views
  • 1 replies
  • 7 kudos

Starting from databricks 12.2 LTS, the explode function can be used in the FROM statement to manipulate data in new and powerful ways. This function t...

Starting from databricks 12.2 LTS, the explode function can be used in the FROM statement to manipulate data in new and powerful ways. This function takes an array column as input and returns a new row for each element in the array, offering new poss...

ezgif-3-f42040b788
  • 654 Views
  • 1 replies
  • 7 kudos
Latest Reply
Ajay-Pandey
Esteemed Contributor III
  • 7 kudos

It's very useful for SQL developers.

  • 7 kudos
Labels