cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

jfarmer
by New Contributor II
  • 3256 Views
  • 3 replies
  • 1 kudos

PermissionError / Operation not Permitted with Files-in-Repos

I've been running a notebook using files-in-repo. Previously this has worked fine. I'm unsure what's changed (I was testing integration with DCS on older runtimes, but don't think I made any persistent changes)--but now it's throwing an error (always...

image image
  • 3256 Views
  • 3 replies
  • 1 kudos
Latest Reply
_carleto_
New Contributor II
  • 1 kudos

Hi @jfarmer , did you solved this issue? I'm having exactly the same challenge.Thanks!

  • 1 kudos
2 More Replies
parthsalvi
by Contributor
  • 1413 Views
  • 1 replies
  • 2 kudos

Amazon SES : boto3 credentials not found. DBR 11.2 Shared mode

We're trying to send email using Amazon SES using boto3.client in python. We've added SES Full access in clusters IAM Role.We were able to send email in "No isolation shared" mode in DBR 11.2 using ses = boto3.client('ses', region_name='us-****-2') s...

image
  • 1413 Views
  • 1 replies
  • 2 kudos
Latest Reply
JameDavi_51481
New Contributor III
  • 2 kudos

This appears to be an intentional design choice to prevent users from using the credentials of the host machine to carry out arbitrary AWS API calls. I really wish there was a workaround or setting to disable this behavior because we put a lot of wor...

  • 2 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 2115 Views
  • 3 replies
  • 25 kudos

Bamboolib with databricks, low-code programming is now available on #databricks Now you can prepare your databricks code without ... coding. Low code ...

Bamboolib with databricks, low-code programming is now available on #databricksNow you can prepare your databricks code without ... coding. Low code solution is now available on Databricks. Install and import bamboolib to start (require a version of ...

Picture2 Picture3 bamboolib Picture4
  • 2115 Views
  • 3 replies
  • 25 kudos
Latest Reply
Palkers
New Contributor II
  • 25 kudos

I have tried to load parquet file using bamboolib menu, and getting below error that path does not existI can load the same file without no problem using spark or pandas using following pathciti_pdf = pd.read_parquet(f'/dbfs/mnt/orbify-sales-raw/Wide...

  • 25 kudos
2 More Replies
parthsalvi
by Contributor
  • 2209 Views
  • 2 replies
  • 1 kudos

getContext() in dbutils.notebook not working in DBR 11.2 10.4 LTS Shared Mode It's also working in no isolation Mode in DBR 11.2

We are trying to fetch notebook context in our Job logging workflow.current_context = dbutils.notebook.entry_point.getDbutils().notebook().getContext().toJson()We were able to access this in DBR 10.4 custom mode but in DBR 10.4 & 11.2 (Shared Mode) w...

image
  • 2209 Views
  • 2 replies
  • 1 kudos
Latest Reply
Tjomme
New Contributor III
  • 1 kudos

See also: https://community.databricks.com/s/question/0D58Y00009t95NHSAY/unity-catalog-shared-access-mode-dbutilsnotebookentrypointgetcontext-not-whitelisted

  • 1 kudos
1 More Replies
Gilg
by Contributor II
  • 611 Views
  • 0 replies
  • 0 kudos

Databricks Runtime 12.1 spins VM in Ubuntu 18.04 LTS

Hi Team,Our cluster is currently in DBR 12.1 but it spins up a VMs with Ubuntu 18.04 LTS. 18.04 will be EOL soon. According to this https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/12.1 OS version should be 20.04 and now a bit...

  • 611 Views
  • 0 replies
  • 0 kudos
ivanychev
by Contributor
  • 2908 Views
  • 7 replies
  • 5 kudos

DBR 12.2: DeltaOptimizedWriter: Resolved attribute(s) missing from in operator

After upgrading from DBR 11.3 LTS to DBR 12.2 LTS we started to observe the following error during "read from parquet and write to delta" piece of logic.AnalysisException: Resolved attribute(s) group_id#72,display_name#73,parent_id#74,path#75,path_li...

  • 2908 Views
  • 7 replies
  • 5 kudos
Latest Reply
Valtor
New Contributor II
  • 5 kudos

I can confirm that this issue is resolved for us as well in the latest 12.2 release.

  • 5 kudos
6 More Replies
HamidHamid_Mora
by New Contributor II
  • 1212 Views
  • 2 replies
  • 2 kudos

ganglia is unavailable on DBR 13.0

We created a library in databricks to ingest ganglia metrics for all jobs in our delta tables;However end point 8652 is no more available on DBR 13.0is there any other endpoint available ? since we need to log all metrics for all executed jobs not on...

  • 1212 Views
  • 2 replies
  • 2 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 2 kudos

Ganglia is only supported on Databricks Runtime versions 12 and below. From Databricks Runtime 13, Ganglia is replaced by a new Databricks metrics system offering more features and integrations. To export metrics to external services, you can use Dat...

  • 2 kudos
1 More Replies
gud4eve
by New Contributor III
  • 1263 Views
  • 1 replies
  • 0 kudos

Resolved! Scala app getting NullPointerException while migrating from DBR 7.3 to 9.1 (and above)

We are migrating our Scala jobs from AWS EMR (6.2.1 and Spark version - 3.0.1) to Lakehouse and few of our jobs are failing due to NullPointerException. We tried in Databricks Runtime 7.3 LTS, it is working fine. Because it had same spark version 3.0...

  • 1263 Views
  • 1 replies
  • 0 kudos
Latest Reply
gud4eve
New Contributor III
  • 0 kudos

In one of my code statements, I updated scala Boolean to java.lang.Boolean and this is working fine now. May be in new newer Spark versions, null in scala Boolean isn't supported.

  • 0 kudos
kll
by New Contributor III
  • 1129 Views
  • 2 replies
  • 0 kudos

Unable to render widget to display map within Jupyter notebook output cell

I am attempting to render a map within jupyter notebook and keep bumping into output limit. Below is my code: import pydeck as pdk   import pandas as pd       COLOR_BREWER_BLUE_SCALE = [   [240, 249, 232],   [204, 235, 197],   [168, 221, 181], ...

  • 1129 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Keval Shah​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feedback w...

  • 0 kudos
1 More Replies
JordiDekker
by New Contributor III
  • 1397 Views
  • 5 replies
  • 6 kudos

StreamCorruptedException, databricks-connect 9.1

Last week, around the 21st of march, we started having issues with databricks-connect (DBR 9.1 LTS). "databricks-connect test" works, but the following code snippet:from pyspark.sql import SparkSession     spark = SparkSession.builder.getOrCreate() s...

  • 1397 Views
  • 5 replies
  • 6 kudos
Latest Reply
Anonymous
Not applicable
  • 6 kudos

Hi @Jordi Dekker​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers ...

  • 6 kudos
4 More Replies
matthewe97
by New Contributor
  • 1549 Views
  • 3 replies
  • 2 kudos

Resolved! Are window functions more performant than self joins?

I have a table with data for each month end and want to know the LEAD and LAG data points either side of each month. For example:SELECT month_date, LEAD(month_date) OVER (PARTITION BY id ORDER BY month_date) next_month_date,  LAG(month_date) OVER (PA...

  • 1549 Views
  • 3 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Matthew Elsham​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answer...

  • 2 kudos
2 More Replies
ABVectr
by New Contributor III
  • 1582 Views
  • 6 replies
  • 1 kudos

Resolved! Maven Package install failing on DBR 11.3 LTS

Hi Databricks Community,I ran into the following issue when setting up a new cluster with the latest LTS Databricks runtime (11.3). When trying to install the package with the coordinates com.microsoft.azure.kusto:kusto-spark_3.0_2.12:3.1.4 from Mave...

  • 1582 Views
  • 6 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Andrei Bondarenko​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you....

  • 1 kudos
5 More Replies
SS0201
by New Contributor II
  • 683 Views
  • 1 replies
  • 0 kudos

Unable to connect to Azure Cosmos DB Cassandra API table using Azure databricks job

Getting below error:Query [id = , runId = ] terminated with exception: Failed to open native connection to Cassandra at {<name>.cassandra.cosmosdb.azure.com:10350} :: Method com/microsoft/azure/cosmosdb/cassandra/CosmosDbConnectionFactory$.createSess...

  • 683 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 0 kudos

Hi, The error looks like there is an issue connecting to csms-ws-ddicsg-dev-001.cassandra.cosmosdb.azure.com:10350. Could you please reverify this in networking config? Also, it will be helpful if you raise an Azure case simultaneously to check the n...

  • 0 kudos
Databricks_-Dat
by New Contributor II
  • 1384 Views
  • 2 replies
  • 4 kudos

what is the supported mssql connector for Databricks runtime 11.3LTS Scala 2.12 Spark 3.3.0?

We were using mssql connector -com.microsoft.azure:spark-mssql-connector_2.12_3.0:1.0.0-alpha with 10.3LTS DBR. As we need to upgrade to higher version of DBR to make use of new functions like unpivot/melt in the notebooks. -com.microsoft.azure:spark...

  • 1384 Views
  • 2 replies
  • 4 kudos
Latest Reply
ranged_coop
Valued Contributor II
  • 4 kudos

Is the spark 3.3 series even supported by the connector yet ?As per the [github link](https://github.com/microsoft/sql-spark-connector#current-releases) - assuming this is the library you are trying to use ?The latest Spark 2.4.x compatible connector...

  • 4 kudos
1 More Replies
yousry
by New Contributor II
  • 1350 Views
  • 2 replies
  • 2 kudos

Resolved! What is the best way to find deltalake version on OSS and Databricks at runtime?

To identify certain deltalake features available on a certain installation, it is important to have a robust way to identify deltalake version. For OSS, I found that the below Scala snippet will do the job.import io.delta println(io.delta.VERSION)Not...

  • 1350 Views
  • 2 replies
  • 2 kudos
Latest Reply
shan_chandra
Honored Contributor III
  • 2 kudos

@Yousry Mohamed​ - could you please check the DBR runtime release notes for the Delta lake API compatibility matrix section ( DBR version vs Delta lake compatible version) for the mapping.Reference: https://docs.databricks.com/release-notes/runtime/r...

  • 2 kudos
1 More Replies
Labels