cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

nk76
by New Contributor III
  • 3737 Views
  • 11 replies
  • 5 kudos

Resolved! Custom library import fails randomly with error: not found: value it

Hello,I have an issue with the import of a custom library, in Azure Databricks.(roughly) 95% of the times it works fine, but sometimes it fails.I searched the internet and this community with no luck, so far.It is a scala library in a scala notebook,...

  • 3737 Views
  • 11 replies
  • 5 kudos
Latest Reply
Naskar
New Contributor II
  • 5 kudos

Even I also encountered the same error. While Importing a file getting an error as "Import failed with error: Could not deserialize: Exceeded 16777216 bytes (current = 16778609)"

  • 5 kudos
10 More Replies
rrussell25
by New Contributor
  • 686 Views
  • 1 replies
  • 0 kudos

Read arguments in a scala note invoked by a job.

In a scala note, how to I read input arguments (e.g. those proved by a job that runs a scala notebook). In python, dbutils.notebook.entry_point.getCurrentBindings() works. How about for scala.

  • 686 Views
  • 1 replies
  • 0 kudos
Latest Reply
UmaMahesh1
Honored Contributor III
  • 0 kudos

Hi @Robert Russell​ You can use dbutils.notebook.getContext.currentRunId in scala notebooks. Other methods are also available likedbutils.notebook.getContext.jobGroupdbutils.notebook.getContext.rootRunId dbutils.notebook.getContext.tags etc...You ...

  • 0 kudos
archanarddy
by New Contributor
  • 677 Views
  • 0 replies
  • 0 kudos

metastore is down

I am trying to run a scala notebook, but my job just spins and says Metastore is down. Can someone help me. Thanks in advance.

  • 677 Views
  • 0 replies
  • 0 kudos
齐木木
by New Contributor III
  • 714 Views
  • 1 replies
  • 3 kudos

Resolved! The case class reports an error when running in the notebook

As shown in the figure, the case class and the json string are converted through fasterxml.jackson, but an unexpected error occurred during the running of the code. I think this problem may be related to the loading principle of the notebook. Because...

image.png local image
  • 714 Views
  • 1 replies
  • 3 kudos
Latest Reply
齐木木
New Contributor III
  • 3 kudos

code:var str="{\"app_type\":\"installed-app\"}" import com.fasterxml.jackson.databind.ObjectMapper import com.fasterxml.jackson.module.scala.DefaultScalaModule val mapper = new ObjectMapper() mapper.registerModule(DefaultScalaModule) ...

  • 3 kudos
sai_731566
by New Contributor II
  • 1303 Views
  • 2 replies
  • 0 kudos

How to pass parameters/arguments to shell script from scala in databricks.

I was running shell scrip in data bricks using %sh magic command.I am having requirement where I need to pass parameters/arguments to the script. Is there any way we can get this done with scala as base language.

  • 1303 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @sai m​ , Can you please check this community thread over here?

  • 0 kudos
1 More Replies
sriwin
by New Contributor
  • 1869 Views
  • 2 replies
  • 0 kudos

Resolved! Create gpg file and save to AWS s3 storage in scala

Hi - Could you please help me on how can I create a scala notebook to perform the below tasksEncrypt a text file using the gpgUpload the file to amazon s3 storageverify the file exists in amazon s3decrypt the encrypted file to verify no issuesApprec...

  • 1869 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @sriwin p​ , There is encrypt/decrypt file test case code from PR to the https://github.com/sbt/sbt-pgp repository. It provides an example of usage of PGP file encryption/decryption with :package com.jsuereth.pgp   import org.specs2.mutable._ im...

  • 0 kudos
1 More Replies
tarente
by New Contributor III
  • 1532 Views
  • 6 replies
  • 5 kudos

Resolved! How to implement the where not exists pattern in scala?

I have a dataframe with the following columns:Key1Key2Y_N_ColCol1Col2For the key tuple (Key1, Key2), I have rows with Y_N_Col = "Y" and Y_N_Col = "N".I need a new dataframe with all rows with Y_N_Col = "Y" (regardless of the key tuple), plus all Y_N_...

  • 1532 Views
  • 6 replies
  • 5 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 5 kudos

I'd use a left-anti join.So create a df with all the Y, then create a df with all the N and do a left_anti join (on key1 and key2) on the df with the Y.then a union of those two.

  • 5 kudos
5 More Replies
User15787040559
by New Contributor III
  • 1435 Views
  • 2 replies
  • 1 kudos

How can I get Databricks notebooks to stop cutting off the explain plans?

(since Spark 3.0)Dataset.queryExecution.debug.toFilewill dump the full plan to a file, without concatenating the output as a fully materialized Java string in memory.

  • 1435 Views
  • 2 replies
  • 1 kudos
Latest Reply
dazfuller
Contributor III
  • 1 kudos

Notebooks really aren't the best method of viewing large files. Two methods you could employ areSave the file to dbfs and then use databricks CLI to download the fileUse the web terminalIn the web terminal option you can do something like "cat my_lar...

  • 1 kudos
1 More Replies
tarente
by New Contributor III
  • 699 Views
  • 2 replies
  • 3 kudos

Resolved! How to create a csv using a Scala notebook that as " in some columns?

In a project we use Azure Databricks to create csv files to be loaded in ThoughtSpot.Below is a sample to the code I use to write the file:val fileRepartition = 1 val fileFormat = "csv" val fileSaveMode = "overwrite" var fileOptions = Map ( ...

  • 699 Views
  • 2 replies
  • 3 kudos
Latest Reply
tarente
New Contributor III
  • 3 kudos

Hi Shan,Thanks for the link.I now know more options for creating different csv files.I have not yet completed the problem, but that is related with a destination application (ThoughtSpot) not being able to load the data in the csv file correctly.Rega...

  • 3 kudos
1 More Replies
Zircoz
by New Contributor II
  • 10307 Views
  • 2 replies
  • 6 kudos

Resolved! Can we access the variables created in Python in Scala's code or notebook ?

If I have a dict created in python on a Scala notebook (using magic word ofcourse):%python d1 = {1: "a", 2:"b", 3:"c"}Can I access this d1 in Scala ?I tried the following and it returns d1 not found:%scala println(d1)

  • 10307 Views
  • 2 replies
  • 6 kudos
Latest Reply
cpm1
New Contributor II
  • 6 kudos

Martin is correct. We could only access the external files and objects. In most of our cases, we just use temporary views to pass data between R & Python.https://docs.databricks.com/notebooks/notebooks-use.html#mix-languages

  • 6 kudos
1 More Replies
Labels