by
nk76
• New Contributor III
- 8537 Views
- 7 replies
- 5 kudos
Hello,I have an issue with the import of a custom library, in Azure Databricks.(roughly) 95% of the times it works fine, but sometimes it fails.I searched the internet and this community with no luck, so far.It is a scala library in a scala notebook,...
- 8537 Views
- 7 replies
- 5 kudos
Latest Reply
Even I also encountered the same error. While Importing a file getting an error as "Import failed with error: Could not deserialize: Exceeded 16777216 bytes (current = 16778609)"
6 More Replies
- 1401 Views
- 1 replies
- 0 kudos
In a scala note, how to I read input arguments (e.g. those proved by a job that runs a scala notebook). In python, dbutils.notebook.entry_point.getCurrentBindings() works. How about for scala.
- 1401 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @Robert Russell You can use dbutils.notebook.getContext.currentRunId in scala notebooks. Other methods are also available likedbutils.notebook.getContext.jobGroupdbutils.notebook.getContext.rootRunId dbutils.notebook.getContext.tags etc...You ...
- 1231 Views
- 0 replies
- 0 kudos
I am trying to run a scala notebook, but my job just spins and says Metastore is down. Can someone help me. Thanks in advance.
- 1231 Views
- 0 replies
- 0 kudos
by
齐木木
• New Contributor III
- 1956 Views
- 1 replies
- 3 kudos
- 1956 Views
- 1 replies
- 3 kudos
Latest Reply
code:var str="{\"app_type\":\"installed-app\"}"
import com.fasterxml.jackson.databind.ObjectMapper
import com.fasterxml.jackson.module.scala.DefaultScalaModule
val mapper = new ObjectMapper()
mapper.registerModule(DefaultScalaModule)
...
- 2433 Views
- 1 replies
- 0 kudos
I was running shell scrip in data bricks using %sh magic command.I am having requirement where I need to pass parameters/arguments to the script. Is there any way we can get this done with scala as base language.
- 2433 Views
- 1 replies
- 0 kudos
- 3004 Views
- 1 replies
- 0 kudos
Hi - Could you please help me on how can I create a scala notebook to perform the below tasksEncrypt a text file using the gpgUpload the file to amazon s3 storageverify the file exists in amazon s3decrypt the encrypted file to verify no issuesApprec...
- 3004 Views
- 1 replies
- 0 kudos
Latest Reply
Hello! My name is Piper and I'm a community moderator for Databricks. Thanks for your question. Let's give it a bit more to see what our members have to say. If not, we'll circle back around.
- 3969 Views
- 6 replies
- 5 kudos
I have a dataframe with the following columns:Key1Key2Y_N_ColCol1Col2For the key tuple (Key1, Key2), I have rows with Y_N_Col = "Y" and Y_N_Col = "N".I need a new dataframe with all rows with Y_N_Col = "Y" (regardless of the key tuple), plus all Y_N_...
- 3969 Views
- 6 replies
- 5 kudos
Latest Reply
I'd use a left-anti join.So create a df with all the Y, then create a df with all the N and do a left_anti join (on key1 and key2) on the df with the Y.then a union of those two.
5 More Replies
- 2378 Views
- 1 replies
- 1 kudos
(since Spark 3.0)Dataset.queryExecution.debug.toFilewill dump the full plan to a file, without concatenating the output as a fully materialized Java string in memory.
- 2378 Views
- 1 replies
- 1 kudos
Latest Reply
Notebooks really aren't the best method of viewing large files. Two methods you could employ areSave the file to dbfs and then use databricks CLI to download the fileUse the web terminalIn the web terminal option you can do something like "cat my_lar...
- 1666 Views
- 2 replies
- 3 kudos
In a project we use Azure Databricks to create csv files to be loaded in ThoughtSpot.Below is a sample to the code I use to write the file:val fileRepartition = 1
val fileFormat = "csv"
val fileSaveMode = "overwrite"
var fileOptions = Map (
...
- 1666 Views
- 2 replies
- 3 kudos
Latest Reply
Hi Shan,Thanks for the link.I now know more options for creating different csv files.I have not yet completed the problem, but that is related with a destination application (ThoughtSpot) not being able to load the data in the csv file correctly.Rega...
1 More Replies
by
Zircoz
• New Contributor II
- 13897 Views
- 2 replies
- 6 kudos
If I have a dict created in python on a Scala notebook (using magic word ofcourse):%python
d1 = {1: "a", 2:"b", 3:"c"}Can I access this d1 in Scala ?I tried the following and it returns d1 not found:%scala
println(d1)
- 13897 Views
- 2 replies
- 6 kudos
Latest Reply
Martin is correct. We could only access the external files and objects. In most of our cases, we just use temporary views to pass data between R & Python.https://docs.databricks.com/notebooks/notebooks-use.html#mix-languages
1 More Replies
by
saqib
• New Contributor II
- 14332 Views
- 5 replies
- 2 kudos
Do Databricks Scala Notebooks support any sort of markup/markdown?
- 14332 Views
- 5 replies
- 2 kudos