cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

zyang
by Contributor
  • 4837 Views
  • 13 replies
  • 13 kudos

Option "delta.columnMapping.mode","name" introduces unexpected result

Hi, I am trying to write and create a delta table by enable "delta.columnMapping.mode","name", and the partition is date. But I found that when I enable this option, the partition folder name is not date any more while it is some random two letters.A...

image
  • 4837 Views
  • 13 replies
  • 13 kudos
Latest Reply
CkoockieMonster
New Contributor II
  • 13 kudos

Hello, I'm a bit late to the party, but I'll put that for posterity:There's a way to rename your weird two letter named folders and still have your table working, but it violates the good practices guidelines suggested by Data Bricks, and I don't thi...

  • 13 kudos
12 More Replies
auser85
by New Contributor III
  • 2260 Views
  • 1 replies
  • 1 kudos

How to incorporate these GC options into my Databricks Cluster? )(spark.executor.extraJavaOptions)

I want to try incorporating these options into my databricks cluster.spark.driver.extraJavaOptions -XX:+UseG1GC -XX:+G1SummarizeConcMark spark.executor.extraJavaOptions -XX:+UseG1GC -XX:+G1SummarizeConcMarkIf I put them under Compute -> Cluster -> Co...

  • 2260 Views
  • 1 replies
  • 1 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 1 kudos

hey @Andrew Fogarty​ , I think this is only for the spark-submit command, not for cluster UI.Please have a look at this doc - http://progexc.blogspot.com/2014/12/spark-configuration-mess-solved.htmlspark.executor.extraJavaOptionsA string of extra JVM...

  • 1 kudos
Chris_Shehu
by Valued Contributor III
  • 666 Views
  • 1 replies
  • 2 kudos

What are the options for extracting data from the delta lake for a vendor?

Our vendor is looking to use Microsoft API Manager to retrieve data from a variety of sources. Is it possible to extract records from the delta lake by using an API?What I've tried:I reviewed the available databricks API's it looks like most of them ...

  • 666 Views
  • 1 replies
  • 2 kudos
Latest Reply
Chris_Shehu
Valued Contributor III
  • 2 kudos

Another possibility for this potentially is to stand up a cluster and have a notebook running flask to create an API interface. I'm still looking into options, but it seems like there should be a baked in solution besides the JDBC connector. I'm not ...

  • 2 kudos
Tripalink
by New Contributor III
  • 2766 Views
  • 2 replies
  • 0 kudos

Using Selenium Chrome Driver in Databricks, runs the first time but fails after that

I have a notebook that uses a Selenium Web Driver for Chrome and it works the first time I run the notebook. If I run the notebook again, it will not work and gives the error message: WebDriverException: Message: unknown error: unable to discover op...

  • 2766 Views
  • 2 replies
  • 0 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 0 kudos

Hi, @Dagart Allison​ . I've created a new version of the selenium with the databricks manual. Please look here https://community.databricks.com/s/feed/0D58Y00009SWgVuSAL

  • 0 kudos
1 More Replies
talha
by New Contributor III
  • 2089 Views
  • 5 replies
  • 0 kudos

spark-submit Error "Unrecognized option: --executor-memory 3G" although --executor-memory is available in Options.

Executed a spark-submit job through databricks cli with the following job configurations.{ "job_id": 123, "creator_user_name": "******", "run_as_user_name": "******", "run_as_owner": true, "settings": { "name": "44aa-8447-c123aad310", ...

  • 2089 Views
  • 5 replies
  • 0 kudos
Latest Reply
talha
New Contributor III
  • 0 kudos

Not really sure if running spark on local mode. But have used alternate property spark.executor.memoryand passed it as --conf and now it works

  • 0 kudos
4 More Replies
Labels