cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

qwerty1
by Contributor
  • 2573 Views
  • 5 replies
  • 14 kudos

Resolved! When will databricks runtime be released for Scala 2.13?

I see that spark fully supports Scala 2.13. I wonder why is there no databricks runtime with Scala 2.13 yet. Any plans on making this available? It would be super useful.

  • 2573 Views
  • 5 replies
  • 14 kudos
Latest Reply
source2sea
Contributor
  • 14 kudos

I see db runtime 14 is out, but still 2.12, when would databricks plan to support 2.13 or 3  thank you

  • 14 kudos
4 More Replies
darkraisisi
by New Contributor
  • 484 Views
  • 0 replies
  • 0 kudos

Is there a way to manually update the cuda required file in the db runtime? There are some rather annoying bugs still in TF 2.11 that have been fixed ...

Is there a way to manually update the cuda required file in the db runtime?There are some rather annoying bugs still in TF 2.11 that have been fixed in TF 2.12.Sadly the latest DB runtime 13.1 (beta) only supports the older TF 2.11 even tho 2.12 was ...

  • 484 Views
  • 0 replies
  • 0 kudos
johnb1
by New Contributor III
  • 1405 Views
  • 3 replies
  • 0 kudos

Cluster Configuration for ML Model Training

Hi!I am training a Random Forest (pyspark.ml.classification.RandomForestClassifier) on Databricks with 1,000,000 training examples and 25 features. I employ a cluster with one driver (16 GB Memory, 4 Cores), 2-6 workers (32-96 GB Memory, 8-24 Cores),...

  • 1405 Views
  • 3 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @John B​ Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we can...

  • 0 kudos
2 More Replies
isaac_gritz
by Valued Contributor II
  • 1259 Views
  • 4 replies
  • 8 kudos

Databricks Runtime Support

How Long are Databricks runtimes supported for? How often are they updated?You can learn more about the Databricks runtime support lifecycle here (AWS | Azure | GCP).Long Term Support (LTS) runtimes are released every 6 months and supported for 2 yea...

  • 1259 Views
  • 4 replies
  • 8 kudos
Latest Reply
Ajay-Pandey
Esteemed Contributor III
  • 8 kudos

Thanks for update

  • 8 kudos
3 More Replies
NSRBX
by Contributor
  • 1797 Views
  • 8 replies
  • 19 kudos

Databricks-connect not available on Databricks Runtime > 10.4

Hello Databricks Team,Databricks-connect doesn't work on databricks runtime 11.3.Databricks recommends that we use dbx for Databricks Lab instead of databricks-connect. Databricks plans no new feature development for Databricks Connect at this time.D...

  • 1797 Views
  • 8 replies
  • 19 kudos
Latest Reply
xiangzhu
Contributor
  • 19 kudos

thx @Landan George​ do you have any ETA about the public preview ?

  • 19 kudos
7 More Replies
Yaswanth
by New Contributor III
  • 9959 Views
  • 5 replies
  • 18 kudos

Resolved! How can Delta table protocol version be downgraded from higher version to lower version the table properties minReader from 2 to 1 and MaxWriter from 5 to 3.

Is there a possibility to downgrade the Delta Table protocol versions minReader from 2 to 1 and maxWriter from 5 to 3? I have set the TBL properties to 2 and 5 and columnmapping mode to rename the columns in the DeltaTable but the other users are rea...

  • 9959 Views
  • 5 replies
  • 18 kudos
Latest Reply
Kaniz
Community Manager
  • 18 kudos

Hi @Yaswanth velkur, We haven’t heard from you since the last response from @Youssef Mrini​ and me​​, and I was checking back to see if our suggestions helped you.Or else, If you have any solution, please share it with the community, as it can be hel...

  • 18 kudos
4 More Replies
data_serf
by New Contributor
  • 2102 Views
  • 3 replies
  • 1 kudos

Resolved! How to integrate java 11 code in Databricks

Hi all,We're trying to attach java libraries which are compiled/packaged using Java 11.After doing some research it looks like even the most recent runtimes use Java 8 which can't run the Java 11 code ("wrong version 55.0, should be 52.0" errors)Is t...

  • 2102 Views
  • 3 replies
  • 1 kudos
Latest Reply
matthewrj
New Contributor II
  • 1 kudos

I have tried setting JNAME=zulu11-ca-amd64 under Cluster > Advanced options > Spark > Environment variables but it doesn't seem to work. I still get errors indicating Java 8 is the JRE and in the Spark UI under "Environment" I still see:Java Home: /u...

  • 1 kudos
2 More Replies
NicolasEscobar
by New Contributor II
  • 6788 Views
  • 8 replies
  • 5 kudos

Resolved! Job fails after runtime upgrade

I have a job running with no issues in Databricks runtime 7.3 LTS. When I upgraded to 8.3 it fails with error An exception was thrown from a UDF: 'pyspark.serializers.SerializationError'... SparkContext should only be created and accessed on the driv...

  • 6788 Views
  • 8 replies
  • 5 kudos
Latest Reply
User16873042682
New Contributor II
  • 5 kudos

Adding to @Sean Owen​  comments, The only reason this is working is that the optimizer is evaluating this locally rather than creating a context on executors and evaluating it.

  • 5 kudos
7 More Replies
sarvesh242
by Contributor
  • 6421 Views
  • 7 replies
  • 2 kudos

Resolved! java.lang.NoSuchMethodError in databricks

I have created a package in scala. Now, I am calling a method from that package and using it in my notebook. During run time, it throws me an error java.lang.NoSuchMethodError. The method exists in the package but still, I am getting this error. Plea...

  • 6421 Views
  • 7 replies
  • 2 kudos
Latest Reply
Kaniz
Community Manager
  • 2 kudos

@sarvesh gakhar​ , Just a friendly follow-up. Do you still need help? Please let us know.

  • 2 kudos
6 More Replies
raymund
by New Contributor III
  • 2222 Views
  • 7 replies
  • 5 kudos

Resolved! Why adding the package 'org.apache.spark:spark-sql-kafka-0-10_2.12:3.0.1' failed in runtime 9.1.x-scala2.12 but was successful using runtime 8.2.x-scala2.12 ?

Using Databricks spark submit job, setting new cluster1] "spark_version": "8.2.x-scala2.12" => OK, works fine2] "spark_version": "9.1.x-scala2.12" => FAIL, with errorsException in thread "main" java.lang.ExceptionInInitializerError at com.databricks...

  • 2222 Views
  • 7 replies
  • 5 kudos
Latest Reply
raymund
New Contributor III
  • 5 kudos

this has been resolved by adding the following spark_conf (not thru --conf) "spark.hadoop.fs.file.impl": "org.apache.hadoop.fs.LocalFileSystem"example:------"new_cluster": { "spark_version": "9.1.x-scala2.12", ... "spark_conf": { "spar...

  • 5 kudos
6 More Replies
Nickels
by New Contributor II
  • 1042 Views
  • 4 replies
  • 1 kudos

Resolved! Reply on inline runtime commands

I feel like the answer to this question should be simple, but none the less I'm struggling.I run a python code that prompts me with the following warning:On my local machine, I can accept this through my terminal and my machine do not run out of memo...

image
  • 1042 Views
  • 4 replies
  • 1 kudos
Latest Reply
jose_gonzalez
Moderator
  • 1 kudos

Hi @Nickels Köhling​ ,In Databricks, you will only be able to see the output in the driver logs. If you go to your driver logs, you will be able to see 3 windows that are displaying the output of "stdout", "stderr" and "log4j".If in your code you do ...

  • 1 kudos
3 More Replies
Labels