cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

qwerty1
by Contributor
  • 5093 Views
  • 7 replies
  • 17 kudos

Resolved! When will databricks runtime be released for Scala 2.13?

I see that spark fully supports Scala 2.13. I wonder why is there no databricks runtime with Scala 2.13 yet. Any plans on making this available? It would be super useful.

  • 5093 Views
  • 7 replies
  • 17 kudos
Latest Reply
guersam
New Contributor II
  • 17 kudos

I agree with @777. As Scala 3 is getting mature and there are more real use cases with Scala 3 on Spark now, support for Scala 2.13 will be valuable to users including us.I think the recent upgrade of Databricks runtime from JDK 8 to 17 was one of a ...

  • 17 kudos
6 More Replies
darkraisisi
by New Contributor
  • 895 Views
  • 0 replies
  • 0 kudos

Is there a way to manually update the cuda required file in the db runtime? There are some rather annoying bugs still in TF 2.11 that have been fixed ...

Is there a way to manually update the cuda required file in the db runtime?There are some rather annoying bugs still in TF 2.11 that have been fixed in TF 2.12.Sadly the latest DB runtime 13.1 (beta) only supports the older TF 2.11 even tho 2.12 was ...

  • 895 Views
  • 0 replies
  • 0 kudos
johnb1
by Contributor
  • 2415 Views
  • 3 replies
  • 0 kudos

Cluster Configuration for ML Model Training

Hi!I am training a Random Forest (pyspark.ml.classification.RandomForestClassifier) on Databricks with 1,000,000 training examples and 25 features. I employ a cluster with one driver (16 GB Memory, 4 Cores), 2-6 workers (32-96 GB Memory, 8-24 Cores),...

  • 2415 Views
  • 3 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @John B​ Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we can...

  • 0 kudos
2 More Replies
isaac_gritz
by Databricks Employee
  • 2285 Views
  • 4 replies
  • 8 kudos

Databricks Runtime Support

How Long are Databricks runtimes supported for? How often are they updated?You can learn more about the Databricks runtime support lifecycle here (AWS | Azure | GCP).Long Term Support (LTS) runtimes are released every 6 months and supported for 2 yea...

  • 2285 Views
  • 4 replies
  • 8 kudos
Latest Reply
Ajay-Pandey
Esteemed Contributor III
  • 8 kudos

Thanks for update

  • 8 kudos
3 More Replies
NSRBX
by Contributor
  • 3410 Views
  • 8 replies
  • 19 kudos

Databricks-connect not available on Databricks Runtime > 10.4

Hello Databricks Team,Databricks-connect doesn't work on databricks runtime 11.3.Databricks recommends that we use dbx for Databricks Lab instead of databricks-connect. Databricks plans no new feature development for Databricks Connect at this time.D...

  • 3410 Views
  • 8 replies
  • 19 kudos
Latest Reply
xiangzhu
Contributor III
  • 19 kudos

thx @Landan George​ do you have any ETA about the public preview ?

  • 19 kudos
7 More Replies
Yaswanth
by New Contributor III
  • 17822 Views
  • 2 replies
  • 12 kudos

Resolved! How can Delta table protocol version be downgraded from higher version to lower version the table properties minReader from 2 to 1 and MaxWriter from 5 to 3.

Is there a possibility to downgrade the Delta Table protocol versions minReader from 2 to 1 and maxWriter from 5 to 3? I have set the TBL properties to 2 and 5 and columnmapping mode to rename the columns in the DeltaTable but the other users are rea...

  • 17822 Views
  • 2 replies
  • 12 kudos
Latest Reply
youssefmrini
Databricks Employee
  • 12 kudos

Unfortunately You can't downgrade the version. it's an irreversible operation.

  • 12 kudos
1 More Replies
data_serf
by New Contributor
  • 4020 Views
  • 3 replies
  • 1 kudos

Resolved! How to integrate java 11 code in Databricks

Hi all,We're trying to attach java libraries which are compiled/packaged using Java 11.After doing some research it looks like even the most recent runtimes use Java 8 which can't run the Java 11 code ("wrong version 55.0, should be 52.0" errors)Is t...

  • 4020 Views
  • 3 replies
  • 1 kudos
Latest Reply
matthewrj
New Contributor II
  • 1 kudos

I have tried setting JNAME=zulu11-ca-amd64 under Cluster > Advanced options > Spark > Environment variables but it doesn't seem to work. I still get errors indicating Java 8 is the JRE and in the Spark UI under "Environment" I still see:Java Home: /u...

  • 1 kudos
2 More Replies
NicolasEscobar
by New Contributor II
  • 9298 Views
  • 7 replies
  • 5 kudos

Resolved! Job fails after runtime upgrade

I have a job running with no issues in Databricks runtime 7.3 LTS. When I upgraded to 8.3 it fails with error An exception was thrown from a UDF: 'pyspark.serializers.SerializationError'... SparkContext should only be created and accessed on the driv...

  • 9298 Views
  • 7 replies
  • 5 kudos
Latest Reply
User16873042682
New Contributor II
  • 5 kudos

Adding to @Sean Owen​  comments, The only reason this is working is that the optimizer is evaluating this locally rather than creating a context on executors and evaluating it.

  • 5 kudos
6 More Replies
sarvesh242
by Contributor
  • 9528 Views
  • 3 replies
  • 2 kudos

Resolved! java.lang.NoSuchMethodError in databricks

I have created a package in scala. Now, I am calling a method from that package and using it in my notebook. During run time, it throws me an error java.lang.NoSuchMethodError. The method exists in the package but still, I am getting this error. Plea...

  • 9528 Views
  • 3 replies
  • 2 kudos
Latest Reply
sarvesh242
Contributor
  • 2 kudos

Hi! @Kaniz Fatma​ . I am using scala version 2.11 with spark 2.4.3. According to Apache spark official website https://spark.apache.org/docs/2.4.3/#:~:text=For%20the%20Scala%20API%2C%20Spark,x.) Spark 2.4.3 uses Scala 2.12. (https://spark.apache.org/...

  • 2 kudos
2 More Replies
raymund
by New Contributor III
  • 3778 Views
  • 7 replies
  • 5 kudos

Resolved! Why adding the package 'org.apache.spark:spark-sql-kafka-0-10_2.12:3.0.1' failed in runtime 9.1.x-scala2.12 but was successful using runtime 8.2.x-scala2.12 ?

Using Databricks spark submit job, setting new cluster1] "spark_version": "8.2.x-scala2.12" => OK, works fine2] "spark_version": "9.1.x-scala2.12" => FAIL, with errorsException in thread "main" java.lang.ExceptionInInitializerError at com.databricks...

  • 3778 Views
  • 7 replies
  • 5 kudos
Latest Reply
raymund
New Contributor III
  • 5 kudos

this has been resolved by adding the following spark_conf (not thru --conf) "spark.hadoop.fs.file.impl": "org.apache.hadoop.fs.LocalFileSystem"example:------"new_cluster": { "spark_version": "9.1.x-scala2.12", ... "spark_conf": { "spar...

  • 5 kudos
6 More Replies
Nickels
by New Contributor II
  • 2101 Views
  • 4 replies
  • 1 kudos

Resolved! Reply on inline runtime commands

I feel like the answer to this question should be simple, but none the less I'm struggling.I run a python code that prompts me with the following warning:On my local machine, I can accept this through my terminal and my machine do not run out of memo...

image
  • 2101 Views
  • 4 replies
  • 1 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 1 kudos

Hi @Nickels Köhling​ ,In Databricks, you will only be able to see the output in the driver logs. If you go to your driver logs, you will be able to see 3 windows that are displaying the output of "stdout", "stderr" and "log4j".If in your code you do ...

  • 1 kudos
3 More Replies
Labels