cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Yaswanth
by New Contributor III
  • 18105 Views
  • 2 replies
  • 12 kudos

Resolved! How can Delta table protocol version be downgraded from higher version to lower version the table properties minReader from 2 to 1 and MaxWriter from 5 to 3.

Is there a possibility to downgrade the Delta Table protocol versions minReader from 2 to 1 and maxWriter from 5 to 3? I have set the TBL properties to 2 and 5 and columnmapping mode to rename the columns in the DeltaTable but the other users are rea...

  • 18105 Views
  • 2 replies
  • 12 kudos
Latest Reply
youssefmrini
Databricks Employee
  • 12 kudos

Unfortunately You can't downgrade the version. it's an irreversible operation.

  • 12 kudos
1 More Replies
Data_Engineer3
by Contributor III
  • 4069 Views
  • 1 replies
  • 7 kudos

Move folder from dbfs location to user workspace directory in azure databricks

I need to move group of files(python or scala file from)or folder from dbfs location to user workspace directory in azure databricks to do testing on file.Its verify difficult to upload each file one by one into the user workspace directory, so is it...

  • 4069 Views
  • 1 replies
  • 7 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 7 kudos

dbutils.fs.mv or dbutils.fs.cp can help you.

  • 7 kudos
kthneighbor
by New Contributor II
  • 3201 Views
  • 5 replies
  • 2 kudos

Resolved! What will be the next LTS version after 10.4?

What will be the next LTS version after 10.4?

  • 3201 Views
  • 5 replies
  • 2 kudos
Latest Reply
youssefmrini
Databricks Employee
  • 2 kudos

Hello, 11.3 LTS is now available https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/11.3

  • 2 kudos
4 More Replies
IdanYaffe
by New Contributor II
  • 1393 Views
  • 1 replies
  • 2 kudos

Cannot find an AWS Cloudwatch init script that supports runtime 11.x

Hi all,I'm using the AWS CW init global script in order to monitor my clusters' instances.I'm also using data live tables with some autoloader jobs.Unfortunately, the data live tables are now running runtime version 11.As a result, newly created pipe...

  • 1393 Views
  • 1 replies
  • 2 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 2 kudos

Unfortunately, in delta live tables, you can not specify runtime (except current and preview, which you mentioned). It could be helpful that DLT runtimes releases are mentioned on the databricks side the same way as SQL, ML, and standard ones @Kaniz ...

  • 2 kudos
Christine
by Contributor II
  • 8774 Views
  • 7 replies
  • 5 kudos

Resolved! autoML' is not found when using databricks.automl with runtime 112.ML (and runtime 10.4 LTS ML).

I have tried to set up a autoML experiment with runtime 11.2ML and data from a delta table. However I receive the error "ModuleNotFoundError: No module named 'databricks.automl'" and "AutoML not available: Use Databricks Runtime 8.3 ML or above." tho...

image
  • 8774 Views
  • 7 replies
  • 5 kudos
Latest Reply
Christine
Contributor II
  • 5 kudos

I deleted the cluster and created a new with runtime 9.1 LTS ML which solved the problem.

  • 5 kudos
6 More Replies
ahuarte
by New Contributor III
  • 18668 Views
  • 17 replies
  • 3 kudos

Resolved! Getting Spark & Scala version in Cluster node initialization script

Hi there, I am developing a Cluster node initialization script (https://docs.gcp.databricks.com/clusters/init-scripts.html#environment-variables) in order to install some custom libraries.Reading the docs of Databricks we can get some environment var...

  • 18668 Views
  • 17 replies
  • 3 kudos
Latest Reply
Lingesh
Databricks Employee
  • 3 kudos

We can infer the cluster DBR version using the env $DATABRICKS_RUNTIME_VERSION. (For the exact spark/scala version mapping, you can refer to the specific DBR release notes)Sample usage inside a init script, DBR_10_4_VERSION="10.4" if [[ "$DATABRICKS_...

  • 3 kudos
16 More Replies
Hooli
by New Contributor II
  • 2772 Views
  • 3 replies
  • 1 kudos

Resolved! When is the End of Life (EOL) date for Databricks Runtime 7.3 LTS? Can we still use an unsupported version till its EOL date?

The End of Support (EOS) date sneaked up on us and we are now wondering if we can delay our upgrade post the EOS date. Could you please help us analyze the risks of operating a DBR version post EOS date?

  • 2772 Views
  • 3 replies
  • 1 kudos
Latest Reply
Vidula
Honored Contributor
  • 1 kudos

Hi @Syed Zaffar​ Does @Prabakar Ammeappin​  response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?We'd love to hear from you.Thanks!

  • 1 kudos
2 More Replies
77796
by New Contributor II
  • 4399 Views
  • 4 replies
  • 0 kudos

Databricks S3A error - java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.s3a.commit.S3ACommitterFactory not found

We are getting the below error for runtime 10.x and 11.x when writing to s3 via saveAsNewAPIHadoopFile function. The same jobs are running fine on runtime 9.x and 7.x. The difference betwen 9.x and 10.x is the former has hadoop 2.7 bindings with sp...

  • 4399 Views
  • 4 replies
  • 0 kudos
Latest Reply
77796
New Contributor II
  • 0 kudos

We have resolved this issue by using s3 scheme instead of s3a i.e. pairRDD.saveAsNewAPIHadoopFile("s3://bucket/testout.dat",

  • 0 kudos
3 More Replies
isaac_gritz
by Databricks Employee
  • 1857 Views
  • 1 replies
  • 6 kudos

Versions of Spark, Python, Scala, R in each Databricks Runtime

What version of Spark, Python, Scala, R are included in each Databricks Runtime? What libraries are pre-installed?You can find this info at the Databricks runtime releases page (AWS | Azure | GCP).Let us know if you have any additional questions on t...

  • 1857 Views
  • 1 replies
  • 6 kudos
Latest Reply
maxdata
Databricks Employee
  • 6 kudos

Wow! Thanks for the help @Isaac Gritz​ !

  • 6 kudos
NicolasEscobar
by New Contributor II
  • 9557 Views
  • 7 replies
  • 5 kudos

Resolved! Job fails after runtime upgrade

I have a job running with no issues in Databricks runtime 7.3 LTS. When I upgraded to 8.3 it fails with error An exception was thrown from a UDF: 'pyspark.serializers.SerializationError'... SparkContext should only be created and accessed on the driv...

  • 9557 Views
  • 7 replies
  • 5 kudos
Latest Reply
User16873042682
New Contributor II
  • 5 kudos

Adding to @Sean Owen​  comments, The only reason this is working is that the optimizer is evaluating this locally rather than creating a context on executors and evaluating it.

  • 5 kudos
6 More Replies
RiyazAli
by Valued Contributor II
  • 2697 Views
  • 1 replies
  • 3 kudos

Errors in notebooks of Scalable Machine Learning with Apache Spark course in Databricks academy.

HI there,I'm following the course mentioned from Databricks Academy. I downloaded the .dbc archiive and working along side the videos from academy. In ML-08 - Hyperopt notebook, I see the following error in cmd 13. best_hyperparam = fmin(fn=objectiv...

hyperopt_implementation hyperopt problem with "max_features"
  • 2697 Views
  • 1 replies
  • 3 kudos
Latest Reply
RiyazAli
Valued Contributor II
  • 3 kudos

Tagging @Kaniz Fatma​ as there was no response what so ever!By any chance, do you know how to resolve these errors in the notebook?Thanks!

  • 3 kudos
venkad
by Contributor
  • 1742 Views
  • 1 replies
  • 2 kudos

Resolved! Is Databricks Light Runtime Discontinued?

The last Databricks Light runtime release was 2.4 Extended Support. There was no Light version for Spark 3.x. Is Databricks Light runtime discontinued? If not, when we can expect the next DBR Light version?

  • 1742 Views
  • 1 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Venkadeshwaran K​, I looked around, and it does look like there won't be future light runtimes. We can't hire enough engineers to maintain and develop everything, and light is one of the casualties of that.

  • 2 kudos
Emiel_Smeenk
by New Contributor III
  • 13783 Views
  • 5 replies
  • 8 kudos

Resolved! Databricks Runtime 10.4 LTS - AnalysisException: No such struct field id in 0, 1 after upgrading

Hello,We are working to migrate to databricks runtime 10.4 LTS from 9.1 LTS but we're running into weird behavioral issues. Our existing code works up until runtime 10.3 and in 10.4 it stopped working.Problem:We have a nested json file that we are fl...

image image image
  • 13783 Views
  • 5 replies
  • 8 kudos
Latest Reply
Emiel_Smeenk
New Contributor III
  • 8 kudos

It seems like the issue was miraculously resolved. I did not make any code changes but everything is now running as expected. Maybe the latest runtime 10.4 fix released on April 19th also resolved this issue unintentionally.

  • 8 kudos
4 More Replies
msoczka
by New Contributor II
  • 3267 Views
  • 3 replies
  • 3 kudos

Resolved! Cluster terminated.Reason:Unexpected launch failure

Using Community Edition. Cluster starts up and immediately terminates with the following error message.Why would that be?MessageCluster terminated.Reason:Unexpected launch failureAn unexpected error was encountered while setting up the cluster. Pleas...

  • 3267 Views
  • 3 replies
  • 3 kudos
Latest Reply
Manav
New Contributor II
  • 3 kudos

Hi, I am getting the same error. Can @DARSHAN BARGAL​  please help?

  • 3 kudos
2 More Replies
Labels