cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

AGnewbie
by New Contributor
  • 182 Views
  • 1 replies
  • 1 kudos

Required versus current compute setup

To run demo and lab notebooks, I am required to have the following Databricks runtime(s): 15.4.x-cpu-ml-scala2.12 but the compute in my setup is of the following runtime version, will that be an issue? 11.3 LTS (includes Apache Spark 3.3.0, Scala 2.1...

  • 182 Views
  • 1 replies
  • 1 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 1 kudos

Hello @AGnewbie, Firstly, regarding the Databricks runtime: your compute setup is currently running version 11.3 LTS, which will indeed be an issue as the specified version is not present in your current runtime. Hence, you need to update your runtim...

  • 1 kudos
mathijs-fish
by New Contributor III
  • 1475 Views
  • 1 replies
  • 0 kudos

Disable personal compute with the Databricks API or UI

For a production environment, I want to disable the personal compute policy, because I do not want that all users can create personal compute clusters in production. Unfortunately, I am not able to access the account console, so I want to revoke perm...

mathijsfish_0-1702396352437.png mathijsfish_1-1702396390529.png
Get Started Discussions
compute
permissions
policies
  • 1475 Views
  • 1 replies
  • 0 kudos
Latest Reply
arpit
Databricks Employee
  • 0 kudos

@mathijs-fish You need to be admin to disable a policy.

  • 0 kudos
maartenvr
by New Contributor III
  • 14395 Views
  • 5 replies
  • 1 kudos

Installed Library / Module not found through Databricks connect LST 12.2

Hi all,We recently upgraded our databricks compute cluster from runtime version 10.4 LST, to 12.2 LST.After the upgrade one of our python scripts suddenly fails with a module not found error; indicating that our customly created module "xml_parser" i...

  • 14395 Views
  • 5 replies
  • 1 kudos
Latest Reply
maartenvr
New Contributor III
  • 1 kudos

FYI: For now we have found a workaround.We are adding the package as ZIP file to the current spark session with .addyFiles.So after creating a spark session using Databricks-connect we run the following:spark.sparkContext.addPyFile("C:/path/to/custom...

  • 1 kudos
4 More Replies
Labels