cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

niklas
by Contributor
  • 2993 Views
  • 2 replies
  • 1 kudos

Resolved! How can I specify a custom CRAN mirror to be used permanently by default when installing packages within R Notebooks?

When installing Notebook-scoped R libraries I don't want to manually specify the custom CRAN mirror each time like this:install.packages("diffdf", repos="my_custom_cran_url'')Instead I want to take the custom CRAN mirror URL by default so that I don'...

  • 2993 Views
  • 2 replies
  • 1 kudos
Latest Reply
niklas
Contributor
  • 1 kudos

Got solution on Stack Overflow for this problem: https://stackoverflow.com/a/76777228/18082636

  • 1 kudos
1 More Replies
rshark
by New Contributor II
  • 7306 Views
  • 3 replies
  • 0 kudos

Error when calling SparkR from within a Python notebook

I’ve had success with R magic (R cells in a Python notebook) and running an R script from a Python notebook, up to the point of connecting R to a Spark cluster. In either case, I can’t get a `SparkSession` to initialize. 2-cell (Python) notebook exa...

  • 7306 Views
  • 3 replies
  • 0 kudos
Latest Reply
Dooley
Valued Contributor II
  • 0 kudos

The answer I can give you to have this work for you is to call the R notebooks from your Python notebook. Just save each dataframe as a delta table to pass between the languages.How to call a notebook from another notebook? here is a link

  • 0 kudos
2 More Replies
User16752239289
by Databricks Employee
  • 3326 Views
  • 1 replies
  • 1 kudos

Resolved! SparkR session failed to initialize

When run sparkR.session()I faced below error:Spark package found in SPARK_HOME: /databricks/spark   Launching java with spark-submit command /databricks/spark/bin/spark-submit sparkr-shell /tmp/Rtmp5hnW8G/backend_porte9141208532d   Error: Could not f...

  • 3326 Views
  • 1 replies
  • 1 kudos
Latest Reply
User16752239289
Databricks Employee
  • 1 kudos

This is due to the when users run their R scripts on Rstudio, the R session is not shut down gracefully. Databricks is working on handle the R session better and removed the limit. As a workaround, you can create and run below init script to increase...

  • 1 kudos
Labels