cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

thomann
by New Contributor III
  • 6150 Views
  • 3 replies
  • 6 kudos

Bug? Unity Catalog incompatible with Sparklyr in RStudio (on Driver) and as well if used on one cluster from multiple notebooks?

If I start a RStudio Server with in cluster init script as described here in a Unity Catalog Cluster the sparklyr connection fails with an error about a missing Credential Scope.=LI tried it both in 11.3LTS and 12.0 Beta. I tried it only in a Persona...

image
  • 6150 Views
  • 3 replies
  • 6 kudos
Latest Reply
kunalmishra9
New Contributor III
  • 6 kudos

Have run into this issue as well. Let me know if there was any resolution 

  • 6 kudos
2 More Replies
niklas
by Contributor
  • 2794 Views
  • 2 replies
  • 1 kudos

Resolved! How can I specify a custom CRAN mirror to be used permanently by default when installing packages within R Notebooks?

When installing Notebook-scoped R libraries I don't want to manually specify the custom CRAN mirror each time like this:install.packages("diffdf", repos="my_custom_cran_url'')Instead I want to take the custom CRAN mirror URL by default so that I don'...

  • 2794 Views
  • 2 replies
  • 1 kudos
Latest Reply
niklas
Contributor
  • 1 kudos

Got solution on Stack Overflow for this problem: https://stackoverflow.com/a/76777228/18082636

  • 1 kudos
1 More Replies
Ross
by New Contributor II
  • 3306 Views
  • 4 replies
  • 0 kudos

Failed to install cluster scoped SparkR library

Attempting to install SparkR to the cluster and successfully installed other packages such as tidyverse via CRAN. The error is copied below, any help you can provide is greatly appreciated!Databricks runtime 10.4 LTSLibrary installation attempted on ...

  • 3306 Views
  • 4 replies
  • 0 kudos
Latest Reply
Vivian_Wilfred
Databricks Employee
  • 0 kudos

Hi @Ross Hamilton​ ,I believe SparkR comes inbuilt with Databricks RStudio and you don't have to install it explicitly. You can directly import it with library(SparkR) and it works for you from your above comment.The error message you see could be re...

  • 0 kudos
3 More Replies
User16752239289
by Databricks Employee
  • 3197 Views
  • 1 replies
  • 1 kudos

Resolved! SparkR session failed to initialize

When run sparkR.session()I faced below error:Spark package found in SPARK_HOME: /databricks/spark   Launching java with spark-submit command /databricks/spark/bin/spark-submit sparkr-shell /tmp/Rtmp5hnW8G/backend_porte9141208532d   Error: Could not f...

  • 3197 Views
  • 1 replies
  • 1 kudos
Latest Reply
User16752239289
Databricks Employee
  • 1 kudos

This is due to the when users run their R scripts on Rstudio, the R session is not shut down gracefully. Databricks is working on handle the R session better and removed the limit. As a workaround, you can create and run below init script to increase...

  • 1 kudos
dshosseinyousef
by New Contributor II
  • 6232 Views
  • 2 replies
  • 0 kudos

How to extract year and week number from a columns in a sparkDataFrame?

I have the following sparkdataframe : sale_id/ created_at 1 /2016-05-28T05:53:31.042Z 2 /2016-05-30T12:50:58.184Z 3/ 2016-05-23T10:22:18.858Z 4 /2016-05-27T09:20:15.158Z 5 /2016-05-21T08:30:17.337Z 6 /2016-05-28T07:41:14.361Z i need t add a year-wee...

  • 6232 Views
  • 2 replies
  • 0 kudos
Latest Reply
theodondre
New Contributor II
  • 0 kudos

THIS IS HOW HE DOCUMENTATION LOOKS LIKE

  • 0 kudos
1 More Replies
Labels