cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Load GCP data to Databricks using R

dZegpi
New Contributor II

I'm working with Databricks and Google Cloud in the same project. I want to load specific datasets stored in GCP into a R notebook in Databricks. I currently can see the datasets in BigQuery. The problem is that using the sparklyr package, I'm not able to see the datasets that I have in GCP. Instead, I see the name of other datasets I was not aware of. 

This is my R code

 

library(sparklyr)
sc <- spark_connect(method = "databricks")

# List table names in my spark connection
dplyr::src_tbls(sc)
# Can't see the tables sotred in GCP

# Try (and fail) to load the tables
spark_read_table(sc, "my-table-name")

 

How can I access my tables stored in GCP through Databricks notebooks using R?; Is sparklyr the right approach?

1 REPLY 1

dZegpi
New Contributor II

Other than library(SparkR), this is the same code I posted. This does not solve my problem.

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now