I'm working with Databricks and Google Cloud in the same project. I want to load specific datasets stored in GCP into a R notebook in Databricks. I currently can see the datasets in BigQuery. The problem is that using the sparklyr package, I'm not able to see the datasets that I have in GCP. Instead, I see the name of other datasets I was not aware of.
This is my R code
library(sparklyr)
sc <- spark_connect(method = "databricks")
# List table names in my spark connection
dplyr::src_tbls(sc)
# Can't see the tables sotred in GCP
# Try (and fail) to load the tables
spark_read_table(sc, "my-table-name")
How can I access my tables stored in GCP through Databricks notebooks using R?; Is sparklyr the right approach?