- 13609 Views
- 3 replies
- 2 kudos
Hi community,i get an analysis exception when executing following code in a notebook using a personal compute cluster. Seems to be an issue with permission but I am logged in with my admin account. Any help would be appreciated. USE CATALOG catalog;
...
- 13609 Views
- 3 replies
- 2 kudos
Latest Reply
I was having the same issue because I was trying to set the location with the absolute path, just like you did.I solved it by creating an external location, then copying the URL and putting it into the location of the path options.
2 More Replies
- 5048 Views
- 5 replies
- 1 kudos
AnalysisException: Column Is There a PO#17748 are ambiguous. It's probably because you joined several Datasets together, and some of these Datasets are the same. This column points to one of the Datasets but Spark is unable to figure out which one. ...
- 5048 Views
- 5 replies
- 1 kudos
Latest Reply
Hi All,the solution for this problem is very strange.this has caused due to the version of the Databricks runtime.We are using Runtime version 7.0 with Apache Spark 3.0.0 version.In PRD we are using Runtime version 11.3LTS with Apache Spark 3.3.0 ver...
4 More Replies
- 4546 Views
- 1 replies
- 2 kudos
While reading data from BigQuery to Databricks getting the error : AnalysisException: Multiple sources found for bigquery (com.google.cloud.spark.bigquery.BigQueryRelationProvider, com.google.cloud.spark.bigquery.v2.BigQueryTableProvider), please spe...
- 4546 Views
- 1 replies
- 2 kudos
Latest Reply
Hi @Parul Paul , could you please check if this is the scenario: https://stackoverflow.com/questions/68623803/load-to-bigquery-via-spark-job-fails-with-an-exception-for-multiple-sources-foun Also, you can refer: https://github.com/GoogleCloudDatapro...
- 14902 Views
- 5 replies
- 8 kudos
- 14902 Views
- 5 replies
- 8 kudos
Latest Reply
It seems like the issue was miraculously resolved. I did not make any code changes but everything is now running as expected. Maybe the latest runtime 10.4 fix released on April 19th also resolved this issue unintentionally.
4 More Replies