08-07-2021 09:55 AM
When I am trying to read snowflake table from my databricks notebook, it is giving the error as:
df1.read.format("snowflake") \
.options(**options) \
.option("query", "select * from abc") \
.save()
Getting below error
java.sql.SQLException: No suitable driver found for jdbc:snowflake://https://snowflake_username.snowflakecomputing.com
sfUrl="https://snowflake_username.snowflakecomputing.com"
I tried below possibilities for sfUrl:
1. Removed "https://" from sfUrl and tried with sfUrl="snowflake_username.snowflakecomputing.com"
2. Added "jdbc:snowflake" and tried as sfUrl="jdbc:snowflake://https://snowflake_username.snowflakecomputing.com"
3. Removed "https://" and added "jdbc:snowflake" and tried with sfUrl="jdbc:snowflake://snowflake_username.snowflakecomputing.com"
In all the cases I am getting the ame error. Can any one please help me with the correct solution?
Thank you very much in advance.
03-07-2022 02:47 AM
Hi @Madman , Can you please update your sfurl as given below:-
sfUrl="snowflake_username.snowflakecomputing.com"
Please let me know if this helps.
09-02-2021 03:01 AM
Hi @ Madman! My name is Kaniz, and I'm a technical moderator here. Great to meet you, and thanks for your question! Let's see if your peers on the Forum have an answer to your questions first. Or else I will follow up shortly with a response.
03-07-2022 02:47 AM
Hi @Madman , Can you please update your sfurl as given below:-
sfUrl="snowflake_username.snowflakecomputing.com"
Please let me know if this helps.
03-08-2022 03:06 AM
The Databricks version 4.2 native Snowflake Connector allows your Databricks account to read data from and write data to Snowflake without importing any libraries. Older versions of Databricks required importing the libraries for the Spark connector into your Databricks clusters.
10-19-2022 05:02 PM
Hi,
I am unable to read data from Snowflake using the code base given by databricks.
I am able to get the count of the rows of the table. But, I cannot do any other queries.
I get the ResultSet Error.
dataset = spark.read.format("snowflake").options(**options).option("dbtable", 'CUSTOMER').load()
// This throws an error
display(dataset)
java.sql.SQLException: Status of query associated with resultSet is FAILED_WITH_ERROR. Results not generated.
However, printing count works
display(dataset.count())
Greatly appreciate your help.
08-02-2023 12:32 AM
08-02-2023 06:04 AM
Hi @Sashi_Gunturu , we are getting same error. Were you able to find a workaround. Our process is running as expected, but suddenly got this error from today morning.
java.sql.SQLException: Status of query associated with resultSet is FAILED_WITH_ERROR. Results not generated.
08-22-2023 03:13 AM
@anurag2192 did you managed to solve it?
Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.
Click here to register and join today!
Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.