The open source spark connector for Snowflake is available by default in the Databricks runtime.
To connect you can use the following code:
# Use secrets DBUtil to get Snowflake credentials.
user = dbutils.secrets.get("<scope>", "<secret key>")
password = dbutils.secrets.get("<scope>", "<secret key>")
url = dbutils.secrets.get("<scope>", "<secret key>")
# snowflake connection options
options = {
"sfUrl": url,
"sfUser": user,
"sfPassword": password,
"sfDatabase": "<database>",
"sfSchema": "<schema>",
"sfWarehouse": "<warehouse>"
}
# read data
df = (spark.read
.format("snowflake")
.options(**options)
.option("dbtable", "<table name>")
.load())