by
ajbush
• New Contributor III
- 20124 Views
- 8 replies
- 3 kudos
Hi all,I'm just reaching out to see if anyone has information or can point me in a useful direction. I need to connect to Snowflake from Azure Databricks using the connector: https://learn.microsoft.com/en-us/azure/databricks/external-data/snowflakeT...
- 20124 Views
- 8 replies
- 3 kudos
Latest Reply
we ended up using device flow oauth because, as noted above, it is not possible to launch a browser on the Databricks cluster from a notebook so you cannot use "externalBrowser" flow. It gives you a url and a code and you open the url in a new tab an...
7 More Replies
- 3559 Views
- 2 replies
- 0 kudos
In our Databricks workspace, we have several delta tables available in the hive_metastore catalog. we are able to access and query the data via Data Science & Engineering persona clusters with no issues. The cluster have the credential passthrough en...
- 3559 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @Rafael Gomez Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so ...
1 More Replies
- 9441 Views
- 8 replies
- 6 kudos
We've setup a premium workspace with passthrough credentials cluster , while they do work and access my adls gen 2 storageI can't make it install a library on the cluster from there. and keeping getting"Library installation attempted on the driver no...
- 9441 Views
- 8 replies
- 6 kudos
Latest Reply
Sorry I can't figure this out, the link you've added is irrelevant for passthrough credentials, if we add it the cluster won't be passthrough, Is there a way to add this just for a specific folder? while keeping passthrough for the rest?
7 More Replies
- 2491 Views
- 2 replies
- 4 kudos
Hi,I would like to deploy Databricks workspaces to build a delta lakehouse to server both scheduled jobs/processing and ad-hoc/analytical querying workloads. Databricks users comprise of both data engineers and data analysts. In terms of requirements...
- 2491 Views
- 2 replies
- 4 kudos
- 5726 Views
- 9 replies
- 5 kudos
What we have:Databricks Workspace Premium on AzureADLS Gen2 storage for raw data, processed data (tables) and files like CSV, models, etc.What we want to do:We have users that want to work on Databricks to create and work with Python algorithms. We d...
- 5726 Views
- 9 replies
- 5 kudos
Latest Reply
Hey @Vartika Nain , we are still at the same situation as described above. The Hive Metastore is a weak point.I would love to have the functionality that a mount can be dedicated to a given cluster.Regards, Gerhard
8 More Replies