by
GS2312
• New Contributor II
- 5630 Views
- 6 replies
- 5 kudos
Hi There,I have been trying to create an external table on Azure Databricks with below statement.df.write.partitionBy("year", "month", "day").format('org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat').option("path",sourcepath).mod...
- 5630 Views
- 6 replies
- 5 kudos
Latest Reply
Hi @Gaurishankar Sakhare Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best ...
5 More Replies
by
Long
• New Contributor
- 1266 Views
- 1 replies
- 0 kudos
I'm trying to connect to an Azure SQL database using R in Databricks. I want to read the credentials stored in Azure secret key vaults rather than hard coding in R code. I've seen some examples of it being done in Scala, however i'm after a R solutio...
- 1266 Views
- 1 replies
- 0 kudos
Latest Reply
Did you find a solution for this @Long Pham ?? I am having the same issue
by
kinsun
• New Contributor II
- 3979 Views
- 2 replies
- 3 kudos
Dear Databricks ExpertI am trying to get a key which is stored in the Azure Key Vault, using Azure Key Vault Keys client library for Python. However error was met.Python Code:#from azure.identity import DefaultAzureCredentialfrom azure.identity impor...
- 3979 Views
- 2 replies
- 3 kudos
Latest Reply
Hi @KS LAU Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your q...
1 More Replies