cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

kfoster
by Contributor
  • 2747 Views
  • 6 replies
  • 7 kudos

Azure DevOps Repo - Invalid Git Credentials

I have a Repo in Databricks connected to Azure DevOps Repositories.The repo has been working fine for almost a month, until last week. Now when I try to open the Git settings in Databricks, I am getting "Invalid Git Credentials". Nothing has change...

  • 2747 Views
  • 6 replies
  • 7 kudos
Latest Reply
tbMark
New Contributor II
  • 7 kudos

Same symptoms, same issue. Azure support hasn't figured it out

  • 7 kudos
5 More Replies
ajbush
by New Contributor III
  • 13715 Views
  • 8 replies
  • 2 kudos

Connecting to Snowflake using an SSO user from Azure Databricks

Hi all,I'm just reaching out to see if anyone has information or can point me in a useful direction. I need to connect to Snowflake from Azure Databricks using the connector: https://learn.microsoft.com/en-us/azure/databricks/external-data/snowflakeT...

  • 13715 Views
  • 8 replies
  • 2 kudos
Latest Reply
BobGeor_68322
New Contributor II
  • 2 kudos

we ended up using device flow oauth because, as noted above, it is not possible to launch a browser on the Databricks cluster from a notebook so you cannot use "externalBrowser" flow. It gives you a url and a code and you open the url in a new tab an...

  • 2 kudos
7 More Replies
Jyo777
by Contributor
  • 4116 Views
  • 5 replies
  • 4 kudos

need help with Azure Databricks questions on CTE and SQL syntax within notebooks

Hi amazing community folks,Feel free to share your experience or knowledge regarding below questions:-1.) Can we pass a CTE sql statement into spark jdbc? i tried to do it i couldn't but i can pass normal sql (Select * from ) and it works. i heard th...

  • 4116 Views
  • 5 replies
  • 4 kudos
Latest Reply
vijaypavann
New Contributor II
  • 4 kudos

CTE expressions are supported with the `prepareQuery` option.  https://spark.apache.org/docs/latest/sql-data-sources-jdbc.html   A prefix that will form the final query together with query. As the specified query will be parenthesized as a subquery i...

  • 4 kudos
4 More Replies
Valentin14
by New Contributor II
  • 6825 Views
  • 6 replies
  • 4 kudos

Import module never ends on random branches

Hello,Since a week ago, our notebook are stuck in running on the firsts cells which import python module from our github repository which is cloned in databricks.The cells stays in running state and when we try to manually cancel the jobs in databric...

  • 6825 Views
  • 6 replies
  • 4 kudos
Latest Reply
timo199
New Contributor II
  • 4 kudos

@Kaniz_Fatma 

  • 4 kudos
5 More Replies
MRTN
by New Contributor III
  • 3859 Views
  • 3 replies
  • 2 kudos

Resolved! Configure multiple source paths for auto loader

I am currently using two streams to monitor data in two different containers on an Azure storage account. Is there any way to configure an autoloader to read from two different locations? The schemas of the files are identical.

  • 3859 Views
  • 3 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

@Morten Stakkeland​ :Yes, it's possible to configure an autoloader to read from multiple locations.You can define multiple CloudFiles sources for the autoloader, each pointing to a different container in the same storage account. In your case, since ...

  • 2 kudos
2 More Replies
FerArribas
by Contributor
  • 8504 Views
  • 6 replies
  • 6 kudos

Resolved! Redirect error in access to web app in Azure Databricks with private front endpoint

I have created a workspace with private endpoint in Azure following this guide:https://learn.microsoft.com/en-us/azure/databricks/administration-guide/cloud-configurations/azure/private-linkOnce I have created the private link of type browser_authent...

  • 8504 Views
  • 6 replies
  • 6 kudos
Latest Reply
flomader
New Contributor II
  • 6 kudos

You don't need a CNAME record.Go to your private link resource in Azure and click on Settings > DNS Configuration. Make sure you have created private link A records for all the FQDNs listed under 'Custom DNS records'. You have most likely missed one ...

  • 6 kudos
5 More Replies
jwilliam
by Contributor
  • 3157 Views
  • 4 replies
  • 2 kudos

Resolved! How to mount Azure Blob Storage with OAuth2?

We already know that we can mount Azure Data Lake Gen2 with OAuth2 using this:configs = {"fs.azure.account.auth.type": "OAuth", "fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider", ...

  • 3157 Views
  • 4 replies
  • 2 kudos
Latest Reply
dssatpute
New Contributor II
  • 2 kudos

Try replacing wasbs with abfss and dfs with blob in the URI, should work! 

  • 2 kudos
3 More Replies
Arby
by New Contributor II
  • 8868 Views
  • 4 replies
  • 0 kudos

Help With OSError: [Errno 95] Operation not supported: '/Workspace/Repos/Connectors....

Hello,I am experiencing issues with importing from utils repo the schema file I created.this is the logic we use for all ingestion and all other schemas live in this repo utills/schemasI am unable to access the file I created for a new ingestion pipe...

icon
  • 8868 Views
  • 4 replies
  • 0 kudos
Latest Reply
Arby
New Contributor II
  • 0 kudos

@Debayan Mukherjee​ Hello, thank you for your response. please let me know if these are the correct commands to access the file from notebookI can see the files in the repo folderbut I just noticed this. the file I am trying to access the size is 0 b...

  • 0 kudos
3 More Replies
Anonymous
by Not applicable
  • 5602 Views
  • 11 replies
  • 2 kudos

Sql Serverless Option is missing when using Azure Databricks Workspace with No Public IP and VNET Injection

HelloAfter creating an Databricks Workspace in Azure with No Public IP and VNET Injection, I'm unable to use DBSQL Serverless because the option to enable it in SQL warehouse Settings is missing. ​Is it by design? Is it a limitation when using Privat...

  • 5602 Views
  • 11 replies
  • 2 kudos
Latest Reply
RomanLegion
New Contributor III
  • 2 kudos

Fixed, go to Profile -> Compute->  SQL Server Serverless -> On -> Save. For some reason this has been disabled for us.

  • 2 kudos
10 More Replies
mk1987c
by New Contributor III
  • 3917 Views
  • 5 replies
  • 1 kudos

Resolved! I am trying to use Databricks Autoloader with File Notification Mode

When i run my command for readstream using  .option("cloudFiles.useNotifications", "true") it start reading the files from Azure blob (please note that i did not provide the configuration like subscription id , clint id , connect string and all while...

  • 3917 Views
  • 5 replies
  • 1 kudos
Latest Reply
jose_gonzalez
Moderator
  • 1 kudos

Hi,I would like to share the following docs that might be able to help you with this issue. https://docs.databricks.com/ingestion/auto-loader/file-notification-mode.html#required-permissions-for-configuring-file-notification-for-adls-gen2-and-azure-b...

  • 1 kudos
4 More Replies
Akshith_Rajesh
by New Contributor III
  • 9693 Views
  • 5 replies
  • 5 kudos

Resolved! Call a Stored Procedure in Azure Synapse with input and output Params

driver_manager = spark._sc._gateway.jvm.java.sql.DriverManager connection = driver_manager.getConnection(mssql_url, mssql_user, mssql_pass) connection.prepareCall("EXEC sys.sp_tables").execute() connection.close()The above code works fine but however...

  • 9693 Views
  • 5 replies
  • 5 kudos
Latest Reply
judyy
New Contributor III
  • 5 kudos

This blog helped me with the output of the stored procedure: https://medium.com/@judy3.yang/how-to-run-sql-procedure-in-databricks-notebook-e28023555565

  • 5 kudos
4 More Replies
danial
by New Contributor II
  • 4907 Views
  • 3 replies
  • 1 kudos

Connect Databricks hosted on Azure, with RDS on AWS.

We have Databricks set up and running on Azure. Now we want to connect it with RDS (AWS) to transfer data from RDS to Azure DataLake using the Databricks.I could find the documentation on how to do it within the same cloud (Either AWS or Azure) but n...

  • 4907 Views
  • 3 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Danial Malik​ Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so ...

  • 1 kudos
2 More Replies
kDev
by New Contributor
  • 10620 Views
  • 8 replies
  • 3 kudos

UnauthorizedAccessException: PERMISSION_DENIED: User does not have READ FILES on External Location

Our jobs have been running fine so far w/o any issues on a specific workspace. These jobs read data from files on Azure ADLS storage containers and dont use the hive metastore data at all.Now we attached the unity metastore to this workspace, created...

  • 10620 Views
  • 8 replies
  • 3 kudos
Latest Reply
Masha
New Contributor III
  • 3 kudos

@Wojciech_BUK  I granted both in the GUI:) you can either search for display name there (and then it uses the Managed Identity Object ID), or you can search directly for the value of the "Managed Identity Application ID" and then it works correctly! ...

  • 3 kudos
7 More Replies
MartinH
by New Contributor II
  • 4936 Views
  • 7 replies
  • 5 kudos

Resolved! Azure Data Factory and Photon

Hello, we have Databricks Python workbooks accessing Delta tables. These workbooks are scheduled/invoked by Azure Data Factory. How can I enable Photon on the linked services that are used to call Databricks?If I specify new job cluster, there does n...

  • 4936 Views
  • 7 replies
  • 5 kudos
Latest Reply
CharlesReily
New Contributor III
  • 5 kudos

When you create a cluster on Databricks, you can enable Photon by selecting the "Photon" option in the cluster configuration settings. This is typically done when creating a new cluster, and you would find the option in the advanced cluster configura...

  • 5 kudos
6 More Replies
Jon
by New Contributor II
  • 3056 Views
  • 4 replies
  • 5 kudos

IP address fix

How can I fix the IP address of my Azure Cluster so that I can whitelist the IP address to run my job daily on my python notebook? Or can I find out the IP address to perform whitelisting? Thanks

  • 3056 Views
  • 4 replies
  • 5 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 5 kudos

Depends on the scenario.  You could expose a single ip address to the external internet, but databricks itself will always use many addresses.

  • 5 kudos
3 More Replies
Labels