cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Jyo777
by Contributor
  • 5171 Views
  • 7 replies
  • 4 kudos

need help with Azure Databricks questions on CTE and SQL syntax within notebooks

Hi amazing community folks,Feel free to share your experience or knowledge regarding below questions:-1.) Can we pass a CTE sql statement into spark jdbc? i tried to do it i couldn't but i can pass normal sql (Select * from ) and it works. i heard th...

  • 5171 Views
  • 7 replies
  • 4 kudos
Latest Reply
Rjdudley
Contributor
  • 4 kudos

Not a comparison, but there is a DB-SQL cheatsheet at https://www.databricks.com/sites/default/files/2023-09/databricks-sql-cheatsheet.pdf/

  • 4 kudos
6 More Replies
Constantine
by Contributor III
  • 5759 Views
  • 5 replies
  • 1 kudos

Resolved! How to use Databricks Query History API (REST API)

I have setup authentication using this page https://docs.databricks.com/sql/api/authentication.html and run curl -n -X GET https://<databricks-instance>.cloud.databricks.com/api/2.0/sql/history/queriesTo get history of all sql endpoint queries, but I...

  • 5759 Views
  • 5 replies
  • 1 kudos
Latest Reply
yegorski
New Contributor III
  • 1 kudos

Here's how to query with databricks-sdk-py (working code). I had a frustrating time doing it with vanilla python + requests/urllib and couldn't figure it out. import datetime import os from databricks.sdk import WorkspaceClient from databricks.sdk.se...

  • 1 kudos
4 More Replies
my_community2
by New Contributor III
  • 13408 Views
  • 9 replies
  • 6 kudos

Resolved! dropping a managed table does not remove the underlying files

the documentation states that "drop table":Deletes the table and removes the directory associated with the table from the file system if the table is not EXTERNAL  table. An exception is thrown if the table does not exist.In case of an external table...

image.png
  • 13408 Views
  • 9 replies
  • 6 kudos
Latest Reply
MajdSAAD_7953
New Contributor II
  • 6 kudos

Hi,There is a way to force delete files after drop the table and don't wait 30 days to see size in S3 decrease?Tables that I dropped related to the dev and staging, I don't want to keep there files for 30 days 

  • 6 kudos
8 More Replies
Viren123
by Contributor
  • 2271 Views
  • 4 replies
  • 2 kudos

User entry via portal/UI

Hello Experts,We have data bricks on Azure. We need to provide a user interface to users so that some customizing table entries can be entered by end users which in turn are saved in the Delta table. Is there any feature in Databricks or tools that w...

  • 2271 Views
  • 4 replies
  • 2 kudos
Latest Reply
MikaelB
New Contributor II
  • 2 kudos

Hi Viren, I have the same use case on my side and looking for some tips. I was thinking Power Apps and Power Automate with API calls for a job that will run the process but happy to check on other solutions as well !Regards

  • 2 kudos
3 More Replies
kfoster
by Contributor
  • 3612 Views
  • 6 replies
  • 7 kudos

Azure DevOps Repo - Invalid Git Credentials

I have a Repo in Databricks connected to Azure DevOps Repositories.The repo has been working fine for almost a month, until last week. Now when I try to open the Git settings in Databricks, I am getting "Invalid Git Credentials". Nothing has change...

  • 3612 Views
  • 6 replies
  • 7 kudos
Latest Reply
tbMark
New Contributor II
  • 7 kudos

Same symptoms, same issue. Azure support hasn't figured it out

  • 7 kudos
5 More Replies
ajbush
by New Contributor III
  • 17186 Views
  • 8 replies
  • 2 kudos

Connecting to Snowflake using an SSO user from Azure Databricks

Hi all,I'm just reaching out to see if anyone has information or can point me in a useful direction. I need to connect to Snowflake from Azure Databricks using the connector: https://learn.microsoft.com/en-us/azure/databricks/external-data/snowflakeT...

  • 17186 Views
  • 8 replies
  • 2 kudos
Latest Reply
BobGeor_68322
New Contributor II
  • 2 kudos

we ended up using device flow oauth because, as noted above, it is not possible to launch a browser on the Databricks cluster from a notebook so you cannot use "externalBrowser" flow. It gives you a url and a code and you open the url in a new tab an...

  • 2 kudos
7 More Replies
Valentin14
by New Contributor II
  • 7343 Views
  • 5 replies
  • 4 kudos

Import module never ends on random branches

Hello,Since a week ago, our notebook are stuck in running on the firsts cells which import python module from our github repository which is cloned in databricks.The cells stays in running state and when we try to manually cancel the jobs in databric...

  • 7343 Views
  • 5 replies
  • 4 kudos
Latest Reply
timo199
New Contributor II
  • 4 kudos

@Retired_mod 

  • 4 kudos
4 More Replies
MRTN
by New Contributor III
  • 4682 Views
  • 3 replies
  • 2 kudos

Resolved! Configure multiple source paths for auto loader

I am currently using two streams to monitor data in two different containers on an Azure storage account. Is there any way to configure an autoloader to read from two different locations? The schemas of the files are identical.

  • 4682 Views
  • 3 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

@Morten Stakkeland​ :Yes, it's possible to configure an autoloader to read from multiple locations.You can define multiple CloudFiles sources for the autoloader, each pointing to a different container in the same storage account. In your case, since ...

  • 2 kudos
2 More Replies
FerArribas
by Contributor
  • 10245 Views
  • 4 replies
  • 6 kudos

Resolved! Redirect error in access to web app in Azure Databricks with private front endpoint

I have created a workspace with private endpoint in Azure following this guide:https://learn.microsoft.com/en-us/azure/databricks/administration-guide/cloud-configurations/azure/private-linkOnce I have created the private link of type browser_authent...

  • 10245 Views
  • 4 replies
  • 6 kudos
Latest Reply
flomader
New Contributor II
  • 6 kudos

You don't need a CNAME record.Go to your private link resource in Azure and click on Settings > DNS Configuration. Make sure you have created private link A records for all the FQDNs listed under 'Custom DNS records'. You have most likely missed one ...

  • 6 kudos
3 More Replies
jwilliam
by Contributor
  • 3879 Views
  • 3 replies
  • 2 kudos

Resolved! How to mount Azure Blob Storage with OAuth2?

We already know that we can mount Azure Data Lake Gen2 with OAuth2 using this:configs = {"fs.azure.account.auth.type": "OAuth", "fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider", ...

  • 3879 Views
  • 3 replies
  • 2 kudos
Latest Reply
dssatpute
New Contributor II
  • 2 kudos

Try replacing wasbs with abfss and dfs with blob in the URI, should work! 

  • 2 kudos
2 More Replies
Arby
by New Contributor II
  • 11499 Views
  • 4 replies
  • 0 kudos

Help With OSError: [Errno 95] Operation not supported: '/Workspace/Repos/Connectors....

Hello,I am experiencing issues with importing from utils repo the schema file I created.this is the logic we use for all ingestion and all other schemas live in this repo utills/schemasI am unable to access the file I created for a new ingestion pipe...

icon
  • 11499 Views
  • 4 replies
  • 0 kudos
Latest Reply
Arby
New Contributor II
  • 0 kudos

@Debayan Mukherjee​ Hello, thank you for your response. please let me know if these are the correct commands to access the file from notebookI can see the files in the repo folderbut I just noticed this. the file I am trying to access the size is 0 b...

  • 0 kudos
3 More Replies
Anonymous
by Not applicable
  • 6855 Views
  • 11 replies
  • 2 kudos

Sql Serverless Option is missing when using Azure Databricks Workspace with No Public IP and VNET Injection

HelloAfter creating an Databricks Workspace in Azure with No Public IP and VNET Injection, I'm unable to use DBSQL Serverless because the option to enable it in SQL warehouse Settings is missing. ​Is it by design? Is it a limitation when using Privat...

  • 6855 Views
  • 11 replies
  • 2 kudos
Latest Reply
RomanLegion
New Contributor III
  • 2 kudos

Fixed, go to Profile -> Compute->  SQL Server Serverless -> On -> Save. For some reason this has been disabled for us.

  • 2 kudos
10 More Replies
mk1987c
by New Contributor III
  • 4876 Views
  • 5 replies
  • 1 kudos

Resolved! I am trying to use Databricks Autoloader with File Notification Mode

When i run my command for readstream using  .option("cloudFiles.useNotifications", "true") it start reading the files from Azure blob (please note that i did not provide the configuration like subscription id , clint id , connect string and all while...

  • 4876 Views
  • 5 replies
  • 1 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 1 kudos

Hi,I would like to share the following docs that might be able to help you with this issue. https://docs.databricks.com/ingestion/auto-loader/file-notification-mode.html#required-permissions-for-configuring-file-notification-for-adls-gen2-and-azure-b...

  • 1 kudos
4 More Replies
Akshith_Rajesh
by New Contributor III
  • 11042 Views
  • 5 replies
  • 6 kudos

Resolved! Call a Stored Procedure in Azure Synapse with input and output Params

driver_manager = spark._sc._gateway.jvm.java.sql.DriverManager connection = driver_manager.getConnection(mssql_url, mssql_user, mssql_pass) connection.prepareCall("EXEC sys.sp_tables").execute() connection.close()The above code works fine but however...

  • 11042 Views
  • 5 replies
  • 6 kudos
Latest Reply
judyy
New Contributor III
  • 6 kudos

This blog helped me with the output of the stored procedure: https://medium.com/@judy3.yang/how-to-run-sql-procedure-in-databricks-notebook-e28023555565

  • 6 kudos
4 More Replies
danial
by New Contributor II
  • 6211 Views
  • 3 replies
  • 1 kudos

Connect Databricks hosted on Azure, with RDS on AWS.

We have Databricks set up and running on Azure. Now we want to connect it with RDS (AWS) to transfer data from RDS to Azure DataLake using the Databricks.I could find the documentation on how to do it within the same cloud (Either AWS or Azure) but n...

  • 6211 Views
  • 3 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Danial Malik​ Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so ...

  • 1 kudos
2 More Replies
Labels