cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

kfoster
by Contributor
  • 4238 Views
  • 7 replies
  • 7 kudos

Azure DevOps Repo - Invalid Git Credentials

I have a Repo in Databricks connected to Azure DevOps Repositories.The repo has been working fine for almost a month, until last week. Now when I try to open the Git settings in Databricks, I am getting "Invalid Git Credentials". Nothing has change...

  • 4238 Views
  • 7 replies
  • 7 kudos
Latest Reply
tbMark
New Contributor II
  • 7 kudos

Same symptoms, same issue. Azure support hasn't figured it out

  • 7 kudos
6 More Replies
Viren123
by Contributor
  • 2713 Views
  • 5 replies
  • 2 kudos

User entry via portal/UI

Hello Experts,We have data bricks on Azure. We need to provide a user interface to users so that some customizing table entries can be entered by end users which in turn are saved in the Delta table. Is there any feature in Databricks or tools that w...

  • 2713 Views
  • 5 replies
  • 2 kudos
Latest Reply
RajeevKum
New Contributor II
  • 2 kudos

I achieved this using textbox widget and asked user to enter and run dashboard, in python code validating and inserting data in databricks.2nd method , created excel vba application, using odbc connection to insert delete and update data in delta tab...

  • 2 kudos
4 More Replies
William_Scardua
by Valued Contributor
  • 9861 Views
  • 3 replies
  • 1 kudos

How to read data from Azure Log Analitycs ?

Hi guys,I need to read data from Azure Log Analitycs Workspace directaly, have any idea ?thank you

  • 9861 Views
  • 3 replies
  • 1 kudos
Latest Reply
alexott
Databricks Employee
  • 1 kudos

You can use Kusto Spark connector for that: https://github.com/Azure/azure-kusto-spark/blob/master/docs/KustoSource.md#source-read-command It heavily depends on how you access data, there could be a need for using ADX cluster for it: https://learn.mi...

  • 1 kudos
2 More Replies
Rita
by New Contributor III
  • 8506 Views
  • 7 replies
  • 6 kudos

How to connect Cognos 11.1.7 to Azure Databricks

We are trying to connect Cognos 11.1.7 to Azure Databricks, but no success.Can you please help or guide us how to connect Cognos 11.1.7 to Azure Databricks.This is very critical to our user community. Can you please help or guide us how to connect Co...

  • 8506 Views
  • 7 replies
  • 6 kudos
Latest Reply
Hans2
New Contributor II
  • 6 kudos

Have anyone got the Simba JDBC driver going with CA 11.1.7? The ODBC driver works fine but i  can't get the JDBC running.Regd's

  • 6 kudos
6 More Replies
Maksym
by New Contributor III
  • 9352 Views
  • 5 replies
  • 7 kudos

Resolved! Databricks Autoloader is getting stuck and does not pass to the next batch

I have a simple job scheduled every 5 min. Basically it listens to cloudfiles on storage account and writes them into delta table, extremely simple. The code is something like this:df = (spark .readStream .format("cloudFiles") .option('cloudFil...

  • 9352 Views
  • 5 replies
  • 7 kudos
Latest Reply
lassebe
New Contributor II
  • 7 kudos

I had the same issue: files would randomly not be loaded.Setting `.option("cloudFiles.useIncrementalListing", False)` Seemed to do the trick!

  • 7 kudos
4 More Replies
krishnachaitany
by New Contributor II
  • 5082 Views
  • 3 replies
  • 4 kudos

Resolved! Spot instance in Azure Databricks

When I run a job enabling using spot instances , I would like to know how many number of workers are using spot and how many number of workers are using on demand instances for a given job run In order to identify the spot instances we got for any...

  • 5082 Views
  • 3 replies
  • 4 kudos
Latest Reply
drumcircle
New Contributor II
  • 4 kudos

This remains a challenge using system tables.

  • 4 kudos
2 More Replies
Hubert-Dudek
by Esteemed Contributor III
  • 21217 Views
  • 4 replies
  • 26 kudos

How to connect your Azure Data Lake Storage to Azure DatabricksStandard Workspace �� Private link In your storage accounts please go to “Networ...

How to connect your Azure Data Lake Storage to Azure DatabricksStandard Workspace Private linkIn your storage accounts please go to “Networking” -> “Private endpoint connections” and click Add Private Endpoint.It is important to add private links in ...

image.png image.png image.png image.png
  • 21217 Views
  • 4 replies
  • 26 kudos
Latest Reply
dollyb
Contributor
  • 26 kudos

This should be updated for Unity Catalog workspaces. 

  • 26 kudos
3 More Replies
Jyo777
by Contributor
  • 5982 Views
  • 7 replies
  • 4 kudos

need help with Azure Databricks questions on CTE and SQL syntax within notebooks

Hi amazing community folks,Feel free to share your experience or knowledge regarding below questions:-1.) Can we pass a CTE sql statement into spark jdbc? i tried to do it i couldn't but i can pass normal sql (Select * from ) and it works. i heard th...

  • 5982 Views
  • 7 replies
  • 4 kudos
Latest Reply
Rjdudley
Honored Contributor
  • 4 kudos

Not a comparison, but there is a DB-SQL cheatsheet at https://www.databricks.com/sites/default/files/2023-09/databricks-sql-cheatsheet.pdf/

  • 4 kudos
6 More Replies
Constantine
by Contributor III
  • 6312 Views
  • 5 replies
  • 1 kudos

Resolved! How to use Databricks Query History API (REST API)

I have setup authentication using this page https://docs.databricks.com/sql/api/authentication.html and run curl -n -X GET https://<databricks-instance>.cloud.databricks.com/api/2.0/sql/history/queriesTo get history of all sql endpoint queries, but I...

  • 6312 Views
  • 5 replies
  • 1 kudos
Latest Reply
yegorski
New Contributor III
  • 1 kudos

Here's how to query with databricks-sdk-py (working code). I had a frustrating time doing it with vanilla python + requests/urllib and couldn't figure it out. import datetime import os from databricks.sdk import WorkspaceClient from databricks.sdk.se...

  • 1 kudos
4 More Replies
my_community2
by New Contributor III
  • 15700 Views
  • 9 replies
  • 6 kudos

Resolved! dropping a managed table does not remove the underlying files

the documentation states that "drop table":Deletes the table and removes the directory associated with the table from the file system if the table is not EXTERNAL  table. An exception is thrown if the table does not exist.In case of an external table...

image.png
  • 15700 Views
  • 9 replies
  • 6 kudos
Latest Reply
MajdSAAD_7953
New Contributor II
  • 6 kudos

Hi,There is a way to force delete files after drop the table and don't wait 30 days to see size in S3 decrease?Tables that I dropped related to the dev and staging, I don't want to keep there files for 30 days 

  • 6 kudos
8 More Replies
ajbush
by New Contributor III
  • 20123 Views
  • 8 replies
  • 3 kudos

Connecting to Snowflake using an SSO user from Azure Databricks

Hi all,I'm just reaching out to see if anyone has information or can point me in a useful direction. I need to connect to Snowflake from Azure Databricks using the connector: https://learn.microsoft.com/en-us/azure/databricks/external-data/snowflakeT...

  • 20123 Views
  • 8 replies
  • 3 kudos
Latest Reply
BobGeor_68322
New Contributor III
  • 3 kudos

we ended up using device flow oauth because, as noted above, it is not possible to launch a browser on the Databricks cluster from a notebook so you cannot use "externalBrowser" flow. It gives you a url and a code and you open the url in a new tab an...

  • 3 kudos
7 More Replies
Valentin14
by New Contributor II
  • 7691 Views
  • 5 replies
  • 4 kudos

Import module never ends on random branches

Hello,Since a week ago, our notebook are stuck in running on the firsts cells which import python module from our github repository which is cloned in databricks.The cells stays in running state and when we try to manually cancel the jobs in databric...

  • 7691 Views
  • 5 replies
  • 4 kudos
Latest Reply
timo199
New Contributor II
  • 4 kudos

@Retired_mod 

  • 4 kudos
4 More Replies
MRTN
by New Contributor III
  • 5215 Views
  • 3 replies
  • 2 kudos

Resolved! Configure multiple source paths for auto loader

I am currently using two streams to monitor data in two different containers on an Azure storage account. Is there any way to configure an autoloader to read from two different locations? The schemas of the files are identical.

  • 5215 Views
  • 3 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

@Morten Stakkeland​ :Yes, it's possible to configure an autoloader to read from multiple locations.You can define multiple CloudFiles sources for the autoloader, each pointing to a different container in the same storage account. In your case, since ...

  • 2 kudos
2 More Replies
FerArribas
by Contributor
  • 11314 Views
  • 4 replies
  • 6 kudos

Resolved! Redirect error in access to web app in Azure Databricks with private front endpoint

I have created a workspace with private endpoint in Azure following this guide:https://learn.microsoft.com/en-us/azure/databricks/administration-guide/cloud-configurations/azure/private-linkOnce I have created the private link of type browser_authent...

  • 11314 Views
  • 4 replies
  • 6 kudos
Latest Reply
flomader
New Contributor II
  • 6 kudos

You don't need a CNAME record.Go to your private link resource in Azure and click on Settings > DNS Configuration. Make sure you have created private link A records for all the FQDNs listed under 'Custom DNS records'. You have most likely missed one ...

  • 6 kudos
3 More Replies
jwilliam
by Contributor
  • 4369 Views
  • 3 replies
  • 2 kudos

Resolved! How to mount Azure Blob Storage with OAuth2?

We already know that we can mount Azure Data Lake Gen2 with OAuth2 using this:configs = {"fs.azure.account.auth.type": "OAuth", "fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider", ...

  • 4369 Views
  • 3 replies
  • 2 kudos
Latest Reply
dssatpute
New Contributor II
  • 2 kudos

Try replacing wasbs with abfss and dfs with blob in the URI, should work! 

  • 2 kudos
2 More Replies
Labels