cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

ajbush
by New Contributor III
  • 9268 Views
  • 6 replies
  • 2 kudos

Connecting to Snowflake using an SSO user from Azure Databricks

Hi all,I'm just reaching out to see if anyone has information or can point me in a useful direction. I need to connect to Snowflake from Azure Databricks using the connector: https://learn.microsoft.com/en-us/azure/databricks/external-data/snowflakeT...

  • 9268 Views
  • 6 replies
  • 2 kudos
Latest Reply
aagarwal
New Contributor
  • 2 kudos

@ludgervisser We are trying to connect to Snowflake via Azure AD user through the externalbrowser method but the browser window doesn't open. Could you please share an example code of how you managed to achieve this, or to some documentation? @BobGeo...

  • 2 kudos
5 More Replies
Jon
by New Contributor II
  • 1851 Views
  • 4 replies
  • 5 kudos

IP address fix

How can I fix the IP address of my Azure Cluster so that I can whitelist the IP address to run my job daily on my python notebook? Or can I find out the IP address to perform whitelisting? Thanks

  • 1851 Views
  • 4 replies
  • 5 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 5 kudos

Depends on the scenario.  You could expose a single ip address to the external internet, but databricks itself will always use many addresses.

  • 5 kudos
3 More Replies
michael_mehrten
by New Contributor III
  • 15915 Views
  • 27 replies
  • 14 kudos

Resolved! How to use Databricks Repos with a service principal for CI/CD in Azure DevOps?

Databricks Repos best-practices recommend using the Repos REST API to update a repo via your git provider. The REST API requires authentication, which can be done one of two ways:A user / personal access tokenA service principal access tokenUsing a u...

  • 15915 Views
  • 27 replies
  • 14 kudos
Latest Reply
martindlarsson
New Contributor III
  • 14 kudos

Having the exact same problem. Did you find a solution @michael_mehrten ?In my case Im using a managed identity so the solution some topics suggest on generating an access token from a Entra ID service principal is not applicable.

  • 14 kudos
26 More Replies
pSdatabricks
by New Contributor II
  • 2071 Views
  • 3 replies
  • 0 kudos

Azure Databricks Monitoring & Alerting (Data Observability) Tools / Frameworks for Enterprise

I am trying to evaluate options for Monitoring and Alerting tools like New Relic, Datadog, Grafana with Databricks on Azure . No one supports when reached out to them. I would like to hear from the databricks team on the recommended tool / framework ...

  • 2071 Views
  • 3 replies
  • 0 kudos
Latest Reply
Sruthivika
New Contributor II
  • 0 kudos

I'd recommend this new tool we've been trying out. It's really helpful for monitoring and provides good insights on how Azure Databricks clusters, pools & jobs are doing – like if they're healthy or having issues. It brings everything together, makin...

  • 0 kudos
2 More Replies
Constantine
by Contributor III
  • 2832 Views
  • 5 replies
  • 1 kudos

Resolved! How to use Databricks Query History API (REST API)

I have setup authentication using this page https://docs.databricks.com/sql/api/authentication.html and run curl -n -X GET https://<databricks-instance>.cloud.databricks.com/api/2.0/sql/history/queriesTo get history of all sql endpoint queries, but I...

  • 2832 Views
  • 5 replies
  • 1 kudos
Latest Reply
MorpheusGoGo
New Contributor II
  • 1 kudos

Are you sure this works?payload = { "filter_by": {    }, "max_results": 1} Returns 1 result. payload = { "filter_by": {      "query_start_time_range":{       "start_time_ms" :1640995200000,        "end_time_ms" : 1641081599000   } }, "max_results": 1...

  • 1 kudos
4 More Replies
gregorymig
by New Contributor III
  • 2855 Views
  • 9 replies
  • 2 kudos

Sql Serverless Option is missing when using Azure Databricks Workspace with No Public IP and VNET Injection

HelloAfter creating an Databricks Workspace in Azure with No Public IP and VNET Injection, I'm unable to use DBSQL Serverless because the option to enable it in SQL warehouse Settings is missing. ​Is it by design? Is it a limitation when using Privat...

  • 2855 Views
  • 9 replies
  • 2 kudos
Latest Reply
zzthatch
New Contributor II
  • 2 kudos

I am starting up with databricks and having the same issue. This is very unexpected, since SQL Serverless is advertised so heavily. I have only seen it noted in one place thus far that restricted networking can preclude you from using this. Please le...

  • 2 kudos
8 More Replies
Hal
by New Contributor II
  • 604 Views
  • 1 replies
  • 3 kudos

Connecting Power BI on Azure to Databricks on AWS?

Can someone share with me the proper way to connect Power BI running on Azure to Databricks running on AWS?

  • 604 Views
  • 1 replies
  • 3 kudos
Latest Reply
bhanadi
New Contributor II
  • 3 kudos

Have the same question. Do we have to take care of any specific tasks to make it work. Anyone who implemented it?

  • 3 kudos
Bas1
by New Contributor III
  • 5957 Views
  • 17 replies
  • 20 kudos

Resolved! network security for DBFS storage account

In Azure Databricks the DBFS storage account is open to all networks. Changing that to use a private endpoint or minimizing access to selected networks is not allowed.Is there any way to add network security to this storage account? Alternatively, is...

  • 5957 Views
  • 17 replies
  • 20 kudos
Latest Reply
Odee79
New Contributor II
  • 20 kudos

How can we secure the storage account in the managed resource group which holds the DBFS with restricted network access, since access from all networks is blocked by our Azure storage account policy?

  • 20 kudos
16 More Replies
Baldrez
by New Contributor II
  • 2289 Views
  • 4 replies
  • 5 kudos

Resolved! REST API for Stream Monitoring

Hi, everyone. I just recently started using Databricks on Azure so my question is probably very basic but I am really stuck right now.I need to capture some streaming metrics (number of input rows and their time) so I tried using the Spark Rest Api ...

  • 2289 Views
  • 4 replies
  • 5 kudos
Latest Reply
jose_gonzalez
Moderator
  • 5 kudos

hi @Roberto Baldrez​ ,if you think that @Gaurav Rupnar​ solved your question, then please select it as best response to it can be moved to the top of the topic and it will help more users in the future.Thank you

  • 5 kudos
3 More Replies
jwilliam
by Contributor
  • 1886 Views
  • 3 replies
  • 2 kudos

Resolved! How to mount Azure Blob Storage with OAuth2?

We already know that we can mount Azure Data Lake Gen2 with OAuth2 using this:configs = {"fs.azure.account.auth.type": "OAuth", "fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider", ...

  • 1886 Views
  • 3 replies
  • 2 kudos
Latest Reply
mathijs-fish
New Contributor III
  • 2 kudos

Is there any update on this feature request? OAuth still seems not to be working with Azure Blob Storage... Configuration works fine for ADLS gen 2, but for Azure Blob Storage still only SAS and Account key seems to be working.

  • 2 kudos
2 More Replies
joao_vnb
by New Contributor III
  • 21084 Views
  • 7 replies
  • 11 kudos

Resolved! Automate the Databricks workflow deployment

Hi everyone,Do you guys know if it's possible to automate the Databricks workflow deployment through azure devops (like what we do with the deployment of notebooks)?

  • 21084 Views
  • 7 replies
  • 11 kudos
Latest Reply
asingamaneni
New Contributor II
  • 11 kudos

Did you get a chance to try Brickflows - https://github.com/Nike-Inc/brickflowYou can find the documentation here - https://engineering.nike.com/brickflow/v0.11.2/Brickflow uses - Databricks Asset Bundles(DAB) under the hood but provides a Pythonic w...

  • 11 kudos
6 More Replies
ChrisS
by New Contributor III
  • 1917 Views
  • 7 replies
  • 8 kudos

How to get data scraped from the web into your data storage

I learning data bricks for the first time following the book that is copywrited in 2020 so I imagine it might be a little outdated at this point. What I am trying to do is move data from an online source (in this specific case using shell script but ...

  • 1917 Views
  • 7 replies
  • 8 kudos
Latest Reply
CharlesReily
New Contributor III
  • 8 kudos

In Databricks, you can install external libraries by going to the Clusters tab, selecting your cluster, and then adding the Maven coordinates for Deequ. This represents the best b2b data enrichment services in Databricks.In your notebook or script, y...

  • 8 kudos
6 More Replies
MartinH
by New Contributor II
  • 2349 Views
  • 6 replies
  • 3 kudos

Azure Data Factory and Photon

Hello, we have Databricks Python workbooks accessing Delta tables. These workbooks are scheduled/invoked by Azure Data Factory. How can I enable Photon on the linked services that are used to call Databricks?If I specify new job cluster, there does n...

  • 2349 Views
  • 6 replies
  • 3 kudos
Latest Reply
CharlesReily
New Contributor III
  • 3 kudos

When you create a cluster on Databricks, you can enable Photon by selecting the "Photon" option in the cluster configuration settings. This is typically done when creating a new cluster, and you would find the option in the advanced cluster configura...

  • 3 kudos
5 More Replies
pgruetter
by Contributor
  • 3095 Views
  • 7 replies
  • 2 kudos

Run Task as Service Principal with Code in Azure DevOps Repo

Hi allI have a task of type Notebook, source is Git (Azure DevOps). This task runs fine with my user, but if I change the Owner to a service principal, I get the following error:Run result unavailable: run failed with error message Failed to checkout...

  • 3095 Views
  • 7 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

@pgruetter​ :To enable a service principal to access a specific Azure DevOps repository, you need to grant it the necessary permissions at both the organization and repository levels.Here are the steps to grant the service principal the necessary per...

  • 2 kudos
6 More Replies
Labels