cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

gregorymig
by New Contributor III
  • 2863 Views
  • 9 replies
  • 2 kudos

Sql Serverless Option is missing when using Azure Databricks Workspace with No Public IP and VNET Injection

HelloAfter creating an Databricks Workspace in Azure with No Public IP and VNET Injection, I'm unable to use DBSQL Serverless because the option to enable it in SQL warehouse Settings is missing. ​Is it by design? Is it a limitation when using Privat...

  • 2863 Views
  • 9 replies
  • 2 kudos
Latest Reply
zzthatch
New Contributor II
  • 2 kudos

I am starting up with databricks and having the same issue. This is very unexpected, since SQL Serverless is advertised so heavily. I have only seen it noted in one place thus far that restricted networking can preclude you from using this. Please le...

  • 2 kudos
8 More Replies
Kazer
by New Contributor III
  • 3209 Views
  • 2 replies
  • 1 kudos

Resolved! com.microsoft.sqlserver.jdbc.SQLServerException: The driver could not establish a secure connection to SQL Server by using Secure Sockets Layer (SSL) encryption.

Hi. I am trying to read from our Microsoft SQL Server from Azure Databricks via spark.read.jdbc() as described here: Query databases using JDBC - Azure Databricks | Microsoft Learn. The SQL Server is on an Azure VM in a virtual network peered with th...

  • 3209 Views
  • 2 replies
  • 1 kudos
Latest Reply
databricks26
New Contributor II
  • 1 kudos

Hi @Kazer ,Even if I use a new table name, I get the same error. Do you have any suggestions?Thanks,

  • 1 kudos
1 More Replies
ravi28
by New Contributor III
  • 6282 Views
  • 10 replies
  • 8 kudos

How to setup Job notifications using Microsoft Teams webhook ?

Couple of things I tried:1. I created a webhook connector in msft teams and copied it Notifications destinations via Admin page -> New destination -> from dropdown I selected Microsoft teams -> added webhook url and saved it.outcome: I don't get the ...

  • 6282 Views
  • 10 replies
  • 8 kudos
Latest Reply
youssefmrini
Honored Contributor III
  • 8 kudos

You can set up job notifications for Databricks jobs using Microsoft Teams webhooks by following these steps:Set up a Microsoft Teams webhook:Go to the channel where you want to receive notifications in Microsoft Teams.Click on the "..." icon next to...

  • 8 kudos
9 More Replies
Sujitha
by Community Manager
  • 761 Views
  • 1 replies
  • 2 kudos

Take advantage of the Data + AI Summit Virtual Experience next week! Data + AI Summit is just a few days away! With data professionals from 155+ count...

Take advantage of the Data + AI Summit Virtual Experience next week!Data + AI Summit is just a few days away! With data professionals from 155+ countries already registered, this is truly the premier event for the global data, analytics and AI commun...

  • 761 Views
  • 1 replies
  • 2 kudos
Latest Reply
Priyag1
Honored Contributor II
  • 2 kudos

Thank you for informing highlights of the upcoming sessions.

  • 2 kudos
bshirdi
by New Contributor II
  • 1302 Views
  • 1 replies
  • 2 kudos

Getting HTTP 502 bad gateway error!

Hello all,I am suddenly getting an HTTP 502 and DRIVER_LIBRARY_INSTALLATION_FAILURE error during the Python library installation when the cluster gets initialized. I have around 10 Python packages out of which 2-3, packages always failed to install a...

image.png
  • 1302 Views
  • 1 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Bhargav Shir​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.

  • 2 kudos
AhSon
by New Contributor II
  • 1109 Views
  • 2 replies
  • 5 kudos

Resolved! Databricks Certificate Renewal

I received email reminder on my databricks certificate that going to expire next month. May I check where we can renew the certificate like how we did in Microsoft?Thank you.

  • 1109 Views
  • 2 replies
  • 5 kudos
Latest Reply
Anonymous
Not applicable
  • 5 kudos

Hi @Jason Yap​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers you...

  • 5 kudos
1 More Replies
Stokholm
by New Contributor III
  • 4409 Views
  • 9 replies
  • 1 kudos

Pushdown of datetime filter to date partition.

Hi Everybody,I have 20 years of data, 600m rows.I have partitioned them on year and month to generated a files size which seems reasonable.(128Mb)All data is queried using timestamp, as all queries needs to filter on the exact hours.So my requirement...

  • 4409 Views
  • 9 replies
  • 1 kudos
Latest Reply
Stokholm
New Contributor III
  • 1 kudos

Hi Guys, thanks for your advices. I found a solution. We upgrade the Databricks Runtime to 12.2 and now the pushdown of the partitionfilter works. The documentation said that 10.4 would be adequate, but obviously it wasn't enough.

  • 1 kudos
8 More Replies
KVNARK
by Honored Contributor II
  • 1043 Views
  • 2 replies
  • 6 kudos

Resolved! Microsoft ADB file from BLOB into Azure Databricks

Can anyone let me know how we can load the database file into Azure Databricks from the azure blob.

  • 1043 Views
  • 2 replies
  • 6 kudos
Latest Reply
Anonymous
Not applicable
  • 6 kudos

Hi @KVNARK .​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feedback wil...

  • 6 kudos
1 More Replies
Chris_Shehu
by Valued Contributor III
  • 2262 Views
  • 7 replies
  • 4 kudos

On-Behalf of tokens disabled for Azure Environments?

While trying to setup a Power BI connection to the Azure Delta Lake we ran into several issues around Service Principals. ​1) The API listed on the learn.microsoft site (link 1 below) indicates that there is an API you can use to create SP tokens. Wh...

  • 2262 Views
  • 7 replies
  • 4 kudos
Latest Reply
meetskorun
New Contributor II
  • 4 kudos

hello,i am new here from india, here to share some thoughts with you all

  • 4 kudos
6 More Replies
juned
by New Contributor III
  • 1205 Views
  • 2 replies
  • 1 kudos

How install a library that is under the /Workspace/Shared/ directory using the init.sh script in a cluster?

I would like to install a library that is under the /Workspace/Shared/ directory using the init.sh script in a cluster. How to access the /Workspace/Shared/ folder in shell? This page only shows how to access manually but doesn't show how to access i...

  • 1205 Views
  • 2 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Juned Mala​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 1 kudos
1 More Replies
Phani1
by Valued Contributor
  • 1618 Views
  • 2 replies
  • 0 kudos

SUBNET_EXHAUSTED_FAILURE(CLOUD_FAILURE): or No more address space to create NIC within injected virtual network

Currently we are using an all-purpose compute cluster. When we tried to allocate the scheduled jobs to job cluster, we are blocked at the following error:SUBNET_EXHAUSTED_FAILURE(CLOUD_FAILURE): azure_error_code:SubnetIsFull,azure_error_message:No mo...

  • 1618 Views
  • 2 replies
  • 0 kudos
Latest Reply
daniel_sahal
Esteemed Contributor
  • 0 kudos

Answering your questions - yes, your vnet/subnet is out of non-occupied IPs and this can be fixed by allocating more IPs to your network address space.Each cluster requires it's own IP, so if there are none available, it simply cannot start.

  • 0 kudos
1 More Replies
Rishabh264
by Honored Contributor II
  • 1253 Views
  • 0 replies
  • 5 kudos

To connect Delta Lake with Microsoft Excel, you can use the Microsoft Power Query for Excel add-in. Power Query is a data connection tool that allows ...

To connect Delta Lake with Microsoft Excel, you can use the Microsoft Power Query for Excel add-in. Power Query is a data connection tool that allows you to connect to various data sources, including Delta Lake. Here's how to do it:Install the Micros...

  • 1253 Views
  • 0 replies
  • 5 kudos
Ajay-Pandey
by Esteemed Contributor III
  • 3308 Views
  • 5 replies
  • 18 kudos

Resolved! Fetching data in excel through delta sharing

Hi all,Is anyway that we can access or push data in delta sharing by using Microsoft excel?

  • 3308 Views
  • 5 replies
  • 18 kudos
Latest Reply
Rishabh264
Honored Contributor II
  • 18 kudos

hey @Ajay Pandey​ yes recently the new excel feature also comes in the market that we can enable the delta sharing from excel also so whatever the changes you will made to delta , it will automaticaly get saved in the excel file also ,refer this lin...

  • 18 kudos
4 More Replies
Chris_Shehu
by Valued Contributor III
  • 669 Views
  • 1 replies
  • 2 kudos

What are the options for extracting data from the delta lake for a vendor?

Our vendor is looking to use Microsoft API Manager to retrieve data from a variety of sources. Is it possible to extract records from the delta lake by using an API?What I've tried:I reviewed the available databricks API's it looks like most of them ...

  • 669 Views
  • 1 replies
  • 2 kudos
Latest Reply
Chris_Shehu
Valued Contributor III
  • 2 kudos

Another possibility for this potentially is to stand up a cluster and have a notebook running flask to create an API interface. I'm still looking into options, but it seems like there should be a baked in solution besides the JDBC connector. I'm not ...

  • 2 kudos
hitesh1
by New Contributor III
  • 4412 Views
  • 1 replies
  • 5 kudos

java.util.NoSuchElementException: key not found

Hello,We are using a Azure Databricks with Standard DS14_V2 Cluster with Runtime 9.1 LTS, Spark 3.1.2 and Scala 2.12 and facing the below issue frequently when running our ETL pipeline. As part of the operation that is failing there are several joins...

  • 4412 Views
  • 1 replies
  • 5 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 5 kudos

Hey man,Please use these configuration in your cluster and it will work,spark.sql.storeAssignmentPolicy LEGACYspark.sql.parquet.binaryAsString truespark.speculation falsespark.sql.legacy.timeParserPolicy LEGACYif it wont work let me know what problem...

  • 5 kudos
Labels