cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

MathewDRitch
by Visitor
  • 19 Views
  • 2 replies
  • 0 kudos

Connecting from Databricks to Network Path

Hi All,Will appreciate if someone can help me with some references links on connecting from Databricks to external network path. I have Databricks on AWS and previously used to connect to files on external network path using Mount method. Now Databri...

  • 19 Views
  • 2 replies
  • 0 kudos
Latest Reply
MathewDRitch
  • 0 kudos

Currently we connect to the cloud storages as external storages using unity catalog. We have not yet connected to the on-premise network storages, which is currently the solution we are looking for. 

  • 0 kudos
1 More Replies
CDICSteph
by New Contributor
  • 649 Views
  • 3 replies
  • 0 kudos

permission denied listing external volume when using vscode databricks extension

hey, i'm using the Db extension for vscode (Databricks connect v2). When using dbutils to list an external volume defined in UC like so:   dbutils.fs.ls("/Volumes/dev/bronze/rawdatafiles/") i get this error: "databricks.sdk.errors.mapping.PermissionD...

  • 649 Views
  • 3 replies
  • 0 kudos
Latest Reply
lukasjh
Visitor
  • 0 kudos

We still face the problem (UC enabled shared cluster). Is there any resolution? @Kaniz  

  • 0 kudos
2 More Replies
Kroy
by Contributor
  • 1360 Views
  • 8 replies
  • 1 kudos

Resolved! What is difference between streaming and streaming live table

Can anyone explain in layman what is difference between Streaming and streaming live table ?

  • 1360 Views
  • 8 replies
  • 1 kudos
Latest Reply
CharlesReily
New Contributor III
  • 1 kudos

Streaming, in a broad sense, refers to the continuous flow of data over a network. It allows you to watch or listen to content in real-time without having to download the entire file first.  A "Streaming Live Table" might refer to a specific type of ...

  • 1 kudos
7 More Replies
kiko_roy
by New Contributor III
  • 1454 Views
  • 5 replies
  • 3 kudos

Resolved! Permission error loading dataframe from azure unity catalog to GCS bucket

I am creating a data frame by reading a table's data residing in Azure backed unity catalog. I need to write the df or file to GCS bucket. I have configured the spark cluster config using the GCP service account json values.on running : df1.write.for...

Data Engineering
GCS bucket
permission error
  • 1454 Views
  • 5 replies
  • 3 kudos
Latest Reply
ruloweb
New Contributor
  • 3 kudos

Hi, is there any terraform resource to apply this GRANT or this have to be done always manually?

  • 3 kudos
4 More Replies
leireroman
by New Contributor
  • 69 Views
  • 1 replies
  • 0 kudos

Bootstrap Timeout during job cluster start

My job was not able to start because I got this problem in the job cluster.This job is running on a Azure Databricks workspace that has been deployed for almost a year and I have not had this error before. It is deployed in North Europe.After getting...

leireroman_0-1713160992292.png
  • 69 Views
  • 1 replies
  • 0 kudos
Latest Reply
lukasjh
Visitor
  • 0 kudos

We have the same problem randomly occurring since yesterday in two workspaces.The cluster started fine today in the morning at 08:00, but failed again from around 09:00 on. 

  • 0 kudos
MrJava
by New Contributor III
  • 3282 Views
  • 9 replies
  • 9 kudos

How to know, who started a job run?

Hi there!We have different jobs/workflows configured in our Databricks workspace running on AWS and would like to know who actually started the job run? Are they started by a user or a service principle using curl?Currently one can only see, who is t...

  • 3282 Views
  • 9 replies
  • 9 kudos
Latest Reply
leonorgrosso
  • 9 kudos

I've just posted this idea on the Idea Portal of Databricks regarding this subject. Upvote it so it may get developed!https://feedback.azure.com/d365community/idea/5d0fdbbf-eefb-ee11-a73c-0022485313bb

  • 9 kudos
8 More Replies
Karene
by New Contributor
  • 47 Views
  • 1 replies
  • 0 kudos

Databricks Connection to Redash

Hello,I am trying to connect my Redash account with Databricks so that my organization can run queries on the data in Unity Catalog from Redash.I followed through the steps in the documentation and managed to connect successfully. However, I am only ...

  • 47 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Karene, Thanks for reaching out! We'll look into this and get back to you with an answer shortly. Thanks for your patience!

  • 0 kudos
AnkithP
by Visitor
  • 33 Views
  • 0 replies
  • 0 kudos

Infer schema eliminating leading zeros.

Upon reading a CSV file with schema inference enabled, I've noticed that a column originally designated as string datatype contains numeric values with leading zeros. However, upon reading the data to Pyspark data frame, it undergoes automatic conver...

  • 33 Views
  • 0 replies
  • 0 kudos
Mutharasu
by New Contributor II
  • 1195 Views
  • 6 replies
  • 6 kudos

SAP Business Object(BO) Integration with Databricks

Hi Team,We are doing an analysis on SAP Business object to connect with databricks and built a report on top of the data in the data lakehouse. In our current architecture we have delta tables on top of S3 storage. Please let us know any connectors/d...

  • 1195 Views
  • 6 replies
  • 6 kudos
Latest Reply
bharat4880
Visitor
  • 6 kudos

Hi @HB83 , Can I know which version of BO are you using? We have a similar requirement.

  • 6 kudos
5 More Replies
Ravikumashi
by Contributor
  • 315 Views
  • 3 replies
  • 0 kudos

issue with azure databricks workspace after we disable public network access

Hi All,We had azure databricks workspaces created thru terraform with public network access enabled to true and everything was working great. recently we have disabled the public network access and started to face issues.terraform is uanble to add us...

  • 315 Views
  • 3 replies
  • 0 kudos
Latest Reply
Ravikumashi
Contributor
  • 0 kudos

we use the following code to create private endpoint and on UI we can see the private endpoint connection status as approved.resource "azurerm_private_endpoint" "example" { name = "example-endpoint" location = azurerm_re...

  • 0 kudos
2 More Replies
Anske
by New Contributor
  • 414 Views
  • 1 replies
  • 0 kudos

One-time backfill for DLT streaming table before apply_changes

Hi,absolute Databricks noob here, but I'm trying to set up a DLT pipeline that processes cdc records from an external sql server instance to create a mirrored table in my databricks delta lakehouse. For this, I need to do some initial one-time backfi...

Data Engineering
Delta Live Tables
  • 414 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anske
New Contributor
  • 0 kudos

So since nobody responded, I decided to try my own suggestion and hack the snapshot data into the table that gathers the change data capture. After some straying I ended up with the notebook as attached.The notebook first creates 2 dlt tables (lookup...

  • 0 kudos
Labels
Top Kudoed Authors