cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

charlieyou
by New Contributor
  • 1075 Views
  • 1 replies
  • 0 kudos

StreamingQueryException: Read timed out // Reading from delta share'd dataset

I have a workspace in GCP that's reading from a delta-shared dataset hosted in S3. When trying to run a very basic DLT pipeline, I'm getting the below error. Any help would be awesome!Code:import dlt     @dlt.table def fn(): return (spark.readStr...

  • 1075 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Charlie You​ :The error message you're encountering suggests a timeout issue when reading from the Delta-shared dataset hosted in S3. There are a few potential reasons and solutions you can explore:Network connectivity: Verify that the network conne...

  • 0 kudos
shubhadip
by New Contributor
  • 395 Views
  • 1 replies
  • 0 kudos

If we do z-order on a particular column will delta log stats collection be affected?

Let's assume a table contains more than 40 columns, now we know it automatically collects stat for the first 32 columns. If we run a z-order on a particular column(let's say column 1), then will the log file collect stats for all the 32 columns or wi...

  • 395 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Shubhadip Ghosh​ : Hope this helps. In Delta Lake, when you perform Z-Ordering on a particular column, it reorganizes the data within the files based on the values of that column. However, Z-Ordering itself does not directly affect the statistics co...

  • 0 kudos
shubhadip
by New Contributor
  • 528 Views
  • 1 replies
  • 0 kudos

Will consecutive delete insert affect z-ordering?

Let's say there is a delta table with a date field as its partition. In a table where condition, we delete all the rows according to the division. The data is currently being inserted into the same date field. If we do a z-order after inserting the d...

  • 528 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Shubhadip Ghosh​ :In Delta Lake, when you perform a delete operation on a table, it doesn't physically remove the data from the files. Instead, it marks the affected rows for deletion by adding a tombstone marker to the Delta transaction log. This e...

  • 0 kudos
Anonymous
by Not applicable
  • 474 Views
  • 1 replies
  • 1 kudos

 Presenting top 3 members who contributed to Community last week between 11th June-17th June- ​ @Tyler Heflin​ @Werner Stinckens​  and @Bharathan K​  ...

 Presenting top 3 members who contributed to Community last week between 11th June-17th June- ​ @Tyler Heflin​ @Werner Stinckens​  and @Bharathan K​ We would like to express our gratitude for your participation and dedication in the Databricks Commun...

Copy of 2023-05-Community-ongoing-announcement-1200x628 (2)
  • 474 Views
  • 1 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Wow!!!Exciting metrics - @Werner Stinckens​ , @Tyler Heflin​ , and @Bharathan K​ !Congratulations!!!

  • 1 kudos
Aanchal
by New Contributor III
  • 1187 Views
  • 4 replies
  • 2 kudos

Resolved! Unable to lunch cluster in Databricks as my azure subscription has been disabled

Azure subscription- disabledDatabricks subscription- free trial 13 day leftDatabricks host- AzureThe cluster is not getting created as my Azure subscription has been disabled after a month of free trial. However, Databricks subscription has still got...

  • 1187 Views
  • 4 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Aanchal Soni​ We haven't heard from you since the last response from @Tyler Retzlaff​ â€‹, and I was checking back to see if her suggestions helped you.Or else, If you have any solution, please share it with the community, as it can be helpful to o...

  • 2 kudos
3 More Replies
PK225
by New Contributor III
  • 811 Views
  • 3 replies
  • 2 kudos
  • 811 Views
  • 3 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Pavan Kumar​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks...

  • 2 kudos
2 More Replies
Nikhil3107
by New Contributor III
  • 5139 Views
  • 2 replies
  • 0 kudos

Resolved! Model Serving error - Java gateway process exited before sending its port number

Hello, I am trying to serve a model endpoint (using Databricks GUI) for a model that was successfully logged to the Model Registry. However, the endpoint creation failed with the following errors: Endpoint logs with error messagesEndpoint events with...

image.png Endpoint events log model image
  • 5139 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Nikhil Gajghate​ We haven't heard from you since the last response from @Kaniz Fatma​ â€‹, and I was checking back to see if her suggestions helped you.Or else, If you have any solution, please share it with the community, as it can be helpful to o...

  • 0 kudos
1 More Replies
shishirs29
by New Contributor II
  • 706 Views
  • 2 replies
  • 5 kudos

Resolved! Databricks voucher

I have attended the training last year of databricks to gain knowledge and help the clients but later i got to know that there are vouchers also available for which survey needs to complete which i completed now. I have already given some of the exam...

  • 706 Views
  • 2 replies
  • 5 kudos
Latest Reply
Anonymous
Not applicable
  • 5 kudos

Hi @Shishir Shivhare​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answ...

  • 5 kudos
1 More Replies
ashu_aith1991
by New Contributor II
  • 515 Views
  • 1 replies
  • 3 kudos

delta table

can we connect delta table of databricks from one workspace to another in different subscription and run vacuum command?

  • 515 Views
  • 1 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @ASHUTOSH YADAV​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.

  • 3 kudos
alemo
by New Contributor III
  • 467 Views
  • 1 replies
  • 1 kudos

DLT started by SERVICE_UPGRADE

HelloI'm developing a dlt pipeline, configured in continuous mode.I'm still in dev mode, so I stop my pipeline when i'm not working on it.My problem is that the pipeline is frequently started by SERVICE_UPGRADE.example of message:'Update xxxxx starte...

  • 467 Views
  • 1 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @alex mo​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.

  • 1 kudos
Paddy_chu
by New Contributor
  • 11072 Views
  • 1 replies
  • 0 kudos

How to restart the kernel on my notebook in databricks?

while installing a python package on my databricks notebook, I kept getting a message saying that: "Note: you may need to restart the kernel using dbutils.library.restartPython() to use updated packages."I've tried restarting my cluster, also detach ...

error message
  • 11072 Views
  • 1 replies
  • 0 kudos
Latest Reply
Evan_MCK
Contributor
  • 0 kudos

dbutils.library.restartPython()Just run this code in the notebook without restarting the cluster or using pip install again. Restarting the cluster erased what you just installed with pip and you are back to square one. Restarting python after the pi...

  • 0 kudos
apiury
by New Contributor III
  • 2631 Views
  • 9 replies
  • 14 kudos

Resolved! Pipeline workflow dude

Hi! I have a problem. I'm using an autoloader to ingest data from raw to a Delta Lake, but when my pipeline starts, I want to apply the pipeline only to the new data. The autoloader ingests data into the Delta Lake, but now, how can I distinguish the...

  • 2631 Views
  • 9 replies
  • 14 kudos
Latest Reply
Anonymous
Not applicable
  • 14 kudos

Hi @Alejandro Piury Pinzón​ We haven't heard from you since the last response from @Tyler Retzlaff​ â€‹, and I was checking back to see if her suggestions helped you.Or else, If you have any solution, please share it with the community, as it can be he...

  • 14 kudos
8 More Replies
Tjomme
by New Contributor III
  • 5515 Views
  • 7 replies
  • 8 kudos

Resolved! How to manipulate files in an external location?

According to the documentation, the usage of external locations is preferred over the use of mount points.Unfortunately the basic funtionality to manipulate files seems to be missing.This is my scenario:create a download folder in an external locatio...

  • 5515 Views
  • 7 replies
  • 8 kudos
Latest Reply
Tjomme
New Contributor III
  • 8 kudos

The main problem was related to the network configuration of the storage account: Databricks did not have access. Quite strange that it did manage to create folders...Currently dbutils.fs functionality is working.For the zipfile manipulation: that on...

  • 8 kudos
6 More Replies
simensma
by New Contributor II
  • 755 Views
  • 3 replies
  • 1 kudos

Resolved! Autoload files in wide table format, but store it unpivot in Streaming Table

Hey, I get wide table format in csv file. Where each sensor have its own column. I want to store it in Delta Live Streaming Table. But since it is inefficient to process it and storage space, due to varying frequency and sensor amount. I want to tran...

  • 755 Views
  • 3 replies
  • 1 kudos
Latest Reply
Vartika
Moderator
  • 1 kudos

Hi @Simen Småriset​,Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us s...

  • 1 kudos
2 More Replies
matkap
by New Contributor II
  • 845 Views
  • 2 replies
  • 2 kudos

In the VSCode Databricks Extension how can one specify notebook parameters to a pass to a workflow job?

I have successfully used the VSCode extension for Databricks to run a notebook on a cluster from my IDE. However in order to test effectively without changing the source, I need a way to pass parameters to the workflow job.I have tried various ways ...

  • 845 Views
  • 2 replies
  • 2 kudos
Latest Reply
AsphaltDataRide
New Contributor III
  • 2 kudos

@matthew kaplan​ I am not using widgets but what works is running it by pressing F5 in the python file you want to run.

  • 2 kudos
1 More Replies
Labels
Top Kudoed Authors