cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

slothPetete
by New Contributor II
  • 2042 Views
  • 1 replies
  • 0 kudos

Error with mosaic.enable_mosaic() when created DLT Pipeline with Mosaic lib

The error was raised when I tried to start a DLT pipeline with simple code, just to start experimenting the DLT. The primary library was Mosaic, which is already instructed to installed first before importing. The code is roughly as follow $ %pip ins...

Data Engineering
Delta Live Table
dlt
geospatial
mosaic
  • 2042 Views
  • 1 replies
  • 0 kudos
Jaynab_1
by New Contributor
  • 944 Views
  • 0 replies
  • 0 kudos

Trying calculate Zonal_stats using mosaic and H3

I am trying to calculate Zonal_stats for raster data using mosaic and H3. Created dataframe from geometry data to H3 index. While previously I was calculating Zonal_stats using rasterio, tif file, geometry data in python which is slow. Now want to ex...

  • 944 Views
  • 0 replies
  • 0 kudos
ac0
by Contributor
  • 2478 Views
  • 3 replies
  • 4 kudos

Resolved! Can I have additional logic in a DLT notebook that is unrelated to directly creating DLTs?

I have an Azure Storage Data Table that I would like to update based on records that were just streamed into a Delta Live Table. Below is example code:@Dlt.create_table( comment="comment", table_properties={ "pipelines.autoOptimize.managed": ...

  • 2478 Views
  • 3 replies
  • 4 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 4 kudos

Hi @ac0, Please check @raphaelblg 's response and let us know if this helped to resolve your issue. If it did, please mark it as the accepted solution.

  • 4 kudos
2 More Replies
Jennifer
by New Contributor III
  • 6685 Views
  • 5 replies
  • 1 kudos

Resolved! Import python file to notebook doesn't work

I followed the documentation here under the section "Import a file into a notebook" to import a shared python file among notebooks used by delta live table. But it sometimes can find the module, sometimes not and returns me exception No module named ...

  • 6685 Views
  • 5 replies
  • 1 kudos
Latest Reply
Vartika
Databricks Employee
  • 1 kudos

Thank you so much for getting back to us @Jennifer MJ​ . It's really great of you to send in the solution. Would you be happy to mark the answer as best so other community members can find the solution quickly and easily? We really appreciate your ti...

  • 1 kudos
4 More Replies
117074
by New Contributor III
  • 5492 Views
  • 2 replies
  • 0 kudos

Notebook Visualisations suddenly not working

Hi all,I have a python script which runs SQL code against our Delta Live Tables and returns a pandas dataframe. I do this multiple times and then use 'display(pandas_dataframe)'. Once this displays I then create a visualization from the UI which is t...

  • 5492 Views
  • 2 replies
  • 0 kudos
Latest Reply
117074
New Contributor III
  • 0 kudos

Thank you for the detailed response Kaniz, I appreciate it! I do think it may have been cache issues due to there being no spark computation when running them when the error occured.It did lead me down a train of thought.. is it possible to extract t...

  • 0 kudos
1 More Replies
MichTalebzadeh
by Valued Contributor
  • 2394 Views
  • 1 replies
  • 1 kudos

Working with a text file that is both compressed by bz2 followed by zip in PySpark

 I have downloaded Am azon reviews for sentiment analysis from here. The file is not particularly large (just over 500MB) but comes in the following formattest.ft.txt.bz2.zipSo it is a text file that is compressed by bz2 followed by zip. Now I like t...

Data Engineering
pyspark
zip
  • 2394 Views
  • 1 replies
  • 1 kudos
Latest Reply
MichTalebzadeh
Valued Contributor
  • 1 kudos

Thanks for your reply @Retired_mod On the face of it spark can handle both .bz2 and .zip . It practice it does not work with both at the same time. You end up with ineligible characters as text. I suspect it handles decompression of outer layer (in t...

  • 1 kudos
Miro_ta
by New Contributor III
  • 10470 Views
  • 8 replies
  • 4 kudos

Resolved! Can't query delta tables, token missing required scope

Hello,I've correctly set up a stream from kinesis, but I can't read anything from my delta tableI'm actually reproducing the demo from Frank Munz: https://github.com/fmunz/delta-live-tables-notebooks/tree/main/motion-demoand I'm running the following...

  • 10470 Views
  • 8 replies
  • 4 kudos
Latest Reply
holly
Databricks Employee
  • 4 kudos

Hello, I also had this issue. It was because I was trying to read a DLT table with a Machine Learning Runtime. At time of writing, Machine Learning Runtimes are not compatible with shared access mode, so I ended up setting up two clusters, one MLR as...

  • 4 kudos
7 More Replies
Ela
by New Contributor III
  • 1376 Views
  • 1 replies
  • 1 kudos

Checking for availability of dynamic data masking functionality in SQL.

I am looking forward for functionality similar to snowflake which allows attaching masking to a existing column. Documents found related to masking with encryption but my use case is on the existing table. Solutions using views along with Dynamic Vie...

  • 1376 Views
  • 1 replies
  • 1 kudos
Latest Reply
sivankumar86
New Contributor II
  • 1 kudos

Unity catalog provide similar feature https://docs.databricks.com/en/data-governance/unity-catalog/row-and-column-filters.html

  • 1 kudos
thethirtyfour
by New Contributor III
  • 2468 Views
  • 2 replies
  • 1 kudos

Resolved! error installing the igraph and networkD3 library

Hi!I am trying to install the igraph and networkD3 CRAN packages for use within a notebook. However, I am receiving the below installation error when attempting to do so.Could someone please assist? Thank you!* installing *source* package ‘igraph’ .....

  • 2468 Views
  • 2 replies
  • 1 kudos
Latest Reply
haleight-dc
New Contributor III
  • 1 kudos

Hi! I just figured this out myself. I'm not sure why this is suddenly occurring, since igraph has always loaded fine for me in databricks but didn't this week. I found that the following solution worked.In your notebook before installing your R libra...

  • 1 kudos
1 More Replies
addy
by New Contributor III
  • 6105 Views
  • 2 replies
  • 1 kudos

Reading a table from a catalog that is in a different/external workspace

I am trying to read a table that is hosted on a different workspace. We have been told to establish a connection to said workspace using a table and consume the table.Code I am using isfrom databricks import sqlconnection = sql.connect(server_hostnam...

Data Engineering
catalog
Databricks
sql
  • 6105 Views
  • 2 replies
  • 1 kudos
Latest Reply
AlliaKhosla
Databricks Employee
  • 1 kudos

 Hi Addy Greetings! You can also use Delta sharing to share the data across multiple workspaces. Since you want to read tables from another workspace you can use databricks to databricks delta sharing. https://docs.databricks.com/en/data-sharing/read...

  • 1 kudos
1 More Replies
Data_Engineer3
by Contributor III
  • 2023 Views
  • 3 replies
  • 0 kudos

live spark driver log analysis

In databricks, if we want to see the live log of the exuction we can able to see it from the driver log page of the cluster.But in that we can't able to search by key word instead of that we need to download every one hour log file and live logs are ...

  • 2023 Views
  • 3 replies
  • 0 kudos
Latest Reply
Data_Engineer3
Contributor III
  • 0 kudos

Hi @shan_chandra ,It is like we are putting our driver log into another cloud platform, But here I want to check the live log in local machine tools, is this possible? 

  • 0 kudos
2 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels