cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Being_Bane
by New Contributor
  • 1265 Views
  • 1 replies
  • 1 kudos

Alternative to the inbuilt Scan method of the Databricks SDK for Golang

The implementation of Scan method in the Databricks SDK .If the default value of any column is NULL, in that case the Scan method return exception.What can we use as the alternative to this?While implementing the logic with Scan method as per the doc...

Screenshot 2023-06-07 at 9.43.34 PM Screenshot 2023-06-08 at 11.31.17 AM
  • 1265 Views
  • 1 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Shivansh Bhatnagar​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.

  • 1 kudos
Being_Bane
by New Contributor
  • 1435 Views
  • 1 replies
  • 0 kudos

Alternative to Scan method in Databricks SDK for Golang

We want to fetch multiple rows/records with 100+ columns(that are dynamic as per the request) but in the doc the way shown is to scan each column(through Scan method) one by one rather scanning the complete object at once which is making it very tedi...

Screenshot 2023-06-07 at 5.34.35 PM
  • 1435 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Shivansh Bhatnagar​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.

  • 0 kudos
Saurabh707344
by New Contributor III
  • 18826 Views
  • 1 replies
  • 1 kudos

Platform and Approach Comparison

Do anyone have structure and crisp comparison between benefits of performing MLOps using below ways and what are the strong areas of each platform:a) Standalone Databricks where all pipelines and orchestration done on Databricks and external third pa...

  • 18826 Views
  • 1 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Saurabh Singh​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.

  • 1 kudos
luiso
by New Contributor
  • 970 Views
  • 1 replies
  • 0 kudos
  • 970 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Luis Lopez​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.

  • 0 kudos
DB_795688_DB_44
by New Contributor II
  • 2448 Views
  • 4 replies
  • 2 kudos

error: at least one column must be specified for the table.

error: at least one column must be specified for the table.

  • 2448 Views
  • 4 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @anand R​ Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we ca...

  • 2 kudos
3 More Replies
selvakumar092
by New Contributor II
  • 7312 Views
  • 5 replies
  • 0 kudos

Resolved! Incremental Load without Last Modified Date and Primary Key field in Azure Data Factory to create bronze data in data bricks

 I am trying to do incremental load in azure data factory. Most of the tables in the Oracle database doesn't have last modified date and Primary key column. Is there any way to do incremental loading without last modified date and primary key column?

  • 7312 Views
  • 5 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Selva Kumar Ponnusamy​ Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please te...

  • 0 kudos
4 More Replies
Eelke
by New Contributor II
  • 2845 Views
  • 2 replies
  • 2 kudos

I would like to create a schedule in Databricks that runs a job every two weeks on Monday night 0:00

This seems impossible with the cron that databricks is using but maybe I am wrong? However, if this is not possible it seems to me a missing feature, and thereby would like to suggest this feature to you

  • 2845 Views
  • 2 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Eelke van Foeken​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.T...

  • 2 kudos
1 More Replies
varad
by New Contributor III
  • 4880 Views
  • 6 replies
  • 8 kudos

Resolved! My exam has suspended , Need help Urgently (10/06/2023)

Hello Team, I encountered Pathetic experience while attempting my 1st DataBricks certification. Abruptly, Proctor asked me to show my desk, after showing he/she asked multiple times.. wasted my time and then suspended my exam. I want to file a compla...

  • 4880 Views
  • 6 replies
  • 8 kudos
Latest Reply
Anonymous
Not applicable
  • 8 kudos

Hi @Varad Manglekar​ Glad to hear!Help us build a vibrant and resourceful community by recognizing and highlighting insightful contributions. Mark the best answers and show your appreciation!

  • 8 kudos
5 More Replies
mikimiki309
by New Contributor II
  • 3157 Views
  • 4 replies
  • 2 kudos

Resolved! Voucher not received

Hi, I have attended the recent Lakehouse webinar (May) and completed Lakehouse fundamentals but still have not received the certification voucher. Kindly help.

  • 3157 Views
  • 4 replies
  • 2 kudos
Latest Reply
mikimiki309
New Contributor II
  • 2 kudos

Thanks @Vidula Khanna​ for your help. I have raised a ticket as suggested for this.

  • 2 kudos
3 More Replies
naga_databricks
by Contributor
  • 12802 Views
  • 8 replies
  • 6 kudos

Resolved! Set timestamp column to blank when inserting a record into delta table

I am trying to insert a record into Delta table using notebook written in python. This record has a timestamp column that should be blank initially, later i have a plan to update the timestamp value.How am i inserting the record: stmt_insert_audit_r...

  • 12802 Views
  • 8 replies
  • 6 kudos
Latest Reply
Anonymous
Not applicable
  • 6 kudos

Hi @Naga Vaibhav Elluru​ Elevate our community by acknowledging exceptional contributions. Your participation in marking the best answers is a testament to our collective pursuit of knowledge

  • 6 kudos
7 More Replies
Christine
by Contributor II
  • 31711 Views
  • 4 replies
  • 1 kudos

Resolved! Is it possible to import functions from a module in Workspace/Shared instead of Repos?

Hi,I am considering creating libraries for my databricks notebooks, and found that it is possible to import functions from modules saved in repos. Is it possible to move the .py files with the functions to Workspace/Shared and still import functions ...

  • 31711 Views
  • 4 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Christine Pedersen​ Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell ...

  • 1 kudos
3 More Replies
KarthikeyanB
by New Contributor II
  • 2715 Views
  • 1 replies
  • 2 kudos

Window function + Multiple simultaneous aggregations

Hi team,Why is there no support to perform multiple aggregations together with a single window spec? ie I dont want to specify each aggregation separately and I don't want to see each aggregation perform as a separate piece of work.Or if there is ind...

  • 2715 Views
  • 1 replies
  • 2 kudos
Latest Reply
KarthikeyanB
New Contributor II
  • 2 kudos

Hi @Kaniz Fatma​ ,Firstly, thank you much for responding.Thank you for confirming that performing multiple aggr using a single window spec does NOT evaluate the window spec separately each time. My bad in the wrong understanding prior.

  • 2 kudos
leelingmin
by Databricks Employee
  • 1457 Views
  • 0 replies
  • 1 kudos

dbricks.co

Hi everyone!We’re excited to gather everyone for Data + AI Summit 2023 — the premier AI and LLM global event for the data, analytics and AI community. Join thousands of data engineers, data scientists and data analyst experts virtually from June 29–3...

  • 1457 Views
  • 0 replies
  • 1 kudos
Eelke
by New Contributor II
  • 7093 Views
  • 3 replies
  • 0 kudos

I want to perform interpolation on a streaming table in delta live tables.

I have the following code:from pyspark.sql.functions import * !pip install dbl-tempo from tempo import TSDF   from pyspark.sql.functions import *   # interpolate target_cols column linearly for tsdf dataframe def interpolate_tsdf(tsdf_data, target_c...

  • 7093 Views
  • 3 replies
  • 0 kudos
Latest Reply
Eelke
New Contributor II
  • 0 kudos

The issue was not resolved because we were trying to use a streaming table within TSDF which does not work.

  • 0 kudos
2 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels