cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

pjp94
by Contributor
  • 2166 Views
  • 4 replies
  • 9 kudos

Databrick Job - Notebook Execution

Question - When you set a reoccuring job to simply update a notebook, does databricks clear the state of the notebook prior to executing the notebook? If not, can I configure it to make sure it clears the state before running?

  • 2166 Views
  • 4 replies
  • 9 kudos
Latest Reply
Anonymous
Not applicable
  • 9 kudos

@Paras Patel​ - Would you be happy to mark Hubert's answer as best so that other members can find the solution more easily?Thanks!

  • 9 kudos
3 More Replies
morganmazouchi
by Databricks Employee
  • 7250 Views
  • 7 replies
  • 2 kudos

Resolved! Incremental updates in Delta Live Tables

What happens if we change the logic for the delta live tables and we do an incremental update. Does the table get reset (refresh) automatically or would it only apply the logic to new incoming data? would we have to trigger a reset in this case?

  • 7250 Views
  • 7 replies
  • 2 kudos
Latest Reply
morganmazouchi
Databricks Employee
  • 2 kudos

Here is my finding on when to refresh (reset) the table: If it is a complete table all the changes would be apply automatically. If the table is incremental table, you need to do a manually reset (full refresh).

  • 2 kudos
6 More Replies
Kody_Devl
by New Contributor II
  • 4968 Views
  • 3 replies
  • 2 kudos

%SQL Append null values into a SQL Table

Hi All, I am new to Databricks and am writing my first program.Note: Code Shown Below:I am creating a table with 3 columns to store data. 2 of the columns will be appended in from data that I have in another table.When I run my append query into the...

  • 4968 Views
  • 3 replies
  • 2 kudos
Latest Reply
Kody_Devl
New Contributor II
  • 2 kudos

Hi Hubert,Your answer moves me closer to being able to update pieces of a 26 field MMR_Restated table in pieces are the correct fields values are calculated Thru the process. I have been looking for a way to be able to update in "pieces"...... 2 fie...

  • 2 kudos
2 More Replies
RiyazAli
by Valued Contributor II
  • 12112 Views
  • 7 replies
  • 4 kudos

Issue while trying to read a text file in databricks using Local File API's instead of Spark API.

I'm trying to read a small txt file which is added as a table to the default db on Databricks. While trying to read the file via Local File API, I get a `FileNotFoundError`, but I'm able to read the same file as Spark RDD using SparkContext.Please fi...

  • 12112 Views
  • 7 replies
  • 4 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 4 kudos

can you try with /dbfs/Filestore/tables/boringwords.txt?

  • 4 kudos
6 More Replies
ak09
by New Contributor
  • 856 Views
  • 0 replies
  • 0 kudos

Triggering Notebook in Azure Repos via Azure DevOps

I have been using Databricks workspace for all my data science projects in my firm. In my current project, I have built a CI pipeline using databricks-cli & Azure DevOps. Using databricks-cli I can trigger the Notebook which is present in my workspa...

  • 856 Views
  • 0 replies
  • 0 kudos
tarente
by New Contributor III
  • 3508 Views
  • 3 replies
  • 3 kudos

Partitioned parquet table (folder) with different structure

Hi,We have a parquet table (folder) in Azure Storage Account.The table is partitioned by column PeriodId (represents a day in the format YYYYMMDD) and has data from 20181001 until 20211121 (yesterday).We have a new development that adds a new column ...

  • 3508 Views
  • 3 replies
  • 3 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 3 kudos

I think problem is in overwrite as when you overwrite it overwrites all folders. Solution is to mix append with dynamic overwrite so it will overwrite only folders which have data and doesn't affect old partitions:spark.conf.set("spark.sql.sources.pa...

  • 3 kudos
2 More Replies
Khaled
by New Contributor III
  • 3699 Views
  • 4 replies
  • 2 kudos

Uploading CSV to Databricks community edition

When I upload a csv file of size 1 GB from my PC the in the upload place, it is upload​ing untill the file reach some point and disappear for example it reach 600 MB and disappear from that place

  • 3699 Views
  • 4 replies
  • 2 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 2 kudos

Hi @Khaled ALZHARANI​ ,I would also recommend to split up your CSV files into smaller files.

  • 2 kudos
3 More Replies
tap
by New Contributor III
  • 9813 Views
  • 8 replies
  • 10 kudos

Could Not Connect to ADLS Gen2 Using ABFSS

I'm new to Databricks, not sure what can I do about this issue. I run a simple comment to list all file paths but get SSLHandshakeException.Is there any way to resolve this?  The full error messageExecutionError              Traceback (most recent ca...

2021-12-06_23h04_24
  • 9813 Views
  • 8 replies
  • 10 kudos
Latest Reply
Anonymous
Not applicable
  • 10 kudos

@suet pooi tan​ - Thank you for letting us know.

  • 10 kudos
7 More Replies
pantelis_mare
by Contributor III
  • 5500 Views
  • 6 replies
  • 1 kudos

Delta merge file size control

Hello community!I have a rather weird issue where a delta merge is writing very big files (~1GB) that slow down my pipeline. Here is some context:I have a dataframe containg updates for several dates in the past. Current and last day contain the vast...

  • 5500 Views
  • 6 replies
  • 1 kudos
Latest Reply
pantelis_mare
Contributor III
  • 1 kudos

Hello Jose,I just went with splitting the merge in 2 so I have a merge that touches many partitions but few rows per file and a second that touches ​2-3 partitions but contain the build of the data.

  • 1 kudos
5 More Replies
-werners-
by Esteemed Contributor III
  • 2350 Views
  • 5 replies
  • 22 kudos

Look what I just saw appearing in my notebook:a data histogram of your dataframe!

Look what I just saw appearing in my notebook:a data histogram of your dataframe!

image
  • 2350 Views
  • 5 replies
  • 22 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 22 kudos

you heard it first in here!https://databricks.com/blog/2021/12/07/introducing-data-profiles-in-the-databricks-notebook.html

  • 22 kudos
4 More Replies
Nilave
by New Contributor III
  • 4698 Views
  • 2 replies
  • 1 kudos

Resolved! Solution for API hosted on Databricks

I'm using Azure Databricks Python notebooks. We are preparing a front end to display the Databricks tables via API to query the tables. Is there a solution from Databricks to host callable APIs for querying its table and sending it as response to fro...

  • 4698 Views
  • 2 replies
  • 1 kudos
Latest Reply
Nilave
New Contributor III
  • 1 kudos

@Prabakar Ammeappin​  Thanks for the linkAlso was wondering for web page front end will it be more effective to query from SQL Database or from Azure Databricks tables. If from Azure SQL database, is there any efficient way to sync the tables from Az...

  • 1 kudos
1 More Replies
SailajaB
by Valued Contributor III
  • 1972 Views
  • 4 replies
  • 4 kudos

facing format issue while converting one type nested json to other brand new json schema

Hi,We are writing our flatten json dataframe to user defined nested schema json using pysprk in Databricks.But we are not getting the expected formatExpecting : {"ID":"aaa",c_id":[{"con":null,"createdate":"2015-10-09T00:00:00Z","data":null,"id":"1"},...

  • 1972 Views
  • 4 replies
  • 4 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 4 kudos

as @wereners said you need to share the code. If it is dataframe to json probably you need to use StructType - Array to get that list but without code is hard to help.

  • 4 kudos
3 More Replies
JD2
by Contributor
  • 5055 Views
  • 4 replies
  • 4 kudos

Resolved! Databricks Delta Table

Hello:I am new to databricks and need little help on Delta Table creation.I am having great difficulty to understand creating of delta table and they are:-Do I need to create S3 bucket for Delta Table? If YES then do I have to mount on the mountpoint...

  • 5055 Views
  • 4 replies
  • 4 kudos
Latest Reply
mathan_pillai
Databricks Employee
  • 4 kudos

Hi Jay,I would suggest to start with creating managed delta table. please run a simple commandCREATE TABLE events(id long) USING DELTAThis will create a managed delta table called "events"Then perform %sql describe extended eventsThe above command ...

  • 4 kudos
3 More Replies
Siddhesh2525
by New Contributor III
  • 8623 Views
  • 4 replies
  • 4 kudos

how to set retry attempt and how to set email alert with error message of databricks notebook

how to set retry attempt in the data bricks notebook in term of like if any cmd /cell get fails that times that particular cmd/cell should be rerun for purpose of connection issue etc.

  • 8623 Views
  • 4 replies
  • 4 kudos
Latest Reply
Siddhesh2525
New Contributor III
  • 4 kudos

"you can just implement try/except in cell, handling it by using dbutils.notebook.exit(jobId) and using other dbutils can help,@HubertDudek As i am fresher in the databricks ,Could you please suggest /explain me in detail

  • 4 kudos
3 More Replies
Brose
by New Contributor III
  • 13255 Views
  • 9 replies
  • 2 kudos

Creating a delta table Mismatch Input Error

I am trying to create a delta table for streaming data, but I am getting the following error; Error in SQL statement: ParseException: mismatched input 'CREATE' expecting {<EOF>, ';'}(line 2, pos 0).My statement is as follows;%sqlDROP TABLE IF EXISTS ...

  • 13255 Views
  • 9 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

@Ambrose Walker​ - If Jose's answer resolved your issue, would you be happy to mark that post as best? That will help others find the solution more quickly.

  • 2 kudos
8 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels