cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

hpant
by New Contributor III
  • 2898 Views
  • 5 replies
  • 1 kudos

Autoloader error "Failed to infer schema for format json from existing files in input"

I have two json files in one of the location in Azure gen 2 storage e.g. '/mnt/abc/Testing/'. When I trying to read the files using autoloader I am getting this error: "Failed to infer schema for format json from existing files in input path /mnt/abc...

  • 2898 Views
  • 5 replies
  • 1 kudos
Latest Reply
holly
Databricks Employee
  • 1 kudos

Hi @hpant would you consider testing the new VARIANT type for your JSON data? I appreciate it will require rewriting the next step in your pipeline, but should be more robust wrt errors.  Disclaimer: I haven't personally tested variant with Autoloade...

  • 1 kudos
4 More Replies
Devsql
by New Contributor III
  • 383 Views
  • 1 replies
  • 1 kudos

For a given Notebook, how to find the calling Job

Hi Team,I came across a situation that I have a Notebook but I am Not able to find a Job/DLT which calls this Notebook.So is there any query or any mechanism, using which i can find out ( or i can list out ) Jobs/scripts which has called given Notebo...

Data Engineering
Azure Databricks
  • 383 Views
  • 1 replies
  • 1 kudos
Latest Reply
Devsql
New Contributor III
  • 1 kudos

Hi @Retired_mod , would you like to help me for above question !!!

  • 1 kudos
semsim
by Contributor
  • 1103 Views
  • 1 replies
  • 0 kudos

List and iterate over files in Databricks workspace

Hi DE Community,I need to be able to list/iterate over a set of files in a specific directory within the Databricks workspace. For example:"/Workspace/SharedFiles/path/to/file_1"..."/Workspace/SharedFiles/path/to/file_n"Thanks for your direction and ...

  • 1103 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @semsim ,You can use File system utility (dbutils.fs)Databricks Utilities (dbutils) reference | Databricks on AWSWork with files on Databricks | Databricks on AWSdbutils.fs.ls("file:/Workspace/Users/<user-folder>/")

  • 0 kudos
Zeruno
by New Contributor II
  • 918 Views
  • 1 replies
  • 0 kudos

DLT - Get pipeline_id and update_id

I need to insert pipeline_id and update_id in my Delta Live Table (DLT), the point being to know which pipeline created which row. How can I obtain this information?I know you can get job_id and run_id from widgets but I don't know if these are the s...

  • 918 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @Zeruno ,Those values are rather static. Maybe you can design process that as a first step will extract information from List Pipepline API and save them in delta table.List pipelines | Pipelines API | REST API reference | Databricks on AWSThan in...

  • 0 kudos
Shazaamzaa
by New Contributor III
  • 1043 Views
  • 1 replies
  • 0 kudos

Setup dbt-core with Azure Entra ID

Hey team, I'm trying to standardize the development environment setup in our team. I've written up a shell script that I want our devs to run in WSL2 after setup. The shell script does the following:1. setup Azure CLI - install and authenticate2. Ins...

  • 1043 Views
  • 1 replies
  • 0 kudos
Latest Reply
Shazaamzaa
New Contributor III
  • 0 kudos

Hey @Retired_mod thanks for the response. I persisted a little more with the logs and the issue appears to be related to WSL2 not having a backend credential manager to handle management of tokens supplied by the OAuth process. To be honest, this is ...

  • 0 kudos
acj1459
by New Contributor
  • 302 Views
  • 0 replies
  • 0 kudos

Azure Databricks Data Load

Hi All,I have 10 tables present on On-prem MS SQL DB and want to load 10 table data incrementally into Bronze delta table as append only. From Bronze to Silver , using merge query I want to load latest record into Silver delta table . Whatever latest...

  • 302 Views
  • 0 replies
  • 0 kudos
MRTN
by New Contributor III
  • 4956 Views
  • 3 replies
  • 2 kudos

Resolved! Configure multiple source paths for auto loader

I am currently using two streams to monitor data in two different containers on an Azure storage account. Is there any way to configure an autoloader to read from two different locations? The schemas of the files are identical.

  • 4956 Views
  • 3 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

@Morten Stakkeland​ :Yes, it's possible to configure an autoloader to read from multiple locations.You can define multiple CloudFiles sources for the autoloader, each pointing to a different container in the same storage account. In your case, since ...

  • 2 kudos
2 More Replies
jfvizoso
by New Contributor II
  • 9658 Views
  • 4 replies
  • 0 kudos

Can I pass parameters to a Delta Live Table pipeline at running time?

I need to execute a DLT pipeline from a Job, and I would like to know if there is any way of passing a parameter. I know you can have settings in the pipeline that you use in the DLT notebook, but it seems you can only assign values to them when crea...

  • 9658 Views
  • 4 replies
  • 0 kudos
Latest Reply
lprevost
Contributor
  • 0 kudos

This seems to be the key to this question:parameterize for dlt  My understanding of this is that you can add the parameter either in the DLT settings UI via Advanced Config/Add Configuration, key, value dialog.   Or via the corresponding pipeline set...

  • 0 kudos
3 More Replies
N_M
by Contributor
  • 10937 Views
  • 7 replies
  • 3 kudos

Resolved! use job parameters in scripts

Hi CommunityI made some research, but I wasn't lucky, and I'm a bit surprised I can't find anything about it.So, I would simply access the job parameters when using python scripts (not notebooks).My flow doesn't use notebooks, but I still need to dri...

  • 10937 Views
  • 7 replies
  • 3 kudos
Latest Reply
N_M
Contributor
  • 3 kudos

The only working workaround I found has been provided in another threadRe: Retrieve job-level parameters in Python - Databricks Community - 44720I will repost it here (thanks @julio_resende )You need to push down your parameters to a task level. Eg:C...

  • 3 kudos
6 More Replies
Shiva3
by New Contributor III
  • 640 Views
  • 1 replies
  • 0 kudos

How to know actual size of delta and non-delta tables also the no of files actually exists on S3.

I have set of delta and non-delta tables, their data is on AWS s3, I want to know the total size of my delta and non-delta table in actual excluding files belongs to operations DELETE, VACCUM etc. , also I need to know how much files each delta versi...

  • 640 Views
  • 1 replies
  • 0 kudos
pjv
by New Contributor III
  • 766 Views
  • 1 replies
  • 1 kudos

Resolved! Connection error when accessing dbutils secrets

We have daily running pipelines that need to access dbutils secrets for API keys. However, the dbutils.secrets.get function within our python code we get the following error:org.apache.http.conn.HttpHostConnectException: Connect to us-central1.gcp.da...

  • 766 Views
  • 1 replies
  • 1 kudos
erigaud
by Honored Contributor
  • 5971 Views
  • 2 replies
  • 3 kudos

Get total number of files of a Delta table

I'm looking to know programatically how many files a delta table is made of.I know I can do %sqlDESCRIBE DETAIL my_tableBut that would only give me the number of files of the current version. I am looking to know the total number of files (basically ...

  • 5971 Views
  • 2 replies
  • 3 kudos
Latest Reply
ADavid
New Contributor II
  • 3 kudos

What was the solution?

  • 3 kudos
1 More Replies
Brian-Nowak
by New Contributor II
  • 1417 Views
  • 3 replies
  • 5 kudos

DBR 15.4 LTS Beta Unable to Write Files to Azure Storage Account

Hi there!I believe I might have identified a bug with DBR 15.4 LTS Beta. The basic task of saving data to a delta table, as well as an even more basic operation of saving a file to cloud storage, is failing on 15.4, but working perfectly fine on 15.3...

  • 1417 Views
  • 3 replies
  • 5 kudos
Latest Reply
Ricklen
New Contributor III
  • 5 kudos

We have the same issue since yesterday (6/8/2024), running on DBR 15.3 or 15.4 LTS Beta. It seems to have something to do with large table's indeed. Tried with multiple .partition sizes.

  • 5 kudos
2 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels