cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

thisisadarshsin
by New Contributor II
  • 4235 Views
  • 12 replies
  • 0 kudos

Permission issue in Fundamentals of the Databricks Lakehouse Platform Quiz

Hi ,I am getting this Error,when i am trying to give the exam ofFundamentals of the Databricks Lakehouse Platform.403FORBIDDENYou don't have permission to access this page2023-05-20 12:37:41 | Error 403 | https://customer-academy.databricks.com/I al...

  • 4235 Views
  • 12 replies
  • 0 kudos
Latest Reply
Advika_
Databricks Employee
  • 0 kudos

Hello, everyone! We are sorry to hear you're having trouble accessing the Quiz. Please note that the Lakehouse Fundamentals course has been replaced by Databricks Fundamentals along with the updated content.Try logging into your account directly by u...

  • 0 kudos
11 More Replies
eballinger
by Contributor
  • 81 Views
  • 2 replies
  • 0 kudos

Email notification to end users

Is there a way a way we can notify all of our databricks end users by email when there is a issue? We currently have our jobs setup to notify the technical team when a job workflow fails. That part works fine.But we would like the ability to maybe us...

  • 81 Views
  • 2 replies
  • 0 kudos
Latest Reply
Isi
Contributor III
  • 0 kudos

Hey @eballinger , Step 1: Add a Notification Task to Your WorkflowThe first thing you should do is add an extra task to your Databricks job/Airflow Dag/etc. , and set its dependency to "at least one failed".This way, if any upstream task fails, the n...

  • 0 kudos
1 More Replies
dbxlearner
by New Contributor II
  • 109 Views
  • 1 replies
  • 0 kudos

Deploying using Databricks asset bundles (DABs) in a closed network

Hello, I'm trying to deploy DBX workflows using DABs using an Azure DevOps pipeline, in a network that cannot download the required terraform databricks provider package online, due to firewall/network restrictions.I have followed this post: https://...

  • 109 Views
  • 1 replies
  • 0 kudos
Latest Reply
dbxlearner
New Contributor II
  • 0 kudos

Another thing I noticed is, when running the 'databricks bundle debug terraform' command, it mentions these variables:I have tried setting these variables as environment variables in my ADO pipeline, specially the databricks terraform provider variab...

  • 0 kudos
NathanC0926
by New Contributor
  • 177 Views
  • 1 replies
  • 0 kudos

Delta Live Table (Streaming Tables) for excel (.xlsx, .xls)

What's the native way to ingest excel files using a streaming table? I wish that when the excel files land in unity catalog, it can pick up those and load it in to the Streaming Table. Data is Small, so we can afford some kind of UDF, but we really n...

  • 177 Views
  • 1 replies
  • 0 kudos
Latest Reply
LRALVA
Honored Contributor
  • 0 kudos

Hi @NathanC0926 Ingesting Excel files with streaming tables requires a combination of Databricks Autoloader(for file discovery and exactly-once processing) and a custom UDF for Excel parsing.Here's the native approachKey Features of This Solution1. E...

  • 0 kudos
FranPérez
by New Contributor III
  • 12845 Views
  • 8 replies
  • 4 kudos

set PYTHONPATH when executing workflows

I set up a workflow using 2 tasks. Just for demo purposes, I'm using an interactive cluster for running the workflow. { "task_key": "prepare", "spark_python_task": { "python_file": "file...

  • 12845 Views
  • 8 replies
  • 4 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 4 kudos

Hi @Fran Pérez​,Just a friendly follow-up. Did any of the responses help you to resolve your question? if it did, please mark it as best. Otherwise, please let us know if you still need help.

  • 4 kudos
7 More Replies
lumen
by New Contributor
  • 171 Views
  • 3 replies
  • 1 kudos

Notebook ID in DESCRIBE HISTORY not showing

We've recently installed Databricks 14.3 LTS with Unity Catalog and for some reason that is escaping me the Notebook Id is not showing up when I execute the DESCRIBE HISTORY SQL command. Example below for table test_catalog.lineagedemo.lm_lineage_tes...

Image 23-05-2025 at 14.18.png
  • 171 Views
  • 3 replies
  • 1 kudos
Latest Reply
lumen
New Contributor
  • 1 kudos

Hi @RameshRetnasamy first off thank you so much for taking the time to reply to my question. In my case they were indeed created via Notebooks, but I'll re-evaluate on my end as I might've missed something. If the issue persists, I'll re-assert the q...

  • 1 kudos
2 More Replies
fellipeao
by New Contributor
  • 109 Views
  • 2 replies
  • 0 kudos

How to create parameters that works in Power BI Report Builder (SSRS)

Hello!I'm trying to create an item in Power Bi Report Server (SSRS) connected to Databricks. I can connect normally, but I'm having trouble using a parameter that Databricks recognizes.First, I'll illustrate what I do when I connect to SQL Server and...

fellipeao_0-1747918499426.png fellipeao_1-1747918679264.png fellipeao_2-1747918734966.png fellipeao_3-1747918927934.png
  • 109 Views
  • 2 replies
  • 0 kudos
Latest Reply
fellipeao
New Contributor
  • 0 kudos

We can't use linkedserver/procedures because we will stop using SQL management studio and focus only on lake/databricks. But it's a path that would work for now.Because this we are looking for a solution 100% in databricks.

  • 0 kudos
1 More Replies
Vinoth_nirmal
by New Contributor
  • 132 Views
  • 4 replies
  • 0 kudos

Not able create and start a cluster

Hi Team,I am trying to use community edition for learning below is my URL detailshttps://community.cloud.databricks.com/compute/interactive?o=2059657917292434Due to some reason my clusters are taking nearly 45 to 60 minutes for creation and after if ...

  • 132 Views
  • 4 replies
  • 0 kudos
Latest Reply
Vinoth_nirmal
New Contributor
  • 0 kudos

Hi @nikhilj0421  still the same issue, i could see its working till spark 15.4 LTS above 15.4LTS whatever version i useits not woring i am not able to create any cluster.

  • 0 kudos
3 More Replies
hao-uit
by New Contributor
  • 256 Views
  • 1 replies
  • 0 kudos

Spark Streaming Job gets stuck in the "Stream Initializing"

Hello all,I am having an issue with my Spark Streaming Job. It is stuck at "Stream Initializing" stage.Need your help here to understand what is happening inside the "Stream Initializing" stage of Spark Streaming job which is taking so long. Here are...

  • 256 Views
  • 1 replies
  • 0 kudos
Latest Reply
nikhilj0421
Databricks Employee
  • 0 kudos

Hi @hao-uit, do you see any kind of load on the driver and event logs? Also, what libraries you have installed on your cluster? 

  • 0 kudos
dipanjannet
by New Contributor
  • 123 Views
  • 1 replies
  • 0 kudos

Anyone using Databricks Query Federation for ETL purpose ?

Hello All,We have a use case to fetch data from a SQL Server wherein we have some tables to consume. This is typically a OLTP setup wherein the comes in a regular interval.  Now, as we have Unity Catalog enabled, we are interested in exploring Databr...

  • 123 Views
  • 1 replies
  • 0 kudos
Latest Reply
nikhilj0421
Databricks Employee
  • 0 kudos

Hi @dipanjannet, you can leverage DLT feature to do so.  Please check: https://docs.databricks.com/aws/en/dlt/transform https://docs.databricks.com/aws/en/dlt/stateful-processing Here is the step-by-step tutorial: https://docs.databricks.com/aws/en/d...

  • 0 kudos
dev_puli
by New Contributor III
  • 44731 Views
  • 7 replies
  • 8 kudos

how to read the CSV file from users workspace

Hi!I have been carrying out a POC, so I created the CSV file in my workspace and tried to read the content using the techniques below in a Python notebook, but did not work.Option1:repo_file = "/Workspace/Users/u1@org.com/csv files/f1.csv"tmp_file_na...

  • 44731 Views
  • 7 replies
  • 8 kudos
Latest Reply
MujtabaNoori
New Contributor III
  • 8 kudos

Hi @Dev ,Generally, What happens spark reader APIs point to the DBFS by default. And, to read the file from User workspace, we need to append 'file:/' in the prefix.Thanks

  • 8 kudos
6 More Replies
Miloud_G
by New Contributor
  • 63 Views
  • 0 replies
  • 0 kudos

issue on databricks bundle deploy

HiI am trying to configure Databricks Asset Bundle, but got error on deploymentDatabricks bundle init ----------- OKDatabricks bundle validate ----- OKDatabricks bundle deploy ------ Failerror : PS C:\Databricks_DABs\DABs_Init\DABS_Init> databricks b...

  • 63 Views
  • 0 replies
  • 0 kudos
jigar191089
by New Contributor II
  • 388 Views
  • 6 replies
  • 0 kudos

Multiple concurrent jobs using interactive cluster

Hi All,I have notebook in Databricks. This notebook is executed from azure datafactory pipeline having a databricks notebook activity with linkedservice connected to an interactive cluster.When multiple concurrent runs of this pipeline are created, I...

Data Engineering
azure
Databricks
interactive cluster
  • 388 Views
  • 6 replies
  • 0 kudos
Latest Reply
nikhilj0421
Databricks Employee
  • 0 kudos

What libraries are you installing in your cluster?

  • 0 kudos
5 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels