cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

lezwon
by Contributor
  • 1605 Views
  • 2 replies
  • 1 kudos

Resolved! At least 1 "file_arrival" blocks are required.

Hi folks, I'm trying to set up a databricks asset bundle for a job to load some product data into databricks. This job was created in databricks and loads the data from a location hardcoded into the notebook (for now). It is supposed to run every 3 h...

  • 1605 Views
  • 2 replies
  • 1 kudos
Latest Reply
Tharani
New Contributor III
  • 1 kudos

I think since it is a scheduled job, you have to explicitly specify a cron-based schedule instead of using file_arrival in trigger section in yaml file.

  • 1 kudos
1 More Replies
Vittorio
by New Contributor II
  • 1444 Views
  • 1 replies
  • 1 kudos

Month-on-month growth with pivot

I need to create pivot tables with data on revenue/costs presented monthly and I need to show the month on month growth and it seems like mission impossible with Dashboards on SQL warehouse despite being quite obviously a very typical task.Pivot tabl...

  • 1444 Views
  • 1 replies
  • 1 kudos
Latest Reply
Brahmareddy
Esteemed Contributor
  • 1 kudos

Hi Vittorio,How are you doing today? , As per my understanding, You're absolutely right—creating a proper pivot table with dynamic month-over-month (MoM) growth in Databricks SQL Dashboards is surprisingly tricky for such a common use case. The built...

  • 1 kudos
standup1
by Contributor
  • 4175 Views
  • 3 replies
  • 1 kudos

Recover a deleted DLT pipeline

Hello,does anyone know how to recover a deleted dlt pipeline, or at least recover deleted tables that were managed by the dlt pipeline ? We have a pipeline that stopped working and throwing all kind of errors, so we decided to create a new one and de...

  • 4175 Views
  • 3 replies
  • 1 kudos
Latest Reply
Nishair05
New Contributor II
  • 1 kudos

Found out a way to recover the tables. But it seems like we need to recover the pipeline as well. Any idea on how to recover the pipeline?

  • 1 kudos
2 More Replies
Phani1
by Databricks MVP
  • 3426 Views
  • 1 replies
  • 0 kudos

Dashboard deployment

Hi Team,How to deploy a dashboard in one databricks account to another databricks account/client account without revealing the underlying notebook code.Regards,Phani

  • 3426 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Community Manager
  • 0 kudos

Hello @Phani1! Could you confirm whether you're referring to an AI/BI Dashboard or a Notebook Dashboard?

  • 0 kudos
balu_9309
by New Contributor II
  • 1377 Views
  • 3 replies
  • 0 kudos

databricks job runs connect with powerbi

Hi i have databricks jobs run how to connect power bi app or that runs save in blob or delta table

  • 1377 Views
  • 3 replies
  • 0 kudos
Latest Reply
chexa_Wee
New Contributor III
  • 0 kudos

You can connect Databricks to Power BI using the "Get Data" option. To do this, you need to provide the necessary cluster details and then connect to the Delta tables in Databricks. This allows Power BI to access and analyze data stored in Databricks...

  • 0 kudos
2 More Replies
RolandCVaillant
by New Contributor II
  • 4073 Views
  • 1 replies
  • 2 kudos

Databricks notebook dashboard export

After the latest Databricks update, my team can no longer download internal notebook dashboards in the dashboard view as .html files. When downloading, the entire code is always exported as an .html file. Is there a way to export just the notebook da...

  • 4073 Views
  • 1 replies
  • 2 kudos
Latest Reply
Advika
Community Manager
  • 2 kudos

Hello @RolandCVaillant! For AI/BI dashboards, you can download the rendered view as a PDF after publishing the dashboard. However, there isn’t a direct way for notebook dashboards. Instead, you can export the notebook by selecting File > Export in th...

  • 2 kudos
carlos_tasayco
by Contributor
  • 2017 Views
  • 1 replies
  • 0 kudos

Databricks app share with people out of your organization

Yes, that is it, Can we share our app with people who are not from organization?https://www.databricks.com/blog/introducing-databricks-apps

  • 2017 Views
  • 1 replies
  • 0 kudos
Latest Reply
Shua42
Databricks Employee
  • 0 kudos

Currently, apps can only be accessed by users in the workspace where it is deployed.  One workaround solution, depending on the amount of external users you want to grant access to, would be the following: Create a new workspace.Add external users to...

  • 0 kudos
shubham_meshram
by Databricks Partner
  • 1504 Views
  • 2 replies
  • 2 kudos

Databrick Dashboards Maps

I am currently unable to use the MAPS (Choropleth) feature in databricks dashboard, however I am able to use it in the legacy dashboard. I would like my users to use a single dashboard that has all the combined features than exposing them to 2 differ...

  • 1504 Views
  • 2 replies
  • 2 kudos
Latest Reply
shubham_meshram
Databricks Partner
  • 2 kudos

Thanks for your input @Brahmareddy , I just met some folks from Databricks at a booth in Missouri and they confirmed April end would be the planned timeline for the maps feature release. Thanks for your input

  • 2 kudos
1 More Replies
jeremy98
by Honored Contributor
  • 3431 Views
  • 1 replies
  • 0 kudos

where saving the wheel package?

Hi community,We have deployed the wheel package internally in our bundle repository: artifacts: rnc_lib: type: whl build: poetry build path: . # For passing wheel package to workspace sync: include: - ./dist/*.whlThe problem is t...

  • 3431 Views
  • 1 replies
  • 0 kudos
Latest Reply
Renu_
Valued Contributor II
  • 0 kudos

Hi @jeremy98 You can upload the wheel to a shared workspace location and configure it for cluster-level installation by attaching it as a library.Or you can also automate the process by adding the wheel to the libraries section of your databricks.yml...

  • 0 kudos
naineel
by New Contributor
  • 701 Views
  • 1 replies
  • 0 kudos

Scheduling a Complete Python Project in Databricks

Hi everyone,I have a simple Python project with the following structure:root/ │── src/ │ ├── package_name/ │ │ ├── __init__.py │ │ ├── main.py │ │ ├── submodules1/ │ │ │ ├── __init__.py │ │ │ ├── base1.py │ ...

  • 701 Views
  • 1 replies
  • 0 kudos
Latest Reply
ashraf1395
Honored Contributor
  • 0 kudos

Hi there @naineel , one approach can be you can convert your project into a whl fileand then create a python whl task for it and schedulehttps://docs.databricks.com/aws/en/jobs/python-wheel

  • 0 kudos
Subhasis
by New Contributor III
  • 949 Views
  • 2 replies
  • 0 kudos

Unity catalog codebase

SpoilerHow can I get the code base of unity catalog?How can I get the code base of unity catalog?

  • 949 Views
  • 2 replies
  • 0 kudos
Latest Reply
Subhasis
New Contributor III
  • 0 kudos

Here I am mentioning about the .dbc file which databricks used to provide earlier for practice the query. Hence is it possible to get that query? 

  • 0 kudos
1 More Replies
gchandra
by Databricks Employee
  • 3564 Views
  • 5 replies
  • 3 kudos

Resolved! Databricks Community Edition - DBFS Alternative Solutions

Option 1: Mount AWS S3 bucket access_key = ""secret_key = ""encoded_secret_key = secret_key.replace("/", "%2F") aws_bucket_name = "yourawsbucketname/"mount_name = "youraliasmountname" # #dbutils.fs.unmountf"/mnt/{mount_name}")dbutils.fs.mount(f"s3a:/...

  • 3564 Views
  • 5 replies
  • 3 kudos
Latest Reply
DanT
New Contributor II
  • 3 kudos

Seems to be removed again? Can't see options. 

  • 3 kudos
4 More Replies
jhgorse
by New Contributor III
  • 3099 Views
  • 1 replies
  • 0 kudos

mqtt to Delta Live Table

Greetings,I see that Delta Live Tables has various real-time connectors such as Kafka, Kinesis, Google's Pub Sub, and so on. I also see that Apache had maintained an mqtt connector to Spark through the 2.x series called Bahir, but dropped it in versi...

  • 3099 Views
  • 1 replies
  • 0 kudos
Latest Reply
KK0001
New Contributor II
  • 0 kudos

How this solution end up?

  • 0 kudos
alex_crow
by New Contributor II
  • 23622 Views
  • 7 replies
  • 1 kudos

ModuleNotFoundError: No module named 'databricks.sdk' in module installed via Pip

Hello. I'm currently having an issue that I simply cannot understand nor find an adequate work-around for. Recently, my team within our organization has undergone the effort of migrating our Python code from Databricks notebooks into regular Python m...

  • 23622 Views
  • 7 replies
  • 1 kudos
Latest Reply
ferdinand
New Contributor II
  • 1 kudos

Lol, OK so in my case it was because I had a file called databricks.py with clashed with the installed databricks. Renaming my file to databricks_utils.py solved it. 

  • 1 kudos
6 More Replies
Niil
by New Contributor II
  • 2986 Views
  • 1 replies
  • 2 kudos

Resolved! AI Agents in ETL

Hi, I recently found a blog online about Databricks using AI Agents to automate ETL, but I can't find where these capabilities are located in Databricks. Does anyone know?Here is the blog,https://www.heliverse.com/blog/databricks-ai-agents-streamlini...

  • 2986 Views
  • 1 replies
  • 2 kudos
Latest Reply
santhakumar11
New Contributor III
  • 2 kudos

Hi NiilDatabrick has introduced AI Agent sub-categories as part of its Generative AI capabilities. We can now automate tasks such as Extract, Transformation, and Load (ETL).Ex,  Information Extraction Agent - we can able to transform large a volume  ...

  • 2 kudos
Labels