cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

prashantjjain33
by New Contributor II
  • 1874 Views
  • 3 replies
  • 0 kudos

databricks_error_message:REQUEST_LIMIT_EXCEEDED:

A databricks job failed unexpectedly with below error. There ware only 5 jobs running at that time and no major operations. what could be the root cause and how can we avoid this in futureCluster '0331-xxxxxx-zs8i8pcn' was terminated. Reason: INIT_SC...

  • 1874 Views
  • 3 replies
  • 0 kudos
Latest Reply
saurabh18cs
Honored Contributor III
  • 0 kudos

are you generating any databricks tokens in this process? if yes there is a limit of 600.

  • 0 kudos
2 More Replies
kleanthis
by New Contributor III
  • 953 Views
  • 1 replies
  • 0 kudos

Resolved! dbutls.fs.cp() fails in Runtime 16.3 Beta, when using abfss://

Hello,I am not sure if this is the right place to post this, however, I reporting what seems to me, a breaking issue with 16.3 Beta Runtime, when performing dbutils.fs.cp() operations between abfss://This is not a permissions issue — let's get that o...

  • 953 Views
  • 1 replies
  • 0 kudos
Latest Reply
kleanthis
New Contributor III
  • 0 kudos

To close off my loop: The ClassCastException has been resolved in 16.3

  • 0 kudos
JangaReddy
by New Contributor
  • 909 Views
  • 1 replies
  • 0 kudos

Serverless Access

Hi Team,Can you help us, how to restrict serverless access to only specific users/groups. (through Workspace admin /account admin)?Regards,Phani

  • 909 Views
  • 1 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

you cannot at the moment.  I suppose that is something that is coming soon (many complaints about that).It is only possible to toggle serverless compute on/off.

  • 0 kudos
lezwon
by Contributor
  • 1544 Views
  • 2 replies
  • 1 kudos

Resolved! At least 1 "file_arrival" blocks are required.

Hi folks, I'm trying to set up a databricks asset bundle for a job to load some product data into databricks. This job was created in databricks and loads the data from a location hardcoded into the notebook (for now). It is supposed to run every 3 h...

  • 1544 Views
  • 2 replies
  • 1 kudos
Latest Reply
Tharani
New Contributor III
  • 1 kudos

I think since it is a scheduled job, you have to explicitly specify a cron-based schedule instead of using file_arrival in trigger section in yaml file.

  • 1 kudos
1 More Replies
Vittorio
by New Contributor II
  • 1360 Views
  • 1 replies
  • 1 kudos

Month-on-month growth with pivot

I need to create pivot tables with data on revenue/costs presented monthly and I need to show the month on month growth and it seems like mission impossible with Dashboards on SQL warehouse despite being quite obviously a very typical task.Pivot tabl...

  • 1360 Views
  • 1 replies
  • 1 kudos
Latest Reply
Brahmareddy
Esteemed Contributor
  • 1 kudos

Hi Vittorio,How are you doing today? , As per my understanding, You're absolutely right—creating a proper pivot table with dynamic month-over-month (MoM) growth in Databricks SQL Dashboards is surprisingly tricky for such a common use case. The built...

  • 1 kudos
standup1
by Contributor
  • 4059 Views
  • 3 replies
  • 1 kudos

Recover a deleted DLT pipeline

Hello,does anyone know how to recover a deleted dlt pipeline, or at least recover deleted tables that were managed by the dlt pipeline ? We have a pipeline that stopped working and throwing all kind of errors, so we decided to create a new one and de...

  • 4059 Views
  • 3 replies
  • 1 kudos
Latest Reply
Nishair05
New Contributor II
  • 1 kudos

Found out a way to recover the tables. But it seems like we need to recover the pipeline as well. Any idea on how to recover the pipeline?

  • 1 kudos
2 More Replies
Phani1
by Databricks MVP
  • 3408 Views
  • 1 replies
  • 0 kudos

Dashboard deployment

Hi Team,How to deploy a dashboard in one databricks account to another databricks account/client account without revealing the underlying notebook code.Regards,Phani

  • 3408 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Community Manager
  • 0 kudos

Hello @Phani1! Could you confirm whether you're referring to an AI/BI Dashboard or a Notebook Dashboard?

  • 0 kudos
balu_9309
by New Contributor II
  • 1337 Views
  • 3 replies
  • 0 kudos

databricks job runs connect with powerbi

Hi i have databricks jobs run how to connect power bi app or that runs save in blob or delta table

  • 1337 Views
  • 3 replies
  • 0 kudos
Latest Reply
chexa_Wee
New Contributor III
  • 0 kudos

You can connect Databricks to Power BI using the "Get Data" option. To do this, you need to provide the necessary cluster details and then connect to the Delta tables in Databricks. This allows Power BI to access and analyze data stored in Databricks...

  • 0 kudos
2 More Replies
RolandCVaillant
by New Contributor II
  • 3951 Views
  • 1 replies
  • 2 kudos

Databricks notebook dashboard export

After the latest Databricks update, my team can no longer download internal notebook dashboards in the dashboard view as .html files. When downloading, the entire code is always exported as an .html file. Is there a way to export just the notebook da...

  • 3951 Views
  • 1 replies
  • 2 kudos
Latest Reply
Advika
Community Manager
  • 2 kudos

Hello @RolandCVaillant! For AI/BI dashboards, you can download the rendered view as a PDF after publishing the dashboard. However, there isn’t a direct way for notebook dashboards. Instead, you can export the notebook by selecting File > Export in th...

  • 2 kudos
carlos_tasayco
by Contributor
  • 1877 Views
  • 1 replies
  • 0 kudos

Databricks app share with people out of your organization

Yes, that is it, Can we share our app with people who are not from organization?https://www.databricks.com/blog/introducing-databricks-apps

  • 1877 Views
  • 1 replies
  • 0 kudos
Latest Reply
Shua42
Databricks Employee
  • 0 kudos

Currently, apps can only be accessed by users in the workspace where it is deployed.  One workaround solution, depending on the amount of external users you want to grant access to, would be the following: Create a new workspace.Add external users to...

  • 0 kudos
shubham_meshram
by New Contributor II
  • 1455 Views
  • 2 replies
  • 2 kudos

Databrick Dashboards Maps

I am currently unable to use the MAPS (Choropleth) feature in databricks dashboard, however I am able to use it in the legacy dashboard. I would like my users to use a single dashboard that has all the combined features than exposing them to 2 differ...

  • 1455 Views
  • 2 replies
  • 2 kudos
Latest Reply
shubham_meshram
New Contributor II
  • 2 kudos

Thanks for your input @Brahmareddy , I just met some folks from Databricks at a booth in Missouri and they confirmed April end would be the planned timeline for the maps feature release. Thanks for your input

  • 2 kudos
1 More Replies
jeremy98
by Honored Contributor
  • 3406 Views
  • 1 replies
  • 0 kudos

where saving the wheel package?

Hi community,We have deployed the wheel package internally in our bundle repository: artifacts: rnc_lib: type: whl build: poetry build path: . # For passing wheel package to workspace sync: include: - ./dist/*.whlThe problem is t...

  • 3406 Views
  • 1 replies
  • 0 kudos
Latest Reply
Renu_
Valued Contributor II
  • 0 kudos

Hi @jeremy98 You can upload the wheel to a shared workspace location and configure it for cluster-level installation by attaching it as a library.Or you can also automate the process by adding the wheel to the libraries section of your databricks.yml...

  • 0 kudos
naineel
by New Contributor
  • 675 Views
  • 1 replies
  • 0 kudos

Scheduling a Complete Python Project in Databricks

Hi everyone,I have a simple Python project with the following structure:root/ │── src/ │ ├── package_name/ │ │ ├── __init__.py │ │ ├── main.py │ │ ├── submodules1/ │ │ │ ├── __init__.py │ │ │ ├── base1.py │ ...

  • 675 Views
  • 1 replies
  • 0 kudos
Latest Reply
ashraf1395
Honored Contributor
  • 0 kudos

Hi there @naineel , one approach can be you can convert your project into a whl fileand then create a python whl task for it and schedulehttps://docs.databricks.com/aws/en/jobs/python-wheel

  • 0 kudos
Subhasis
by New Contributor III
  • 907 Views
  • 2 replies
  • 0 kudos

Unity catalog codebase

SpoilerHow can I get the code base of unity catalog?How can I get the code base of unity catalog?

  • 907 Views
  • 2 replies
  • 0 kudos
Latest Reply
Subhasis
New Contributor III
  • 0 kudos

Here I am mentioning about the .dbc file which databricks used to provide earlier for practice the query. Hence is it possible to get that query? 

  • 0 kudos
1 More Replies
gchandra
by Databricks Employee
  • 3450 Views
  • 5 replies
  • 3 kudos

Resolved! Databricks Community Edition - DBFS Alternative Solutions

Option 1: Mount AWS S3 bucket access_key = ""secret_key = ""encoded_secret_key = secret_key.replace("/", "%2F") aws_bucket_name = "yourawsbucketname/"mount_name = "youraliasmountname" # #dbutils.fs.unmountf"/mnt/{mount_name}")dbutils.fs.mount(f"s3a:/...

  • 3450 Views
  • 5 replies
  • 3 kudos
Latest Reply
DanT
New Contributor II
  • 3 kudos

Seems to be removed again? Can't see options. 

  • 3 kudos
4 More Replies
Labels