cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Iguinrj11
by New Contributor II
  • 109 Views
  • 1 replies
  • 0 kudos

Data Bricks x Power BI Report Server

I connected two .pbix files to the local server. In the first, I used Import connectivity, and in the second, Direct Query connectivity. However, I encountered the following problems: Import connection: The data is viewed successfully, but it is not ...

Iguinrj11_0-1737741174035.png Iguinrj11_1-1737741227201.png
  • 109 Views
  • 1 replies
  • 0 kudos
Latest Reply
peter598philip
New Contributor II
  • 0 kudos

@Iguinrj11 wrote:I connected two .pbix files to the local server. In the first, I used Import connectivity, and in the second, Direct Query connectivity. However, I encountered the following problems: Import connection: The data is viewed successfull...

  • 0 kudos
Sase
by New Contributor II
  • 272 Views
  • 5 replies
  • 0 kudos

Building a Custom Usage Dashboard using APIs for Job-Level Cost Insights

Since Databricks does not provide individual cost breakdowns for components like Jobs or Compute, we aim to create a custom usage dashboard leveraging APIs to display the cost of each job run across Databricks, Azure Data Factory (ADF), or serverless...

Community Platform Discussions
apis
Cost Analysis
jobs
  • 272 Views
  • 5 replies
  • 0 kudos
Latest Reply
Isi
New Contributor III
  • 0 kudos

Hey,Yes, I am not Azure expert but, Databricks REST API can help you extract usage data for serverless resources, allowing you to integrate this information into custom dashboards or external tools like Grafana.On the Azure side, costs related to wil...

  • 0 kudos
4 More Replies
Daan
by New Contributor III
  • 226 Views
  • 4 replies
  • 1 kudos

Resolved! Permission denied during write

Hey everyone,I have a pipeline that fetches data from s3 and stores them under the Databricks .tmp/ folder.The pipeline is always able to write around 200 000 files before I get a Permission Denied error. This happens in the following code block: os....

  • 226 Views
  • 4 replies
  • 1 kudos
Latest Reply
Daan
New Contributor III
  • 1 kudos

Thanks for your reply Walter! The filenames are already unique, retries produce the same result and I have the necessary permission as I was able to write the other 200 000 files (with the same program that is running continuous). It does makes sense...

  • 1 kudos
3 More Replies
Ashishkumar6921
by New Contributor III
  • 12838 Views
  • 12 replies
  • 0 kudos

Resolved! databricks data engineer associate exam

Hello Team, I encountered Pathetic experience while attempting my 1st Databricks certification. I was giving the exam and Abruptly, Proctor asked me to show my desk, everything i showed every corner of my bed.. It was neat and clean with no suspiciou...

  • 12838 Views
  • 12 replies
  • 0 kudos
Latest Reply
Cert-Team
Databricks Employee
  • 0 kudos

Hi @gokul2 the badge was issued on Dec 2. We just resent the email. Please check your spam. If you continue to have issues, please file a ticket with our support team: https://help.databricks.com/s/contact-us?ReqType=training

  • 0 kudos
11 More Replies
mrstevegross
by New Contributor III
  • 253 Views
  • 3 replies
  • 0 kudos

Resolved! How to grant custom container AWS credentials for reading init script?

I'm using a customer container *and* init scripts. At runtime, I get this error:Cluster '...' was terminated. Reason: INIT_SCRIPT_FAILURE (CLIENT_ERROR). Parameters: instance_id:i-0440ddd3a2d5cce79, databricks_error_message:Cluster scoped init script...

  • 253 Views
  • 3 replies
  • 0 kudos
Latest Reply
mrstevegross
New Contributor III
  • 0 kudos

Followup: I got the AWS creds working by amending our AWS role to permit read/write access to our S3 bucket. Woohoo!

  • 0 kudos
2 More Replies
mrstevegross
by New Contributor III
  • 225 Views
  • 3 replies
  • 0 kudos

Resolved! Format when specifying docker_image url?

I am providing a custom Docker image to my Databricks/Spark job. I've created the image and uploaded it to our private ECR registry (the URL is `472542229217.dkr.ecr.us-west-2.amazonaws.com/tectonai/mrstevegross-testing:latest`). Based on the docs (h...

  • 225 Views
  • 3 replies
  • 0 kudos
Latest Reply
mrstevegross
New Contributor III
  • 0 kudos

Thanks, that's pretty much what I did; a lot of terraform configuration to get the AWS account set up properly, and now I'm able to tell DBR to load the container. (FWIW, I'm encountering *new* access issues; I started a thread here (https://communit...

  • 0 kudos
2 More Replies
Sudheer2
by New Contributor III
  • 99 Views
  • 1 replies
  • 1 kudos

Unable to Register Models After Uploading Artifacts to DBFS in Databricks

 Hi everyone,I'm currently working on a project where I'm migrating models and artifacts from a source Databricks workspace to a target one. I've written a script to upload the model artifacts from my local system to DBFS in the target workspace (usi...

  • 99 Views
  • 1 replies
  • 1 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 1 kudos

Hi @Sudheer2, Does it give you any error while trying to register the model?

  • 1 kudos
Greg_c
by New Contributor II
  • 111 Views
  • 2 replies
  • 0 kudos

Scheduling multiple jobs (workflows) in DABs

Hello, I'm wondering how can I schedule multiple jobs (workflow).I'd like to do something like this but on a workflow level.  tasks: - task_key: task_1 sql_task: warehouse_id: ${var.warehouse_id} paramet...

  • 111 Views
  • 2 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @Greg_c, You can try with this sctructure: - In the main databricks.yml # databricks.ymlbundle:name: master-bundle include:- resources/*.yml # Other bundle configurations... In resource directory, create a YAML for each job: # resources/job1.ymlre...

  • 0 kudos
1 More Replies
malhm
by New Contributor II
  • 221 Views
  • 4 replies
  • 1 kudos

ALIAS Not accepted 42601

I am unable to run the following query generated from my backend at databricks sideQuery: SELECT "A".`cut` AS "Cut" , "A".`color` AS "Color" , "A".`carat` AS "Carat" , "A".`clarity` AS "Clarity" FROM databricksconnect.default.diamonds "A"  Error logs...

  • 221 Views
  • 4 replies
  • 1 kudos
Latest Reply
filipniziol
Contributor III
  • 1 kudos

Hi @malhm ,Double quotes are not supported in column alias. In Databricks SQL/Spark SQL one uses backticks instead of double quotes like in PostgreSQL.Check the docs:https://spark.apache.org/docs/3.5.1/sql-ref-identifier.html 

  • 1 kudos
3 More Replies
Iguinrj11
by New Contributor II
  • 258 Views
  • 3 replies
  • 0 kudos

Resolved! DataBricks x Query Folding Power BI

I ran a native Power BI query in DataBricks in import mode and query folding was not enabled. No query folding? 

Iguinrj11_0-1737819675982.png
  • 258 Views
  • 3 replies
  • 0 kudos
Latest Reply
filipniziol
Contributor III
  • 0 kudos

Hi @Iguinrj11 ,The trick is to configure Databricks.Query instead of Databricks.Catalogs.Check this article and let us know if that helps:https://www.linkedin.com/pulse/query-folding-azure-databricks-tushar-desai/

  • 0 kudos
2 More Replies
tomvogel01
by New Contributor II
  • 219 Views
  • 2 replies
  • 0 kudos

Dynamic Bloom Filters for Inner Joins

I have a question regarding combining the use of Bloom filters with Liquid Clustering to further reduce the data read during a join/merge on top of dynamic file pruning. Testing both combined worked extremely well together for point queries. However ...

  • 219 Views
  • 2 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

We do not recommend Bloom filters Index on the Delta Tables as they have to be manually maintained.  If you prefer photon - please try predictive I/O with Liquid Clustering.

  • 0 kudos
1 More Replies
MadelynM
by Databricks Employee
  • 1272 Views
  • 9 replies
  • 0 kudos

Who's hiring? Latest Job Postings from the Databricks Community!

More than 10,000 organizations worldwide — including Block, Comcast, Condé Nast, Rivian, Shell and over 60% of the Fortune 500 — rely on the Databricks Data Intelligence Platform to take control of their data and put it to work with AI. Use this thre...

  • 1272 Views
  • 9 replies
  • 0 kudos
Latest Reply
MadelynM
Databricks Employee
  • 0 kudos

Cedar is seeking a seasoned Engineering Director to lead our Data Integration and Platforms team. This role will help us tackle challenges of scale, quality and insight as the company grows its market share and product scope.  Job Title:  Engineering...

  • 0 kudos
8 More Replies
ivvande
by New Contributor II
  • 300 Views
  • 4 replies
  • 0 kudos

Automate run as workflow parameter to default to current user

I am trying to run a workflow within Databricks. I have 2 workflows, workflow one which always runs as the service principal, as all data gets accessed and wrangled within this workflow, and workflow 2 which always defaults to the last run account. I...

ivvande_0-1737706760905.png
  • 300 Views
  • 4 replies
  • 0 kudos
Latest Reply
saurabh18cs
Valued Contributor III
  • 0 kudos

Hi, how are you expecting to achieve this? Do you want users who are manually triggering this workflow first update to their run_as? or you want to make this happen programatically?

  • 0 kudos
3 More Replies
subhadeep
by New Contributor II
  • 244 Views
  • 2 replies
  • 0 kudos

Create csv and upload on azure

Can some write a sql query , which queries a table like select * from stages.benefit , creates a csv and upload on azure 

  • 244 Views
  • 2 replies
  • 0 kudos
Latest Reply
hari-prasad
Valued Contributor II
  • 0 kudos

Hi @subhadeep ,You can achieve this in SQL similarly to how you write a dataframe into a table or blob path. We will create an external table pointing to the blob path or mounted blob path. Note that this table does not support ACID transactions and ...

  • 0 kudos
1 More Replies
idanyow
by New Contributor III
  • 653 Views
  • 10 replies
  • 2 kudos

01_demo_setup error

HelloI was following "Demo: Creating and Working with a Delta Table"while I have a community edition user.The first command in the Notebook is: %run ./setup/01_demo_setup But I got the following error:Notebook not found: Users/<my-email-was-here..>/s...

  • 653 Views
  • 10 replies
  • 2 kudos
Latest Reply
Isi
New Contributor III
  • 2 kudos

Hey!Sad news guys... if you go to Course Logistics Review you can read:"We are pleased to offer a version of this course that also contains hands-on practice via a Databricks Academy Labs subscription. With a Databricks Academy Labs subscription, you...

  • 2 kudos
9 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Top Kudoed Authors