cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Phani1
by Valued Contributor II
  • 2238 Views
  • 0 replies
  • 1 kudos

cost finding and optimization

 Hi Team,Could you please suggest the best way to track the cost of Databricks objects/components? Could you please share any best practices for optimizing costs and conducting detailed cost analysis?Regards,Phanindra

  • 2238 Views
  • 0 replies
  • 1 kudos
Phani1
by Valued Contributor II
  • 2075 Views
  • 1 replies
  • 0 kudos

Capture changes at the object level in Databricks

Could you please suggest how to capture changes at the object level in Databricks, such as notebooks changes, table DDL changes, view DDL , functions DDL, and workflows etc. changes? We would like to build a dashboard for changes.

  • 2075 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ayushi_Suthar
Databricks Employee
  • 0 kudos

Hi @Phani1 , Good Day!  Could you kindly clarify on your question about the capture change? What type of change do you want to capture? If you want to see what modifications the user made to the Notebooks, tables, and workflow, you can check the audi...

  • 0 kudos
Ruchita
by New Contributor
  • 1698 Views
  • 1 replies
  • 0 kudos

Getting internal server error while creating a new query definition

Hi, I am trying to create a query definition using API '/api/2.0/preview/sql/queries' in postman but getting internal server error. Below is the snap for the snap. Let me know If I am doing anything wrong here.

Capture.PNG
  • 1698 Views
  • 1 replies
  • 0 kudos
Latest Reply
UplightDrew
New Contributor II
  • 0 kudos

I'm interested to know how this error was resolved. I'm getting an "Internal Server Error" returned when trying to create queries with version 1.36.1 of the Databricks Terraform Provider. The error provides no other information.

  • 0 kudos
cesarc
by New Contributor II
  • 7519 Views
  • 2 replies
  • 0 kudos

Parallel jobs with individual contexts

I was wondering if someone could help us with implementation here. Our current program will spin up 5 jobs through the Databricks API using the same Databricks cluster but each one needs their own spark context (specifically each one will connect to ...

  • 7519 Views
  • 2 replies
  • 0 kudos
Latest Reply
feiyun0112
Honored Contributor
  • 0 kudos

 you can set up buckets with different credentials, endpoints, and so on.https://docs.databricks.com/en/connect/storage/amazon-s3.html#per-bucket-configuration 

  • 0 kudos
1 More Replies
wellington
by New Contributor III
  • 2042 Views
  • 1 replies
  • 0 kudos

Log notebook activities

Hi friends;I'm working on a project where we are 4 programmers. We are working in a single environment, using only the "Workspaces" folder. Each has its own user, which is managed by Azure AD.We had a peak in consumption on the 5th Feb. So I can see ...

  • 2042 Views
  • 1 replies
  • 0 kudos
Latest Reply
wellington
New Contributor III
  • 0 kudos

Hi @Retired_mod , thanks for your quick answer.There is no other way to monitor notebook runs. I ask this because adding tags to the cluster and workspace does not solve my problem, considering that everyone uses the same cluster and the same workspa...

  • 0 kudos
JasonAckman
by New Contributor
  • 7975 Views
  • 0 replies
  • 0 kudos

Data Engineer – Databricks - Remote

Data Engineer – Databricks - RemoteApply Here: Job Application for Data Engineer – Databricks at Jenzabar (greenhouse.io)Jenzabar Website: Higher Education Software Solutions - JenzabarFor over four decades, the higher education experts at Jenzabar h...

  • 7975 Views
  • 0 replies
  • 0 kudos
Sujitha
by Databricks Employee
  • 11140 Views
  • 0 replies
  • 1 kudos

Calling all innovators and visionaries! The 2024 Data Team Awards are open for nominations

Each year, we celebrate the amazing customers that rely on Databricks to innovate and transform their organizations — and the world — with the power of data and AI. The nomination form is now open to submit nominations. Nominations will close on Marc...

Screenshot 2024-02-12 at 10.52.07 AM.png
  • 11140 Views
  • 0 replies
  • 1 kudos
RobsonNLPT
by Contributor III
  • 3857 Views
  • 4 replies
  • 0 kudos

Databricks XML - Bypassing rootTag and rowTag

I see the current conversion of dataframe to xml need to be improved.My dataframe schema is a perfect nested schema based on structs but when I create a xml I have the follow issues:1) I can't add elements to root2) rootTag and rowTag are requiredIn ...

  • 3857 Views
  • 4 replies
  • 0 kudos
Latest Reply
sandip_a
Databricks Employee
  • 0 kudos

Here is one of the ways to use the struct field name as rowTag:     import org.apache.spark.sql.types._ val schema = new StructType().add("Record", new StructType().add("age", IntegerType).add("name", StringType)) val data = Seq(Row(Row(18, "John ...

  • 0 kudos
3 More Replies
Israel_H
by New Contributor III
  • 3248 Views
  • 3 replies
  • 5 kudos

The risks of code execution by default on widget change

Taking from my experience, the default action of widgets triggering code execution upon value change poses risks that outweigh the convenience in certain scenarios. While this feature may seem advantageous in some cases, it can lead to unintended con...

  • 3248 Views
  • 3 replies
  • 5 kudos
Latest Reply
Kayla
Valued Contributor II
  • 5 kudos

I definitely have to agree with the original point- if you have a notebook that you import, and you touch any widget value you're running code, most likely accidentally. I'd love to see a workspace or user type option where you can change the default...

  • 5 kudos
2 More Replies
RobsonNLPT
by Contributor III
  • 1959 Views
  • 2 replies
  • 1 kudos

databricks spark XML Writer

Hi.I'm trying to generate XML as output base on my nested dataframe. Everything is ok except by I don't know how to add elements to rootTag.I can add elements from rowtag but not in rootTag. Same problems to add attributes to root <books  version = "...

  • 1959 Views
  • 2 replies
  • 1 kudos
Latest Reply
Ayushi_Suthar
Databricks Employee
  • 1 kudos

Hi @RobsonNLPT ,Thanks for bringing up your concerns, always happy to help  Can you please refer to the below document to read and write the XML files? https://docs.databricks.com/en/query/formats/xml.html Please let me know if this helps and leave a...

  • 1 kudos
1 More Replies
Miasu
by New Contributor II
  • 2729 Views
  • 1 replies
  • 0 kudos

FileAlreadyExistsException error while analyzing table in Notebook

Databricks experts, I'm new to Databricks, and encounter an issue with the ANALYZE TABLE command in the Notebook. I created two tables nyc_taxi and nyc_taxi2, from one csv file.When executing the following command in Notebook, analyze table nyc_taxi2...

  • 2729 Views
  • 1 replies
  • 0 kudos
Dlt
by New Contributor III
  • 10300 Views
  • 4 replies
  • 0 kudos

Running an exe file in databricks

Hello I have an executable file which i want to host and run from databricks.  is this possible in databricks using DBFS ?If NOT what are the other ways to it in databricks ? 

  • 10300 Views
  • 4 replies
  • 0 kudos
Latest Reply
BR_DatabricksAI
Contributor III
  • 0 kudos

Hello, I don't have much information on what kind of executables you would like to run in databricks however, I can think of two solutions : Solution 1: Deploy your code in azure container registry as an image and use the endpoint in data bricks. Sol...

  • 0 kudos
3 More Replies
Wycliff
by New Contributor II
  • 3129 Views
  • 1 replies
  • 0 kudos

JWT Encoding error while using Azure secret key

My secret value in Azure key vault is like below.private_key="""-----BEGIN RSA PRIVATE KEY-----********-----END RSA PRIVATE KEY-----"""Running this command in Databricks notebook - jwt.encode(claim_set,private_key,algorithm='RS256')While using the ab...

  • 3129 Views
  • 1 replies
  • 0 kudos
Latest Reply
Wycliff
New Contributor II
  • 0 kudos

Thanks much for your troubleshooting methods.Validated the secret scopes, accessing secrets. These looks fine.Key format - I feel problem is with the key format only. As of now I'm awaiting on Azure subscription access. But I printed the secret value...

  • 0 kudos
sanjay
by Valued Contributor II
  • 9671 Views
  • 3 replies
  • 1 kudos

stop autoloader with continuous trigger programatically

Hi,I am running autoloader with continuous trigger. How can I stop this trigger during some specific time, only if no data pending and current batch process is complete. How to check how many records pending in queue and current state.Regards,Sanjay

  • 9671 Views
  • 3 replies
  • 1 kudos
Latest Reply
RamonaMraz
New Contributor II
  • 1 kudos

Hello, I am new here, Can I ask a question?

  • 1 kudos
2 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels