cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Shahe
by New Contributor
  • 2601 Views
  • 2 replies
  • 0 kudos

Azure Databricks Metrics to Prometheus?

What is the best method to expose Azure Databricks metrics to Prometheus specifically? And is it possible to get the underlying Spark metrics also? All I can see clearly defined in the documentation is the serving endpoint metrics:https://learn.micro...

  • 2601 Views
  • 2 replies
  • 0 kudos
Latest Reply
DanielB
New Contributor II
  • 0 kudos

HelloI don't have databricks running as pod in an aks cluster.. It's working on azure as saas.. What should I do the export the metrics to prometheus?

  • 0 kudos
1 More Replies
david1144
by New Contributor III
  • 467 Views
  • 1 replies
  • 0 kudos

Streaming data with kafka

What is the Best way for implement streaming data flow dron kafka to databricks (delta tables)

  • 467 Views
  • 1 replies
  • 0 kudos
Latest Reply
GCosta
New Contributor II
  • 0 kudos

Structured streaming:https://spark.apache.org/docs/latest/structured-streaming-programming-guide.html

  • 0 kudos
GCosta
by New Contributor II
  • 580 Views
  • 0 replies
  • 0 kudos

How to write data to Confluent Kafka with SchemaRegistry format on sparkstructured?

Hi There!I am to trying write a batch data to kafka topic with schema registry in databricks using pyspark, i serialize the data with pyspark to_avro function and write it to the topic, but the consumers can’t read the schema id. If they do not separ...

  • 580 Views
  • 0 replies
  • 0 kudos
Mlaricobar94
by New Contributor
  • 425 Views
  • 0 replies
  • 0 kudos

Collect Spark UI statistics to analyze the performance for several Spark Applications

To identify the reasons for a data process poor performance, we need to navigate and analyze the metrics in the Spark UI manually... However, replicating those steps for a giant group of spark applications would be very expensive in times...Given thi...

  • 425 Views
  • 0 replies
  • 0 kudos
Serena
by New Contributor
  • 238 Views
  • 0 replies
  • 0 kudos

Data architecture

Check out our platform architecture italgas-from-gas-pipelines-to-data-pipelines-fueling-our-reporting-with-the-latest-innovations-7f00e20ba115?source=social.linkedin 

  • 238 Views
  • 0 replies
  • 0 kudos
Gaurav19
by New Contributor III
  • 1760 Views
  • 3 replies
  • 1 kudos

Resolved! Databricks API - list job runs doesn't have 'task run id'

Hi all,I am calling get job run list API to get all task ids and refer them in dbt-artifacts view created by dbt job run. The question is I can see 'task run id' on screen but it doesn't come back in api response? Is there a way to get it? I checked ...

Gaurav19_0-1718197141799.png
  • 1760 Views
  • 3 replies
  • 1 kudos
Latest Reply
Gaurav19
New Contributor III
  • 1 kudos

Never mind, I have found task_run_id present in getrun api https://docs.databricks.com/api/azure/workspace/jobs/getrunI overlooked at first instance as it is buried under nested json structuretasks[] > run_id.This clarifies and solves my problem!

  • 1 kudos
2 More Replies
thiagoawstest
by Contributor
  • 1501 Views
  • 1 replies
  • 0 kudos

Resolved! Unity Catalog mount S3

Hi, I still have some questions, I have a Databricks on AWS and I need to mount S3 bucksts.According to the documentation, it is recommended to do it through the Unity Catalog, but how would I go about reading data from a notebook that would be mount...

  • 1501 Views
  • 1 replies
  • 0 kudos
Latest Reply
thiagoawstest
Contributor
  • 0 kudos

Returning, I already understood, I'm marking it as resolved.

  • 0 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels