cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

mbdata
by New Contributor II
  • 31769 Views
  • 7 replies
  • 5 kudos

Resolved! Toggle line comment

I work with Azure Databricks. The shortcut Ctrl + / to toggle line comment doesn't work on AZERTY keyboard on Firefox... Do you know this issue ? Is there an other shortcut I can try ? Thanks !

  • 31769 Views
  • 7 replies
  • 5 kudos
Latest Reply
Flo
New Contributor III
  • 5 kudos

'cmd + shift + 7' works for me!I'm using an AZERTY keyboard on Chrome for MacOS.

  • 5 kudos
6 More Replies
JordanYaker
by Contributor
  • 1117 Views
  • 0 replies
  • 0 kudos

Integration options for Databricks Jobs and DataDog?

I know that there is already the Databricks (technically Spark) integration for DataDog. Unfortunately, that integration only covers the cluster execution itself and that means only Cluster Metrics and Spark Jobs and Tasks. I'm looking for somethin...

  • 1117 Views
  • 0 replies
  • 0 kudos
Direo
by Contributor
  • 1537 Views
  • 1 replies
  • 1 kudos

Azure databricks integration with Datadog

Before running a script which would create an agent on a cluster, you have to provide SPARK_LOCAL_IP variable. How can I find it? Does it change over time or its a constant?

  • 1537 Views
  • 1 replies
  • 1 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 1 kudos

Hi, Could you please refer to https://www.datadoghq.com/blog/databricks-monitoring-datadog/ and let us know if this helps. SPARK_LOCAL_IP is the environment variable, FYI, https://spark.apache.org/docs/latest/configuration.html

  • 1 kudos
julie
by New Contributor III
  • 3357 Views
  • 5 replies
  • 3 kudos

Resolved! Scope creation in Databricks or Confluent?

Hello I am a newbie in this field and trying to access confluent kafka stream in Databricks Azure based on a beginner's video by Databricks. I have a free trial of Databricks cluster right now. When I run the below notebook, it errors out on line 5 o...

image
  • 3357 Views
  • 5 replies
  • 3 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 3 kudos

For testing, create without secret scope. It will be unsafe, but you can post secrets as strings in the notebook for testing. Here is the code which I used for loading data from confluent:inputDF = (spark .readStream .format("kafka") .option("kafka.b...

  • 3 kudos
4 More Replies
Lizzz
by New Contributor II
  • 2891 Views
  • 2 replies
  • 3 kudos

Resolved! Forward Spark structured streaming metrics to Datadog

We have a spark streaming application written in Pyspark that we'd like to monitor with Datadog. By default, datadog collects a couple of streaming metrics like 'spark.structured_streaming.processing_rate' and 'spark.structured_streaming.latency'. Ho...

  • 2891 Views
  • 2 replies
  • 3 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 3 kudos

Hi @Liz Zhang​ , We haven't heard from you on the last response from @Shanmugavel Chandrakasu​​, and I was checking back to see if his suggestions helped you. Or else, If you have any solution, please share it with the community as it can be helpful ...

  • 3 kudos
1 More Replies
User16826994223
by Honored Contributor III
  • 3585 Views
  • 1 replies
  • 0 kudos

How to export full result Databricks Azure

what is the best way to see all the data , I see display shows up to 100000 data only . any way in which I can see all the data or do I need to download or export it in different file

  • 3585 Views
  • 1 replies
  • 0 kudos
Latest Reply
User16826994223
Honored Contributor III
  • 0 kudos

Yes, databricks display only a limited dataframe. It allows you to download the data like a csv, . You can save the dataframe as a table in the databricks database with this:predictions.select("salry", "dept").write.saveAsTable("depsalry")Then you ca...

  • 0 kudos
Labels