cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

talha
by New Contributor III
  • 2839 Views
  • 3 replies
  • 2 kudos

How to fetch runs from cluster id in api

Task to achieve: We have cluster id and want to fetch all runs against it.Currently I have to get all the runs, iterate through it and filter out the runs with the required cluster id.Similarly if I want to fetch all the runs that are active?

  • 2839 Views
  • 3 replies
  • 2 kudos
Latest Reply
Vidula
Databricks Partner
  • 2 kudos

Hi @Muhammad Talha Jamil​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from y...

  • 2 kudos
2 More Replies
Hooli
by New Contributor II
  • 4030 Views
  • 3 replies
  • 1 kudos

Resolved! When is the End of Life (EOL) date for Databricks Runtime 7.3 LTS? Can we still use an unsupported version till its EOL date?

The End of Support (EOS) date sneaked up on us and we are now wondering if we can delay our upgrade post the EOS date. Could you please help us analyze the risks of operating a DBR version post EOS date?

  • 4030 Views
  • 3 replies
  • 1 kudos
Latest Reply
Vidula
Databricks Partner
  • 1 kudos

Hi @Syed Zaffar​ Does @Prabakar Ammeappin​  response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?We'd love to hear from you.Thanks!

  • 1 kudos
2 More Replies
Stephanraj
by Databricks Partner
  • 8388 Views
  • 7 replies
  • 7 kudos

Resolved! Spark eventlog for Cluster pools

Hi,I would want to setup the cluster logging (to capture eventlogs to /dbfs/cluster-logs dir) in my cluster pool configuration? is it possible?If I create cluster manually, I could able to setup the cluster logging as mentioned here: https://docs.mic...

  • 8388 Views
  • 7 replies
  • 7 kudos
Latest Reply
Prabakar
Databricks Employee
  • 7 kudos

Hi @Stephanraj C​ instance pool is to reduce cluster start and auto-scaling times for a cluster. Are you using any API to create clusters? If so could you please share the API request?

  • 7 kudos
6 More Replies
pgaddam
by New Contributor II
  • 4582 Views
  • 2 replies
  • 5 kudos

Error while mounting ADLS Gen 2 storage account to Az Databricks

Hello TeamI am facing troubles while mounting storage account onto my databricks. Some background on my setup:Storage Account - stgAcc1 - attached to vnet1 and it's subnetsDatabricks - databricks1 - attached to 'workers-vnet' and subnets - these were...

  • 4582 Views
  • 2 replies
  • 5 kudos
Latest Reply
Vidula
Databricks Partner
  • 5 kudos

Hi @Pranith Gaddam​ Does @Debayan Mukherjee​  response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?We'd love to hear from you.Thanks!

  • 5 kudos
1 More Replies
RohitKulkarni
by Contributor II
  • 6298 Views
  • 2 replies
  • 1 kudos

Salesforce to Databricks

Hello Team,I am trying to run the salesforce and try to extract the data.AT that time i am facing the below issue :SOURCE_SYSTEM_NAME = 'Salesforce'TABLE_NAME = 'XY'desc = eval("sf." + TABLE_NAME + ".describe()")print(desc)for field in desc['fields']...

  • 6298 Views
  • 2 replies
  • 1 kudos
Latest Reply
Vidula
Databricks Partner
  • 1 kudos

Hi @Rohit Kulkarni​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Tha...

  • 1 kudos
1 More Replies
Priyanka48
by Databricks Partner
  • 3666 Views
  • 2 replies
  • 0 kudos

Is there any way we can use usermetadataAsOf option in time travelling query or can we modify the timestamps of delta lake that seems to be immutable?

We are using delta lake time travelling capability in our current project. We can use select * from timestamp/versionAsOF query. However ,there might be some change in our approach and we might need to recreate the delta lake while persisting the tim...

  • 3666 Views
  • 2 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi Priyanka, Thanks for reaching out to community.databricks.com. As of now, "As of" query only has two parameters timestamp and version. Please refer: https://docs.delta.io/latest/delta-batch.html#sql-as-of-syntaxPlease let us know in case if you ha...

  • 0 kudos
1 More Replies
109005
by Databricks Partner
  • 4086 Views
  • 5 replies
  • 5 kudos

Not able to install geomesa on my Databricks cluster

Hi team, I have attempting to install Geomesa (2.12:3.4.1) library on my cluster but it keeps failing with the below error:Library installation attempted on the driver node of cluster 0824-052900-76icyj32 and failed. Please refer to the following err...

  • 4086 Views
  • 5 replies
  • 5 kudos
Latest Reply
Prabakar
Databricks Employee
  • 5 kudos

Hi @Ayushi Pandey​ I could see the package is available in the maven repo. https://mvnrepository.com/artifact/org.locationtech.geomesa/geomesa_2.12/3.4.1Have you tried downloading the package to dbfs location and installed on the cluster?

  • 5 kudos
4 More Replies
Ank
by New Contributor II
  • 2299 Views
  • 1 replies
  • 2 kudos

Why am I getting a FileNotFoundError after providing the file path?

I used copy file path to get the file path of the notebook I am trying to run from another notebook.file_path = "/Users/ankur.lohiya@workday.com/PAS/Training/Ingest/TrainingQueries-Cloned.py/"ddi = DatabricksDataIngestion(file_path=file_path,        ...

  • 2299 Views
  • 1 replies
  • 2 kudos
Latest Reply
Vidula
Databricks Partner
  • 2 kudos

Hello @Ankur Lohiya​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Th...

  • 2 kudos
umarkhan
by New Contributor II
  • 4699 Views
  • 2 replies
  • 1 kudos

Driver context not found for python spark for spark_submit_task using Jobs API submit run endpoint

I am trying to run a multi file python job in databricks without using notebooks. I have tried setting this up by:creating a docker image using the DBRT 10.4 LTS as a base and adding the zipped python application to that.make a call to the run submit...

  • 4699 Views
  • 2 replies
  • 1 kudos
Latest Reply
Vidula
Databricks Partner
  • 1 kudos

Hi @Umar Khan​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 1 kudos
1 More Replies
data_boy_2022
by New Contributor III
  • 5190 Views
  • 2 replies
  • 1 kudos

Resolved! What are the options to offer a low latency API for small tables derived from big tables?

I have a big dataset which gets divided into smaller datasets. For some of these smaller datasets I'd like to offer a low latency API (*** ms) to query them. Big dataset 1B entriesSmaller dataset 1 Mio entriesWhat's the best way to do it?I thought ab...

  • 5190 Views
  • 2 replies
  • 1 kudos
Latest Reply
Vidula
Databricks Partner
  • 1 kudos

Hi @Jan R​ Does @Tian Tan​  response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?We'd love to hear from you.Thanks!

  • 1 kudos
1 More Replies
data_boy_2022
by New Contributor III
  • 4069 Views
  • 2 replies
  • 0 kudos

Resolved! Writing transformed DataFrame to a persistent table is unbearable slow

I want to transform a DF with a simple UDF. Afterwards I want to store the resulting DF in a new table (see code below)key = "test_key"   schema = StructType([ StructField("***", StringType(), True), StructField("yyy", StringType(), True), StructF...

  • 4069 Views
  • 2 replies
  • 0 kudos
Latest Reply
Vidula
Databricks Partner
  • 0 kudos

Hello @Jan R​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 0 kudos
1 More Replies
komplex
by New Contributor
  • 1925 Views
  • 2 replies
  • 1 kudos

I need help finding the right mode for my course

How do I find the Data Brick Community edition?

  • 1925 Views
  • 2 replies
  • 1 kudos
Latest Reply
Vidula
Databricks Partner
  • 1 kudos

Hi @Kester Truman​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Than...

  • 1 kudos
1 More Replies
Jessevds
by New Contributor II
  • 4434 Views
  • 2 replies
  • 2 kudos

Create dropdown-list in Markdown

In the first cell of my notebooks, I record a changelog for all changes done in the notebook in Markdown. However, as this list becomes longer and longer, I want to implement a dropdown list. Is there anyway to do this in Markdown in databricks?For t...

  • 4434 Views
  • 2 replies
  • 2 kudos
Latest Reply
Vidula
Databricks Partner
  • 2 kudos

Hi @Jesse vd S​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 2 kudos
1 More Replies
mghildiy
by New Contributor
  • 1675 Views
  • 1 replies
  • 0 kudos

A basic DataFrame transformation query

I want to know how dataframe transformations work.Suppose I have a DataFrame instance df1. I apply some operation on it, say a filter. As every operation gives a new dataframe, so lets say now we have df2. So we have two DataFrame instances now, df1 ...

  • 1675 Views
  • 1 replies
  • 0 kudos
Latest Reply
Vidula
Databricks Partner
  • 0 kudos

Hi @mghildiy​ Does @Kaniz Fatma​  response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?We'd love to hear from you.Thanks!

  • 0 kudos
Labels