cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

lily1
by New Contributor III
  • 6339 Views
  • 3 replies
  • 2 kudos

Resolved! NoSuchMethodError: com.google.common.util.concurrent.MoreExecutors.directExecutor()Ljava/util/concurrent/Executor

When I execute a function in google-cloud-bigquery:2.7.0 jar, it executes a function in gax:2.12.2 jar and then this gax jar file executes a function in guava jar. And this guava jar file is a Databricks default library which is located at /databrick...

  • 6339 Views
  • 3 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hey there @Lily Kim​ Hope you are doing well!Thank you for posting your question. We are happy that you were able to find the solution.Would you please like to mark the answer as best?We'd love to hear from you.

  • 2 kudos
2 More Replies
AJ270990
by Contributor III
  • 10141 Views
  • 8 replies
  • 3 kudos

Resolved! Powerpoint file operations in Databricks

Hi Team, I am writing a python code in Azure Databricks where I have mounted a Azure storage and accessing the input dataset from Azure storage resource. I am accessing the input data from Azure storage and generating charts from that data in databri...

  • 10141 Views
  • 8 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Abhishek Jain​ Thanks for sending in your query. We are glad that you found a solution. Would you like to mark the answer as best so the other members can benefit from it too?Cheers!

  • 3 kudos
7 More Replies
MarcoData01
by New Contributor III
  • 4392 Views
  • 6 replies
  • 4 kudos

Resolved! Is there the possibility to protect Init script folder on DBFS

Hi everyone,We are looking for a way to protect the folder where init script is hosted from editing.This because we have implemented inside init script a parameter that blocks the download file from R Studio APP Emulator and we would like to avoid th...

  • 4392 Views
  • 6 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @Marco Data​ Thank you for sending in your question. It is awesome that you found a solution. Would you like to mark the answer as best so others can find the solution quickly?Cheers!

  • 4 kudos
5 More Replies
ChriChri
by New Contributor II
  • 6575 Views
  • 2 replies
  • 4 kudos

Azure Databricks Delta live table tab is missing

In my Azure Databricks workspace UI I do not have the tab "Delta live tables". In the documentation it says that there is a tab after clicking on Jobs in the main menu. I just created this Databricks resource in Azure and from my understanding the DL...

  • 6575 Views
  • 2 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @Chr Jon​ How are you doing? Thanks for posting your question. Just checking in to see if one of the answers helped, would you let us know?

  • 4 kudos
1 More Replies
Mark1
by New Contributor II
  • 3537 Views
  • 2 replies
  • 2 kudos

Resolved! Using Delta Tables without Time Travel features?

Hi Everyone / Experts,is it possible to use Delta Tables without the Time Travel features? We are primarily interested in using the DML Features (delete, update, merge into, etc)Thanks,Mark

  • 3537 Views
  • 2 replies
  • 2 kudos
Latest Reply
Mark1
New Contributor II
  • 2 kudos

Thank you Hubert

  • 2 kudos
1 More Replies
ADB_482115_ADB_
by New Contributor
  • 1321 Views
  • 0 replies
  • 0 kudos

Reading multiple file and writing final results in Databricks getting below error

Job aborted due to stage failure: Task 12 in stage 1446.0 failed 4 times, most recent failure: Lost task 12.3 in stage 1446.0 (TID 2922) (10.24.175.143 executor 41): ExecutorLostFailure (executor 41 exited caused by one of the running tasks) Reason: ...

  • 1321 Views
  • 0 replies
  • 0 kudos
haseebkhan1421
by New Contributor
  • 16859 Views
  • 2 replies
  • 1 kudos

Resolved! How can I access python variable in Spark SQL?

I have python variable created under %python in my jupyter notebook file in Azure Databricks. How can I access the same variable to make comparisons under %sql. Below is the example:%python RunID_Goal = sqlContext.sql("SELECT CONCAT(SUBSTRING(RunID,...

  • 16859 Views
  • 2 replies
  • 1 kudos
Latest Reply
Nirupam
New Contributor III
  • 1 kudos

You can use {} in spark.sql() of pyspark/scala instead of making a sql cell using %sql.This will result in a dataframe. If you want you can create a view on top of this using createOrReplaceTempView()Below is an example to use a variable:-# A variab...

  • 1 kudos
1 More Replies
Sugumar_Sriniva
by New Contributor III
  • 11318 Views
  • 11 replies
  • 5 kudos

Resolved! Data bricks cluster creation is failing while running the Cron job scheduling script through init script method from azure data bricks.

Dear connections,I'm unable to run a shell script which contains scheduling a Cron job through init script method on Azure Data bricks cluster nodes.Error from Azure Data bricks workspace:"databricks_error_message": "Cluster scoped init script dbfs:/...

  • 11318 Views
  • 11 replies
  • 5 kudos
Latest Reply
User16764241763
Databricks Employee
  • 5 kudos

Hello @Sugumar Srinivasan​  Could you please enable cluster log delivery and inspect the INIT script logs in the below path dbfs:/cluster-logs/<clusterId>/init_scripts path.https://docs.databricks.com/clusters/configure.html#cluster-log-delivery-1

  • 5 kudos
10 More Replies
sannycse
by New Contributor II
  • 5529 Views
  • 4 replies
  • 6 kudos

Resolved! read the csv file as shown in description

Project_Details.csvProjectNo|ProjectName|EmployeeNo100|analytics|1100|analytics|2101|machine learning|3101|machine learning|1101|machine learning|4Find each employee in the form of list working on each project?Output:ProjectNo|employeeNo100|[1,2]101|...

  • 5529 Views
  • 4 replies
  • 6 kudos
Latest Reply
User16764241763
Databricks Employee
  • 6 kudos

@SANJEEV BANDRU​  You can simply do thisJust change the file path CREATE TEMPORARY VIEW readcsv USING CSV OPTIONS ( path "dbfs:/docs/test.csv", header "true", delimiter "|", mode "FAILFAST");select ProjectNo, collect_list(EmployeeNo) Employeesfrom re...

  • 6 kudos
3 More Replies
weldermartins
by Honored Contributor
  • 5702 Views
  • 5 replies
  • 13 kudos

Hello everyone, I have a directory with 40 files. File names are divided into prefixes. I need to rename the prefix k3241 according to the name in the...

Hello everyone, I have a directory with 40 files.File names are divided into prefixes. I need to rename the prefix k3241 according to the name in the last prefix.I even managed to insert the csv extension at the end of the file. but renaming files ba...

Template
  • 5702 Views
  • 5 replies
  • 13 kudos
Latest Reply
Anonymous
Not applicable
  • 13 kudos

Hi @welder martins​ How are you doing?Thank you for posting that question. We are glad you could resolve the issue. Would you want to mark an answer as the best solution?Cheers

  • 13 kudos
4 More Replies
cristianc
by Contributor
  • 4186 Views
  • 5 replies
  • 3 kudos

Is it required to run OPTIMIZE after doing GDPR DELETEs?

Greetings,I have been reading the excellent article from https://docs.databricks.com/security/privacy/gdpr-delta.html?_ga=2.130942095.1400636634.1649068106-1416403472.1644480995&_gac=1.24792648.1647880283.CjwKCAjwxOCRBhA8EiwA0X8hi4Jsx2PulVs_FGMBdByBk...

  • 4186 Views
  • 5 replies
  • 3 kudos
Latest Reply
cristianc
Contributor
  • 3 kudos

@Hubert Dudek​ thanks for the hint, exactly as written in the article VACUUM is required after the GDPR delete operation, however do we need to OPTIMIZE ZSORT again the table or is the ordering maintained?

  • 3 kudos
4 More Replies
kdkoa
by New Contributor III
  • 4768 Views
  • 4 replies
  • 2 kudos

Resolved! Random SMTP authentication failures to Office 365 (Exchange)

Hey all-I have a python script running in databricks notebook which uses smtplib to connect and send email via our Exchange online server. At random times, it will start getting authentication failures and I can't figure out why. I've confirmed that ...

  • 4768 Views
  • 4 replies
  • 2 kudos
Latest Reply
Hubert-Dudek
Databricks MVP
  • 2 kudos

If message is "'bad username or password.'" my guess is that it is on Exchange side.

  • 2 kudos
3 More Replies
athjain
by New Contributor III
  • 8563 Views
  • 5 replies
  • 7 kudos

Resolved! How to query deltatables stored in s3 through databricks SQL Endpoint?

the delta tables after ETL are stored in s3 in csv or parquet format, so now question is how to allow databricks sql endpoint to run query over s3 saved files

  • 8563 Views
  • 5 replies
  • 7 kudos
Latest Reply
Anonymous
Not applicable
  • 7 kudos

Hey @Athlestan Jain​ How are you doing?Thanks for posting your question. Do you think you were able to resolve the issue?We'd love to hear from you.

  • 7 kudos
4 More Replies
_Orc
by New Contributor
  • 5138 Views
  • 2 replies
  • 1 kudos

Resolved! Checkpoint is getting created even the though the microbatch append has failed

Use caseRead data from source table using structured spark streaming(Round the clock).Apply transformation logic etc etc and finally merge the dataframe in the target table.If there is any failure during transformation or merge ,databricks job should...

  • 5138 Views
  • 2 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Om Singh​ Hope you are doing well. Just wanted to check in and see if you were able to find a solution to your question?Cheers

  • 1 kudos
1 More Replies
Databricks_7045
by New Contributor III
  • 5451 Views
  • 2 replies
  • 4 kudos

Resolved! Connecting Delta Tables from any Tools

Hi Team,To access SQL Tables we use tools like TOAD , SQL SERVER MANAGEMENT STUDIO (SSMS).Is there any tool to connect and access Databricks Delta tables.Please let us know.Thank you

  • 5451 Views
  • 2 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @Rajesh Vinukonda​ Hope you are doing well. Thanks for sending in your question. Were you able to find a solution to your query?

  • 4 kudos
1 More Replies
Labels