ā12-07-2021 10:48 PM
I'm running a scheduled job on Job clusters. I didnt mention the log location for the cluster. Where can we get the stored logs location.
Yes, I can see the logs in the runs, but i need the logs location.
ā12-08-2021 06:28 AM
Hi @karthick Jā ,
You may get more information over here:-
https://docs.databricks.com/clusters/configure.html#cluster-log-delivery
ā12-07-2021 11:49 PM
Hi @ kjoth! Thanks for your question! Let's see if your peers in the community have an answer to your question first. Or else I will get back to you soon. Thanks.
ā12-08-2021 12:57 AM
I do not know the default location of the logs.
But it is very easy to define where the logs should be shipped:
https://docs.microsoft.com/en-us/azure/databricks/clusters/configure#cluster-log-delivery
Like that you control the location.
ā12-08-2021 02:22 AM
Yes, I'm aware of the option of providing log location. Ran some jobs without configuring the log location. Need to know the default location, for exporting the logs.
ā12-08-2021 06:28 AM
Hi @karthick Jā ,
You may get more information over here:-
https://docs.databricks.com/clusters/configure.html#cluster-log-delivery
ā09-22-2022 05:03 AM
Hi @karthick Jā Did you get this?
ā09-22-2022 05:05 AM
@Kaniz Fatmaā
I am unable find where the logs were stored defaultly without giving any path in advanced options.Would Please help me with this?
ā09-25-2022 09:54 PM
Hi @Sai Kalyani Pā , Yes it helped. Thanks
ā09-26-2022 03:11 AM
@karthick Jā
would you please help me to find out the location of the logs location?I was unable to find out that one
ā09-26-2022 07:52 AM
Hi @Sai Kalyani Pā ,
In the databricks one of the cluster run this command to get the spark configs
spark.sparkContext.getConf().getAll()
The above command will list all configs.
In this returned result, search for this config
('spark.databricks.eventLog.dir', 'eventlogs') This is the place where eventlogs are stored.
Alos check this property, where its by default disabled.
('spark.eventLog.enabled', 'false'),
Check these two files.
ls -ll /databricks/driver/logs/stdout
ls -ll /databricks/spark/logs/
By default these logs are not saved regularly.
We need to mention the cluster log location if we want it ti be stored reliably.
ā09-26-2022 09:48 AM
@karthick Jā
Thank you!!
Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.
Click here to register and join today!
Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.