by
frank7
• New Contributor II
- 3743 Views
- 2 replies
- 1 kudos
I have a pyspark dataframe that contains information about the tables that I have on sql database (creation date, number of rows, etc)Sample data: {
"Day":"2023-04-28",
"Environment":"dev",
"DatabaseName":"default",
"TableName":"discount"...
- 3743 Views
- 2 replies
- 1 kudos
Latest Reply
@Bruno Simoes :Yes, it is possible to write a PySpark DataFrame to a custom log table in Log Analytics workspace using the Azure Log Analytics Workspace API.Here's a high-level overview of the steps you can follow:Create an Azure Log Analytics Works...
1 More Replies
- 2541 Views
- 2 replies
- 0 kudos
ERROR Description :java.lang.Exception: Cannot run program "/local_disk0/pythonVirtualEnvDirs/virtualEnv-5acc1ea9-d03f-4de3-b76b-203d42614000/bin/python" (in directory "."): error=2, No such file or directory at java.lang.ProcessB...
- 2541 Views
- 2 replies
- 0 kudos
Latest Reply
@Akanksha Gupta :The error message suggests that the Python executable file specified in the configuration of the Databricks cluster cannot be found or accessed. Specifically, it seems that the Python executable file at the path "/local_disk0/python...
1 More Replies
- 12933 Views
- 5 replies
- 7 kudos
Hi,How we can integrate log analytics with databricks to log notebook run details and code validations.Thank you
- 12933 Views
- 5 replies
- 7 kudos
Latest Reply
I think you are looking for sending application logs.I'd use Log4j as this is already used by Databricks.The link does not use notebooks but it should work in notebooks too.
4 More Replies