cancel
Showing results for 
Search instead for 
Did you mean: 
Murthy1
Contributor II
since ‎11-30-2022
‎05-01-2024

User Stats

  • 19 Posts
  • 5 Solutions
  • 8 Kudos given
  • 7 Kudos received

User Activity

I am looking to install Python Egg files on all my clusters. The egg file is located in a S3 location. I tried using the following code which didn't work   resource "databricks_dbfs_file" "app" { source = "${S3_Path}/foo.egg" path = "/FileStore...
I understand that DLT is a separate job compute but I would like to use an existing all purpose cluster for the DLT pipeline. Is there a way I can achieve this?
Can I run multiple jobs(for example: 100+) in parallel that refers the same notebook? I supply each job with a different parameter. If we can do this, what would be the impact? (for example: reliability, performance, troubleshooting etc. )Example: N...
I would like to send some custom logs (in Python) from my Databricks notebook to AWS Cloudwatch. For example: df = spark.read.json(".......................")logger.info("Successfully ingested data from json")Has someone succeeded in doing this before...