Monday
Hi Team,
I have a scenario that i have a jar file(24MB) to be put on workspace directory. But the ownership should be associated to the SP with any Individual ID ownership. Tried the Databricks CLI export option but it has limitation of 10 MB max.
Please suggest.
Monday
Are you trying from CLI ?
I would try uploading it to Unity Catalog Volume using UC Files API running with SP auth. It allows to upload upto 5GB size of file.
Monday
Trick I use to do something similar to that: Run a job with that service principal. In theory, all new objects created will be configured with that service principal ownership. Job can be as simple as a task with a notebook importing your file.
In my case, "Creator" is the same as "Run as" but those can be different. In my scenario are the same because I use DAB with same SP to deploy jobs.
Monday - last edited Monday
Hi @Naveenkumar1811 ,
Setup a Unity Catalog volume and then you can use following rest api call. It supports file up to 5 GB:
Upload a file | Files API | REST API reference | Azure Databricks
I guess dabricks cp command should also work with volume - that's another option. So once you have databricks volumen you can try following:
databricks cp path_to_your_file dbfs:/Volumes/your_catalog_name/your_schema_name/Volume_name/path/to/data
Monday
Reference Link - https://docs.databricks.com/aws/en/volumes/volume-files#upload-files-to-a-volume
yesterday
Hi Team,
My Workspace is not Unity Catalogue Enabled... Do we have any Solution for the workspace without UC?
Thanks,
Naveen
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now