โ08-20-2025 11:07 AM
#dbfs
I am unable to upload jar file dbfs to job cluster as it's deprecated now I need to upload it to workspace and install it to cluster, hower my jar size is 70mb i can't upload it through api or cli as max size is 50mb. Is there alternative ways to achieve this ,
Thanks
โ08-20-2025 11:17 AM
Hi @Srinivas5,
Did you try uploading your JAR file to a Unity Catalog volume? That might help.
โ08-20-2025 11:25 AM
Hi @WiliamRosa ,
Thank you for response,
I didn't try , is there any other way to upload it to databricks workspace?
โ08-20-2025 12:01 PM - edited โ08-20-2025 12:22 PM
Hi @Srinivas5 ,
You can zip you jar file into 2 parts and upload them separately. Then just use %sh magic command and unzip them in your workspace
zip -s 40m archive.zip myfile.jar
โ08-21-2025 10:52 AM
For every cicd release pipeline new jar artifact will be created usually it automatically upload that jar file to dbfs and install it on job clusters but in latest run time versions dbfs libraries are deprecated now, I need to automate the process the jar installation through workspace, for this first have to upload jar to workspace and I have try upload in chunks as jar size is 70mb but that also not working , is there another approach to fullfill this requirement.
โ08-21-2025 12:22 PM
Why it's not working when you are sending in chunks? What error did you get?
But I guess you should consider of using UC volumes. Then you could just use Files API, which let's you send files up to 5 GB:
3 weeks ago
Hi @Srinivas5!
Were you able to find a solution or approach that worked? If so, please mark the helpful reply as the Accepted Solution, or share your approach so others can benefit as well.
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now