2 weeks ago
#dbfs
I am unable to upload jar file dbfs to job cluster as it's deprecated now I need to upload it to workspace and install it to cluster, hower my jar size is 70mb i can't upload it through api or cli as max size is 50mb. Is there alternative ways to achieve this ,
Thanks
2 weeks ago
Hi @Srinivas5,
Did you try uploading your JAR file to a Unity Catalog volume? That might help.
2 weeks ago
Hi @WiliamRosa ,
Thank you for response,
I didn't try , is there any other way to upload it to databricks workspace?
2 weeks ago - last edited 2 weeks ago
Hi @Srinivas5 ,
You can zip you jar file into 2 parts and upload them separately. Then just use %sh magic command and unzip them in your workspace
zip -s 40m archive.zip myfile.jar
2 weeks ago
For every cicd release pipeline new jar artifact will be created usually it automatically upload that jar file to dbfs and install it on job clusters but in latest run time versions dbfs libraries are deprecated now, I need to automate the process the jar installation through workspace, for this first have to upload jar to workspace and I have try upload in chunks as jar size is 70mb but that also not working , is there another approach to fullfill this requirement.
2 weeks ago
Why it's not working when you are sending in chunks? What error did you get?
But I guess you should consider of using UC volumes. Then you could just use Files API, which let's you send files up to 5 GB:
Friday
Hi @Srinivas5!
Were you able to find a solution or approach that worked? If so, please mark the helpful reply as the Accepted Solution, or share your approach so others can benefit as well.
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now