Pytorch DDP on Databricks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-29-2022 06:42 AM
Hello!
I am currently trying to use Pytorch Lightning inside Databricks and I am currently using a cluster with 2 gpus. Whenever I try to train my Transformer model with 1 gpu in DP strategy everything works fine, but when I try to use both the 2 gpus with a DDP strategy I get the following error:
MisconfigurationException: `Trainer(strategy='ddp_spawn')` or `Trainer(accelerator='ddp_spawn')` is not compatible with an interactive environment. Run your code as a script, or choose one of the compatible strategies: Trainer(strategy=None|dp|tpu_spawn). In case you are spawning processes yourself, make sure to include the Trainer creation inside the worker function.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-18-2022 11:11 PM
Hi @Marco Capusso , I am facing the similar issue could you find some fix. It would be great if you share some details around it.

