It's not recommended to turn of Spark speculative execution permanently. For jobs where tasks are running slow or stuck because of transient network or storage issues, speculative execution can be very handy. However, it suppresses the actual problem and performs a retry of the task.
Speculative execution should be treated as a temporary workaround until finding the root cause of why the task or job is stuck.
Speculative execution can cause unnecessary task retries and can degrade the performance of jobs/stages where there is no true task stuck scenarios.