Using spark-submit and pyspark command you can run the spark statements, both these commands are available at $SPARK_HOME/bin directory and you will find two sets of these commands .sh files for Linux/macOS and .cmd files for windows.
If you are using EMR , there are three things
1.using pyspark(or spark-shell)
2.using spark-submit without using --master and --deploy-mode
3.using spark-submit and using --master and --deploy-mode
Although using all the above three will run the application in spark cluster, there is a difference how the driver program works.
In 1st and 2nd the driver will be in client mode whereas in 3rd the driver will also be in the cluster.
In 1st and 2nd, you will have to wait until one application complete to run another, but in 3rd you can run multiple applications in parallel.