Launch Spark cluster by sbt sparkLaunchCluster, you can also execute sbt first and type sparkLaunchCluster (support TAB completion) in the sbt console.
Once launched, submit your job by sbt sparkSubmitJob <args>
All available commands
sbt sparkConf
sbt sparkEc2Dir
sbt sparkEc2Run
sbt sparkLaunchCluster
sbt sparkDestroyCluster
sbt sparkStartCluster
sbt sparkStopCluster
sbt sparkShowMachines
sbt sparkLoginMaster
sbt sparkShowSpaceUsage Show the i-node and disk usage of all the instances.
sbt sparkUploadJar
sbt sparkSubmitJob <args> Start the job remotely from local machine.
sbt sparkRemoveS3Dir <dir-name> Remove the s3 directory with the _$folder$ folder file. (ex. sbt sparkRemoveS3Dir s3://bucket_name/middle_folder/target_folder)