i want to run two spark apps in parallel. First a PySpark Script - here it's no problme to define max amount of memory and cores via SparkConf.
Then i like to make a Analytic Graph Query. But the started Spark app "Apache TinkerPop's Spark-Gremlin" consumes the max on cores and memory.
Any suggestion where i can configure the amount of cores and memory the Analytic Graph App consumes?