i want to run two spark apps in parallel. First a PySpark Script - here it's no problme to define max amount of memory and cores via SparkConf.
Then i like to make a Analytic Graph Query. But the started Spark app "Apache TinkerPop's Spark-Gremlin" consumes the max on cores and memory.
Any suggestion where i can configure the amount of cores and memory the Analytic Graph App consumes?
It's not clear to me if you're using the Python driver for DSE – I'm going to close this for now, but feel free to open a new ticket as a feature request, if it's relevant to the driver.
If you're looking for help with DataStax products, I recommend going through your customer service channels, or asking in the DataStax Academy Slack.