How to set executor cores and memory for DSE Graph Analytics

Description

Hi,

i want to run two spark apps in parallel. First a PySpark Script - here it's no problme to define max amount of memory and cores via SparkConf.
Then i like to make a Analytic Graph Query. But the started Spark app "Apache TinkerPop's Spark-Gremlin" consumes the max on cores and memory.

Any suggestion where i can configure the amount of cores and memory the Analytic Graph App consumes?

Thanks!

Environment

None

Pull Requests

None

Activity

Show:
Jim Witschey
August 2, 2017, 6:21 AM

It's not clear to me if you're using the Python driver for DSE – I'm going to close this for now, but feel free to open a new ticket as a feature request, if it's relevant to the driver.

If you're looking for help with DataStax products, I recommend going through your customer service channels, or asking in the DataStax Academy Slack.

Not a Problem

Assignee

Unassigned

Reporter

Michael Weber

Fix versions

None

Labels

PM Priority

None

External issue ID

None

Doc Impact

None

Reviewer

None

Size

None

Pull Request

None

Components

Sprint

Py P-ENG-TRIAGE

Affects versions

Priority

Major