Details

    • Type: Improvement
    • Status: Closed
    • Priority: Major
    • Resolution: Won't Fix
    • Affects Version/s: 2.1.7.1
    • Fix Version/s: None
    • Component/s: Core
    • Labels:
      None
    • Environment:

      Spark 1.4.1, Hadoop 2.6.0

      Description

      We are trying to use the Datastax Cassandra Java driver in a Spark application. However, we are running in some sort of dependency hell. Hadoop 2.6 comes with Guava 11. The Cassandra Java driver requires Guava 14.

      In order to avoid any dependency issues it would be great it the Cassandra Java driver would shade and embed such dependencies to become independent from the environment.

      Note, we did try to work around the problem with spark.driver.userClassPathFirst and spark.executor.userClassPathFirst. When using those option we run into a lot more class path issues (such as Netty, etc.). While we got it working successfully on the driver, it still fails on the executor. The executor does not seem to pick up spark.executor.userClassPathFirst.

        Attachments

          Issue links

            Activity

              People

              • Assignee:
                Unassigned
                Reporter:
                gunnar Gunnar Wagenknecht
              • Votes:
                0 Vote for this issue
                Watchers:
                2 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: