Shade Guava dependency in Java driver

Description

We are trying to use the Datastax Cassandra Java driver in a Spark application. However, we are running in some sort of dependency hell. Hadoop 2.6 comes with Guava 11. The Cassandra Java driver requires Guava 14.

In order to avoid any dependency issues it would be great it the Cassandra Java driver would shade and embed such dependencies to become independent from the environment.

Note, we did try to work around the problem with spark.driver.userClassPathFirst and spark.executor.userClassPathFirst. When using those option we run into a lot more class path issues (such as Netty, etc.). While we got it working successfully on the driver, it still fails on the executor. The executor does not seem to pick up spark.executor.userClassPathFirst.

Environment

Spark 1.4.1, Hadoop 2.6.0

Pull Requests

None

Activity

Show:

Gunnar Wagenknecht September 15, 2015 at 9:49 AM

Thanks for the insights. That's a bit unfortunate. I'd recommend encapsulating those 3rd party API away in a future major release. Feel free to modify the request or close it as won't fix.

Olivier Michallat September 15, 2015 at 8:15 AM

Unfortunately we can't shade Guava because it's part of our public API. For example some of our public methods return ListenableFutures.

What you could maybe do is shade it yourself, when building the JAR that you deploy to Hadoop. Does that sound like a workable approach?

Won't Fix

Details

Assignee

Reporter

Affects versions

Components

Priority

Created September 14, 2015 at 2:07 PM
Updated May 22, 2017 at 3:27 PM
Resolved September 15, 2015 at 10:36 AM