Have to manually add spark-sql dependency to get around bad symbolic reference

Description

Hi,

I understand that spark-sql is perhaps a transitive dependency but my "hello-world" basic sample did not compile (in sbt).

So with this setup in sbt:

resolvers += "Spark Packages Repo" at "https://dl.bintray.com/spark-packages/maven"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.1.0"
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "2.0.0-M3"
I get this:

[error] bad symbolic reference to org.apache.spark.sql.package encountered in class file 'package.class'.
[error] Cannot access term package in package org.apache.spark.sql. The current classpath may be
[error] missing a definition for org.apache.spark.sql.package, or package.class may have been compiled against a version that's
[error] incompatible with the one found on the current classpath.
[error] bad symbolic reference to org.apache.spark.sql.package.DataFrame encountered in class file 'package.class'.
[error] Cannot access type DataFrame in value org.apache.spark.sql.package. The current classpath may be
[error] missing a definition for org.apache.spark.sql.package.DataFrame, or package.class may have been compiled against a version that's
[error] incompatible with the one found on the current classpath.
[error] two errors found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 1 s, completed 29-Jan-2017 23:41:05
And I had to add this line to get around:

libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "2.1.0"
I have recently deleted my .ivy2 cache so doubt it is corruption.

Thanks for great work. This issue got me hooked for hours until I got around it.

Environment

None

Pull Requests

None

Status

Assignee

Piotr Kołaczkowski

Reporter

Ali Kheyrollahi

Labels

None

Reviewer

None

Reviewer 2

None

Tester

None

Pull Request

None

Components

Affects versions

Priority

Major
Configure