Unable to make ssl connection to cassandra through spark
SPARKC-521
java.util.NoSuchElementException: key not found: 'org.apache.cassandra.db.marshal.DateType'
SPARKC-282
Provide Listen Address as Preferred Location in Spark 1.3
SPARKC-131
Undeclared dependency on commons-configuration
SPARKC-566
Default read and write consistency levels cause issues
SPARKC-564
Spark 2.4.0 depends on netty-all 4.1.17.Final
SPARKC-557
Allow Custom Converters for Collection Types
SPARKC-556
JavaInstantConverter needs conversion from java.util.Date
SPARKC-555
Add support for custom rate limiters
SPARKC-554
Cassandra Table count retrieves different values when running with multiple executors.
SPARKC-549
joinWithCassandraTable does not support PrimaryKeyColumns
SPARKC-547
Allow user to have custom ratelimiter
SPARKC-546
Having a column of UDT type in Cassandra while try to read table getting error
SPARKC-543
Upgrade Cassandra for Integration Testing with Java 162+
SPARKC-539
Explicit shutdown order for EmbeddedCassandra and SessionCache
SPARKC-538
Access WRITETIME in pyspark?
SPARKC-535
Support for Cassandra 4.0 - Driver Update
SPARKC-534
WRITETIME and TTL cannot be read by using the DataFrame API
SPARKC-528
Writing UDT camelCase column doesn't work
SPARKC-526
dataset.createCassandraTable is not available
SPARKC-525
Generate a CQL Create Table statement using a DataFrame and all the parameters required to construct the CREATE Table statement
SPARKC-524
StructType is Unsupported in createCassandraTable
SPARKC-523
Cannot use null for primitive Java types
SPARKC-517
Show more information if any DataFrame value cannot be converted
SPARKC-516
Add spanBy and spanByKey in Python API
SPARKC-514
Support partially specified writes from case classes
SPARKC-512
Exception in thread "Shutdown-checker" java.lang.RuntimeException: eventfd_write() failed: Bad file descriptor
SPARKC-511
Support Spark Structured Streaming
SPARKC-510
Postgresql UUID[] to Cassandra: Conversion Error
SPARKC-506
Unable to save RDD with NULL UDT fields.
SPARKC-504
C* secondary index usage
SPARKC-502
leftJoinWithCassandraTable is not available for the Dstream api.
SPARKC-500
left join with cassandra table is having always read-row-meter = 0 and read-byte-meter = 0
SPARKC-499
Depending on netty-all considered bad practice
SPARKC-498
In clause does not work with cluster key
SPARKC-497
Allow setting of Custom Load Balancing Policy for Cassandra Connector
SPARKC-489
Connection factory configuration is ignored for cluster level settings
SPARKC-488
Cassandra Connection closed using Spark Cassandra Connector in Spark Streaming
SPARKC-487
using joinWithCassandraTable in python/pyspark
SPARKC-483
Logical error of org.apache.spark.sql.cassandra.BasicCassandraPredicatePushDown#partitionKeyPredicatesToPushDown
SPARKC-482
Reduce Allocation Pressure Caused by the SCC
SPARKC-478
Look up Codecs on Write only Once
SPARKC-477
Have to manually add spark-sql dependency to get around bad symbolic reference
SPARKC-468
SPARKC-363 for Tuples: Reflection on the Wrong Classloader leads to Class Not Found
SPARKC-467
Add Python Specific Dataframe Option Docs
SPARKC-465
Repeated TypeConverter Registration Causes Degraded Performance and Crashes
SPARKC-464
Unable to persist UDT in Cassandra
SPARKC-453
Improve support for PairRDD joins
SPARKC-445
Add Tunable Read Paralleism for Full Table Scans
SPARKC-444
Introduce Data Locality for Single Partition Queries
SPARKC-439
issue 1 of 122

Unable to make ssl connection to cassandra through spark

Description

This link says that the variables used are correct - https://github.com/datastax/spark-cassandra-connector/blob/master/doc/reference.md

ERROR ApplicationMaster: User class threw exception: java.util.concurrent.ExecutionException: com.datastax.spark.connector.util.ConfigCheck$ConnectorC
onfigurationException: Invalid Config Variables
Only known spark.cassandra.* variables are allowed when using the Spark Cassandra Connector.
spark.cassandra.connection.ssl.keyStore.password is not a valid Spark Cassandra Connector variable.
No likely matches found.
spark.cassandra.connection.ssl.keyStore.path is not a valid Spark Cassandra Connector variable.
No likely matches found.
java.util.concurrent.ExecutionException: com.datastax.spark.connector.util.ConfigCheck$ConnectorConfigurationException: Invalid Config Variables
Only known spark.cassandra.* variables are allowed when using the Spark Cassandra Connector.
spark.cassandra.connection.ssl.keyStore.password is not a valid Spark Cassandra Connector variable.
No likely matches found.
spark.cassandra.connection.ssl.keyStore.path is not a valid Spark Cassandra Connector variable.
No likely matches found.
at comgoogle.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:299)
at comgoogle.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:286)
at comgoogle.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:116)
at comgoogle.common.util.concurrent.Uninterruptibles.getUninterruptibly(Uninterruptibles.java:135)
at comgoogle.common.cache.LocalCache$Segment.getAndRecordStats(LocalCache.java:2346)
at comgoogle.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2318)
at comgoogle.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2280)
at comgoogle.common.cache.LocalCache$Segment.get(LocalCache.java:2195)
at comgoogle.common.cache.LocalCache.get(LocalCache.java:3934)
at comgoogle.common.cache.LocalCache.getOrLoad(LocalCache.java:3938)
at comgoogle.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4821)
at org.apache.spark.sql.cassandra.CassandraCatalog.lookupRelation(CassandraCatalog.scala:33)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.getTable(Analyzer.scala:302)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$9.applyOrElse(Analyzer.scala:314)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$9.applyOrElse(Analyzer.scala:309)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$resolveOperators$1.apply(LogicalPlan.scala:57)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$resolveOperators$1.apply(LogicalPlan.scala:57)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:69)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:56)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:54)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:54)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:281)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273)
at scala.collection.AbstractIterator.to(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)
at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformChildren(TreeNode.scala:321)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:54)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.apply(Analyzer.scala:309)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.apply(Analyzer.scala:299)
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:83)
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:80)
at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:111)
at scala.collection.immutable.List.foldLeft(List.scala:84)
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:80)
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:72)
at scala.collection.immutable.List.foreach(List.scala:318)
at org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:72)
at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:36)
at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:36)
at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:34)
at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:133)
at org.apache.spark.sql.cassandra.CassandraSQLContext.cassandraSql(CassandraSQLContext.scala:70)
at org.apache.spark.sql.cassandra.CassandraSQLContext.sql(CassandraSQLContext.scala:73)
at com.drishti.base.ConfigurationUtil$.getConfigurationTableData(ConfigurationUtil.scala:12)
at com.drishti.ameyodbconnector.UpdateCustomer$.main(UpdateCustomerInBTTC.scala:74)
at com.drishti.ameyodbconnector.UpdateCustomer.main(UpdateCustomerInBTTC.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:559)
Caused by: com.datastax.spark.connector.util.ConfigCheck$ConnectorConfigurationException: Invalid Config Variables
Only known spark.cassandra.* variables are allowed when using the Spark Cassandra Connector.
spark.cassandra.connection.ssl.keyStore.password is not a valid Spark Cassandra Connector variable.
No likely matches found.
spark.cassandra.connection.ssl.keyStore.path is not a valid Spark Cassandra Connector variable.
No likely matches found.
at com.datastax.spark.connector.util.ConfigCheck$.checkConfig(ConfigCheck.scala:50)
at com.datastax.spark.connector.cql.CassandraConnectorConf$.apply(CassandraConnectorConf.scala:256)
at org.apache.spark.sql.cassandra.CassandraSourceRelation$.apply(CassandraSourceRelation.scala:252)
at org.apache.spark.sql.cassandra.CassandraCatalog.org$apache$spark$sql$cassandra$CassandraCatalog$$buildRelation(CassandraCatalog.scala:41)
at org.apache.spark.sql.cassandra.CassandraCatalog$$anon$1.load(CassandraCatalog.scala:26)
at org.apache.spark.sql.cassandra.CassandraCatalog$$anon$1.load(CassandraCatalog.scala:23)
at comgoogle.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3524)
at comgoogle.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2317)
... 54 more

Environment

cassandra - 3.7.0
cassandra-driver-core - 3.0.3
spark-cassandra-connector - 1.6.0
spark-core - 1.6.2
spark-sql - 1.6.2

Pull Requests

None

Status

Assignee

Unassigned

Reporter

Samarth Goel

Labels

Reviewer

None

Reviewer 2

None

Tester

None

Pull Request

None

Time tracking

8h

Priority

Critical
Configure