Fixed
Details
Details
Assignee
Alex Liu
Alex Liu(Deactivated)Reporter
faraz waseem
faraz waseemReviewer
Russell Spitzer
Russell SpitzerReviewer 2
Jacek
JacekPull Request
Components
Fix versions
Priority
Created March 15, 2016 at 10:31 PM
Updated August 21, 2018 at 3:10 PM
Resolved June 8, 2016 at 9:01 PM
We want to add support of using "ttl" filed . For example currently spark supports TTL field using
rdd.saveToCassandra("test", "tab", writeConf = WriteConf(ttl = TTLOption.perRow("ttl")))
where rdd is something like ( last field is TTL in seconds)
val rdd = sc.makeRDD(Seq(
KeyValueWithTTL(1, 1L, "value1", 100),
KeyValueWithTTL(2, 2L, "value2", 200),
KeyValueWithTTL(3, 3L, "value3", 300)))
We want do similar with dataframe apis like
data.write
.format("org.apache.spark.sql.cassandra")
.options(Map( "table" -> tableName, "keyspace" -> dbName,
"spark_cassandra_connection_host" -> createStringList(hosts),
"spark_cassandra_output_metrics" -> "false",
"spark_cassandra_output_concurrent_writes" -> "20",
))
.mode(SaveMode.Append)
.save()
We want to use "ttl" field in our dataframe and want cassandra to apply it per row.