Cannot save to UDT C* Rows from UDT's read into Dataframes
Description
Easy to repo, try copying a table with UDT's to another Table with the same schema
val cc = new CassandraSQLContext(sc)
cc.setKeyspace("sql_test")
val result = cc
.read
.format("org.apache.spark.sql.cassandra")
.options(
Map(
"table" -> "objects",
"keyspace" -> "sql_test"
)
)
.load()
.write
.format("org.apache.spark.sql.cassandra")
.options(
Map(
"table" -> "objects_copy",
"keyspace" -> "sql_test"
)
).save()
Cause: com.datastax.spark.connector.types.TypeConversionException: Cannot convert object [foo,[thermostat,WrappedArray()],0] of type class org.apache.spark.sql.catalyst.expressions.GenericRowWithSchema to com.datastax.spark.connector.UDTValue.
Where `objects` has udt's in it
We could fix this by automatically converting GenericRowWithScheam to connector.UDT values
Environment
None
Pull Requests
None
Activity
Show:
jharnad October 29, 2015 at 1:54 PM
Can we please plan to include it with the next release.
Easy to repo, try copying a table with UDT's to another Table with the same schema
val cc = new CassandraSQLContext(sc) cc.setKeyspace("sql_test") val result = cc .read .format("org.apache.spark.sql.cassandra") .options( Map( "table" -> "objects", "keyspace" -> "sql_test" ) ) .load() .write .format("org.apache.spark.sql.cassandra") .options( Map( "table" -> "objects_copy", "keyspace" -> "sql_test" ) ).save()
Cause: com.datastax.spark.connector.types.TypeConversionException: Cannot convert object [foo,[thermostat,WrappedArray()],0] of type class org.apache.spark.sql.catalyst.expressions.GenericRowWithSchema to com.datastax.spark.connector.UDTValue.
Where `objects` has udt's in it
We could fix this by automatically converting GenericRowWithScheam to connector.UDT values