Fixed
Details
Details
Assignee
Jarek Grabowski
Jarek GrabowskiReporter
Alexey Ott
Alexey Ott(Deactivated)Fix versions
Components
Affects versions
Priority
Created June 16, 2020 at 1:24 PM
Updated September 7, 2020 at 4:06 PM
Resolved September 7, 2020 at 4:06 PM
According to the SPIP we should be able to perform
USE catalog
command to select default catalog. But if I execute it, I'm getting following error:scala> spark.sql("USE cass").show org.apache.spark.sql.connector.catalog.CatalogNotFoundException: Catalog 'Catalog cass For Cassandra Cluster At {10.101.34.176:9042, 10.101.34.94:9042} ' plugin class not found: spark.sql.catalog.Catalog cass For Cassandra Cluster At {10.101.34.176:9042, 10.101.34.94:9042} is not defined at org.apache.spark.sql.connector.catalog.Catalogs$.load(Catalogs.scala:51) at org.apache.spark.sql.connector.catalog.CatalogManager.$anonfun$catalog$1(CatalogManager.scala:52) at scala.collection.mutable.HashMap.getOrElseUpdate(HashMap.scala:86) at org.apache.spark.sql.connector.catalog.CatalogManager.catalog(CatalogManager.scala:52)
although I'm able to list keyspaces in given catalog:
scala> spark.sql("SHOW NAMESPACES FROM cass").show +--------------------+ | namespace| +--------------------+ |system_virtual_sc...| | baselines| | scc| | OpsCenter| | dse_leases| | system_traces| ....