CassandraSparkExtensions crash with Spark 3.1.0 & Databricks Runtime 7.x

Description

If I use DirectJoin on the Spark 3.1.0-SNAPSHOT, or on Databricks Runtime 7.x, like this:

When trying to explain it or perform, it generates the exception.

This problem is caused by the refactoring done as part of SPARK-31719 that moved BuildSide/BuildLeft/BuildRight case classes into org.apache.spark.sql.catalyst.optimizer package.

Unfortunately, after discussion with developer, it looks like that return them back will cause another big refactoring that is hard to do before Spark 3.1 freeze. The fix on the SCC side is just changing imports in 2 files (PR will follow shortly)

Environment

Databricks Runtime 7.x (7.0/7.2/7.3), OSS Spark 3.1.0-SNAPSHOT

Activity

Show:
Alex Ott
May 1, 2021, 10:58 AM

You can build from the open pull request

wassim almaaoui
April 27, 2021, 10:52 AM
Edited

We have the same issue to upgrade spark too, thanks in advance!

Jatin Puri
April 27, 2021, 7:39 AM

Dear Team. Any update on this. Looks like this is a blocker for 3.1.1 upgrade :).

Assignee

Jaroslaw Grabowski

Reporter

Alex Ott

Fix versions