The Connector relies on the org.apache.commons.configuration.ConfigurationException class, but it only has access to commons-configuration transitively through hadoop-2.7. If it's running on a Spark cluster using Hadoop 3.1, this class isn't present.
I'm new to Scala, but if dependency conventions work the same as in Java, then the connector should be declaring its dependency on this. I don't know how to use SBT to check for any other problems. If there are other issues that lead to a known/intended incompatibility with Hadoop 3, then ideally that should be documented as well.