Config deprecations are not checked when instantiating CassandraConnector
Description
When CassandraConnector is instantiated with `CassandraConnector(SparkContext)`, config deprecations are not checked (nor automatically replaced).
This is an issue when using any Cassandra RDD function because every default implicit `CassandraConnector` in class `RDDFunctions` is instantiated this way.
For example, when first executing `rdd.saveToCassandra(keyspaceName, tableName, columns, WriteConf())` with no implicit `CassandraConnector` already initialized, then config deprecations are not checked, nor replaced, and deprecated configs are not taken into account.
Note that config deprecations are however checked from methods `ReadConf.fromSparkConf()` and `WriteConf.fromSparkConf()`, so the issue only occurs if these methods are not called previously.
When CassandraConnector is instantiated with `CassandraConnector(SparkContext)`, config deprecations are not checked (nor automatically replaced).
This is an issue when using any Cassandra RDD function because every default implicit `CassandraConnector` in class `RDDFunctions` is instantiated this way.
For example, when first executing `rdd.saveToCassandra(keyspaceName, tableName, columns, WriteConf())` with no implicit `CassandraConnector` already initialized, then config deprecations are not checked, nor replaced, and deprecated configs are not taken into account.
Note that config deprecations are however checked from methods `ReadConf.fromSparkConf()` and `WriteConf.fromSparkConf()`, so the issue only occurs if these methods are not called previously.