We have a DSE environment in production which is divided into two data centers. Among them, one datacenter is handling SOLR search responsibility and the other is Spark Analytics. Both SOLR and Spark running fine except sometimes causing issues due to data storage that we have.
To reduce the storage load we are planning to load some analytics data in an altogether separate DSE environment. for that, I am planning to run a Spark service in a new DSE environment which will fetch existing data from current production and perform some operation on data and store it back in the new DSE environment.
So now the question is, Is it a good idea to point two spark context to one Cassandra table? I am aware that it would be a completely separate spark context so it should be good. but still, this is something which impacts our production environment so it's always good to take an opinion considering that we have more than 5tb of data and I don't want to take risk
EDIT: One thing which I miss to mention
I will install stand-alone Spark from the repository and configure it separately. Will not use spark service which packed along with DSE environment