这是关于我从Apache Spark查询Cassandra时遇到的问题.
来自Spark的正常查询工作正常,没有任何问题,但是当我查询条件是关键时,我得到以下错误.最初我尝试查询复合键列族,它也给出了与下面相同的问题.
"引起:InvalidRequestException(为什么:如果包含Equal,则empid不能被多个关系限制)"
专栏系列:
CREATE TABLE emp ( empID int, deptID int, first_name varchar, last_name varchar, PRIMARY KEY (empID));
列族内容:
empID, deptID, first_name, last_name 104, 15, 'jane', 'smith'
示例SCALA代码:
val job=new Job() job.setInputFormatClass(classOf[CqlPagingInputFormat]) val host: String = "localhost" val port: String = "9160" ConfigHelper.setInputInitialAddress(job.getConfiguration(), host) ConfigHelper.setInputRpcPort(job.getConfiguration(), port) ConfigHelper.setInputColumnFamily(job.getConfiguration(), "demodb", "emp") ConfigHelper.setInputPartitioner(job.getConfiguration(), "Murmur3Partitioner") CqlConfigHelper.setInputColumns(job.getConfiguration(), "empid,deptid,first_name,last_name") //CqlConfigHelper.setInputCQLPageRowSize(job.getConfiguration(), limit.toString) CqlConfigHelper.setInputWhereClauses(job.getConfiguration(),"empid='104'") // Make a new Hadoop RDD val casRdd = sc.newAPIHadoopRDD(job.getConfiguration(), classOf[CqlPagingInputFormat], classOf[Map[String, ByteBuffer]], classOf[Map[String, ByteBuffer]])
我恳请你告诉我,如果有任何解决这种情况,因为我在过去几天努力克服这个问题.
谢谢