拼写错误:value countBykey is not a member of org.apache.spark.rdd.RDD[(String, Int)]
2017-06-01 11:01
701 查看
今天写了一行代码,感觉很简单啊,怎么报错呢,后来一看是一个超级低级错误, 大小写搞错了,countByKey写成了countBykey,所以Spark的算子大小写一定不要搞错,有可能会报上面的错误。
:23: error: value
org.apache.spark.rdd.RDD[(String, Int)]
sc.textFile(“E:\eventype”).map(_.split(“|”)).map(x=>(x(0)+”|”+x(1),1)).countBykey()
^
正确的应该是
由于在windows下的spark-shell换行可能会有写问题,可以写到一行
换行会打印出来
![](https://oscdn.geek-share.com/Uploads/Images/Content/202011/05/7db9c2c781731cfe3d6becd4f8c3ae6f)
学习scala和spark编程,推荐在Windows下简单配置,最快的速度去学习。只要下载一个Spark的包解压,设置对应的SPARK_HOME和Path环境变量,就可以直接使用cmd或者PowerShell试验你写的程序了,我是ultraedit写好的,如果用IDE可能不会有上面的问题,各有利弊。
我的配置
SPARK_HOME=F:\spark-1.5.2-bin-hadoop2.3
Path=%SPARK_HOME%\bin;
打开Power Shell,输入spark-shell即可使用。
scala> sc.textFile("E:\\eventype").map(_.split("\\|")).map(x=>(x(0)+"|"+x(1),1)).countBykey()
:23: error: value
countBykeyis not a member of
org.apache.spark.rdd.RDD[(String, Int)]
sc.textFile(“E:\eventype”).map(_.split(“|”)).map(x=>(x(0)+”|”+x(1),1)).countBykey()
^
正确的应该是
scala> val eventRDD = sc.textFile("E:\\eventype").map(_.split("\\|")).map(x=>(x(0)+"|"+x(1),1)).countByKey().map{ line => val cell= line._1.split("\\|")(0) val eventype=line._1.split("\\|")(1) val count = line._2 (cell,eventype,count) }
由于在windows下的spark-shell换行可能会有写问题,可以写到一行
scala> val eventRDD = sc.textFile("E:\\eventype").map(_.split("\\|")).map(x=>(x(0)+"|"+x(1),1)).countByKey().map{line => val cell= line._1.split("\\|")(0);val eventype=line._1.split("\\|")(1);val count = line._2;(cell,eventype,count)}
换行会打印出来
学习scala和spark编程,推荐在Windows下简单配置,最快的速度去学习。只要下载一个Spark的包解压,设置对应的SPARK_HOME和Path环境变量,就可以直接使用cmd或者PowerShell试验你写的程序了,我是ultraedit写好的,如果用IDE可能不会有上面的问题,各有利弊。
我的配置
SPARK_HOME=F:\spark-1.5.2-bin-hadoop2.3
Path=%SPARK_HOME%\bin;
打开Power Shell,输入spark-shell即可使用。
相关文章推荐
- value toDF is not a member of org.apache.spark.rdd.RDD
- scala学习-Description Resource Path Location Type value toDF is not a member of org.apache.spark.rdd.R
- value toDF is not a member of org.apache.spark.rdd.RDD[People]
- 解决value toDF is not a member of org.apache.spark.rdd.RDD[People]
- value toDF is not a member of org.apache.spark.rdd.RDD[People]
- error: object kafka is not a member of package org.apache.spark.streaming
- Spark wordcount 编译错误 -- reduceByKey is not a member of RDD
- Scala编程中常见错误:Error:(24, 29) value foreach is not a member of java.util.Set[String]
- sbt编译spark程序提示value toDF is not a member of Seq()
- object hbase is not a member of package org.apache.hadoop
- vc++ .net std::string is not a member of std
- Class org.apache.struts2.json.JSONWriter can not access a member of *
- Class org.apache.struts2.json.JSONWriter can not access a member of class oracle.jdbc.driver.Physica
- Class org.apache.struts2.json.JSONWriter can not access a member of class oracle.jdbc.driver.Physica
- org.apache.struts2.json.JSONWriter can not access a member of class
- 使用json-lib-2.1.jar报,org.apache.struts2.json.JSONWriter can not access a member of class org.apache.commons.dbcp.PoolingDataSourc
- org.apache.struts2.json.JSONWriter can not access a member of class org.apache.commons.dbcp.PoolingD
- Class org.apache.struts2.json.JSONWriter can not access a member of class oracle.jdbc.driver.Physica
- 遇到问题---org.apache.struts2.json.JSONWriter can not access a member of class
- Class org.apache.struts2.json.JSONWriter can not access a member of class org.springframework.aop.in