网站介绍:文章浏览阅读213次。1 map + reduceByKey sparkContext.textFile("hdfs://ifeng:9000/hdfsapi/wc.txt") .flatMap(_.split(",")) .map((_,1)) .reduceByKey(_+_).collect()2 countByValue代替map + reduceByKeyval RDDfile = sparkContext.textFile("hdfs://ifeng:9_用combinebykey实现wordcount
- 链接地址:https://blog.csdn.net/weixin_39381833/article/details/108349646
- 链接标题:Spark算子实现WordCount_用combinebykey实现wordcount-CSDN博客
- 所属网站:blog.csdn.net
- 被收藏次数:2038
- 网站标签:用combinebykey实现wordcount