下载Spark编译插件:https://blog.csdn.net/bababuzaijia/article/details/117742080
mvn clean scala:compilecompile package
程序运行:
package com.buwenbuhuo.sparkimport org.apache.spark.{SparkConf, SparkContext}/****@author 不温卜火*** @create 2020-07-19 14:26** MyCSDN :https://buwenbuhuo.blog.csdn.net/*/object WordCount {def main(args: Array[String]): Unit = {// 1. 创建 SparkConf对象, 并设置 App名字val conf: SparkConf = new SparkConf().setAppName("WordCount")// 2. 创建SparkContext对象val sc = new SparkContext(conf)// 3. 使用sc创建RDD并执行相应的transformation和actionsc.textFile("/input").flatMap(_.split(" ")).map((_, 1)).reduceByKey(_ + _).saveAsTextFile("/result")// 4. 关闭连接sc.stop()}}
运行程序异常:VerifyError,版本与SDK不对应
https://blog.csdn.net/qq_42166929/article/details/109301753
https://mvnrepository.com/artifact/org.apache.spark/spark-core
本地运行报错解决:
https://blog.csdn.net/love666666shen/article/details/78812622
