下载Spark编译插件:https://blog.csdn.net/bababuzaijia/article/details/117742080
    mvn clean scala:compilecompile package

    程序运行:

    1. package com.buwenbuhuo.spark
    2. import org.apache.spark.{SparkConf, SparkContext}
    3. /**
    4. **
    5. @author 不温卜火
    6. **
    7. * @create 2020-07-19 14:26
    8. *
    9. * MyCSDN :https://buwenbuhuo.blog.csdn.net/
    10. */
    11. object WordCount {
    12. def main(args: Array[String]): Unit = {
    13. // 1. 创建 SparkConf对象, 并设置 App名字
    14. val conf: SparkConf = new SparkConf().setAppName("WordCount")
    15. // 2. 创建SparkContext对象
    16. val sc = new SparkContext(conf)
    17. // 3. 使用sc创建RDD并执行相应的transformation和action
    18. sc.textFile("/input")
    19. .flatMap(_.split(" "))
    20. .map((_, 1))
    21. .reduceByKey(_ + _)
    22. .saveAsTextFile("/result")
    23. // 4. 关闭连接
    24. sc.stop()
    25. }
    26. }

    运行程序异常:VerifyError,版本与SDK不对应
    https://blog.csdn.net/qq_42166929/article/details/109301753
    https://mvnrepository.com/artifact/org.apache.spark/spark-core

    本地运行报错解决:
    https://blog.csdn.net/love666666shen/article/details/78812622