环境:
    操作系统:Win10
    Scala:2.12.14
    Spark:3.1.2
    Hadoop:3.2.0

    Spark的安装很简单,直接Maven里面加依赖就行了

    1. <dependencies>
    2. <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
    3. <dependency>
    4. <groupId>org.apache.spark</groupId>
    5. <artifactId>spark-core_2.12</artifactId>
    6. <version>3.1.2</version>
    7. </dependency>
    8. </dependencies>

    但是会报一些错误比如

    WARNING: An illegal reflective access operation has occurred WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/C:/Users/Administrator/.m2/repository/org/apache/spark/spark-unsafe_2.12/3.1.2/spark-unsafe_2.12-3.1.2.jar) to constructor java.nio.DirectByteBuffer(long,int) WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform

    大概就是说本地没有Hadoop资源。
    所以我们还要安装Hadoop。

    TODO
    参考:Install Hadoop 3.3.0 on Windows 10 Step by Step Guide

    ================

    新路子,直接安装prebuild的Hadoop+Spark

    https://blog.csdn.net/hil2000/article/details/90747665

    image.png

    1. import org.apache.spark.{SparkConf, SparkContext}
    2. object Spark01_WordCount {
    3. def main(args: Array[String]): Unit = {
    4. // Application
    5. // Spark 框架
    6. // TODO 建立和 Spark 框架的连接
    7. //JDBC: Connection
    8. val sparkConf = new SparkConf().setMaster("local").setAppName("WordCount")
    9. val sc = new SparkContext(sparkConf)
    10. // TODO 执行业务操作
    11. // TODO 关闭连接
    12. sc.stop()
    13. }
    14. }