安装并配置Spark

下载Spark

配置Spark

集成Hadoop

测试

处理容器

Idea创建Spark项目

issues:

  • Caused by: ExitCodeException exitCode=-1073741515
  • Permission denied: user=Administrator, access=EXECUTE, inode=”/tmp”
  • HADOOP_HOME and hadoop.home.dir are unset.
  • Ports are not available: listen tcp 0.0.0.0/50070: bind: An attempt was made to access a socket in a way forbidden by its access permissions

https://www.cnblogs.com/jasongrass/p/13726009.html
https://blog.csdn.net/lly1122334/article/details/113930489

  • InvalidClassException: scala.collection.mutable.WrappedArray$ofRef; local class incompatible: stream classdesc serialVersionUID = 3456489343829468865, local class serialVersionUID = 1028182004549731694
    • watch out the scala version in your IDEA
      • project library scala sdk version
      • scala version in pom.xml

References: