spark-env.sh 配置

With out Hadoop需要配置conf/spark-env.sh

  1. export SPARK_DIST_CLASSPATH=$(/path/to/hadoop/bin/hadoop classpath)

hive-site.xml和hdfs-site.xml同步到spark/conf目录下

启动

  1. ./start-thriftserver.sh --master local --conf spark.driver.memory=1G --class org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 --name --executor-memory 1G --num-executors 2 --name ThriftServer01

用Spark自己的bin/beeline连接.