- 粗鲁的关闭(手动):
使用yarn工具,将spark app 粗鲁地关掉
①进入web UI界面,点击app,之后点击 kill application
②yarn app -kill appid
优雅地关闭(自动):
调用streamingContext.stop()
核心: ①什么时候调用?
SparkStreaming开发的app是一个流式应用,需要24h不停地计算,通常不关闭的!
只有在需要关闭时,才关闭!
程序怎么知道什么时候需要关闭?
需要给程序一个信号!
②在哪里调用?
额外启动一个线程,在线程中,判断需要关闭时,再关闭!
接收信号关闭思维:
另起一个线程负责监控信号点,接收到信号则stop
package test1import org.apache.hadoop.conf.Configurationimport org.apache.hadoop.fs.{FileSystem, Path}import org.apache.spark.streaming.Durations.secondsimport org.apache.spark.streaming.StreamingContextimport org.junit.Testimport java.net.URIclass GracefullCloseApp {@Testdef test()={val context = new StreamingContext("local[*]", "GClose", seconds(5))val ds = context.socketTextStream("hadoop102", 3333)val ds1 = ds.map((_, 1))ds1.print(1000)context.start()new Thread(){setDaemon(true)override def run(): Unit = {def getSignal ={val conf = new Configurationval fileSystem: FileSystem = FileSystem.get(new URI("hdfs://hadoop102:9820"), conf, "atguigu")!fileSystem.exists(new Path("/GClose"))}while (getSignal){Thread.sleep(5000)}println("关闭系统")context.stop(true,true)}}.start()context.awaitTermination()}}
