报错如下

    1. (pyspark) cd12:~ cdarling$ python
    2. Python 3.7.3 (default, Mar 27 2019, 16:54:48)
    3. [Clang 4.0.1 (tags/RELEASE_401/final)] :: Anaconda, Inc. on darwin
    4. Type "help", "copyright", "credits" or "license" for more information.
    5. >>> from pyspark.sql import SparkSession
    6. >>> s=SparkSession.builder.getOrCreate()
    7. 19/07/13 14:05:59 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    8. Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
    9. Setting default log level to "WARN".
    10. To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
    11. 19/07/13 14:06:00 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    12. 19/07/13 14:06:00 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    13. 19/07/13 14:06:00 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    14. 19/07/13 14:06:00 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    15. 19/07/13 14:06:00 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    16. 19/07/13 14:06:00 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    17. 19/07/13 14:06:00 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    18. 19/07/13 14:06:00 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    19. 19/07/13 14:06:00 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    20. 19/07/13 14:06:00 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    21. 19/07/13 14:06:00 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    22. 19/07/13 14:06:00 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    23. 19/07/13 14:06:00 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    24. 19/07/13 14:06:00 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    25. 19/07/13 14:06:00 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    26. 19/07/13 14:06:00 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    27. 19/07/13 14:06:00 ERROR SparkContext: Error initializing SparkContext.
    28. java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
    29. at sun.nio.ch.Net.bind0(Native Method)
    30. at sun.nio.ch.Net.bind(Net.java:433)
    31. at sun.nio.ch.Net.bind(Net.java:425)
    32. at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
    33. at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:128)
    34. at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:558)
    35. at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1283)
    36. at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:501)
    37. at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:486)
    38. at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:989)
    39. at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:254)
    40. at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:364)
    41. at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
    42. at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
    43. at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
    44. at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
    45. at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
    46. at java.lang.Thread.run(Thread.java:748)
    47. Traceback (most recent call last):
    48. File "<stdin>", line 1, in <module>
    49. File "/usr/local/Caskroom/miniconda/4.6.14/miniconda3/envs/pyspark/lib/python3.7/site-packages/pyspark/sql/session.py", line 173, in getOrCreate
    50. sc = SparkContext.getOrCreate(sparkConf)
    51. File "/usr/local/Caskroom/miniconda/4.6.14/miniconda3/envs/pyspark/lib/python3.7/site-packages/pyspark/context.py", line 367, in getOrCreate
    52. SparkContext(conf=conf or SparkConf())
    53. File "/usr/local/Caskroom/miniconda/4.6.14/miniconda3/envs/pyspark/lib/python3.7/site-packages/pyspark/context.py", line 136, in __init__
    54. conf, jsc, profiler_cls)
    55. File "/usr/local/Caskroom/miniconda/4.6.14/miniconda3/envs/pyspark/lib/python3.7/site-packages/pyspark/context.py", line 198, in _do_init
    56. self._jsc = jsc or self._initialize_context(self._conf._jconf)
    57. File "/usr/local/Caskroom/miniconda/4.6.14/miniconda3/envs/pyspark/lib/python3.7/site-packages/pyspark/context.py", line 306, in _initialize_context
    58. return self._jvm.JavaSparkContext(jconf)
    59. File "/usr/local/Caskroom/miniconda/4.6.14/miniconda3/envs/pyspark/lib/python3.7/site-packages/py4j/java_gateway.py", line 1525, in __call__
    60. answer, self._gateway_client, None, self._fqn)
    61. File "/usr/local/Caskroom/miniconda/4.6.14/miniconda3/envs/pyspark/lib/python3.7/site-packages/py4j/protocol.py", line 328, in get_return_value
    62. format(target_id, ".", name), value)
    63. py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
    64. : java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
    65. at sun.nio.ch.Net.bind0(Native Method)
    66. at sun.nio.ch.Net.bind(Net.java:433)
    67. at sun.nio.ch.Net.bind(Net.java:425)
    68. at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
    69. at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:128)
    70. at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:558)
    71. at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1283)
    72. at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:501)
    73. at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:486)
    74. at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:989)
    75. at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:254)
    76. at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:364)
    77. at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
    78. at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
    79. at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
    80. at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
    81. at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
    82. at java.lang.Thread.run(Thread.java:748)
    83. >>>
    84. (pyspark) cd12:~ cdarling$

    解决方法,跟hosts有关
    1 SO问题
    2 中文博客