网站介绍:文章浏览阅读2.4k次。我的spark是搭建在hadoop上面的,启动hadoop以后,接着启动spark的时候,会在启动命令行提示一下信息,spark2: failed to launch org.apache.spark.deploy.worker.Worker:最后分析可知,是由于没有打开spark的7077端口的原因,解决方案:关闭spark服务,在所有节点运行打开端口操作,具体命令如下所示:_failed to launch org.apache.spark.deploy.worker.worker:
- 链接地址:https://blog.csdn.net/wangxiaotongfan/article/details/46912537
- 链接标题:spark启动时 问题_failed to launch org.apache.spark.deploy.worker.wo-CSDN博客
- 所属网站:blog.csdn.net
- 被收藏次数:6852
- 网站标签:failed to launch org.apache.spark.deploy.worker.worker: