Spark启动的时候出现failed to launch: nice -n 0 /soft/spark/bin/spark-class org.apache.spark.deploy.worker_failed to launch: nice -n 0 /opt/module/spark-stan-CSDN博客

网站介绍:文章浏览阅读1.4w次,点赞23次,收藏31次。不多bb,错误如图所示:在网上搜了好多,大部分人说需要在root用户下的.bashrc配置JAVA_HOME,试过之后发现还是原来的错误;最后请教了大佬发现需要在/spark/sbin/spark-config.sh 加入JAVA_HOME的路径。 如图:保存之后启动:..._failed to launch: nice -n 0 /opt/module/spark-standalone/bin/spark-class org