pyspark写入mysql内存溢出_当我用sparksubmit运行作业.py,它总是写着文件'pyspark.zip文件'不存在...-CSDN博客

网站介绍:文章浏览阅读446次。环境:spark-2.1当我使用spark submit运行作业.py,它总是说文件pyspark.zip不存在。在但还是不行。我的run.sh如下:#!/bin/sh/usr/lib/software/spark/spark-2.1/bin/spark-submit \--master yarn-cluster \--driver-memory 5G \--num-executors 12 \...._pyspark.zip does not exist