cocdkl commented on issue #406: URL: https://github.com/apache/incubator-livy/issues/406#issuecomment-1556358335
感谢您抽出时间帮我解答问题 我在 livy-env.sh中进行了如下配置 SPARK_HOME=/home/cocdkl/soft/spark-3.2.4-bin-hadoop2.7 SPARK_CONF_DIR=/home/cocdkl/soft/spark-3.2.4-bin-hadoop2.7/conf 并且通过python -V 得到的版本为 2.7.16 我尝试过通过spar直接执行 pyspark命令,可以正常操作。 我写改过 /pyspark.zip/pyspark/find_spark_home.py 这个脚本,将这个脚本的返回值固化为我本地的spark_home目录,但是还是会报其他的错误。 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
