pyspark环境

版本要对齐

spark版本 python版本 pyspark版本

环境变量设置

JAVA_HOME = /usr/local/jdk1.8.0_11

HADOOP_CONF_DIR=/cloud/dahua/spark-2.4.4-binhadoop2.7/conf

SPARK_HOME=/usr/local/spark-2.4.4-bin-hadoop2.7

SCALA_HOME=/usr/local/scala-2.11.8

Author

Lavine Hu

Posted on

2022-03-07

Updated on

2024-04-05

Licensed under

Comments

:D 一言句子获取中...