2022-03-07 cf3d495fc4489a3ae288bedcdb7c99fd 99+ fast 0.0 k0 visitspyspark环境版本要对齐spark版本 python版本 pyspark版本 环境变量设置JAVA_HOME = /usr/local/jdk1.8.0_11 HADOOP_CONF_DIR=/cloud/dahua/spark-2.4.4-binhadoop2.7/confSPARK_HOME=/usr/local/spark-2.4.4-bin-hadoop2.7 SCALA_HOME=/usr/local/scala-2.11.8 pyspark环境http://example.com/2022/03/07/pyspark-dependency/AuthorLavine HuPosted on2022-03-07Updated on2024-04-05Licensed under# Related Post 1.spark使用总结 2.spark交互工具 3.spark常见错误 4.dataframe 5.rdd 6.spark模块 7.spark配置 8.Spark支持的存储介质