file-type

在本地通过Kubernetes运行Java Web应用教程

ZIP文件

下载需积分: 5 | 171KB | 更新于2025-09-06 | 20 浏览量 | 0 下载量 举报 收藏
download 立即下载
### 知识点 #### 项目背景 本项目名为“java-kubernetes-local”,其目标是在本地环境中搭建一个运行Java编写的Web应用程序的Kubernetes集群。使用Minikube作为本地Kubernetes环境的搭建工具,它简化了Kubernetes的本地部署过程,允许开发者在自己的笔记本上创建和管理单节点的Kubernetes集群。此项目的构建过程涉及对Spring Boot应用程序的部署和调试,以及对Kubernetes环境的配置与管理。 #### 技术栈与工具 - **Java 15**: 项目使用的Java版本,保证了现代的Java特性得以利用,如记录(record)、模式匹配等。 - **IntelliJ IDEA 2019.03.8**: 项目开发使用的集成开发环境(IDE),为Java开发提供高效和智能的代码编写支持。 - **Docker 19.03.8**: 容器化技术,使得应用程序和依赖可以在容器中打包,并在任何环境中运行。 - **MySQL 5.6**: 数据库管理系统,用于存储应用程序的数据。 - **Maven 3.5.4**: 项目构建与依赖管理工具,用于项目的构建过程、文档生成以及依赖关系管理。 - **Minikube v1.19.0**: 本地Kubernetes环境搭建工具,可以快速启动一个单节点的Kubernetes集群。 - **kubectl v1.21.0**: Kubernetes集群的命令行工具,允许用户与集群交互,执行如部署、检查状态等操作。 - **Spring Boot 3.2.1**: 基于Java的开源框架,用于快速开发独立的、生产级别的基于Spring的应用程序。 #### 关键概念和组件 - **Kubernetes**: 一个开源的系统,用于自动化部署、扩展和管理容器化应用程序。 - **Minikube**: 一个轻量级的Kubernetes实现,可以在笔记本电脑上运行。 - **Spring Boot**: 提供了一种快速开发独立的、生产级别的基于Spring的应用程序的方法。 - **Docker**: 一种开放平台,让开发者打包、分发应用程序为轻量级、可移植的容器。 - **Prometheus**: 一种开源监控和警报工具包,广泛用于监控时间序列数据。 - **Grafana**: 一个开源的指标分析与可视化工具,常与Prometheus一起使用以展示监控数据。 #### 环境要求 为运行本项目,需要满足以下环境要求: - 支持Linux或Windows操作系统。 - 至少需要2个或更多CPU核心。 - 至少2GB的可用内存。 - 硬盘空间至少20GB。 #### 项目实施步骤 1. **环境搭建**:安装并配置Minikube、Docker、kubectl等工具。 2. **项目搭建**:使用Spring Initializr创建新的Spring Boot项目,并配置相应的依赖。 3. **开发与调试**:在IntelliJ中编写Java代码,并通过Spring Boot的特性进行应用开发和本地调试。 4. **容器化**:利用Docker构建Spring Boot应用程序的容器镜像。 5. **部署**:使用kubectl命令行工具和Minikube将应用程序部署到本地的Kubernetes集群。 6. **监控和管理**:部署Prometheus和Grafana等工具来监控Kubernetes集群和应用程序状态。 #### pom.xml文件 项目的`pom.xml`文件包含此项目的具体依赖和版本信息,其详细内容对项目的构建和维护至关重要。 #### 结语 以上是对项目“java-kubernetes-local”的全面分析,涵盖了项目的背景、技术栈、关键组件、环境要求、实施步骤和配置文件内容。希望通过这些知识点的介绍,能够加深对Java应用程序在Kubernetes上部署与管理的理解。此项目的成功实施将为类似场景提供一个可借鉴的参考。

相关推荐

filetype

"C:\Program Files\Java\jdk1.8.0_281\bin\java.exe" "-javaagent:D:\新建文件夹 (2)\IDEA\idea\IntelliJ IDEA 2019.3.3\lib\idea_rt.jar=59342" -Dfile.encoding=UTF-8 -classpath "C:\Program Files\Java\jdk1.8.0_281\jre\lib\charsets.jar;C:\Program Files\Java\jdk1.8.0_281\jre\lib\deploy.jar;C:\Program Files\Java\jdk1.8.0_281\jre\lib\ext\access-bridge-64.jar;C:\Program Files\Java\jdk1.8.0_281\jre\lib\ext\cldrdata.jar;C:\Program Files\Java\jdk1.8.0_281\jre\lib\ext\dnsns.jar;C:\Program Files\Java\jdk1.8.0_281\jre\lib\ext\jaccess.jar;C:\Program Files\Java\jdk1.8.0_281\jre\lib\ext\jfxrt.jar;C:\Program Files\Java\jdk1.8.0_281\jre\lib\ext\localedata.jar;C:\Program Files\Java\jdk1.8.0_281\jre\lib\ext\nashorn.jar;C:\Program Files\Java\jdk1.8.0_281\jre\lib\ext\sunec.jar;C:\Program Files\Java\jdk1.8.0_281\jre\lib\ext\sunjce_provider.jar;C:\Program Files\Java\jdk1.8.0_281\jre\lib\ext\sunmscapi.jar;C:\Program Files\Java\jdk1.8.0_281\jre\lib\ext\sunpkcs11.jar;C:\Program Files\Java\jdk1.8.0_281\jre\lib\ext\zipfs.jar;C:\Program Files\Java\jdk1.8.0_281\jre\lib\javaws.jar;C:\Program Files\Java\jdk1.8.0_281\jre\lib\jce.jar;C:\Program Files\Java\jdk1.8.0_281\jre\lib\jfr.jar;C:\Program Files\Java\jdk1.8.0_281\jre\lib\jfxswt.jar;C:\Program Files\Java\jdk1.8.0_281\jre\lib\jsse.jar;C:\Program Files\Java\jdk1.8.0_281\jre\lib\management-agent.jar;C:\Program Files\Java\jdk1.8.0_281\jre\lib\plugin.jar;C:\Program Files\Java\jdk1.8.0_281\jre\lib\resources.jar;C:\Program Files\Java\jdk1.8.0_281\jre\lib\rt.jar;D:\carspark\out\production\carspark;C:\Users\wyatt\.ivy2\cache\org.scala-lang\scala-library\jars\scala-library-2.12.10.jar;C:\Users\wyatt\.ivy2\cache\org.scala-lang\scala-reflect\jars\scala-reflect-2.12.10.jar;C:\Users\wyatt\.ivy2\cache\org.scala-lang\scala-library\srcs\scala-library-2.12.10-sources.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\accessors-smart-1.2.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\activation-1.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\aircompressor-0.10.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\algebra_2.12-2.0.0-M2.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\antlr-runtime-3.5.2.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\antlr4-runtime-4.8-1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\aopalliance-1.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\aopalliance-repackaged-2.6.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\arpack_combined_all-0.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\arrow-format-2.0.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\arrow-memory-core-2.0.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\arrow-memory-netty-2.0.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\audience-annotations-0.5.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\automaton-1.11-8.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\avro-1.8.2.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\avro-ipc-1.8.2.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\avro-mapred-1.8.2-hadoop2.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\bonecp-0.8.0.RELEASE.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\breeze-macros_2.12-1.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\breeze_2.12-1.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\cats-kernel_2.12-2.0.0-M4.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\chill-java-0.9.5.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\chill_2.12-0.9.5.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\commons-beanutils-1.9.4.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\commons-cli-1.2.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\commons-codec-1.10.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\commons-collections-3.2.2.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\commons-compiler-3.0.16.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\commons-compress-1.20.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\commons-configuration2-2.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\commons-crypto-1.1.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\commons-daemon-1.0.13.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\commons-dbcp-1.4.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\commons-httpclient-3.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\commons-io-2.5.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\commons-lang-2.6.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\commons-lang3-3.10.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\commons-logging-1.1.3.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\commons-math3-3.4.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\commons-net-3.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\commons-pool-1.5.4.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\commons-text-1.6.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\compress-lzf-1.0.3.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\core-1.1.2.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\curator-client-2.13.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\curator-framework-2.13.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\curator-recipes-2.13.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\datanucleus-api-jdo-4.2.4.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\datanucleus-core-4.1.17.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\datanucleus-rdbms-4.1.19.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\derby-10.12.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\dnsjava-2.1.7.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\ehcache-3.3.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\flatbuffers-java-1.9.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\generex-1.0.2.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\geronimo-jcache_1.0_spec-1.0-alpha-1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\gson-2.2.4.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\guava-14.0.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\guice-4.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\guice-servlet-4.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hadoop-annotations-3.2.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hadoop-auth-3.2.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hadoop-common-3.2.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hadoop-hdfs-client-3.2.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hadoop-mapreduce-client-common-3.2.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hadoop-mapreduce-client-core-3.2.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hadoop-mapreduce-client-jobclient-3.2.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hadoop-yarn-api-3.2.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hadoop-yarn-client-3.2.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hadoop-yarn-common-3.2.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hadoop-yarn-registry-3.2.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hadoop-yarn-server-common-3.2.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hadoop-yarn-server-web-proxy-3.2.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\HikariCP-2.5.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hive-beeline-2.3.7.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hive-cli-2.3.7.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hive-common-2.3.7.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hive-exec-2.3.7-core.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hive-jdbc-2.3.7.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hive-llap-common-2.3.7.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hive-metastore-2.3.7.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hive-serde-2.3.7.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hive-service-rpc-3.1.2.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hive-shims-0.23-2.3.7.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hive-shims-common-2.3.7.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hive-shims-scheduler-2.3.7.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hive-storage-api-2.7.2.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hive-vector-code-gen-2.3.7.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hk2-api-2.6.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hk2-locator-2.6.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\hk2-utils-2.6.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\htrace-core4-4.1.0-incubating.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\httpclient-4.5.6.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\httpcore-4.4.12.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\istack-commons-runtime-3.0.8.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\ivy-2.4.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jackson-annotations-2.10.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jackson-core-2.10.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jackson-core-asl-1.9.13.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jackson-databind-2.10.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jackson-dataformat-yaml-2.10.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jackson-datatype-jsr310-2.11.2.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jackson-jaxrs-base-2.9.5.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jackson-jaxrs-json-provider-2.9.5.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jackson-mapper-asl-1.9.13.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jackson-module-jaxb-annotations-2.10.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jackson-module-paranamer-2.10.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jackson-module-scala_2.12-2.10.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jakarta.activation-api-1.2.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jakarta.annotation-api-1.3.5.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jakarta.inject-2.6.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jakarta.servlet-api-4.0.3.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jakarta.validation-api-2.0.2.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jakarta.ws.rs-api-2.1.6.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jakarta.xml.bind-api-2.3.2.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\janino-3.0.16.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\javassist-3.25.0-GA.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\javax.inject-1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\javax.jdo-3.2.0-m3.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\javolution-5.5.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jaxb-api-2.2.11.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jaxb-runtime-2.3.2.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jcip-annotations-1.0-1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jcl-over-slf4j-1.7.30.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jdo-api-3.0.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jersey-client-2.30.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jersey-common-2.30.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jersey-container-servlet-2.30.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jersey-container-servlet-core-2.30.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jersey-hk2-2.30.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jersey-media-jaxb-2.30.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jersey-server-2.30.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\JLargeArrays-1.5.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jline-2.14.6.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\joda-time-2.10.5.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jodd-core-3.5.2.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jpam-1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\json-1.8.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\json-smart-2.3.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\json4s-ast_2.12-3.7.0-M5.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\json4s-core_2.12-3.7.0-M5.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\json4s-jackson_2.12-3.7.0-M5.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\json4s-scalap_2.12-3.7.0-M5.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jsp-api-2.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jsr305-3.0.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jta-1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\JTransforms-3.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\jul-to-slf4j-1.7.30.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kerb-admin-1.0.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kerb-client-1.0.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kerb-common-1.0.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kerb-core-1.0.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kerb-crypto-1.0.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kerb-identity-1.0.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kerb-server-1.0.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kerb-simplekdc-1.0.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kerb-util-1.0.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kerby-asn1-1.0.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kerby-config-1.0.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kerby-pkix-1.0.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kerby-util-1.0.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kerby-xdr-1.0.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kryo-shaded-4.0.2.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kubernetes-client-4.12.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kubernetes-model-admissionregistration-4.12.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kubernetes-model-apiextensions-4.12.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kubernetes-model-apps-4.12.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kubernetes-model-autoscaling-4.12.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kubernetes-model-batch-4.12.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kubernetes-model-certificates-4.12.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kubernetes-model-common-4.12.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kubernetes-model-coordination-4.12.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kubernetes-model-core-4.12.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kubernetes-model-discovery-4.12.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kubernetes-model-events-4.12.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kubernetes-model-extensions-4.12.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kubernetes-model-metrics-4.12.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kubernetes-model-networking-4.12.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kubernetes-model-policy-4.12.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kubernetes-model-rbac-4.12.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kubernetes-model-scheduling-4.12.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kubernetes-model-settings-4.12.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\kubernetes-model-storageclass-4.12.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\leveldbjni-all-1.8.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\libfb303-0.9.3.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\libthrift-0.12.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\log4j-1.2.17.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\logging-interceptor-3.12.12.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\lz4-java-1.7.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\machinist_2.12-0.6.8.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\macro-compat_2.12-1.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\mesos-1.4.0-shaded-protobuf.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\metrics-core-4.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\metrics-graphite-4.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\metrics-jmx-4.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\metrics-json-4.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\metrics-jvm-4.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\minlog-1.3.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\netty-all-4.1.51.Final.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\nimbus-jose-jwt-4.41.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\objenesis-2.6.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\okhttp-2.7.5.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\okhttp-3.12.12.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\okio-1.14.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\opencsv-2.3.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\orc-core-1.5.12.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\orc-mapreduce-1.5.12.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\orc-shims-1.5.12.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\oro-2.0.8.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\osgi-resource-locator-1.0.3.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\paranamer-2.8.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\parquet-column-1.10.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\parquet-common-1.10.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\parquet-encoding-1.10.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\parquet-format-2.4.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\parquet-hadoop-1.10.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\parquet-jackson-1.10.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\protobuf-java-2.5.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\py4j-0.10.9.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\pyrolite-4.30.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\re2j-1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\RoaringBitmap-0.9.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\scala-collection-compat_2.12-2.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\scala-compiler-2.12.10.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\scala-library-2.12.10.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\scala-parser-combinators_2.12-1.1.2.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\scala-reflect-2.12.10.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\scala-xml_2.12-1.2.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\shapeless_2.12-2.3.3.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\shims-0.9.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\slf4j-api-1.7.30.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\slf4j-log4j12-1.7.30.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\snakeyaml-1.24.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\snappy-java-1.1.8.2.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\spark-catalyst_2.12-3.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\spark-core_2.12-3.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\spark-graphx_2.12-3.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\spark-hive-thriftserver_2.12-3.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\spark-hive_2.12-3.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\spark-kubernetes_2.12-3.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\spark-kvstore_2.12-3.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\spark-launcher_2.12-3.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\spark-mesos_2.12-3.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\spark-mllib-local_2.12-3.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\spark-mllib_2.12-3.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\spark-network-common_2.12-3.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\spark-network-shuffle_2.12-3.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\spark-repl_2.12-3.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\spark-sketch_2.12-3.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\spark-sql_2.12-3.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\spark-streaming_2.12-3.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\spark-tags_2.12-3.1.1-tests.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\spark-tags_2.12-3.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\spark-unsafe_2.12-3.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\spark-yarn_2.12-3.1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\spire-macros_2.12-0.17.0-M1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\spire-platform_2.12-0.17.0-M1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\spire-util_2.12-0.17.0-M1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\spire_2.12-0.17.0-M1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\ST4-4.0.4.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\stax-api-1.0.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\stax2-api-3.1.4.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\stream-2.9.6.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\super-csv-2.2.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\threeten-extra-1.5.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\token-provider-1.0.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\transaction-api-1.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\univocity-parsers-2.9.1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\velocity-1.5.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\woodstox-core-5.0.3.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\xbean-asm7-shaded-4.15.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\xz-1.5.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\zjsonpatch-0.3.0.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\zookeeper-3.4.14.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\zstd-jni-1.4.8-1.jar;D:\spark\spark-3.1.1-bin-hadoop3.2\jars\arrow-vector-2.0.0.jar" car.LoadModelRideHailing Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 25/06/08 17:05:07 INFO SparkContext: Running Spark version 3.1.1 25/06/08 17:05:07 INFO ResourceUtils: ============================================================== 25/06/08 17:05:07 INFO ResourceUtils: No custom resources configured for spark.driver. 25/06/08 17:05:07 INFO ResourceUtils: ============================================================== 25/06/08 17:05:07 INFO SparkContext: Submitted application: LoadModelRideHailing 25/06/08 17:05:07 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0) 25/06/08 17:05:07 INFO ResourceProfile: Limiting resource is cpu 25/06/08 17:05:07 INFO ResourceProfileManager: Added ResourceProfile id: 0 25/06/08 17:05:07 INFO SecurityManager: Changing view acls to: wyatt 25/06/08 17:05:07 INFO SecurityManager: Changing modify acls to: wyatt 25/06/08 17:05:07 INFO SecurityManager: Changing view acls groups to: 25/06/08 17:05:07 INFO SecurityManager: Changing modify acls groups to: 25/06/08 17:05:07 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(wyatt); groups with view permissions: Set(); users with modify permissions: Set(wyatt); groups with modify permissions: Set() 25/06/08 17:05:07 INFO Utils: Successfully started service 'sparkDriver' on port 59361. 25/06/08 17:05:07 INFO SparkEnv: Registering MapOutputTracker 25/06/08 17:05:07 INFO SparkEnv: Registering BlockManagerMaster 25/06/08 17:05:08 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 25/06/08 17:05:08 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 25/06/08 17:05:08 INFO SparkEnv: Registering BlockManagerMasterHeartbeat 25/06/08 17:05:08 INFO DiskBlockManager: Created local directory at C:\Users\wyatt\AppData\Local\Temp\blockmgr-8fe065e2-024c-4e2f-8662-45d2fe3de444 25/06/08 17:05:08 INFO MemoryStore: MemoryStore started with capacity 1899.0 MiB 25/06/08 17:05:08 INFO SparkEnv: Registering OutputCommitCoordinator 25/06/08 17:05:08 INFO Utils: Successfully started service 'SparkUI' on port 4040. 25/06/08 17:05:08 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://windows10.microdone.cn:4040 25/06/08 17:05:08 INFO Executor: Starting executor ID driver on host windows10.microdone.cn 25/06/08 17:05:08 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 59392. 25/06/08 17:05:08 INFO NettyBlockTransferService: Server created on windows10.microdone.cn:59392 25/06/08 17:05:08 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 25/06/08 17:05:08 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, windows10.microdone.cn, 59392, None) 25/06/08 17:05:08 INFO BlockManagerMasterEndpoint: Registering block manager windows10.microdone.cn:59392 with 1899.0 MiB RAM, BlockManagerId(driver, windows10.microdone.cn, 59392, None) 25/06/08 17:05:08 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, windows10.microdone.cn, 59392, None) 25/06/08 17:05:08 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, windows10.microdone.cn, 59392, None) Exception in thread "main" java.lang.IllegalArgumentException: 测试数据中不包含 features 列,请检查数据! at car.LoadModelRideHailing$.main(LoadModelRideHailing.scala:23) at car.LoadModelRideHailing.main(LoadModelRideHailing.scala) 进程已结束,退出代码为 1 package car import org.apache.spark.ml.classification.{LogisticRegressionModel, RandomForestClassificationModel} import org.apache.spark.ml.evaluation.MulticlassClassificationEvaluator import org.apache.spark.sql.{SparkSession, functions => F} object LoadModelRideHailing { def main(args: Array[String]): Unit = { val spark = SparkSession.builder() .master("local[3]") .appName("LoadModelRideHailing") .getOrCreate() spark.sparkContext.setLogLevel("Error") // 使用经过特征工程处理后的测试数据 val TestData = spark.read.option("header", "true").csv("C:\\Users\\wyatt\\Documents\\ride_hailing_test_data.csv") // 将 label 列转换为数值类型 val testDataWithNumericLabel = TestData.withColumn("label", F.col("label").cast("double")) // 检查 features 列是否存在 if (!testDataWithNumericLabel.columns.contains("features")) { throw new IllegalArgumentException("测试数据中不包含 features 列,请检查数据!") } // 修正后的模型路径(确保文件夹存在且包含元数据) val LogisticModel = LogisticRegressionModel.load("C:\\Users\\wyatt\\Documents\\ride_hailing_logistic_model") // 示例路径 val LogisticPre = LogisticModel.transform(testDataWithNumericLabel) val LogisticAcc = new MulticlassClassificationEvaluator() .setLabelCol("label") .setPredictionCol("prediction") .setMetricName("accuracy") .evaluate(LogisticPre) println("逻辑回归模型后期数据准确率:" + LogisticAcc) // 随机森林模型路径同步修正 val RandomForest = RandomForestClassificationModel.load("C:\\Users\\wyatt\\Documents\\ride_hailing_random_forest_model") // 示例路径 val RandomForestPre = RandomForest.transform(testDataWithNumericLabel) val RandomForestAcc = new MulticlassClassificationEvaluator() .setLabelCol("label") .setPredictionCol("prediction") .setMetricName("accuracy") .evaluate(RandomForestPre) println("随机森林模型后期数据准确率:" + RandomForestAcc) spark.stop() } }

filetype

/home/hadoopmaster/jdk1.8.0_161/bin/java -javaagent:/home/hadoopmaster/idea-IC-221.6008.13/lib/idea_rt.jar=34515:/home/hadoopmaster/idea-IC-221.6008.13/bin -Dfile.encoding=UTF-8 -classpath /home/hadoopmaster/jdk1.8.0_161/jre/lib/charsets.jar:/home/hadoopmaster/jdk1.8.0_161/jre/lib/deploy.jar:/home/hadoopmaster/jdk1.8.0_161/jre/lib/ext/cldrdata.jar:/home/hadoopmaster/jdk1.8.0_161/jre/lib/ext/dnsns.jar:/home/hadoopmaster/jdk1.8.0_161/jre/lib/ext/jaccess.jar:/home/hadoopmaster/jdk1.8.0_161/jre/lib/ext/jfxrt.jar:/home/hadoopmaster/jdk1.8.0_161/jre/lib/ext/localedata.jar:/home/hadoopmaster/jdk1.8.0_161/jre/lib/ext/nashorn.jar:/home/hadoopmaster/jdk1.8.0_161/jre/lib/ext/sunec.jar:/home/hadoopmaster/jdk1.8.0_161/jre/lib/ext/sunjce_provider.jar:/home/hadoopmaster/jdk1.8.0_161/jre/lib/ext/sunpkcs11.jar:/home/hadoopmaster/jdk1.8.0_161/jre/lib/ext/zipfs.jar:/home/hadoopmaster/jdk1.8.0_161/jre/lib/javaws.jar:/home/hadoopmaster/jdk1.8.0_161/jre/lib/jce.jar:/home/hadoopmaster/jdk1.8.0_161/jre/lib/jfr.jar:/home/hadoopmaster/jdk1.8.0_161/jre/lib/jfxswt.jar:/home/hadoopmaster/jdk1.8.0_161/jre/lib/jsse.jar:/home/hadoopmaster/jdk1.8.0_161/jre/lib/management-agent.jar:/home/hadoopmaster/jdk1.8.0_161/jre/lib/plugin.jar:/home/hadoopmaster/jdk1.8.0_161/jre/lib/resources.jar:/home/hadoopmaster/jdk1.8.0_161/jre/lib/rt.jar:/root/IdeaProjects/kkk/out/production/kkk:/home/hadoopmaster/scala-2.12.15/lib/scala-reflect.jar:/home/hadoopmaster/scala-2.12.15/lib/scala-xml_2.12-1.0.6.jar:/home/hadoopmaster/scala-2.12.15/lib/scala-parser-combinators_2.12-1.0.7.jar:/home/hadoopmaster/scala-2.12.15/lib/scala-swing_2.12-2.0.3.jar:/home/hadoopmaster/scala-2.12.15/lib/scala-library.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/xz-1.8.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jta-1.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jpam-1.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/json-1.8.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/ST4-4.0.4.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/guice-3.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/ivy-2.5.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/oro-2.0.8.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/blas-2.2.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/core-1.1.2.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/gson-2.2.4.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/tink-1.6.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/avro-1.10.2.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jsp-api-2.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/okio-1.14.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/opencsv-2.3.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/shims-0.9.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/xmlenc-0.52.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/arpack-2.2.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/guava-14.0.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jetty-6.1.26.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jline-2.14.6.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jsr305-3.0.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/lapack-2.2.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/log4j-1.2.17.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/minlog-1.3.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/stream-2.9.6.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/velocity-1.5.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/generex-1.0.2.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hk2-api-2.6.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/janino-3.0.16.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jdo-api-3.0.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/objenesis-2.6.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/paranamer-2.8.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/py4j-0.10.9.3.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/pyrolite-4.30.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/HikariCP-2.5.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/commons-io-2.4.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hive-cli-2.3.9.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/javax.inject-1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/libfb303-0.9.3.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/lz4-java-1.7.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/okhttp-3.12.12.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/snakeyaml-1.27.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/stax-api-1.0.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/JTransforms-3.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/aopalliance-1.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/avro-ipc-1.10.2.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/breeze_2.12-1.2.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/commons-cli-1.2.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/commons-net-3.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/derby-10.14.2.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hive-jdbc-2.3.9.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hk2-utils-2.6.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/httpcore-4.4.14.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jaxb-api-2.2.11.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jersey-hk2-2.34.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jodd-core-3.5.2.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/orc-core-1.6.12.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/super-csv-2.2.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/xml-apis-1.4.01.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/zookeeper-3.6.2.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/JLargeArrays-1.5.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/activation-1.1.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/automaton-1.11-8.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/commons-dbcp-1.4.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/commons-lang-2.6.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/commons-text-1.6.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hive-serde-2.3.9.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hive-shims-2.3.9.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/javolution-5.5.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/libthrift-0.12.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/orc-shims-1.6.12.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/slf4j-api-1.7.30.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/zjsonpatch-0.3.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/zstd-jni-1.5.0-4.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/chill-java-0.10.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/chill_2.12-0.10.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/guice-servlet-3.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hadoop-auth-2.7.4.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hadoop-hdfs-2.7.4.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hive-common-2.3.9.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hk2-locator-2.6.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/httpclient-4.5.13.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jackson-xc-1.9.13.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jetty-util-6.1.26.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/joda-time-2.10.10.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/kryo-shaded-4.0.2.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/metrics-jmx-4.2.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/metrics-jvm-4.2.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/rocksdbjni-6.20.3.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/spire_2.12-0.17.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/xercesImpl-2.12.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/aircompressor-0.21.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/algebra_2.12-2.0.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/annotations-17.0.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/antlr4-runtime-4.8.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/api-util-1.0.0-M20.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/arrow-format-2.0.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/arrow-vector-2.0.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/avro-mapred-1.10.2.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/commons-codec-1.15.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/commons-pool-1.5.4.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/compress-lzf-1.0.3.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hive-beeline-2.3.9.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/javax.jdo-3.2.0-m3.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jaxb-runtime-2.3.2.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jersey-client-2.34.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jersey-common-2.34.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jersey-server-2.34.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/leveldbjni-all-1.8.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/metrics-core-4.2.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/metrics-json-4.2.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/RoaringBitmap-0.9.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/antlr-runtime-3.5.2.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/commons-math3-3.4.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hadoop-client-2.7.4.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hadoop-common-2.7.4.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jackson-core-2.12.3.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/javassist-3.25.0-GA.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jul-to-slf4j-1.7.30.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/protobuf-java-2.5.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/snappy-java-1.1.8.4.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/transaction-api-1.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/bonecp-0.8.0.RELEASE.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/commons-crypto-1.1.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/commons-digester-1.8.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/commons-lang3-3.12.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/curator-client-2.7.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hive-exec-2.3.9-core.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hive-metastore-2.3.9.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jackson-jaxrs-1.9.13.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jakarta.inject-2.6.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/orc-mapreduce-1.6.12.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/scala-xml_2.12-1.2.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/shapeless_2.12-2.3.3.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/slf4j-log4j12-1.7.30.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/spark-sql_2.12-3.2.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/threeten-extra-1.5.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/zookeeper-jute-3.6.2.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/commons-compress-1.21.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/commons-logging-1.1.3.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/curator-recipes-2.7.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hadoop-yarn-api-2.7.4.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hive-shims-0.23-2.3.9.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jcl-over-slf4j-1.7.30.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/parquet-column-1.12.2.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/parquet-common-1.12.2.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/parquet-hadoop-1.12.2.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/scala-library-2.12.15.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/scala-reflect-2.12.15.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/spark-core_2.12-3.2.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/spark-hive_2.12-3.2.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/spark-repl_2.12-3.2.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/spark-tags_2.12-3.2.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/spark-yarn_2.12-3.2.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/api-asn1-api-1.0.0-M20.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/breeze-macros_2.12-1.2.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/cats-kernel_2.12-2.1.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/commons-httpclient-3.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/flatbuffers-java-1.9.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hive-llap-common-2.3.9.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hive-service-rpc-3.1.2.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hive-storage-api-2.7.2.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jetty-sslengine-6.1.26.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/metrics-graphite-4.2.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/netty-all-4.1.68.Final.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/parquet-jackson-1.12.2.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/scala-compiler-2.12.15.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/spark-mesos_2.12-3.2.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/spark-mllib_2.12-3.2.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/spire-util_2.12-0.17.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/xbean-asm9-shaded-4.20.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/apacheds-i18n-2.0.0-M15.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/arpack_combined_all-0.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/arrow-memory-core-2.0.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/commons-beanutils-1.9.4.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/commons-compiler-3.0.16.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/curator-framework-2.7.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/datanucleus-core-4.1.17.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hive-shims-common-2.3.9.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jackson-core-asl-1.9.13.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jackson-databind-2.12.3.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jakarta.ws.rs-api-2.1.6.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/kubernetes-client-5.4.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/macro-compat_2.12-1.1.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/parquet-encoding-1.12.2.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/spark-graphx_2.12-3.2.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/spark-sketch_2.12-3.2.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/spark-unsafe_2.12-3.2.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/univocity-parsers-2.9.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/arrow-memory-netty-2.0.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/datanucleus-rdbms-4.1.19.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hadoop-annotations-2.7.4.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hadoop-yarn-client-2.7.4.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hadoop-yarn-common-2.7.4.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/spark-kvstore_2.12-3.2.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/spire-macros_2.12-0.17.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/commons-collections-3.2.2.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/commons-configuration-1.6.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/datanucleus-api-jdo-4.2.4.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jackson-mapper-asl-1.9.13.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jakarta.servlet-api-4.0.3.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/json4s-ast_2.12-3.7.0-M11.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/spark-catalyst_2.12-3.2.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/spark-launcher_2.12-3.2.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/audience-annotations-0.5.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hive-shims-scheduler-2.3.9.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hive-vector-code-gen-2.3.9.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jackson-annotations-2.12.3.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jakarta.xml.bind-api-2.3.2.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/json4s-core_2.12-3.7.0-M11.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/spark-streaming_2.12-3.2.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/spire-platform_2.12-0.17.0.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/kubernetes-model-apps-5.4.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/kubernetes-model-core-5.4.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/kubernetes-model-node-5.4.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/kubernetes-model-rbac-5.4.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/logging-interceptor-3.12.12.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/mesos-1.4.0-shaded-protobuf.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/osgi-resource-locator-1.0.3.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/spark-kubernetes_2.12-3.2.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/spark-tags_2.12-3.2.1-tests.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/aopalliance-repackaged-2.6.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/htrace-core-3.1.0-incubating.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/istack-commons-runtime-3.0.8.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jakarta.annotation-api-1.3.5.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jakarta.validation-api-2.0.2.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/json4s-scalap_2.12-3.7.0-M11.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/kubernetes-model-batch-5.4.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/spark-mllib-local_2.12-3.2.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jersey-container-servlet-2.34.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/json4s-jackson_2.12-3.7.0-M11.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/kubernetes-model-common-5.4.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/kubernetes-model-events-5.4.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/kubernetes-model-policy-5.4.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jackson-dataformat-yaml-2.12.3.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jackson-datatype-jsr310-2.11.2.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/kubernetes-model-metrics-5.4.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hadoop-yarn-server-common-2.7.4.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/spark-network-common_2.12-3.2.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jackson-module-scala_2.12-2.12.3.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/kubernetes-model-discovery-5.4.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/parquet-format-structures-1.12.2.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/spark-network-shuffle_2.12-3.2.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/apacheds-kerberos-codec-2.0.0-M15.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hadoop-mapreduce-client-app-2.7.4.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/kubernetes-model-extensions-5.4.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/kubernetes-model-networking-5.4.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/kubernetes-model-scheduling-5.4.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hadoop-mapreduce-client-core-2.7.4.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hadoop-yarn-server-web-proxy-2.7.4.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/jersey-container-servlet-core-2.34.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/kubernetes-model-autoscaling-5.4.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/kubernetes-model-flowcontrol-5.4.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/scala-collection-compat_2.12-2.1.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/spark-hive-thriftserver_2.12-3.2.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/kubernetes-model-certificates-5.4.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/kubernetes-model-coordination-5.4.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/kubernetes-model-storageclass-5.4.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/scala-parser-combinators_2.12-1.1.2.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hadoop-mapreduce-client-common-2.7.4.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/kubernetes-model-apiextensions-5.4.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hadoop-mapreduce-client-shuffle-2.7.4.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/hadoop-mapreduce-client-jobclient-2.7.4.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/kubernetes-model-admissionregistration-5.4.1.jar:/home/hadoopmaster/spark-3.2.1-bin-hadoop2.7/jars/dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar kkk.WordCount Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 25/06/02 20:17:03 INFO SparkContext: Running Spark version 3.2.1 25/06/02 20:17:04 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 25/06/02 20:17:04 INFO ResourceUtils: ============================================================== 25/06/02 20:17:04 INFO ResourceUtils: No custom resources configured for spark.driver. 25/06/02 20:17:04 INFO ResourceUtils: ============================================================== 25/06/02 20:17:04 INFO SparkContext: Submitted application: WordCount 25/06/02 20:17:04 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0) 25/06/02 20:17:04 INFO ResourceProfile: Limiting resource is cpu 25/06/02 20:17:04 INFO ResourceProfileManager: Added ResourceProfile id: 0 25/06/02 20:17:04 INFO SecurityManager: Changing view acls to: root 25/06/02 20:17:04 INFO SecurityManager: Changing modify acls to: root 25/06/02 20:17:04 INFO SecurityManager: Changing view acls groups to: 25/06/02 20:17:04 INFO SecurityManager: Changing modify acls groups to: 25/06/02 20:17:04 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set() 25/06/02 20:17:05 INFO Utils: Successfully started service 'sparkDriver' on port 41615. 25/06/02 20:17:05 INFO SparkEnv: Registering MapOutputTracker 25/06/02 20:17:05 INFO SparkEnv: Registering BlockManagerMaster 25/06/02 20:17:05 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 25/06/02 20:17:05 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 25/06/02 20:17:05 INFO SparkEnv: Registering BlockManagerMasterHeartbeat 25/06/02 20:17:05 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-68486311-3f69-48e8-8f69-e7afffcf5979 25/06/02 20:17:05 INFO MemoryStore: MemoryStore started with capacity 258.5 MiB 25/06/02 20:17:05 INFO SparkEnv: Registering OutputCommitCoordinator 25/06/02 20:17:06 INFO Utils: Successfully started service 'SparkUI' on port 4040. 25/06/02 20:17:06 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://hadoopmaster:4040 25/06/02 20:17:06 INFO Executor: Starting executor ID driver on host hadoopmaster 25/06/02 20:17:06 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 41907. 25/06/02 20:17:06 INFO NettyBlockTransferService: Server created on hadoopmaster:41907 25/06/02 20:17:06 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 25/06/02 20:17:06 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, hadoopmaster, 41907, None) 25/06/02 20:17:06 INFO BlockManagerMasterEndpoint: Registering block manager hadoopmaster:41907 with 258.5 MiB RAM, BlockManagerId(driver, hadoopmaster, 41907, None) 25/06/02 20:17:06 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, hadoopmaster, 41907, None) 25/06/02 20:17:06 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, hadoopmaster, 41907, None) 25/06/02 20:17:08 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 244.0 KiB, free 258.2 MiB) 25/06/02 20:17:09 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 23.4 KiB, free 258.2 MiB) 25/06/02 20:17:09 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on hadoopmaster:41907 (size: 23.4 KiB, free: 258.5 MiB) 25/06/02 20:17:09 INFO SparkContext: Created broadcast 0 from textFile at WordCount.scala:9 Exception in thread "main" org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: file:/home/hadoopmaster/words.txt at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:287) at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:229) at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:315) at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:205) at org.apache.spark.rdd.RDD.$anonfun$partitions$2(RDD.scala:300) at scala.Option.getOrElse(Option.scala:189) at org.apache.spark.rdd.RDD.partitions(RDD.scala:296) at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:49) at org.apache.spark.rdd.RDD.$anonfun$partitions$2(RDD.scala:300) at scala.Option.getOrElse(Option.scala:189) at org.apache.spark.rdd.RDD.partitions(RDD.scala:296) at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:49) at org.apache.spark.rdd.RDD.$anonfun$partitions$2(RDD.scala:300) at scala.Option.getOrElse(Option.scala:189) at org.apache.spark.rdd.RDD.partitions(RDD.scala:296) at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:49) at org.apache.spark.rdd.RDD.$anonfun$partitions$2(RDD.scala:300) at scala.Option.getOrElse(Option.scala:189) at org.apache.spark.rdd.RDD.partitions(RDD.scala:296) at org.apache.spark.Partitioner$.$anonfun$defaultPartitioner$4(Partitioner.scala:78) at org.apache.spark.Partitioner$.$anonfun$defaultPartitioner$4$adapted(Partitioner.scala:78) at scala.collection.immutable.List.map(List.scala:293) at org.apache.spark.Partitioner$.defaultPartitioner(Partitioner.scala:78) at org.apache.spark.rdd.PairRDDFunctions.$anonfun$reduceByKey$4(PairRDDFunctions.scala:322) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:414) at org.apache.spark.rdd.PairRDDFunctions.reduceByKey(PairRDDFunctions.scala:322) at kkk.WordCount$.main(WordCount.scala:10) at kkk.WordCount.main(WordCount.scala) 25/06/02 20:17:09 INFO SparkContext: Invoking stop() from shutdown hook 25/06/02 20:17:09 INFO SparkUI: Stopped Spark web UI at http://hadoopmaster:4040 25/06/02 20:17:09 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 25/06/02 20:17:09 INFO MemoryStore: MemoryStore cleared 25/06/02 20:17:09 INFO BlockManager: BlockManager stopped 25/06/02 20:17:09 INFO BlockManagerMaster: BlockManagerMaster stopped 25/06/02 20:17:09 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 25/06/02 20:17:10 INFO SparkContext: Successfully stopped SparkContext 25/06/02 20:17:10 INFO ShutdownHookManager: Shutdown hook called 25/06/02 20:17:10 INFO ShutdownHookManager: Deleting directory /tmp/spark-213e3a91-227c-4e0a-8254-ab5c0f00786d Process finished with exit code 1

filetype

[root@node1 server]# hdfs dfs -ls /spark/jars/ Found 146 items -rw-r--r-- 3 hadoop supergroup 232470 2025-08-04 19:54 /spark/jars/JLargeArrays-1.5.jar -rw-r--r-- 3 hadoop supergroup 1175798 2025-08-04 19:54 /spark/jars/JTransforms-3.1.jar -rw-r--r-- 3 hadoop supergroup 325335 2025-08-04 19:54 /spark/jars/RoaringBitmap-0.7.45.jar -rw-r--r-- 3 hadoop supergroup 134044 2025-08-04 19:54 /spark/jars/aircompressor-0.10.jar -rw-r--r-- 3 hadoop supergroup 1168113 2025-08-04 19:54 /spark/jars/algebra_2.12-2.0.0-M2.jar -rw-r--r-- 3 hadoop supergroup 336803 2025-08-04 19:54 /spark/jars/antlr4-runtime-4.7.1.jar -rw-r--r-- 3 hadoop supergroup 27006 2025-08-04 19:54 /spark/jars/aopalliance-repackaged-2.6.1.jar -rw-r--r-- 3 hadoop supergroup 1194003 2025-08-04 19:54 /spark/jars/arpack_combined_all-0.1.jar -rw-r--r-- 3 hadoop supergroup 64674 2025-08-04 19:54 /spark/jars/arrow-format-0.15.1.jar -rw-r--r-- 3 hadoop supergroup 105777 2025-08-04 19:54 /spark/jars/arrow-memory-0.15.1.jar -rw-r--r-- 3 hadoop supergroup 1437215 2025-08-04 19:54 /spark/jars/arrow-vector-0.15.1.jar -rw-r--r-- 3 hadoop supergroup 176285 2025-08-04 19:54 /spark/jars/automaton-1.11-8.jar -rw-r--r-- 3 hadoop supergroup 1556863 2025-08-04 19:54 /spark/jars/avro-1.8.2.jar -rw-r--r-- 3 hadoop supergroup 132989 2025-08-04 19:54 /spark/jars/avro-ipc-1.8.2.jar -rw-r--r-- 3 hadoop supergroup 187052 2025-08-04 19:54 /spark/jars/avro-mapred-1.8.2-hadoop2.jar -rw-r--r-- 3 hadoop supergroup 134696 2025-08-04 19:54 /spark/jars/breeze-macros_2.12-1.0.jar -rw-r--r-- 3 hadoop supergroup 13826799 2025-08-04 19:54 /spark/jars/breeze_2.12-1.0.jar -rw-r--r-- 3 hadoop supergroup 3226851 2025-08-04 19:54 /spark/jars/cats-kernel_2.12-2.0.0-M4.jar -rw-r--r-- 3 hadoop supergroup 58684 2025-08-04 19:54 /spark/jars/chill-java-0.9.5.jar -rw-r--r-- 3 hadoop supergroup 211523 2025-08-04 19:54 /spark/jars/chill_2.12-0.9.5.jar -rw-r--r-- 3 hadoop supergroup 284184 2025-08-04 19:54 /spark/jars/commons-codec-1.10.jar -rw-r--r-- 3 hadoop supergroup 71626 2025-08-04 19:54 /spark/jars/commons-compiler-3.0.16.jar -rw-r--r-- 3 hadoop supergroup 365552 2025-08-04 19:54 /spark/jars/commons-compress-1.8.1.jar -rw-r--r-- 3 hadoop supergroup 134595 2025-08-04 19:54 /spark/jars/commons-crypto-1.0.0.jar -rw-r--r-- 3 hadoop supergroup 284220 2025-08-04 19:54 /spark/jars/commons-lang-2.6.jar -rw-r--r-- 3 hadoop supergroup 503880 2025-08-04 19:54 /spark/jars/commons-lang3-3.9.jar -rw-r--r-- 3 hadoop supergroup 2035066 2025-08-04 19:54 /spark/jars/commons-math3-3.4.1.jar -rw-r--r-- 3 hadoop supergroup 273370 2025-08-04 19:54 /spark/jars/commons-net-3.1.jar -rw-r--r-- 3 hadoop supergroup 197176 2025-08-04 19:54 /spark/jars/commons-text-1.6.jar -rw-r--r-- 3 hadoop supergroup 79845 2025-08-04 19:54 /spark/jars/compress-lzf-1.0.3.jar -rw-r--r-- 3 hadoop supergroup 164422 2025-08-04 19:54 /spark/jars/core-1.1.2.jar -rw-r--r-- 3 hadoop supergroup 18497 2025-08-04 19:54 /spark/jars/flatbuffers-java-1.9.0.jar -rw-r--r-- 3 hadoop supergroup 14395 2025-08-04 19:54 /spark/jars/generex-1.0.2.jar -rw-r--r-- 3 hadoop supergroup 234942 2025-08-04 19:54 /spark/jars/hive-storage-api-2.7.1.jar -rw-r--r-- 3 hadoop supergroup 200223 2025-08-04 19:54 /spark/jars/hk2-api-2.6.1.jar -rw-r--r-- 3 hadoop supergroup 203358 2025-08-04 19:54 /spark/jars/hk2-locator-2.6.1.jar -rw-r--r-- 3 hadoop supergroup 131590 2025-08-04 19:54 /spark/jars/hk2-utils-2.6.1.jar -rw-r--r-- 3 hadoop supergroup 27156 2025-08-04 19:54 /spark/jars/istack-commons-runtime-3.0.8.jar -rw-r--r-- 3 hadoop supergroup 1282424 2025-08-04 19:54 /spark/jars/ivy-2.4.0.jar -rw-r--r-- 3 hadoop supergroup 67889 2025-08-04 19:54 /spark/jars/jackson-annotations-2.10.0.jar -rw-r--r-- 3 hadoop supergroup 348635 2025-08-04 19:54 /spark/jars/jackson-core-2.10.0.jar -rw-r--r-- 3 hadoop supergroup 1400944 2025-08-04 19:54 /spark/jars/jackson-databind-2.10.0.jar -rw-r--r-- 3 hadoop supergroup 46646 2025-08-04 19:54 /spark/jars/jackson-dataformat-yaml-2.10.0.jar -rw-r--r-- 3 hadoop supergroup 105898 2025-08-04 19:54 /spark/jars/jackson-datatype-jsr310-2.10.3.jar -rw-r--r-- 3 hadoop supergroup 34991 2025-08-04 19:54 /spark/jars/jackson-module-jaxb-annotations-2.10.0.jar -rw-r--r-- 3 hadoop supergroup 43740 2025-08-04 19:54 /spark/jars/jackson-module-paranamer-2.10.0.jar -rw-r--r-- 3 hadoop supergroup 341862 2025-08-04 19:54 /spark/jars/jackson-module-scala_2.12-2.10.0.jar -rw-r--r-- 3 hadoop supergroup 44399 2025-08-04 19:54 /spark/jars/jakarta.activation-api-1.2.1.jar -rw-r--r-- 3 hadoop supergroup 25058 2025-08-04 19:54 /spark/jars/jakarta.annotation-api-1.3.5.jar -rw-r--r-- 3 hadoop supergroup 18140 2025-08-04 19:54 /spark/jars/jakarta.inject-2.6.1.jar -rw-r--r-- 3 hadoop supergroup 91930 2025-08-04 19:54 /spark/jars/jakarta.validation-api-2.0.2.jar -rw-r--r-- 3 hadoop supergroup 140376 2025-08-04 19:54 /spark/jars/jakarta.ws.rs-api-2.1.6.jar -rw-r--r-- 3 hadoop supergroup 115498 2025-08-04 19:54 /spark/jars/jakarta.xml.bind-api-2.3.2.jar -rw-r--r-- 3 hadoop supergroup 926574 2025-08-04 19:54 /spark/jars/janino-3.0.16.jar -rw-r--r-- 3 hadoop supergroup 780265 2025-08-04 19:54 /spark/jars/javassist-3.25.0-GA.jar -rw-r--r-- 3 hadoop supergroup 95806 2025-08-04 19:54 /spark/jars/javax.servlet-api-3.1.0.jar -rw-r--r-- 3 hadoop supergroup 1013367 2025-08-04 19:54 /spark/jars/jaxb-runtime-2.3.2.jar -rw-r--r-- 3 hadoop supergroup 16537 2025-08-04 19:54 /spark/jars/jcl-over-slf4j-1.7.30.jar -rw-r--r-- 3 hadoop supergroup 244502 2025-08-04 19:54 /spark/jars/jersey-client-2.30.jar -rw-r--r-- 3 hadoop supergroup 1166647 2025-08-04 19:54 /spark/jars/jersey-common-2.30.jar -rw-r--r-- 3 hadoop supergroup 32091 2025-08-04 19:54 /spark/jars/jersey-container-servlet-2.30.jar -rw-r--r-- 3 hadoop supergroup 73349 2025-08-04 19:54 /spark/jars/jersey-container-servlet-core-2.30.jar -rw-r--r-- 3 hadoop supergroup 76733 2025-08-04 19:54 /spark/jars/jersey-hk2-2.30.jar -rw-r--r-- 3 hadoop supergroup 85815 2025-08-04 19:54 /spark/jars/jersey-media-jaxb-2.30.jar -rw-r--r-- 3 hadoop supergroup 927721 2025-08-04 19:54 /spark/jars/jersey-server-2.30.jar -rw-r--r-- 3 hadoop supergroup 83632 2025-08-04 19:54 /spark/jars/json4s-ast_2.12-3.6.6.jar -rw-r--r-- 3 hadoop supergroup 482486 2025-08-04 19:54 /spark/jars/json4s-core_2.12-3.6.6.jar -rw-r--r-- 3 hadoop supergroup 36175 2025-08-04 19:54 /spark/jars/json4s-jackson_2.12-3.6.6.jar -rw-r--r-- 3 hadoop supergroup 349025 2025-08-04 19:54 /spark/jars/json4s-scalap_2.12-3.6.6.jar -rw-r--r-- 3 hadoop supergroup 33031 2025-08-04 19:54 /spark/jars/jsr305-3.0.0.jar -rw-r--r-- 3 hadoop supergroup 4592 2025-08-04 19:54 /spark/jars/jul-to-slf4j-1.7.30.jar -rw-r--r-- 3 hadoop supergroup 410874 2025-08-04 19:54 /spark/jars/kryo-shaded-4.0.2.jar -rw-r--r-- 3 hadoop supergroup 775174 2025-08-04 19:54 /spark/jars/kubernetes-client-4.9.2.jar -rw-r--r-- 3 hadoop supergroup 11908731 2025-08-04 19:54 /spark/jars/kubernetes-model-4.9.2.jar -rw-r--r-- 3 hadoop supergroup 3954 2025-08-04 19:54 /spark/jars/kubernetes-model-common-4.9.2.jar -rw-r--r-- 3 hadoop supergroup 1045744 2025-08-04 19:54 /spark/jars/leveldbjni-all-1.8.jar -rw-r--r-- 3 hadoop supergroup 12488 2025-08-04 19:54 /spark/jars/logging-interceptor-3.12.6.jar -rw-r--r-- 3 hadoop supergroup 649950 2025-08-04 19:54 /spark/jars/lz4-java-1.7.1.jar -rw-r--r-- 3 hadoop supergroup 33786 2025-08-04 19:54 /spark/jars/machinist_2.12-0.6.8.jar -rw-r--r-- 3 hadoop supergroup 3180 2025-08-04 19:54 /spark/jars/macro-compat_2.12-1.1.1.jar -rw-r--r-- 3 hadoop supergroup 7343426 2025-08-04 19:54 /spark/jars/mesos-1.4.0-shaded-protobuf.jar -rw-r--r-- 3 hadoop supergroup 105365 2025-08-04 19:54 /spark/jars/metrics-core-4.1.1.jar -rw-r--r-- 3 hadoop supergroup 22042 2025-08-04 19:54 /spark/jars/metrics-graphite-4.1.1.jar -rw-r--r-- 3 hadoop supergroup 20889 2025-08-04 19:54 /spark/jars/metrics-jmx-4.1.1.jar -rw-r--r-- 3 hadoop supergroup 16642 2025-08-04 19:54 /spark/jars/metrics-json-4.1.1.jar -rw-r--r-- 3 hadoop supergroup 23909 2025-08-04 19:54 /spark/jars/metrics-jvm-4.1.1.jar -rw-r--r-- 3 hadoop supergroup 5711 2025-08-04 19:54 /spark/jars/minlog-1.3.0.jar -rw-r--r-- 3 hadoop supergroup 4153218 2025-08-04 19:54 /spark/jars/netty-all-4.1.47.Final.jar -rw-r--r-- 3 hadoop supergroup 54391 2025-08-04 19:54 /spark/jars/objenesis-2.5.1.jar -rw-r--r-- 3 hadoop supergroup 423175 2025-08-04 19:54 /spark/jars/okhttp-3.12.6.jar -rw-r--r-- 3 hadoop supergroup 88732 2025-08-04 19:54 /spark/jars/okio-1.15.0.jar -rw-r--r-- 3 hadoop supergroup 19827 2025-08-04 19:54 /spark/jars/opencsv-2.3.jar -rw-r--r-- 3 hadoop supergroup 813783 2025-08-04 19:54 /spark/jars/orc-core-1.5.10.jar -rw-r--r-- 3 hadoop supergroup 48122 2025-08-04 19:54 /spark/jars/orc-mapreduce-1.5.10.jar -rw-r--r-- 3 hadoop supergroup 27749 2025-08-04 19:54 /spark/jars/orc-shims-1.5.10.jar -rw-r--r-- 3 hadoop supergroup 65261 2025-08-04 19:54 /spark/jars/oro-2.0.8.jar -rw-r--r-- 3 hadoop supergroup 19479 2025-08-04 19:54 /spark/jars/osgi-resource-locator-1.0.3.jar -rw-r--r-- 3 hadoop supergroup 34654 2025-08-04 19:54 /spark/jars/paranamer-2.8.jar -rw-r--r-- 3 hadoop supergroup 1097799 2025-08-04 19:54 /spark/jars/parquet-column-1.10.1.jar -rw-r--r-- 3 hadoop supergroup 94995 2025-08-04 19:54 /spark/jars/parquet-common-1.10.1.jar -rw-r--r-- 3 hadoop supergroup 848750 2025-08-04 19:54 /spark/jars/parquet-encoding-1.10.1.jar -rw-r--r-- 3 hadoop supergroup 723203 2025-08-04 19:54 /spark/jars/parquet-format-2.4.0.jar -rw-r--r-- 3 hadoop supergroup 285732 2025-08-04 19:54 /spark/jars/parquet-hadoop-1.10.1.jar -rw-r--r-- 3 hadoop supergroup 1048171 2025-08-04 19:54 /spark/jars/parquet-jackson-1.10.1.jar -rw-r--r-- 3 hadoop supergroup 123052 2025-08-04 19:54 /spark/jars/py4j-0.10.9.jar -rw-r--r-- 3 hadoop supergroup 100431 2025-08-04 19:54 /spark/jars/pyrolite-4.30.jar -rw-r--r-- 3 hadoop supergroup 112235 2025-08-04 19:54 /spark/jars/scala-collection-compat_2.12-2.1.1.jar -rw-r--r-- 3 hadoop supergroup 10672015 2025-08-04 19:54 /spark/jars/scala-compiler-2.12.10.jar -rw-r--r-- 3 hadoop supergroup 5276900 2025-08-04 19:54 /spark/jars/scala-library-2.12.10.jar -rw-r--r-- 3 hadoop supergroup 222980 2025-08-04 19:54 /spark/jars/scala-parser-combinators_2.12-1.1.2.jar -rw-r--r-- 3 hadoop supergroup 3678534 2025-08-04 19:54 /spark/jars/scala-reflect-2.12.10.jar -rw-r--r-- 3 hadoop supergroup 556575 2025-08-04 19:54 /spark/jars/scala-xml_2.12-1.2.0.jar -rw-r--r-- 3 hadoop supergroup 3243337 2025-08-04 19:54 /spark/jars/shapeless_2.12-2.3.3.jar -rw-r--r-- 3 hadoop supergroup 4028 2025-08-04 19:54 /spark/jars/shims-0.7.45.jar -rw-r--r-- 3 hadoop supergroup 302558 2025-08-04 19:54 /spark/jars/snakeyaml-1.24.jar -rw-r--r-- 3 hadoop supergroup 1934320 2025-08-04 19:54 /spark/jars/snappy-java-1.1.7.5.jar -rw-r--r-- 3 hadoop supergroup 9387724 2025-08-04 19:54 /spark/jars/spark-catalyst_2.12-3.0.0.jar -rw-r--r-- 3 hadoop supergroup 10167565 2025-08-04 19:54 /spark/jars/spark-core_2.12-3.0.0.jar -rw-r--r-- 3 hadoop supergroup 430744 2025-08-04 19:54 /spark/jars/spark-graphx_2.12-3.0.0.jar -rw-r--r-- 3 hadoop supergroup 369562 2025-08-04 19:54 /spark/jars/spark-kubernetes_2.12-3.0.0.jar -rw-r--r-- 3 hadoop supergroup 59866 2025-08-04 19:54 /spark/jars/spark-kvstore_2.12-3.0.0.jar -rw-r--r-- 3 hadoop supergroup 75930 2025-08-04 19:54 /spark/jars/spark-launcher_2.12-3.0.0.jar -rw-r--r-- 3 hadoop supergroup 294928 2025-08-04 19:54 /spark/jars/spark-mesos_2.12-3.0.0.jar -rw-r--r-- 3 hadoop supergroup 111914 2025-08-04 19:54 /spark/jars/spark-mllib-local_2.12-3.0.0.jar -rw-r--r-- 3 hadoop supergroup 5885296 2025-08-04 19:54 /spark/jars/spark-mllib_2.12-3.0.0.jar -rw-r--r-- 3 hadoop supergroup 2397699 2025-08-04 19:54 /spark/jars/spark-network-common_2.12-3.0.0.jar -rw-r--r-- 3 hadoop supergroup 87027 2025-08-04 19:54 /spark/jars/spark-network-shuffle_2.12-3.0.0.jar -rw-r--r-- 3 hadoop supergroup 52512 2025-08-04 19:54 /spark/jars/spark-repl_2.12-3.0.0.jar -rw-r--r-- 3 hadoop supergroup 30347 2025-08-04 19:54 /spark/jars/spark-sketch_2.12-3.0.0.jar -rw-r--r-- 3 hadoop supergroup 7118845 2025-08-04 19:54 /spark/jars/spark-sql_2.12-3.0.0.jar -rw-r--r-- 3 hadoop supergroup 1137816 2025-08-04 19:54 /spark/jars/spark-streaming_2.12-3.0.0.jar -rw-r--r-- 3 hadoop supergroup 9478 2025-08-04 19:54 /spark/jars/spark-tags_2.12-3.0.0-tests.jar -rw-r--r-- 3 hadoop supergroup 15149 2025-08-04 19:54 /spark/jars/spark-tags_2.12-3.0.0.jar -rw-r--r-- 3 hadoop supergroup 51181 2025-08-04 19:54 /spark/jars/spark-unsafe_2.12-3.0.0.jar -rw-r--r-- 3 hadoop supergroup 330670 2025-08-04 19:54 /spark/jars/spark-yarn_2.12-3.0.0.jar -rw-r--r-- 3 hadoop supergroup 79588 2025-08-04 19:54 /spark/jars/spire-macros_2.12-0.17.0-M1.jar -rw-r--r-- 3 hadoop supergroup 8261 2025-08-04 19:54 /spark/jars/spire-platform_2.12-0.17.0-M1.jar -rw-r--r-- 3 hadoop supergroup 34601 2025-08-04 19:54 /spark/jars/spire-util_2.12-0.17.0-M1.jar -rw-r--r-- 3 hadoop supergroup 7188024 2025-08-04 19:54 /spark/jars/spire_2.12-0.17.0-M1.jar -rw-r--r-- 3 hadoop supergroup 178149 2025-08-04 19:54 /spark/jars/stream-2.9.6.jar -rw-r--r-- 3 hadoop supergroup 233745 2025-08-04 19:54 /spark/jars/threeten-extra-1.5.0.jar -rw-r--r-- 3 hadoop supergroup 443231 2025-08-04 19:54 /spark/jars/univocity-parsers-2.8.3.jar -rw-r--r-- 3 hadoop supergroup 281356 2025-08-04 19:54 /spark/jars/xbean-asm7-shaded-4.15.jar -rw-r--r-- 3 hadoop supergroup 99555 2025-08-04 19:54 /spark/jars/xz-1.5.jar -rw-r--r-- 3 hadoop supergroup 35518 2025-08-04 19:54 /spark/jars/zjsonpatch-0.3.0.jar -rw-r--r-- 3 hadoop supergroup 4210625 2025-08-04 19:54 /spark/jars/zstd-jni-1.4.4-3.jar 有什么问题吗

filetype

# 创建命名空间 apiVersion: v1 kind: Namespace metadata: name: starrocks --- # 配置MinIO访问密钥Secret apiVersion: v1 kind: Secret metadata: name: starrocks-s3-secret namespace: starrocks type: Opaque stringData: AWS_ACCESS_KEY_ID: "admin" # MinIO访问密钥 AWS_SECRET_ACCESS_KEY: "izhMinio@pasword" # MinIO密钥 AWS_ENDPOINT: "http://minio-svc.minio-ns.svc.cluster.local:9000" # MinIO服务地址 AWS_DEFAULT_REGION: "cn-north-1" --- # FE ConfigMap apiVersion: v1 kind: ConfigMap metadata: name: fe-config namespace: starrocks data: fe.conf: | brpc_port = 8060 enable_brpc_ssl = false enable_fqdn_mode = true cloud_native_meta_port = 6090 meta_dir = /opt/starrocks/fe/meta edit_log_port = 9010 http_port = 8030 query_port = 9030 JAVA_OPTS = -Xmx16g -Xms16g --- # StorageClass(确保集群已部署local-path-provisioner) apiVersion: storage.k8s.io/v1 kind: StorageClass metadata: name: local-path-starrocks provisioner: rancher.io/local-path volumeBindingMode: WaitForFirstConsumer reclaimPolicy: Retain # 保留数据防止误删除 --- # FE Service apiVersion: v1 kind: Service metadata: name: fe-service namespace: starrocks spec: type: NodePort ports: - name: edit-log # 新增9010端口映射 port: 9010 targetPort: 9010 - name: query port: 9030 targetPort: 9030 nodePort: 31030 - name: http port: 8030 targetPort: 8030 nodePort: 31031 selector: app: fe --- # BE Service apiVersion: v1 kind: Service metadata: name: be-service namespace: starrocks spec: clusterIP: None ports: - name: be port: 9060 targetPort: 9060 - name: heartbeat port: 9050 targetPort: 9050 selector: app: be --- # BE ConfigMap apiVersion: v1 kind: ConfigMap metadata: name: be-config namespace: starrocks data: be.conf: | heartbeat_service_port = 9050 heartbeat_service_address = fe-service.starrocks.svc.cluster.local storage_root_path = /opt/starrocks/storage_data webserver_port = 8040 be_port = 9060 be_http_port = 8040 brpc_port = 8060 cloud_native_meta_port = 6090 # 存算分离核心配置(需与MinIO完全匹配) aws_s3_endpoint = minio-svc.minio-ns.svc.cluster.local:9000 aws_s3_path = s3://starrocks-data/ aws_s3_access_key = admin aws_s3_secret_key = izhMinio@pasword aws_s3_use_aws_sdk_default_behavior = false # 必须关闭AWS默认行为 aws_s3_enable_path_style_access = true # MinIO需要启用路径模式 # 存算分离配置 run_mode = shared_data cloud_native_storage_type = S3 aws_s3_path = "s3://starrocks-data/" # 生产环境JVM参数 JAVA_OPTS = -Xmx32g -Xms32g -XX:+UseG1GC JAVA_HOME = /opt/starrocks/jdk-17 --- # FE StatefulSet apiVersion: apps/v1 kind: StatefulSet metadata: name: fe namespace: starrocks spec: serviceName: fe-service replicas: 3 selector: matchLabels: app: fe template: metadata: labels: app: fe spec: affinity: podAntiAffinity: requiredDuringSchedulingIgnoredDuringExecution: - labelSelector: matchExpressions: - key: app operator: In values: ["fe"] topologyKey: "kubernetes.io/hostname" containers: - name: fe image: registry.cn-shanghai.aliyuncs.com/vsi-open/starrocks-fe:2025-05-22 command: - /bin/sh - -c - | # 添加日志输出便于调试 echo "Starting FE with ID ${FE_ID}"; if [ "${FE_ID}" = "fe-0" ]; then /opt/starrocks/fe/bin/start_fe.sh --host_type=FQDN else export JAVA_OPTS="-Xmx16g -Xms16g -Dcom.alibaba.rocketmq.client.sendSmartMsg=false" /opt/starrocks/fe/bin/start_fe.sh --helper=fe-0.fe-service.starrocks.svc.cluster.local:9010 --host_type=FQDN fi lifecycle: preStop: exec: command: ["/opt/starrocks/fe/bin/stop_fe.sh"] env: - name: FE_MASTER_SERVERS # 主节点专用配置 value: "fe-0.fe-service.starrocks.svc.cluster.local:9010" - name: FE_SERVERS value: "fe-0.fe-service.starrocks.svc.cluster.local:9010,fe-1.fe-service.starrocks.svc.cluster.local:9010,fe-2.fe-service.starrocks.svc.cluster.local:9010" - name: FE_ID valueFrom: fieldRef: fieldPath: metadata.name ports: - containerPort: 9010 # 内部通信端口 - containerPort: 9030 # MySQL协议端口 - containerPort: 8030 # HTTP端口 volumeMounts: - name: meta mountPath: /opt/starrocks/fe/meta - name: config mountPath: /opt/starrocks/fe/conf/fe.conf subPath: fe.conf readinessProbe: httpGet: path: /api/health port: 8030 initialDelaySeconds: 120 periodSeconds: 20 failureThreshold: 5 livenessProbe: httpGet: path: /api/health port: 8030 initialDelaySeconds: 200 periodSeconds: 30 failureThreshold: 3 volumes: - name: config configMap: name: fe-config volumeClaimTemplates: - metadata: name: meta spec: accessModes: [ "ReadWriteOnce" ] storageClassName: "local-path-starrocks" resources: requests: storage: 50Gi --- # BE StatefulSet apiVersion: apps/v1 kind: StatefulSet metadata: name: be namespace: starrocks spec: serviceName: be-service replicas: 4 selector: matchLabels: app: be template: metadata: labels: app: be spec: affinity: podAntiAffinity: requiredDuringSchedulingIgnoredDuringExecution: - labelSelector: matchExpressions: - key: app operator: In values: ["be"] topologyKey: "kubernetes.io/hostname" containers: - name: be image: registry.cn-shanghai.aliyuncs.com/vsi-open/starrocks-be:2025-05-22 command: ["/opt/starrocks/be/bin/start_be.sh"] lifecycle: preStop: exec: command: ["/opt/starrocks/be/bin/stop_be.sh"] envFrom: - secretRef: name: starrocks-s3-secret env: - name: AWS_REGION value: "cn-north-1" - name: AWS_S3_ENDPOINT value: "minio-svc.minio-ns.svc.cluster.local:9000" - name: AWS_S3_USE_AWS_SDK_DEFAULT_BEHAVIOR value: "false" ports: - containerPort: 9060 name: hb-port # 心跳端口 - containerPort: 8040 name: http-port # 管理端口 - containerPort: 9050 name: hb-svc-port # 心跳服务端口 volumeMounts: - name: config mountPath: /opt/starrocks/be/conf/be.conf subPath: be.conf - name: storage mountPath: /opt/starrocks/be/storage resources: requests: cpu: 8 memory: 8Gi limits: cpu: 16 memory: 16Gi livenessProbe: tcpSocket: port: 9060 initialDelaySeconds: 120 periodSeconds: 20 readinessProbe: tcpSocket: port: 9060 initialDelaySeconds: 60 periodSeconds: 15 volumes: - name: config configMap: name: be-config volumeClaimTemplates: - metadata: name: meta spec: accessModes: [ "ReadWriteOnce" ] storageClassName: "local-path-starrocks" resources: requests: storage: 50Gi - metadata: name: storage spec: accessModes: [ "ReadWriteOnce" ] storageClassName: "local-path-starrocks" resources: requests: storage: 100Gi # 根据数据量调整 ```不用考虑minio,与宁状态如下 ``` ➜ StarRocks git:(master) ✗ kubectl get pod -n starrocks -o wide NAME READY STATUS RESTARTS AGE IP NODE NOMINATED NODE READINESS GATES be-0 1/1 Running 0 152m 172.18.3.241 follower03 <none> <none> be-1 1/1 Running 0 151m 172.18.1.57 follower01 <none> <none> be-2 1/1 Running 0 150m 172.18.0.130 leader01 <none> <none> be-3 1/1 Running 0 149m 172.18.2.222 follower02 <none> <none> fe-0 1/1 Running 0 7m22s 172.18.3.62 follower03 <none> <none> fe-1 0/1 Running 1 (46s ago) 5m17s 172.18.1.93 follower01 <none> <none> ➜ StarRocks git:(master) ✗ kubectl describe pod fe-1 -n starrocks Name: fe-1 Namespace: starrocks Priority: 0 Service Account: default Node: follower01/192.168.11.102 Start Time: Tue, 27 May 2025 15:47:56 +0800 Labels: app=fe apps.kubernetes.io/pod-index=1 controller-revision-hash=fe-675df448fb statefulset.kubernetes.io/pod-name=fe-1 Annotations: <none> Status: Running IP: 172.18.1.93 IPs: IP: 172.18.1.93 Controlled By: StatefulSet/fe Containers: fe: Container ID: containerd://463884b0da7eb4156097e7692746062da4046cc63df8d6d0f4c3dc615d4246ab Image: registry.cn-shanghai.aliyuncs.com/vsi-open/starrocks-fe:2025-05-22 Image ID: registry.cn-shanghai.aliyuncs.com/vsi-open/starrocks-fe@sha256:f581e1038681fa2f5d65bbb082f1d47d49fcab8ca37afb02c0eec5685947952d Ports: 9010/TCP, 9030/TCP, 8030/TCP Host Ports: 0/TCP, 0/TCP, 0/TCP Command: /bin/sh -c echo "Starting FE with ID ${FE_ID}"; if [ "${FE_ID}" = "fe-0" ]; then /opt/starrocks/fe/bin/start_fe.sh else echo "--helper=fe-0.fe-service.starrocks.svc.cluster.local:9010"; /opt/starrocks/fe/bin/start_fe.sh --helper=fe-0.fe-service.starrocks.svc.cluster.local:9010 fi State: Running Started: Tue, 27 May 2025 15:52:27 +0800 Last State: Terminated Reason: Error Exit Code: 143 Started: Tue, 27 May 2025 15:47:57 +0800 Finished: Tue, 27 May 2025 15:52:27 +0800 Ready: False Restart Count: 1 Liveness: http-get http://:8030/api/health delay=200s timeout=1s period=30s #success=1 #failure=3 Readiness: http-get http://:8030/api/health delay=120s timeout=1s period=20s #success=1 #failure=5 Environment: FE_MASTER_SERVERS: fe-0.fe-service.starrocks.svc.cluster.local:9010 FE_SERVERS: fe-0.fe-service.starrocks.svc.cluster.local:9010,fe-1.fe-service.starrocks.svc.cluster.local:9010,fe-2.fe-service.starrocks.svc.cluster.local:9010 FE_ID: fe-1 (v1:metadata.name) Mounts: /opt/starrocks/fe/conf/fe.conf from config (rw,path="fe.conf") /opt/starrocks/fe/meta from meta (rw) /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-65kpf (ro) Conditions: Type Status PodReadyToStartContainers True Initialized True Ready False ContainersReady False PodScheduled True Volumes: meta: Type: PersistentVolumeClaim (a reference to a PersistentVolumeClaim in the same namespace) ClaimName: meta-fe-1 ReadOnly: false config: Type: ConfigMap (a volume populated by a ConfigMap) Name: fe-config Optional: false kube-api-access-65kpf: Type: Projected (a volume that contains injected data from multiple sources) TokenExpirationSeconds: 3607 ConfigMapName: kube-root-ca.crt ConfigMapOptional: <nil> DownwardAPI: true QoS Class: BestEffort Node-Selectors: <none> Tolerations: node.kubernetes.io/not-ready:NoExecute op=Exists for 300s node.kubernetes.io/unreachable:NoExecute op=Exists for 300s Events: Type Reason Age From Message ---- ------ ---- ---- ------- Normal Scheduled 5m32s default-scheduler Successfully assigned starrocks/fe-1 to follower01 Normal Pulled 62s (x2 over 5m32s) kubelet Container image "registry.cn-shanghai.aliyuncs.com/vsi-open/starrocks-fe:2025-05-22" already present on machine Normal Created 62s (x2 over 5m32s) kubelet Created container: fe Normal Started 62s (x2 over 5m32s) kubelet Started container fe Warning Unhealthy 62s (x9 over 3m23s) kubelet Readiness probe failed: Get "http://172.18.1.93:8030/api/health": dial tcp 172.18.1.93:8030: connect: connection refused Warning Unhealthy 62s (x3 over 2m2s) kubelet Liveness probe failed: Get "http://172.18.1.93:8030/api/health": dial tcp 172.18.1.93:8030: connect: connection refused Normal Killing 62s kubelet Container fe failed liveness probe, will be restarted Warning FailedPreStopHook 62s kubelet PreStopHook failed kubectl exec fe-0 -n starrocks -- curl http://localhost:8030/api/health % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 60 100 60 0 0 46224 0 --:--:-- --:--:-- --:--:-- 60000{"online_backend_num":0,"total_backend_num":4,"status":"OK"} ➜ StarRocks git:(master) ✗ ``` 我手动登陆fe-0 将4个be添加“ALTER SYSTEM ADD BACKEND”,但其alive都是false 特别说明 我的镜像是基于fe-ubuntu:latest与be-ubuntu:latest,不支持/opt/starrocks/be/bin/start_backend.sh -be_port=9060 webserver_port=8040 请修正