[2025-04-01T05:41:24.092+00:00][ERROR][elasticsearch-service] Unable to retrieve version information from Elasticsearch nodes. security_exception: [security_exception] Reason: failed to authenticate service account [elastic/kibana] with token name [kibana-kibana]

时间: 2025-04-01 19:18:46 AIGC 浏览: 219
<think>好的,我现在要处理用户提供的关于Bitnami Fluentd的配置问题,但现在用户遇到了一个Elasticsearch认证失败的错误。错误信息是“[elasticsearch-service] Unable to retrieve version information from Elasticsearch nodes. security_exception: failed to authenticate service account [elastic/kibana] with token name [kibana-kibana]”。我需要分析这个错误的原因,并提供解决方案。 首先,我需要理解这个错误发生的背景。用户在使用Fluentd将日志发送到Elasticsearch时遇到了认证问题。错误信息明确指出是安全异常,服务账户认证失败。服务账户是Elasticsearch中的一种内部用户,用于内部服务之间的通信,比如Kibana和Elasticsearch之间的通信。这里提到的服务账户是elastic/kibana,对应的令牌名称是kibana-kibana。可能的原因包括令牌无效、权限不足或者配置错误。 接下来,我需要回忆Elasticsearch的安全机制。从Elasticsearch 8.0开始,默认启用了安全功能,包括TLS加密和内置用户认证。服务账户令牌用于服务之间的认证,比如Fluentd作为Kibana的一部分需要访问Elasticsearch时。生成服务账户令牌通常需要使用Elasticsearch的API,例如POST /_security/service/elastic/kibana/credential/token。用户可能在配置Fluentd时没有正确生成或配置这个令牌。 然后,我需要考虑用户可能的配置错误。Fluentd的Elasticsearch插件配置中可能需要指定api_key或service_account_token,但用户可能遗漏了这部分配置,或者令牌已过期。另外,Elasticsearch的SSL/TLS配置是否正确也会影响连接,比如是否启用了HTTPS,证书是否正确配置。 接下来,我应该分步骤思考解决方案。首先,确认是否生成了正确的服务账户令牌。用户可能需要使用Elasticsearch的API生成一个新的令牌,并确保其有效性。其次,检查Fluentd的输出配置,确保正确引用了这个令牌,并且格式正确。例如,使用api_key或service_account_token参数。然后,检查Elasticsearch的权限设置,确保服务账户有足够的权限执行操作,比如索引日志数据。此外,确认网络连接和端口是否正确,Elasticsearch是否监听在正确的地址和端口,是否有防火墙规则阻止了连接。 另外,还需要考虑SSL/TLS的问题。如果Elasticsearch启用了HTTPS,Fluentd的配置中需要指定ca_file或其他SSL参数,否则会出现证书验证失败。用户可能需要配置Fluentd的Elasticsearch插件使用正确的CA证书,或者禁用证书验证(不推荐,仅用于测试)。 还需要查看Elasticsearch的日志,是否有更详细的错误信息,帮助诊断问题。例如,Elasticsearch的日志可能记录更详细的认证失败原因,比如令牌过期、无效的签名等。 可能的解决方案包括: 1. 生成新的服务账户令牌,并确保在Fluentd配置中正确使用。 2. 检查Fluentd的Elasticsearch插件配置,确保api_key或service_account_token正确。 3. 确认Elasticsearch中服务账户的权限是否足够。 4. 检查网络连接和端口配置。 5. 配置SSL/TLS参数,确保Fluentd能够信任Elasticsearch的证书。 同时,需要提醒用户注意安全性,比如不要禁用安全功能,正确管理令牌的生命周期,定期轮换令牌,避免使用明文传输敏感信息等。 最后,应该给出具体的步骤和示例配置,帮助用户逐步排查和解决问题。例如,生成令牌的curl命令示例,Fluentd配置片段示例,以及如何检查权限和网络连接的方法。此外,建议用户查看相关文档,如Elasticsearch的安全设置和Fluentd插件的官方文档,以获取最新信息。</think>以下是针对 Elasticsearch 服务账户认证失败问题的深度解析与解决方案: --- ### **错误原因分析** 错误信息 `security_exception: failed to authenticate service account [elastic/kibana] with token name [kibana-kibana]` 表明: 1. **认证凭据无效**:服务账户 `elastic/kibana` 的令牌(Token)可能未正确生成、已过期或配置错误。 2. **权限不足**:服务账户缺乏访问 Elasticsearch 的必要权限。 3. **安全配置冲突**:Elasticsearch 启用了安全认证(如 TLS、RBAC),但客户端(Fluentd/Kibana)未正确配置。 --- ### **分步解决方案** #### **步骤 1:生成有效的服务账户令牌** 通过 Elasticsearch API 生成新的服务账户令牌: ```bash curl -X POST "https://elasticsearch:9200/_security/service/elastic/kibana/credential/token?pretty" \ -u elastic:your_elastic_password \ # 使用 Elastic 超级用户凭证 -H "Content-Type: application/json" \ -d '{"name": "kibana-token"}' ``` 输出示例: ```json { "created": true, "token": { "name": "kibana-kibana", "value": "AAEAAWVsYXN0aWMva2liYW5h...(Base64 编码的令牌)" } } ``` #### **步骤 2:配置 Fluentd 使用令牌** 在 Fluentd 的 Elasticsearch 输出插件中注入令牌: ```conf <match docker.**> @type elasticsearch host elasticsearch port 9200 scheme https # 确保启用 HTTPS(Elasticsearch 默认安全模式) ca_file /path/to/http_ca.crt # Elasticsearch CA 证书路径 user "elastic" # 服务账户所属用户 password "your_elastic_password" # 或直接使用 api_key api_key "kibana-kibana:AAEAAWVsYXN0aWMva2liYW5h..." # 格式为 name:value logstash_format true </match> ``` #### **步骤 3:验证服务账户权限** 确保服务账户 `elastic/kibana` 拥有写入索引的权限: 1. 检查角色映射: ```bash curl -X GET "https://elasticsearch:9200/_security/role_mapping?pretty" \ -u elastic:your_elastic_password ``` 2. 分配最小必要权限(示例角色定义): ```json { "indices": [ { "names": ["logs-*"], "privileges": ["create_index", "write", "manage"] } ] } ``` #### **步骤 4:检查网络与安全配置** - **网络连通性**:确认 Fluentd 能访问 Elasticsearch 的 9200 端口: ```bash telnet elasticsearch 9200 ``` - **TLS 配置**:确保 Fluentd 信任 Elasticsearch 的 CA 证书,禁用证书验证(仅测试环境): ```conf ssl_verify false # 不推荐生产环境使用 ``` #### **步骤 5:查看 Elasticsearch 日志定位问题** 检查 Elasticsearch 日志中的详细错误(默认路径 `/var/log/elasticsearch/elasticsearch.log`): ```bash grep "security_exception" /var/log/elasticsearch/elasticsearch.log ``` --- ### **常见陷阱与优化建议** 1. **令牌有效期** - 服务账户令牌默认无过期时间,但建议定期轮换(通过 API 删除旧令牌)。 2. **安全配置冲突** - 如果 Elasticsearch 未启用安全认证(`xpack.security.enabled: false`),需移除所有认证参数。 3. **性能调优** - 增加 Fluentd 缓冲区配置以应对高负载: ```conf <buffer> @type memory chunk_limit_size 4MB total_limit_size 2GB flush_interval 3s </buffer> ``` --- ### **附录:关键配置对照表** | 配置项 | 正确值示例 | 错误值示例 | |-----------------------|-----------------------------------|------------------| | `scheme` | `https` | `http`(未启用TLS)| | `ca_file` | `/usr/share/elasticsearch/config/certs/http_ca.crt` | 未指定证书路径 | | `api_key` | `kibana-kibana:AAEAAWVsY...` | 仅填写名称或值 | | `log_level` | `debug`(调试时) | `info`(无详细日志)| 通过以上步骤,可系统性解决服务账户认证问题。若仍失败,建议结合 Elasticsearch 的 Audit Logs 进一步分析认证流程。
阅读全文

相关推荐

这是jar包运行 成功了吗:Windows PowerShell 版权所有(C) Microsoft Corporation。保留所有权利。 安装最新的 PowerShell,了解新功能和改进!https://aka.ms/PSWindows PS E:\后台\项目\novel-master\novel-master> java -jar target/novel-3.5.1-SNAPSHOT.jar 19:02:59,679 |-INFO in ch.qos.logback.classic.LoggerContext[default] - This is logback-classic version 1.5.6 19:02:59,682 |-INFO in ch.qos.logback.classic.util.ContextInitializer@10b3df93 - Here is a list of configurators discovered as a service, by rank: 19:02:59,683 |-INFO in ch.qos.logback.classic.util.ContextInitializer@10b3df93 - org.springframework.boot.logging.logback.RootLogLevelConfigurator 19:02:59,683 |-INFO in ch.qos.logback.classic.util.ContextInitializer@10b3df93 - They will be invoked in order until ExecutionStatus.DO_NOT_INVOKE_NEXT_IF_ANY is returned. 19:02:59,683 |-INFO in ch.qos.logback.classic.util.ContextInitializer@10b3df93 - Constructed configurator of type class org.springframework.boot.logging.logback.RootLogLevelConfigurator 19:02:59,690 |-INFO in ch.qos.logback.classic.util.ContextInitializer@10b3df93 - org.springframework.boot.logging.logback.RootLogLevelConfigurator.configure() call lasted 1 milliseconds. ExecutionStatus=INVOKE_NEXT_IF_ANY 19:02:59,690 |-INFO in ch.qos.logback.classic.util.ContextInitializer@10b3df93 - Trying to configure with ch.qos.logback.classic.joran.SerializedModelConfigurator 19:02:59,692 |-INFO in ch.qos.logback.classic.util.ContextInitializer@10b3df93 - Constructed configurator of type class ch.qos.logback.classic.joran.SerializedModelConfigurator 19:02:59,692 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback-test.scmo] 19:02:59,694 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback.scmo] 19:02:59,694 |-INFO in ch.qos.logback.classic.util.ContextInitializer@10b3df93 - ch.qos.logback.classic.joran.SerializedModelConfigurator.configure() call lasted 2 milliseconds. ExecutionStatus=INVOKE_NEXT_IF_ANY 19:02:59,694 |-INFO in ch.qos.logback.classic.util.ContextInitializer@10b3df93 - Trying to configure with ch.qos.logback.classic.util.DefaultJoranConfigurator 19:02:59,695 |-INFO in ch.qos.logback.classic.util.ContextInitializer@10b3df93 - Constructed configurator of type class ch.qos.logback.classic.util.DefaultJoranConfigurator 19:02:59,695 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback-test.xml] 19:02:59,696 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback.xml] 19:02:59,696 |-INFO in ch.qos.logback.classic.util.ContextInitializer@10b3df93 - ch.qos.logback.classic.util.DefaultJoranConfigurator.configure() call lasted 1 milliseconds. ExecutionStatus=INVOKE_NEXT_IF_ANY 19:02:59,696 |-INFO in ch.qos.logback.classic.util.ContextInitializer@10b3df93 - Trying to configure with ch.qos.logback.classic.BasicConfigurator 19:02:59,697 |-INFO in ch.qos.logback.classic.util.ContextInitializer@10b3df93 - Constructed configurator of type class ch.qos.logback.classic.BasicConfigurator 19:02:59,697 |-INFO in ch.qos.logback.classic.BasicConfigurator@ea27e34 - Setting up default configuration. 19:02:59,725 |-INFO in ch.qos.logback.classic.util.ContextInitializer@10b3df93 - ch.qos.logback.classic.BasicConfigurator.configure() call lasted 28 milliseconds. ExecutionStatus=NEUTRAL 19:03:00,137 |-INFO in ch.qos.logback.core.joran.spi.ConfigurationWatchList@33a2499c - URL [jar:nested:/E:/后台/项目/novel-master/novel-master/target/novel-3.5.1-SNAPSHOT.jar/!BOOT-INF/classes/!/logback-spring.xml] is not of type file 19:03:00,173 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word clr with class [org.springframework.boot.logging.logback.ColorConverter] 19:03:00,173 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word wex with class [org.springframework.boot.logging.logback.WhitespaceThrowableProxyConverter] 19:03:00,173 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word wEx with class [org.springframework.boot.logging.logback.ExtendedWhitespaceThrowableProxyConverter] 19:03:00,195 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - Processing appender named [STDOUT] 19:03:00,195 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - About to instantiate appender of type [ch.qos.logback.core.ConsoleAppender] 19:03:00,198 |-INFO in ch.qos.logback.core.model.processor.ImplicitModelHandler - Assuming default type [ch.qos.logback.classic.encoder.PatternLayoutEncoder] for [encoder] property 19:03:00,207 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - Processing appender named [FILE] 19:03:00,207 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - About to instantiate appender of type [ch.qos.logback.core.rolling.RollingFileAppender] 19:03:00,214 |-INFO in c.q.l.core.rolling.TimeBasedRollingPolicy@242408359 - No compression will be used 19:03:00,215 |-INFO in c.q.l.core.rolling.TimeBasedRollingPolicy@242408359 - Will use the pattern logs/debug.%d.%i.log for the active file 19:03:00,227 |-INFO in ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP@33c2bd - The date pattern is 'yyyy-MM-dd' from file name pattern 'logs/debug.%d.%i.log'. 19:03:00,227 |-INFO in ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP@33c2bd - Roll-over at midnight. 19:03:00,231 |-INFO in ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP@33c2bd - Setting initial period to 2025-08-04T11:03:00.231Z 19:03:00,231 |-WARN in ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP@33c2bd - SizeAndTimeBasedFNATP is deprecated. Use SizeAndTimeBasedRollingPolicy instead 19:03:00,231 |-WARN in ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP@33c2bd - For more information see http://logback.qos.ch/manual/appenders.html#SizeAndTimeBasedRollingPolicy 19:03:00,232 |-INFO in ch.qos.logback.core.model.processor.ImplicitModelHandler - Assuming default type [ch.qos.logback.classic.encoder.PatternLayoutEncoder] for [encoder] property 19:03:00,233 |-INFO in ch.qos.logback.core.rolling.RollingFileAppender[FILE] - Active log file name: logs/novel.log 19:03:00,233 |-INFO in ch.qos.logback.core.rolling.RollingFileAppender[FILE] - File property is set to [logs/novel.log] 19:03:00,234 |-INFO in ch.qos.logback.classic.model.processor.RootLoggerModelHandler - Setting level of ROOT logger to INFO 19:03:00,234 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@1dfd5f51 - Propagating INFO level on Logger[ROOT] onto the JUL framework 19:03:00,234 |-INFO in ch.qos.logback.core.model.processor.AppenderRefModelHandler - Attaching appender named [STDOUT] to Logger[ROOT] 19:03:00,234 |-INFO in ch.qos.logback.core.model.processor.AppenderRefModelHandler - Attaching appender named [FILE] to Logger[ROOT] 19:03:00,234 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [io.github.xxyopen] to DEBUG 19:03:00,234 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@1dfd5f51 - Propagating DEBUG level on Logger[io.github.xxyopen] onto the JUL framework 19:03:00,236 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting additivity of logger [io.github.xxyopen] to false 19:03:00,236 |-INFO in ch.qos.logback.core.model.processor.AppenderRefModelHandler - Attaching appender named [STDOUT] to Logger[io.github.xxyopen] 19:03:00,236 |-INFO in ch.qos.logback.core.model.processor.AppenderRefModelHandler - Attaching appender named [FILE] to Logger[io.github.xxyopen] 19:03:00,236 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.zalando.logbook] to TRACE 19:03:00,236 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@1dfd5f51 - Propagating TRACE level on Logger[org.zalando.logbook] onto the JUL framework 19:03:00,236 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting additivity of logger [org.zalando.logbook] to false 19:03:00,236 |-INFO in ch.qos.logback.core.model.processor.AppenderRefModelHandler - Attaching appender named [STDOUT] to Logger[org.zalando.logbook] 19:03:00,236 |-INFO in ch.qos.logback.core.model.processor.AppenderRefModelHandler - Attaching appender named [FILE] to Logger[org.zalando.logbook] 19:03:00,236 |-INFO in ch.qos.logback.core.model.processor.DefaultProcessor@3c321bdb - End of configuration. 19:03:00,236 |-INFO in org.springframework.boot.logging.logback.SpringBootJoranConfigurator@24855019 - Registering current configuration as safe fallback point . ____ _ __ _ _ /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \ ( ( )\___ | '_ | '_| | '_ \/ _ | \ \ \ \ \\/ ___)| |_)| | | | | || (_| | ) ) ) ) ' |____| .__|_| |_|_| |_\__, | / / / / =========|_|==============|___/=/_/_/_/ :: Spring Boot :: (v3.3.0) 2025-08-04 19:03:00.269 INFO 37232 --- [kground-preinit] o.h.validator.internal.util.Version : HV000001: Hibernate Validator 8.0.1.Final 2025-08-04 19:03:00.341 INFO 37232 --- [ main] i.github.xxyopen.novel.NovelApplication : Starting NovelApplication v3.5.1-SNAPSHOT usin g Java 21.0.5 with PID 37232 (E:\鍚庡彴\椤圭洰\novel-master\novel-master\target\novel-3.5.1-SNAPSHOT.jar started by LENOVO in E:\鍚庡彴\椤圭洰\novel-master\novel-master) 2025-08-04 19:03:00.341 DEBUG 37232 --- [ main] i.github.xxyopen.novel.NovelApplication : Running with Spring Boot v3.3.0, Spring v6.1.8 2025-08-04 19:03:00.342 INFO 37232 --- [ main] i.github.xxyopen.novel.NovelApplication : The following 1 profile is active: "dev" 2025-08-04 19:03:02.105 INFO 37232 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Multiple Spring Data modules found, entering strict repository configuration mode 2025-08-04 19:03:02.117 INFO 37232 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Bootstrapping Spring Data Elasticsearch repositories in DEFAULT mode. 2025-08-04 19:03:02.139 INFO 37232 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Finished Spring Data repository scanning in 16 ms. Found 0 Elasticsearch repository interfaces. 2025-08-04 19:03:02.145 INFO 37232 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Multiple Spring Data modules found, entering strict repository configuration mode 2025-08-04 19:03:02.146 INFO 37232 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Bootstrapping Spring Data Reactive Elasticsearch repositories in DEFAULT mode. 2025-08-04 19:03:02.150 INFO 37232 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Finished Spring Data repository scanning in 3 ms. Found 0 Reactive Elasticsearch repository interfaces. 2025-08-04 19:03:02.159 INFO 37232 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Multiple Spring Data modules found, entering strict repository configuration mode 2025-08-04 19:03:02.162 INFO 37232 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Bootstrapping Spring Data Redis repositories in DEFAULT mode. 2025-08-04 19:03:02.175 INFO 37232 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Finished Spring Data repository scanning in 4 ms. Found 0 Redis repository interfaces. 2025-08-04 19:03:03.399 INFO 37232 --- [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat initialized with port 8888 (http) 2025-08-04 19:03:03.414 INFO 37232 --- [ main] o.a.coyote.http11.Http11NioProtocol : Initializing ProtocolHandler ["http-nio-8888"] 2025-08-04 19:03:03.416 INFO 37232 --- [ main] o.apache.catalina.core.StandardService : Starting service [Tomcat] 2025-08-04 19:03:03.416 INFO 37232 --- [ main] o.apache.catalina.core.StandardEngine : Starting Servlet engine: [Apache Tomcat/10.1.24] 2025-08-04 19:03:03.458 INFO 37232 --- [ main] o.a.c.c.C.[Tomcat].[localhost].[/] : Initializing Spring embedded WebApplicationContext 2025-08-04 19:03:03.459 INFO 37232 --- [ main] w.s.c.ServletWebServerApplicationContext : Root WebApplicationContext: initialization completed in 2861 ms _ _ |_ _ _|_. ___ _ | _ | | |\/|_)(_| | |_\ |_)||_|_\ / | 3.5.6 2025-08-04 19:03:05.955 INFO 37232 --- [ main] org.redisson.Version : Redisson 3.19.1 2025-08-04 19:03:06.222 WARN 37232 --- [ main] ConfigServletWebServerApplicationContext : Exception encountered during context initializ ation - cancelling refresh attempt: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'resourceContro ller' defined in URL [jar:nested:/E:/鍚庡彴/椤圭洰/novel-master/novel-master/target/novel-3.5.1-SNAPSHOT.jar/!BOOT-INF/classes/!/io/github/xxyopen/ novel/controller/front/ResourceController.class]: Unsatisfied dependency expressed through constructor parameter 0: Error creating bean with name ' resourceServiceImpl' defined in URL [jar:nested:/E:/鍚庡彴/椤圭洰/novel-master/novel-master/target/novel-3.5.1-SNAPSHOT.jar/!BOOT-INF/classes/!/io/ github/xxyopen/novel/service/impl/ResourceServiceImpl.class]: Unsatisfied dependency expressed through constructor parameter 0: Error creating bean with name 'verifyCodeManager' defined in URL [jar:nested:/E:/鍚庡彴/椤圭洰/novel-master/novel-master/target/novel-3.5.1-SNAPSHOT.jar/!BOOT-INF/cla sses/!/io/github/xxyopen/novel/manager/redis/VerifyCodeManager.class]: Unsatisfied dependency expressed through constructor parameter 0: Error crea ting bean with name 'stringRedisTemplate' defined in class path resource [org/redisson/spring/starter/RedissonAutoConfiguration.class]: Unsatisfied dependency expressed through method 'stringRedisTemplate' parameter 0: Error creating bean with name 'redissonConnectionFactory' defined in class path resource [org/redisson/spring/starter/RedissonAutoConfiguration.class]: Unsatisfied dependency expressed through method 'redissonConnectionFac tory' parameter 0: Error creating bean with name 'redisson' defined in class path resource [org/redisson/spring/starter/RedissonAutoConfiguration.c lass]: Failed to instantiate [org.redisson.api.RedissonClient]: Factory method 'redisson' threw exception with message: Unable to connect to Redis server: 127.0.0.1/127.0.0.1:6379 2025-08-04 19:03:06.228 INFO 37232 --- [ main] o.apache.catalina.core.StandardService : Stopping service [Tomcat] 2025-08-04 19:03:06.261 INFO 37232 --- [ main] .s.b.a.l.ConditionEvaluationReportLogger : Error starting ApplicationContext. To display the condition evaluation report re-run your application with 'debug' enabled. 2025-08-04 19:03:06.272 ERROR 37232 --- [ main] o.s.boot.SpringApplication : Application run failed org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'resourceController' defined in URL [jar:nested:/E: /鍚庡彴/椤圭洰/novel-master/novel-master/target/novel-3.5.1-SNAPSHOT.jar/!BOOT-INF/classes/!/io/github/xxyopen/novel/controller/front/ResourceContr oller.class]: Unsatisfied dependency expressed through constructor parameter 0: Error creating bean with name 'resourceServiceImpl' defined in URL [jar:nested:/E:/鍚庡彴/椤圭洰/novel-master/novel-master/target/novel-3.5.1-SNAPSHOT.jar/!BOOT-INF/classes/!/io/github/xxyopen/novel/service/impl/Re sourceServiceImpl.class]: Unsatisfied dependency expressed through constructor parameter 0: Error creating bean with name 'verifyCodeManager' defin ed in URL [jar:nested:/E:/鍚庡彴/椤圭洰/novel-master/novel-master/target/novel-3.5.1-SNAPSHOT.jar/!BOOT-INF/classes/!/io/github/xxyopen/novel/manag er/redis/VerifyCodeManager.class]: Unsatisfied dependency expressed through constructor parameter 0: Error creating bean with name 'stringRedisTemp late' defined in class path resource [org/redisson/spring/starter/RedissonAutoConfiguration.class]: Unsatisfied dependency expressed through method 'stringRedisTemplate' parameter 0: Error creating bean with name 'redissonConnectionFactory' defined in class path resource [org/redisson/spring/s tarter/RedissonAutoConfiguration.class]: Unsatisfied dependency expressed through method 'redissonConnectionFactory' parameter 0: Error creating be an with name 'redisson' defined in class path resource [org/redisson/spring/starter/RedissonAutoConfiguration.class]: Failed to instantiate [org.redisson.api.RedissonClient]: Factory method 'redisson' threw exception with message: Unable to connect to Redis server: 127.0.0.1/127.0.0.1:6379 at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:795) at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:237) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1357) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1194) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:562) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:522) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:337) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:335) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:200) at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:975) at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:962) at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:624) at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:146) at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:754) at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:456) at org.springframework.boot.SpringApplication.run(SpringApplication.java:335) at org.springframework.boot.SpringApplication.run(SpringApplication.java:1363) at org.springframework.boot.SpringApplication.run(SpringApplication.java:1352) at io.github.xxyopen.novel.NovelApplication.main(NovelApplication.java:32) at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:103) at java.base/java.lang.reflect.Method.invoke(Method.java:580) at org.springframework.boot.loader.launch.Launcher.launch(Launcher.java:91) at org.springframework.boot.loader.launch.Launcher.launch(Launcher.java:53) at org.springframework.boot.loader.launch.JarLauncher.main(JarLauncher.java:58) Caused by: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'resourceServiceImpl' defined in URL [ja r:nested:/E:/鍚庡彴/椤圭洰/novel-master/novel-master/target/novel-3.5.1-SNAPSHOT.jar/!BOOT-INF/classes/!/io/github/xxyopen/novel/service/impl/Resou rceServiceImpl.class]: Unsatisfied dependency expressed through constructor parameter 0: Error creating bean with name 'verifyCodeManager' defined in URL [jar:nested:/E:/鍚庡彴/椤圭洰/novel-master/novel-master/target/novel-3.5.1-SNAPSHOT.jar/!BOOT-INF/classes/!/io/github/xxyopen/novel/manager/ redis/VerifyCodeManager.class]: Unsatisfied dependency expressed through constructor parameter 0: Error creating bean with name 'stringRedisTemplat e' defined in class path resource [org/redisson/spring/starter/RedissonAutoConfiguration.class]: Unsatisfied dependency expressed through method 's tringRedisTemplate' parameter 0: Error creating bean with name 'redissonConnectionFactory' defined in class path resource [org/redisson/spring/star ter/RedissonAutoConfiguration.class]: Unsatisfied dependency expressed through method 'redissonConnectionFactory' parameter 0: Error creating bean with name 'redisson' defined in class path resource [org/redisson/spring/starter/RedissonAutoConfiguration.class]: Failed to instantiate [org.redisson.api.RedissonClient]: Factory method 'redisson' threw exception with message: Unable to connect to Redis server: 127.0.0.1/127.0.0.1:6379 at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:795) at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:237) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1357) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1194) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:562) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:522) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:337) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:335) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:200) at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:254) at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1443) at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1353) at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:904) at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:782) ... 24 common frames omitted Caused by: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'verifyCodeManager' defined in URL [jar: nested:/E:/鍚庡彴/椤圭洰/novel-master/novel-master/target/novel-3.5.1-SNAPSHOT.jar/!BOOT-INF/classes/!/io/github/xxyopen/novel/manager/redis/Verify CodeManager.class]: Unsatisfied dependency expressed through constructor parameter 0: Error creating bean with name 'stringRedisTemplate' defined i n class path resource [org/redisson/spring/starter/RedissonAutoConfiguration.class]: Unsatisfied dependency expressed through method 'stringRedisTe mplate' parameter 0: Error creating bean with name 'redissonConnectionFactory' defined in class path resource [org/redisson/spring/starter/Redisson AutoConfiguration.class]: Unsatisfied dependency expressed through method 'redissonConnectionFactory' parameter 0: Error creating bean with name 'r edisson' defined in class path resource [org/redisson/spring/starter/RedissonAutoConfiguration.class]: Failed to instantiate [org.redisson.api.RedissonClient]: Factory method 'redisson' threw exception with message: Unable to connect to Redis server: 127.0.0.1/127.0.0.1:6379 at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:795) at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:237) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1357) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1194) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:562) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:522) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:337) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:335) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:200) at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:254) at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1443) at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1353) at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:904) at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:782) ... 38 common frames omitted Caused by: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'stringRedisTemplate' defined in class p ath resource [org/redisson/spring/starter/RedissonAutoConfiguration.class]: Unsatisfied dependency expressed through method 'stringRedisTemplate' p arameter 0: Error creating bean with name 'redissonConnectionFactory' defined in class path resource [org/redisson/spring/starter/RedissonAutoConfi guration.class]: Unsatisfied dependency expressed through method 'redissonConnectionFactory' parameter 0: Error creating bean with name 'redisson' defined in class path resource [org/redisson/spring/starter/RedissonAutoConfiguration.class]: Failed to instantiate [org.redisson.api.RedissonClient]: Factory method 'redisson' threw exception with message: Unable to connect to Redis server: 127.0.0.1/127.0.0.1:6379 at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:795) at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:542) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1337) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1167) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:562) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:522) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:337) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:335) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:200) at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:254) at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1443) at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1353) at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:904) at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:782) ... 52 common frames omitted Caused by: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'redissonConnectionFactory' defined in c lass path resource [org/redisson/spring/starter/RedissonAutoConfiguration.class]: Unsatisfied dependency expressed through method 'redissonConnecti onFactory' parameter 0: Error creating bean with name 'redisson' defined in class path resource [org/redisson/spring/starter/RedissonAutoConfigurat ion.class]: Failed to instantiate [org.redisson.api.RedissonClient]: Factory method 'redisson' threw exception with message: Unable to connect to Redis server: 127.0.0.1/127.0.0.1:6379 at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:795) at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:542) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1337) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1167) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:562) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:522) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:337) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:335) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:200) at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:254) at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1443) at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1353) at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:904) at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:782) ... 66 common frames omitted Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'redisson' defined in class path resource [org/re disson/spring/starter/RedissonAutoConfiguration.class]: Failed to instantiate [org.redisson.api.RedissonClient]: Factory method 'redisson' threw exception with message: Unable to connect to Redis server: 127.0.0.1/127.0.0.1:6379 at org.springframework.beans.factory.support.ConstructorResolver.instantiate(ConstructorResolver.java:648) at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:485) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1337) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1167) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:562) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:522) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:337) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:335) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:200) at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:254) at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1443) at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1353) at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:904) at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:782) ... 80 common frames omitted Caused by: org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.redisson.api.RedissonClient]: Factory method 'redisson' threw exception with message: Unable to connect to Redis server: 127.0.0.1/127.0.0.1:6379 at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:177) at org.springframework.beans.factory.support.ConstructorResolver.instantiate(ConstructorResolver.java:644) ... 94 common frames omitted Caused by: org.redisson.client.RedisConnectionException: Unable to connect to Redis server: 127.0.0.1/127.0.0.1:6379 at org.redisson.connection.pool.ConnectionPool.lambda$createConnection$0(ConnectionPool.java:154) at java.base/java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) at java.base/java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:510) at java.base/java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:2194) at org.redisson.connection.pool.ConnectionPool.promiseFailure(ConnectionPool.java:318) at org.redisson.connection.pool.ConnectionPool.lambda$createConnection$6(ConnectionPool.java:277) at java.base/java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) at java.base/java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:510) at java.base/java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:2194) at org.redisson.client.RedisClient$1$2.run(RedisClient.java:250) at io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:173) at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:166) at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:469) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:569) at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:994) at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: java.util.concurrent.CompletionException: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: getsockopt: 127.0.0.1/127.0.0.1:6379 at java.base/java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:368) at java.base/java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:377) at java.base/java.util.concurrent.CompletableFuture$UniRelay.tryFire(CompletableFuture.java:1097) ... 11 common frames omitted Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: getsockopt: 127.0.0.1/127.0.0.1:6379 Caused by: java.net.ConnectException: Connection refused: getsockopt at java.base/sun.nio.ch.Net.pollConnect(Native Method) at java.base/sun.nio.ch.Net.pollConnectNow(Net.java:682) at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:973) at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:336) at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:339) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:776) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562) at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:994) at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) at java.base/java.lang.Thread.run(Thread.java:1583)

PowerShell 7 环境已加载 (版本: 7.5.2) PowerShell 7 环境已加载 (版本: 7.5.2) PS C:\Users\Administrator\Desktop> cd E:\PyTorch_Build\pytorch PS E:\PyTorch_Build\pytorch> python -m venv rtx5070_env PS E:\PyTorch_Build\pytorch> .\rtx5070_env\Scripts\activate (rtx5070_env) PS E:\PyTorch_Build\pytorch> # 修复之前的脚本错误 (rtx5070_env) PS E:\PyTorch_Build\pytorch> $fixedActivation = @" >> try { >> & "$activatePath" >> Write-Host "✅ 虚拟环境激活成功" -ForegroundColor Green >> python -VV >> } >> catch [System.Exception] { >> Write-Host "❌ 激活失败: $($_.Exception.Message)" -ForegroundColor Red >> } >> "@ InvalidOperation: Line | 3 | & "$activatePath" | ~~~~~~~~~~~~~ | The variable '$activatePath' cannot be retrieved because it has not been set. (rtx5070_env) PS E:\PyTorch_Build\pytorch> (rtx5070_env) PS E:\PyTorch_Build\pytorch> # 切换到PyTorch源码目录 (rtx5070_env) PS E:\PyTorch_Build\pytorch> cd E:\PyTorch_Build\pytorch (rtx5070_env) PS E:\PyTorch_Build\pytorch> (rtx5070_env) PS E:\PyTorch_Build\pytorch> # 更新pip到最新版 (rtx5070_env) PS E:\PyTorch_Build\pytorch> python -m pip install --upgrade pip Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple Requirement already satisfied: pip in e:\pytorch_build\pytorch\rtx5070_env\lib\site-packages (22.3.1) Collecting pip Using cached https://pypi.tuna.tsinghua.edu.cn/packages/b7/3f/945ef7ab14dc4f9d7f40288d2df998d1837ee0888ec3659c813487572faa/pip-25.2-py3-none-any.whl (1.8 MB) Installing collected packages: pip Attempting uninstall: pip Found existing installation: pip 22.3.1 Uninstalling pip-22.3.1: Successfully uninstalled pip-22.3.1 Successfully installed pip-25.2 (rtx5070_env) PS E:\PyTorch_Build\pytorch> (rtx5070_env) PS E:\PyTorch_Build\pytorch> # 安装编译依赖 (rtx5070_env) PS E:\PyTorch_Build\pytorch> pip install -r requirements-build.txt --verbose Using pip 25.2 from E:\PyTorch_Build\pytorch\rtx5070_env\lib\site-packages\pip (python 3.10) Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple Collecting setuptools<80.0,>=70.1.0 (from -r requirements-build.txt (line 2)) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/0d/6d/b4752b044bf94cb802d88a888dc7d288baaf77d7910b7dedda74b5ceea0c/setuptools-79.0.1-py3-none-any.whl (1.3 MB) Collecting cmake>=3.27 (from -r requirements-build.txt (line 3)) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/7c/d0/73cae88d8c25973f2465d5a4457264f95617c16ad321824ed4c243734511/cmake-4.1.0-py3-none-win_amd64.whl (37.6 MB) Collecting ninja (from -r requirements-build.txt (line 4)) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/29/45/c0adfbfb0b5895aa18cec400c535b4f7ff3e52536e0403602fc1a23f7de9/ninja-1.13.0-py3-none-win_amd64.whl (309 kB) Link requires a different Python (3.10.10 not in: '>=3.11'): https://pypi.tuna.tsinghua.edu.cn/packages/f3/db/8e12381333aea300890829a0a36bfa738cac95475d88982d538725143fd9/numpy-2.3.0.tar.gz (from https://pypi.tuna.tsinghua.edu.cn/simple/numpy/) (requires-python:>=3.11) Link requires a different Python (3.10.10 not in: '>=3.11'): https://pypi.tuna.tsinghua.edu.cn/packages/2e/19/d7c972dfe90a353dbd3efbbe1d14a5951de80c99c9dc1b93cd998d51dc0f/numpy-2.3.1.tar.gz (from https://pypi.tuna.tsinghua.edu.cn/simple/numpy/) (requires-python:>=3.11) Link requires a different Python (3.10.10 not in: '>=3.11'): https://pypi.tuna.tsinghua.edu.cn/packages/37/7d/3fec4199c5ffb892bed55cff901e4f39a58c81df9c44c280499e92cad264/numpy-2.3.2.tar.gz (from https://pypi.tuna.tsinghua.edu.cn/simple/numpy/) (requires-python:>=3.11) Collecting numpy (from -r requirements-build.txt (line 5)) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/a3/dd/4b822569d6b96c39d1215dbae0582fd99954dcbcf0c1a13c61783feaca3f/numpy-2.2.6-cp310-cp310-win_amd64.whl (12.9 MB) Collecting packaging (from -r requirements-build.txt (line 6)) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/20/12/38679034af332785aac8774540895e234f4d07f7545804097de4b666afd8/packaging-25.0-py3-none-any.whl (66 kB) Collecting pyyaml (from -r requirements-build.txt (line 7)) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/b5/84/0fa4b06f6d6c958d207620fc60005e241ecedceee58931bb20138e1e5776/PyYAML-6.0.2-cp310-cp310-win_amd64.whl (161 kB) Collecting requests (from -r requirements-build.txt (line 8)) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/1e/db/4254e3eabe8020b458f1a747140d32277ec7a271daf1d235b70dc0b4e6e3/requests-2.32.5-py3-none-any.whl (64 kB) Collecting six (from -r requirements-build.txt (line 9)) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/b7/ce/149a00dd41f10bc29e5921b496af8b574d8413afcd5e30dfa0ed46c2cc5e/six-1.17.0-py2.py3-none-any.whl (11 kB) Collecting typing-extensions>=4.10.0 (from -r requirements-build.txt (line 10)) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/18/67/36e9267722cc04a6b9f15c7f3441c2363321a3ea07da7ae0c0707beb2a9c/typing_extensions-4.15.0-py3-none-any.whl (44 kB) Collecting charset_normalizer<4,>=2 (from requests->-r requirements-build.txt (line 8)) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/e2/c6/f05db471f81af1fa01839d44ae2a8bfeec8d2a8b4590f16c4e7393afd323/charset_normalizer-3.4.3-cp310-cp310-win_amd64.whl (107 kB) Collecting idna<4,>=2.5 (from requests->-r requirements-build.txt (line 8)) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl (70 kB) Collecting urllib3<3,>=1.21.1 (from requests->-r requirements-build.txt (line 8)) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/a7/c2/fe1e52489ae3122415c51f387e221dd0773709bad6c6cdaa599e8a2c5185/urllib3-2.5.0-py3-none-any.whl (129 kB) Collecting certifi>=2017.4.17 (from requests->-r requirements-build.txt (line 8)) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/e5/48/1549795ba7742c948d2ad169c1c8cdbae65bc450d6cd753d124b17c8cd32/certifi-2025.8.3-py3-none-any.whl (161 kB) Installing collected packages: urllib3, typing-extensions, six, setuptools, pyyaml, packaging, numpy, ninja, idna, cmake, charset_normalizer, certifi, requests Attempting uninstall: setuptools Found existing installation: setuptools 65.5.0 Uninstalling setuptools-65.5.0: Removing file or directory e:\pytorch_build\pytorch\rtx5070_env\lib\site-packages\_distutils_hack\ Removing file or directory e:\pytorch_build\pytorch\rtx5070_env\lib\site-packages\distutils-precedence.pth Removing file or directory e:\pytorch_build\pytorch\rtx5070_env\lib\site-packages\pkg_resources\ Removing file or directory e:\pytorch_build\pytorch\rtx5070_env\lib\site-packages\setuptools-65.5.0.dist-info\ Removing file or directory e:\pytorch_build\pytorch\rtx5070_env\lib\site-packages\setuptools\ Successfully uninstalled setuptools-65.5.0 Successfully installed certifi-2025.8.3 charset_normalizer-3.4.3 cmake-4.1.0 idna-3.10 ninja-1.13.0 numpy-2.2.6 packaging-25.0 pyyaml-6.0.2 requests-2.32.5 setuptools-79.0.1 six-1.17.0 typing-extensions-4.15.0 urllib3-2.5.0 (rtx5070_env) PS E:\PyTorch_Build\pytorch> pip install cmake ninja --upgrade Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple Requirement already satisfied: cmake in e:\pytorch_build\pytorch\rtx5070_env\lib\site-packages (4.1.0) Requirement already satisfied: ninja in e:\pytorch_build\pytorch\rtx5070_env\lib\site-packages (1.13.0) (rtx5070_env) PS E:\PyTorch_Build\pytorch> (rtx5070_env) PS E:\PyTorch_Build\pytorch> # 清理旧编译产物 (rtx5070_env) PS E:\PyTorch_Build\pytorch> Remove-Item -Recurse -Force build, dist -ErrorAction SilentlyContinue (rtx5070_env) PS E:\PyTorch_Build\pytorch> Write-Host "n==== 编译环境验证 ====" -ForegroundColor Cyan ==== 编译环境验证 ==== (rtx5070_env) PS E:\PyTorch_Build\pytorch> (rtx5070_env) PS E:\PyTorch_Build\pytorch> # 1. 目录验证 (rtx5070_env) PS E:\PyTorch_Build\pytorch> Write-Host "当前目录: $pwd" 当前目录: E:\PyTorch_Build\pytorch (rtx5070_env) PS E:\PyTorch_Build\pytorch> if ($pwd -ne "E:\PyTorch_Build\pytorch") { >> Write-Host "⚠️ 错误: 需要切换到E:\PyTorch_Build\pytorch" -ForegroundColor Yellow >> cd E:\PyTorch_Build\pytorch >> } ⚠️ 错误: 需要切换到E:\PyTorch_Build\pytorch (rtx5070_env) PS E:\PyTorch_Build\pytorch> (rtx5070_env) PS E:\PyTorch_Build\pytorch> # 2. CUDA工具链验证 (rtx5070_env) PS E:\PyTorch_Build\pytorch> $cudaStatus = @( >> "nvcc --version", >> "nvidia-smi", >> "where cudnn64_8.dll" >> ) (rtx5070_env) PS E:\PyTorch_Build\pytorch> (rtx5070_env) PS E:\PyTorch_Build\pytorch> foreach ($cmd in $cudaStatus) { >> Write-Host "n执行: $cmd" -ForegroundColor Magenta >> try { >> Invoke-Expression $cmd >> } >> catch { >> Write-Host "❌ 命令失败: $_" -ForegroundColor Red >> } >> } 执行: nvcc --version nvcc: NVIDIA (R) Cuda compiler driver Copyright (c) 2005-2025 NVIDIA Corporation Built on Wed_Jul_16_20:06:48_Pacific_Daylight_Time_2025 Cuda compilation tools, release 13.0, V13.0.48 Build cuda_13.0.r13.0/compiler.36260728_0 执行: nvidia-smi Wed Sep 3 22:04:47 2025 +-----------------------------------------------------------------------------------------+ | NVIDIA-SMI 580.97 Driver Version: 580.97 CUDA Version: 13.0 | +-----------------------------------------+------------------------+----------------------+ | GPU Name Driver-Model | Bus-Id Disp.A | Volatile Uncorr. ECC | | Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. | | | | MIG M. | |=========================================+========================+======================| | 0 NVIDIA GeForce RTX 5070 WDDM | 00000000:01:00.0 On | N/A | | 0% 38C P3 22W / 250W | 1601MiB / 12227MiB | 1% Default | | | | N/A | +-----------------------------------------+------------------------+----------------------+ +-----------------------------------------------------------------------------------------+ | Processes: | | GPU GI CI PID Type Process name GPU Memory | | ID ID Usage | |=========================================================================================| | 0 N/A N/A 1540 C+G ...yb3d8bbwe\WindowsTerminal.exe N/A | | 0 N/A N/A 1916 C+G C:\Windows\System32\dwm.exe N/A | | 0 N/A N/A 4972 C+G ...em32\ApplicationFrameHost.exe N/A | | 0 N/A N/A 5036 C+G ...ef.win7x64\steamwebhelper.exe N/A | | 0 N/A N/A 5996 C+G ...8bbwe\PhoneExperienceHost.exe N/A | | 0 N/A N/A 6420 C+G ...ntrolPanel\SystemSettings.exe N/A | | 0 N/A N/A 8280 C+G C:\Windows\explorer.exe N/A | | 0 N/A N/A 8428 C+G ...indows\System32\ShellHost.exe N/A | | 0 N/A N/A 8616 C+G ..._cw5n1h2txyewy\SearchHost.exe N/A | | 0 N/A N/A 9212 C+G ...y\StartMenuExperienceHost.exe N/A | | 0 N/A N/A 10092 C+G ...0.3405.125\msedgewebview2.exe N/A | | 0 N/A N/A 12816 C+G ...5n1h2txyewy\TextInputHost.exe N/A | | 0 N/A N/A 13400 C+G ...crosoft\OneDrive\OneDrive.exe N/A | | 0 N/A N/A 14212 C+G ...t\Edge\Application\msedge.exe N/A | | 0 N/A N/A 14440 C+G ...acted\runtime\WeChatAppEx.exe N/A | | 0 N/A N/A 15156 C+G ...les\Tencent\Weixin\Weixin.exe N/A | | 0 N/A N/A 18312 C+G ...es\Microsoft VS Code\Code.exe N/A | +-----------------------------------------------------------------------------------------+ 执行: where cudnn64_8.dll (rtx5070_env) PS E:\PyTorch_Build\pytorch> (rtx5070_env) PS E:\PyTorch_Build\pytorch> # 3. Python环境验证 (rtx5070_env) PS E:\PyTorch_Build\pytorch> Write-Host "nPython环境状态:" -ForegroundColor Magenta Python环境状态: (rtx5070_env) PS E:\PyTorch_Build\pytorch> pip show torch | Select-String "Location" WARNING: Package(s) not found: torch (rtx5070_env) PS E:\PyTorch_Build\pytorch> python -c "import torch; print(f'PyTorch版本: {torch.__version__}')" Traceback (most recent call last): File "<string>", line 1, in <module> File "E:\PyTorch_Build\pytorch\torch\__init__.py", line 61, in <module> from torch.torch_version import __version__ as __version__ File "E:\PyTorch_Build\pytorch\torch\torch_version.py", line 5, in <module> from torch.version import __version__ as internal_version ModuleNotFoundError: No module named 'torch.version' (rtx5070_env) PS E:\PyTorch_Build\pytorch> # 设置RTX 5070专属编译参数 (rtx5070_env) PS E:\PyTorch_Build\pytorch> $cmakeArgs = @( >> "-B build", >> "-G Ninja", >> "-DUSE_CUDA=ON", >> "-DUSE_CUDNN=ON", >> "-DCUDA_TOOLKIT_ROOT_DIR="C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v13.0"", >> "-DCUDNN_ROOT_DIR="E:\Program Files\NVIDIA\CUNND\v9.12"", >> "-DCUDA_ARCH_LIST="8.9"", # RTX 5070架构 >> "-DTORCH_CUDA_ARCH_LIST="8.9"", >> "-DCMAKE_BUILD_TYPE=Release", >> "-DUSE_NCCL=OFF", >> "-DUSE_MKLDNN=ON", >> "-DTORCH_CUDA_VERSION=11.8" # 兼容旧驱动 >> ) (rtx5070_env) PS E:\PyTorch_Build\pytorch> (rtx5070_env) PS E:\PyTorch_Build\pytorch> # 启动配置过程 (rtx5070_env) PS E:\PyTorch_Build\pytorch> cmake ($cmakeArgs -join " ") CMake Error: Unable to (re)create the private pkgRedirects directory: E:/PyTorch_Build/pytorch/build -G Ninja -DUSE_CUDA=ON -DUSE_CUDNN=ON -DCUDA_TOOLKIT_ROOT_DIR="C:/Program Files/NVIDIA GPU Computing Toolkit/CUDA/v13.0" -DCUDNN_ROOT_DIR="E:/Program Files/NVIDIA/CUNND/v9.12" -DCUDA_ARCH_LIST="8.9" -DTORCH_CUDA_ARCH_LIST="8.9" -DCMAKE_BUILD_TYPE=Release -DUSE_NCCL=OFF -DUSE_MKLDNN=ON -DTORCH_CUDA_VERSION=11.8/CMakeFiles/pkgRedirects This may be caused by not having read/write access to the build directory. Try specifying a location with read/write access like: cmake -B build If using a CMake presets file, ensure that preset parameter 'binaryDir' expands to a writable directory. (rtx5070_env) PS E:\PyTorch_Build\pytorch> # 设置并行编译(根据CPU核心数调整) (rtx5070_env) PS E:\PyTorch_Build\pytorch> $env:MAX_JOBS = 8 (rtx5070_env) PS E:\PyTorch_Build\pytorch> (rtx5070_env) PS E:\PyTorch_Build\pytorch> # 启动编译并记录日志 (rtx5070_env) PS E:\PyTorch_Build\pytorch> $logFile = "build_$(Get-Date -Format 'yyyyMMdd_HHmmss').log" (rtx5070_env) PS E:\PyTorch_Build\pytorch> Start-Transcript -Path $logFile Transcript started, output file is build_20250903_220514.log (rtx5070_env) PS E:\PyTorch_Build\pytorch> (rtx5070_env) PS E:\PyTorch_Build\pytorch> try { >> cmake --build build --config Release --parallel $env:MAX_JOBS >> pip install -v --no-build-isolation . >> } >> catch { >> Write-Host "🔥 编译失败!错误详情: $_" -ForegroundColor Red >> } Error: E:/PyTorch_Build/pytorch/build is not a directory Using pip 25.2 from E:\PyTorch_Build\pytorch\rtx5070_env\lib\site-packages\pip (python 3.10) Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple Processing e:\pytorch_build\pytorch Running command Preparing metadata (pyproject.toml) Building wheel torch-2.9.0a0+git2d31c3d E:\PyTorch_Build\pytorch\rtx5070_env\lib\site-packages\setuptools\config\_apply_pyprojecttoml.py:82: SetuptoolsDeprecationWarning: project.license as a TOML table is deprecated !! ******************************************************************************** Please use a simple string containing a SPDX expression for project.license. You can also use project.license-files. (Both options available on setuptools>=77.0.0). By 2026-Feb-18, you need to update your project and remove deprecated calls or your builds will no longer be supported. See https://packaging.python.org/en/latest/guides/writing-pyproject-toml/#license for details. ******************************************************************************** !! corresp(dist, value, root_dir) running dist_info creating C:\Users\Administrator\AppData\Local\Temp\pip-modern-metadata-inyji1j9\torch.egg-info writing C:\Users\Administrator\AppData\Local\Temp\pip-modern-metadata-inyji1j9\torch.egg-info\PKG-INFO writing dependency_links to C:\Users\Administrator\AppData\Local\Temp\pip-modern-metadata-inyji1j9\torch.egg-info\dependency_links.txt writing entry points to C:\Users\Administrator\AppData\Local\Temp\pip-modern-metadata-inyji1j9\torch.egg-info\entry_points.txt writing requirements to C:\Users\Administrator\AppData\Local\Temp\pip-modern-metadata-inyji1j9\torch.egg-info\requires.txt writing top-level names to C:\Users\Administrator\AppData\Local\Temp\pip-modern-metadata-inyji1j9\torch.egg-info\top_level.txt writing manifest file 'C:\Users\Administrator\AppData\Local\Temp\pip-modern-metadata-inyji1j9\torch.egg-info\SOURCES.txt' reading manifest file 'C:\Users\Administrator\AppData\Local\Temp\pip-modern-metadata-inyji1j9\torch.egg-info\SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no files found matching 'BUILD' warning: no files found matching '*.BUILD' warning: no files found matching 'BUCK' warning: no files found matching '[Mm]akefile.*' warning: no files found matching '*.[Dd]ockerfile' warning: no files found matching '[Dd]ockerfile.*' warning: no previously-included files matching '*.o' found anywhere in distribution warning: no previously-included files matching '*.obj' found anywhere in distribution warning: no previously-included files matching '*.so' found anywhere in distribution warning: no previously-included files matching '*.a' found anywhere in distribution warning: no previously-included files matching '*.dylib' found anywhere in distribution no previously-included directories found matching '*\.git' warning: no previously-included files matching '*~' found anywhere in distribution warning: no previously-included files matching '*.swp' found anywhere in distribution adding license file 'LICENSE' adding license file 'NOTICE' writing manifest file 'C:\Users\Administrator\AppData\Local\Temp\pip-modern-metadata-inyji1j9\torch.egg-info\SOURCES.txt' creating 'C:\Users\Administrator\AppData\Local\Temp\pip-modern-metadata-inyji1j9\torch-2.9.0a0+git2d31c3d.dist-info' Preparing metadata (pyproject.toml) ... done Collecting filelock (from torch==2.9.0a0+git2d31c3d) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/42/14/42b2651a2f46b022ccd948bca9f2d5af0fd8929c4eec235b8d6d844fbe67/filelock-3.19.1-py3-none-any.whl (15 kB) Requirement already satisfied: typing-extensions>=4.10.0 in e:\pytorch_build\pytorch\rtx5070_env\lib\site-packages (from torch==2.9.0a0+git2d31c3d) (4.15.0) Collecting sympy>=1.13.3 (from torch==2.9.0a0+git2d31c3d) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/a2/09/77d55d46fd61b4a135c444fc97158ef34a095e5681d0a6c10b75bf356191/sympy-1.14.0-py3-none-any.whl (6.3 MB) Link requires a different Python (3.10.10 not in: '>=3.11'): https://pypi.tuna.tsinghua.edu.cn/packages/eb/8d/776adee7bbf76365fdd7f2552710282c79a4ead5d2a46408c9043a2b70ba/networkx-3.5-py3-none-any.whl (from https://pypi.tuna.tsinghua.edu.cn/simple/networkx/) (requires-python:>=3.11) Link requires a different Python (3.10.10 not in: '>=3.11'): https://pypi.tuna.tsinghua.edu.cn/packages/6c/4f/ccdb8ad3a38e583f214547fd2f7ff1fc160c43a75af88e6aec213404b96a/networkx-3.5.tar.gz (from https://pypi.tuna.tsinghua.edu.cn/simple/networkx/) (requires-python:>=3.11) Link requires a different Python (3.10.10 not in: '>=3.11'): https://pypi.tuna.tsinghua.edu.cn/packages/3f/a1/46c1b6e202e3109d2a035b21a7e5534c5bb233ee30752d7f16a0bd4c3989/networkx-3.5rc0-py3-none-any.whl (from https://pypi.tuna.tsinghua.edu.cn/simple/networkx/) (requires-python:>=3.11) Link requires a different Python (3.10.10 not in: '>=3.11'): https://pypi.tuna.tsinghua.edu.cn/packages/90/7e/0319606a20ced20730806b9f7fe91d8a92f7da63d76a5c388f87d3f7d294/networkx-3.5rc0.tar.gz (from https://pypi.tuna.tsinghua.edu.cn/simple/networkx/) (requires-python:>=3.11) Collecting networkx>=2.5.1 (from torch==2.9.0a0+git2d31c3d) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/b9/54/dd730b32ea14ea797530a4479b2ed46a6fb250f682a9cfb997e968bf0261/networkx-3.4.2-py3-none-any.whl (1.7 MB) Collecting jinja2 (from torch==2.9.0a0+git2d31c3d) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/62/a1/3d680cbfd5f4b8f15abc1d571870c5fc3e594bb582bc3b64ea099db13e56/jinja2-3.1.6-py3-none-any.whl (134 kB) Collecting fsspec>=0.8.5 (from torch==2.9.0a0+git2d31c3d) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/47/71/70db47e4f6ce3e5c37a607355f80da8860a33226be640226ac52cb05ef2e/fsspec-2025.9.0-py3-none-any.whl (199 kB) Collecting mpmath<1.4,>=1.1.0 (from sympy>=1.13.3->torch==2.9.0a0+git2d31c3d) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/43/e3/7d92a15f894aa0c9c4b49b8ee9ac9850d6e63b03c9c32c0367a13ae62209/mpmath-1.3.0-py3-none-any.whl (536 kB) Collecting MarkupSafe>=2.0 (from jinja2->torch==2.9.0a0+git2d31c3d) Using cached https://pypi.tuna.tsinghua.edu.cn/packages/44/06/e7175d06dd6e9172d4a69a72592cb3f7a996a9c396eee29082826449bbc3/MarkupSafe-3.0.2-cp310-cp310-win_amd64.whl (15 kB) Building wheels for collected packages: torch Running command Building wheel for torch (pyproject.toml) Building wheel torch-2.9.0a0+git2d31c3d -- Building version 2.9.0a0+git2d31c3d E:\PyTorch_Build\pytorch\rtx5070_env\lib\site-packages\setuptools\_distutils\_msvccompiler.py:12: UserWarning: _get_vc_env is private; find an alternative (pypa/distutils#340) warnings.warn( Cloning into 'nccl'... Note: switching to '3ea7eedf3b9b94f1d9f99f4e55536dfcbd23c1ca'. You are in 'detached HEAD' state. You can look around, make experimental changes and commit them, and you can discard any commits you make in this state without impacting any branches by switching back to a branch. If you want to create a new branch to retain commits you create, you may do so (now or later) by using -c with the switch command. Example: git switch -c <new-branch-name> Or undo this operation with: git switch - Turn off this advice by setting config variable advice.detachedHead to false cmake -GNinja -DBUILD_PYTHON=True -DBUILD_TEST=True -DCMAKE_BUILD_TYPE=Release -DCMAKE_INSTALL_PREFIX=E:\PyTorch_Build\pytorch\torch -DCMAKE_PREFIX_PATH=E:\PyTorch_Build\pytorch\rtx5070_env\Lib\site-packages -DCUDNN_INCLUDE_DIR=E:\Program Files\NVIDIA\CUNND\v9.12\include\12.9 -DCUDNN_LIBRARY=E:\Program Files\NVIDIA\CUNND\v9.12\lib\12.9\x64 -DCUDNN_ROOT=E:\Program Files\NVIDIA\CUNND\v9.12 -DPython_EXECUTABLE=E:\PyTorch_Build\pytorch\rtx5070_env\Scripts\python.exe -DPython_NumPy_INCLUDE_DIR=E:\PyTorch_Build\pytorch\rtx5070_env\lib\site-packages\numpy\_core\include -DTORCH_BUILD_VERSION=2.9.0a0+git2d31c3d -DTORCH_CUDA_ARCH_LIST=8.9 -DUSE_NUMPY=True -DUSE_OPENBLAS=1 E:\PyTorch_Build\pytorch CMake Deprecation Warning at CMakeLists.txt:9 (cmake_policy): The OLD behavior for policy CMP0126 will be removed from a future version of CMake. The cmake-policies(7) manual explains that the OLD behaviors of all policies are deprecated and that a policy should be set to OLD only under specific short-term circumstances. Projects should be ported to the NEW behavior and not rely on setting a policy to OLD. -- The CXX compiler identification is MSVC 19.44.35215.0 -- The C compiler identification is MSVC 19.44.35215.0 -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Check for working CXX compiler: C:/Program Files (x86)/Microsoft Visual Studio/2022/BuildTools/VC/Tools/MSVC/14.44.35207/bin/Hostx64/x64/cl.exe - skipped -- Detecting CXX compile features -- Detecting CXX compile features - done -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working C compiler: C:/Program Files (x86)/Microsoft Visual Studio/2022/BuildTools/VC/Tools/MSVC/14.44.35207/bin/Hostx64/x64/cl.exe - skipped -- Detecting C compile features -- Detecting C compile features - done -- Not forcing any particular BLAS to be found CMake Warning at CMakeLists.txt:421 (message): TensorPipe cannot be used on Windows. Set it to OFF CMake Warning at CMakeLists.txt:423 (message): KleidiAI cannot be used on Windows. Set it to OFF CMake Warning at CMakeLists.txt:435 (message): Libuv is not installed in current conda env. Set USE_DISTRIBUTED to OFF. Please run command 'conda install -c conda-forge libuv=1.39' to install libuv. -- Performing Test C_HAS_AVX_1 -- Performing Test C_HAS_AVX_1 - Success -- Performing Test C_HAS_AVX2_1 -- Performing Test C_HAS_AVX2_1 - Success -- Performing Test C_HAS_AVX512_1 -- Performing Test C_HAS_AVX512_1 - Success -- Performing Test CXX_HAS_AVX_1 -- Performing Test CXX_HAS_AVX_1 - Success -- Performing Test CXX_HAS_AVX2_1 -- Performing Test CXX_HAS_AVX2_1 - Success -- Performing Test CXX_HAS_AVX512_1 -- Performing Test CXX_HAS_AVX512_1 - Success -- Current compiler supports avx2 extension. Will build perfkernels. -- Performing Test COMPILER_SUPPORTS_HIDDEN_VISIBILITY -- Performing Test COMPILER_SUPPORTS_HIDDEN_VISIBILITY - Failed -- Performing Test COMPILER_SUPPORTS_HIDDEN_INLINE_VISIBILITY -- Performing Test COMPILER_SUPPORTS_HIDDEN_INLINE_VISIBILITY - Failed -- Could not find hardware support for NEON on this machine. -- No OMAP3 processor on this machine. -- No OMAP4 processor on this machine. -- Compiler does not support SVE extension. Will not build perfkernels. CMake Warning at CMakeLists.txt:841 (message): x64 operating system is required for FBGEMM. Not compiling with FBGEMM. Turn this warning off by USE_FBGEMM=OFF. -- Performing Test HAS/UTF_8 -- Performing Test HAS/UTF_8 - Success -- Found CUDA: E:/Program Files/NVIDIA GPU Computing Toolkit/CUDA/v13.0 (found version "13.0") -- The CUDA compiler identification is NVIDIA 13.0.48 with host compiler MSVC 19.44.35215.0 -- Detecting CUDA compiler ABI info -- Detecting CUDA compiler ABI info - done -- Check for working CUDA compiler: E:/Program Files/NVIDIA GPU Computing Toolkit/CUDA/v13.0/bin/nvcc.exe - skipped -- Detecting CUDA compile features -- Detecting CUDA compile features - done -- Found CUDAToolkit: E:/Program Files/NVIDIA GPU Computing Toolkit/CUDA/v13.0/include (found version "13.0.48") -- PyTorch: CUDA detected: 13.0 -- PyTorch: CUDA nvcc is: E:/Program Files/NVIDIA GPU Computing Toolkit/CUDA/v13.0/bin/nvcc.exe -- PyTorch: CUDA toolkit directory: E:/Program Files/NVIDIA GPU Computing Toolkit/CUDA/v13.0 -- PyTorch: Header version is: 13.0 -- Found Python: E:\PyTorch_Build\pytorch\rtx5070_env\Scripts\python.exe (found version "3.10.10") found components: Interpreter CMake Warning at cmake/public/cuda.cmake:140 (message): Failed to compute shorthash for libnvrtc.so Call Stack (most recent call first): cmake/Dependencies.cmake:44 (include) CMakeLists.txt:869 (include) -- Found CUDNN: E:/Program Files/NVIDIA/CUNND/v9.12/lib/13.0/x64/cudnn.lib -- Could NOT find CUSPARSELT (missing: CUSPARSELT_LIBRARY_PATH CUSPARSELT_INCLUDE_PATH) CMake Warning at cmake/public/cuda.cmake:226 (message): Cannot find cuSPARSELt library. Turning the option off Call Stack (most recent call first): cmake/Dependencies.cmake:44 (include) CMakeLists.txt:869 (include) -- Could NOT find CUDSS (missing: CUDSS_LIBRARY_PATH CUDSS_INCLUDE_PATH) CMake Warning at cmake/public/cuda.cmake:242 (message): Cannot find CUDSS library. Turning the option off Call Stack (most recent call first): cmake/Dependencies.cmake:44 (include) CMakeLists.txt:869 (include) -- USE_CUFILE is set to 0. Compiling without cuFile support CMake Warning at cmake/public/cuda.cmake:317 (message): pytorch is not compatible with CMAKE_CUDA_ARCHITECTURES and will ignore its value. Please configure TORCH_CUDA_ARCH_LIST instead. Call Stack (most recent call first): cmake/Dependencies.cmake:44 (include) CMakeLists.txt:869 (include) -- Added CUDA NVCC flags for: -gencode;arch=compute_89,code=sm_89 CMake Warning at cmake/Dependencies.cmake:95 (message): Not compiling with XPU. Could NOT find SYCL. Suppress this warning with -DUSE_XPU=OFF. Call Stack (most recent call first): CMakeLists.txt:869 (include) -- Building using own protobuf under third_party per request. -- Use custom protobuf build. CMake Warning at cmake/ProtoBuf.cmake:37 (message): Ancient protobuf forces CMake compatibility Call Stack (most recent call first): cmake/ProtoBuf.cmake:87 (custom_protobuf_find) cmake/Dependencies.cmake:107 (include) CMakeLists.txt:869 (include) CMake Deprecation Warning at third_party/protobuf/cmake/CMakeLists.txt:2 (cmake_minimum_required): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument <min> value. Or, use the <min>...<max> syntax to tell CMake that the project requires at least <min> but has been updated to work with policies introduced by <max> or earlier. -- -- 3.13.0.0 -- Performing Test CMAKE_HAVE_LIBC_PTHREAD -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed -- Looking for pthread_create in pthreads -- Looking for pthread_create in pthreads - not found -- Looking for pthread_create in pthread -- Looking for pthread_create in pthread - not found -- Found Threads: TRUE -- Caffe2 protobuf include directory: $<BUILD_INTERFACE:E:/PyTorch_Build/pytorch/third_party/protobuf/src>$<INSTALL_INTERFACE:include> -- Trying to find preferred BLAS backend of choice: MKL -- MKL_THREADING = OMP -- Looking for sys/types.h -- Looking for sys/types.h - found -- Looking for stdint.h -- Looking for stdint.h - found -- Looking for stddef.h -- Looking for stddef.h - found -- Check size of void* -- Check size of void* - done -- MKL_THREADING = OMP CMake Warning at cmake/Dependencies.cmake:213 (message): MKL could not be found. Defaulting to Eigen Call Stack (most recent call first): CMakeLists.txt:869 (include) CMake Warning at cmake/Dependencies.cmake:279 (message): Preferred BLAS (MKL) cannot be found, now searching for a general BLAS library Call Stack (most recent call first): CMakeLists.txt:869 (include) -- MKL_THREADING = OMP -- Checking for [mkl_intel_lp64 - mkl_intel_thread - mkl_core - libiomp5md] -- Library mkl_intel_lp64: not found -- Checking for [mkl_intel - mkl_intel_thread - mkl_core - libiomp5md] -- Library mkl_intel: not found -- Checking for [mkl_intel_lp64 - mkl_intel_thread - mkl_core] -- Library mkl_intel_lp64: not found -- Checking for [mkl_intel - mkl_intel_thread - mkl_core] -- Library mkl_intel: not found -- Checking for [mkl_intel_lp64 - mkl_sequential - mkl_core] -- Library mkl_intel_lp64: not found -- Checking for [mkl_intel - mkl_sequential - mkl_core] -- Library mkl_intel: not found -- Checking for [mkl_intel_lp64 - mkl_core - libiomp5md - pthread] -- Library mkl_intel_lp64: not found -- Checking for [mkl_intel - mkl_core - libiomp5md - pthread] -- Library mkl_intel: not found -- Checking for [mkl_intel_lp64 - mkl_core - pthread] -- Library mkl_intel_lp64: not found -- Checking for [mkl_intel - mkl_core - pthread] -- Library mkl_intel: not found -- Checking for [mkl - guide - pthread - m] -- Library mkl: not found -- MKL library not found -- Checking for [blis] -- Library blis: BLAS_blis_LIBRARY-NOTFOUND -- Checking for [Accelerate] -- Library Accelerate: BLAS_Accelerate_LIBRARY-NOTFOUND -- Checking for [vecLib] -- Library vecLib: BLAS_vecLib_LIBRARY-NOTFOUND -- Checking for [flexiblas] -- Library flexiblas: BLAS_flexiblas_LIBRARY-NOTFOUND -- Checking for [openblas] -- Library openblas: BLAS_openblas_LIBRARY-NOTFOUND -- Checking for [openblas - pthread - m] -- Library openblas: BLAS_openblas_LIBRARY-NOTFOUND -- Checking for [openblas - pthread - m - gomp] -- Library openblas: BLAS_openblas_LIBRARY-NOTFOUND -- Checking for [libopenblas] -- Library libopenblas: BLAS_libopenblas_LIBRARY-NOTFOUND -- Checking for [goto2 - gfortran] -- Library goto2: BLAS_goto2_LIBRARY-NOTFOUND -- Checking for [goto2 - gfortran - pthread] -- Library goto2: BLAS_goto2_LIBRARY-NOTFOUND -- Checking for [acml - gfortran] -- Library acml: BLAS_acml_LIBRARY-NOTFOUND -- Checking for [blis] -- Library blis: BLAS_blis_LIBRARY-NOTFOUND -- Could NOT find Atlas (missing: Atlas_CBLAS_INCLUDE_DIR Atlas_CLAPACK_INCLUDE_DIR Atlas_CBLAS_LIBRARY Atlas_BLAS_LIBRARY Atlas_LAPACK_LIBRARY) -- Checking for [ptf77blas - atlas - gfortran] -- Library ptf77blas: BLAS_ptf77blas_LIBRARY-NOTFOUND -- Checking for [] -- Looking for sgemm_ -- Looking for sgemm_ - not found -- Cannot find a library with BLAS API. Not using BLAS. -- Using pocketfft in directory: E:/PyTorch_Build/pytorch/third_party/pocketfft/ CMake Deprecation Warning at third_party/pthreadpool/CMakeLists.txt:1 (CMAKE_MINIMUM_REQUIRED): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument <min> value. Or, use the <min>...<max> syntax to tell CMake that the project requires at least <min> but has been updated to work with policies introduced by <max> or earlier. CMake Deprecation Warning at third_party/FXdiv/CMakeLists.txt:1 (CMAKE_MINIMUM_REQUIRED): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument <min> value. Or, use the <min>...<max> syntax to tell CMake that the project requires at least <min> but has been updated to work with policies introduced by <max> or earlier. CMake Deprecation Warning at third_party/cpuinfo/CMakeLists.txt:1 (CMAKE_MINIMUM_REQUIRED): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument <min> value. Or, use the <min>...<max> syntax to tell CMake that the project requires at least <min> but has been updated to work with policies introduced by <max> or earlier. -- The ASM compiler identification is MSVC CMake Warning (dev) at rtx5070_env/Lib/site-packages/cmake/data/share/cmake-4.1/Modules/CMakeDetermineASMCompiler.cmake:234 (message): Policy CMP194 is not set: MSVC is not an assembler for language ASM. Run "cmake --help-policy CMP194" for policy details. Use the cmake_policy command to set the policy and suppress this warning. Call Stack (most recent call first): third_party/XNNPACK/CMakeLists.txt:18 (PROJECT) This warning is for project developers. Use -Wno-dev to suppress it. -- Found assembler: C:/Program Files (x86)/Microsoft Visual Studio/2022/BuildTools/VC/Tools/MSVC/14.44.35207/bin/Hostx64/x64/cl.exe -- Building for XNNPACK_TARGET_PROCESSOR: x86_64 -- Generating microkernels.cmake

kd> ed nt!Kd_DEFAULT_Mask 0xFFFFFFFF kd> ed nt!Kd_IHVDRIVER_Mask 0xFFFFFFFF kd> g [+] [DriverEntry] 驱动加载开始 [+] [DriverEntry] 驱动加载成功 [+] [ProcessNotifyCallback] 目标进程 oxygen.exe 创建 (PID: 8052) [+] [ProcessNotifyCallback] 工作线程已创建 [+] Worker thread started for hook installation on PID: 8052 Break instruction exception - code 80000003 (first chance) obpcallback!InstallHook+0x50: fffff8030e2a12f0 cc int 3 kd> g [+] [InstallHook] 找到目标函数地址: FFFFF803062EFB60 [PTE_HOOK] 开始隔离页面: PID=8052, 地址=0xFFFFF803062EFB60 [PTE_HOOK] 正在拆分大页: 输入PDE=0xa00000002a001a1, 输出PDE=0xFFFFBB878EFA3F80 [PTE_HOOK] 大页拆分完成: 新PTE表物理地址=0x239dab000 [PTE_HOOK] G-Bit Info: Align Address: 0xFFFFF803062EF000 IsLargePage: 1 PDE: 0xa00000002a000a1 (Address: 0xFFFFFE7F3E00C188) [PTE_HOOK] 清除大页G位: PDE=0xa00000002a000a1 [PTE_HOOK] 开始隔离页表: CR3=0x12e16c000, 地址=0xFFFFF803062EF000 KDTARGET: Refreshing KD connection *** Fatal System Error: 0x0000007e (0xFFFFFFFFC0000005,0xFFFFF8030E2A2265,0xFFFF80031C95B638,0xFFFF80031C95AE70) Break instruction exception - code 80000003 (first chance) A fatal system error has occurred. Debugger entered on first try; Bugcheck callbacks have not been invoked. A fatal system error has occurred. For analysis of this file, run !analyze -v nt!DbgBreakPointWithStatus: fffff80305ffd0b0 cc int 3 kd> !analyze -v Connected to Windows 10 19041 x64 target at (Wed Jul 9 20:20:01.053 2025 (UTC + 8:00)), ptr64 TRUE Loading Kernel Symbols ................................... Press ctrl-c (cdb, kd, ntsd) or ctrl-break (windbg) to abort symbol loads that take too long. Run !sym noisy before .reload to track down problems loading symbols. ............................ ................................................................ ..................................................... Loading User Symbols PEB is paged out (Peb.Ldr = 0000000000259018). Type ".hh dbgerr001" for details Loading unloaded module list ...... ERROR: FindPlugIns 8007007b ******************************************************************************* * * * Bugcheck Analysis * * * ******************************************************************************* SYSTEM_THREAD_EXCEPTION_NOT_HANDLED (7e) This is a very common BugCheck. Usually the exception address pinpoints the driver/function that caused the problem. Always note this address as well as the link date of the driver/image that contains this address. Arguments: Arg1: ffffffffc0000005, The exception code that was not handled Arg2: fffff8030e2a2265, The address that the exception occurred at Arg3: ffff80031c95b638, Exception Record Address Arg4: ffff80031c95ae70, Context Record Address Debugging Details: ------------------ Press ctrl-c (cdb, kd, ntsd) or ctrl-break (windbg) to abort symbol loads that take too long. Run !sym noisy before .reload to track down problems loading symbols. PEB is paged out (Peb.Ldr = 0000000000259018). Type ".hh dbgerr001" for details PEB is paged out (Peb.Ldr = 0000000000259018). Type ".hh dbgerr001" for details PEB is paged out (Peb.Ldr = 0000000000259018). Type ".hh dbgerr001" for details PEB is paged out (Peb.Ldr = 0000000000259018). Type ".hh dbgerr001" for details PEB is paged out (Peb.Ldr = 0000000000259018). Type ".hh dbgerr001" for details PEB is paged out (Peb.Ldr = 0000000000259018). Type ".hh dbgerr001" for details PEB is paged out (Peb.Ldr = 0000000000259018). Type ".hh dbgerr001" for details PEB is paged out (Peb.Ldr = 0000000000259018). Type ".hh dbgerr001" for details PEB is paged out (Peb.Ldr = 0000000000259018). Type ".hh dbgerr001" for details PEB is paged out (Peb.Ldr = 0000000000259018). Type ".hh dbgerr001" for details PEB is paged out (Peb.Ldr = 0000000000259018). Type ".hh dbgerr001" for details PEB is paged out (Peb.Ldr = 0000000000259018). Type ".hh dbgerr001" for details PEB is paged out (Peb.Ldr = 0000000000259018). Type ".hh dbgerr001" for details PEB is paged out (Peb.Ldr = 0000000000259018). Type ".hh dbgerr001" for details PEB is paged out (Peb.Ldr = 0000000000259018). Type ".hh dbgerr001" for details PEB is paged out (Peb.Ldr = 0000000000259018). Type ".hh dbgerr001" for details PEB is paged out (Peb.Ldr = 0000000000259018). Type ".hh dbgerr001" for details PEB is paged out (Peb.Ldr = 0000000000259018). Type ".hh dbgerr001" for details KEY_VALUES_STRING: 1 Key : AV.Type Value: Read Key : Analysis.CPU.mSec Value: 4234 Key : Analysis.Elapsed.mSec Value: 39170 Key : Analysis.IO.Other.Mb Value: 0 Key : Analysis.IO.Read.Mb Value: 2 Key : Analysis.IO.Write.Mb Value: 0 Key : Analysis.Init.CPU.mSec Value: 3250 Key : Analysis.Init.Elapsed.mSec Value: 59417 Key : Analysis.Memory.CommitPeak.Mb Value: 72 Key : Analysis.Version.DbgEng Value: 10.0.27829.1001 Key : Analysis.Version.Description Value: 10.2503.24.01 amd64fre Key : Analysis.Version.Ext Value: 1.2503.24.1 Key : Bugcheck.Code.KiBugCheckData Value: 0x7e Key : Bugcheck.Code.LegacyAPI Value: 0x7e Key : Bugcheck.Code.TargetModel Value: 0x7e Key : Failure.Bucket Value: AV_obpcallback!PteHookManager::fn_isolation_pagetable Key : Failure.Exception.Code Value: 0xc0000005 Key : Failure.Exception.IP.Address Value: 0xfffff8030e2a2265 Key : Failure.Exception.IP.Module Value: obpcallback Key : Failure.Exception.IP.Offset Value: 0x2265 Key : Failure.Exception.Record Value: 0xffff80031c95b638 Key : Failure.Hash Value: {03bbd39e-299b-91b8-212d-d002a6bff650} Key : Hypervisor.Enlightenments.Value Value: 12576 Key : Hypervisor.Enlightenments.ValueHex Value: 0x3120 Key : Hypervisor.Flags.AnyHypervisorPresent Value: 1 Key : Hypervisor.Flags.ApicEnlightened Value: 0 Key : Hypervisor.Flags.ApicVirtualizationAvailable Value: 0 Key : Hypervisor.Flags.AsyncMemoryHint Value: 0 Key : Hypervisor.Flags.CoreSchedulerRequested Value: 0 Key : Hypervisor.Flags.CpuManager Value: 0 Key : Hypervisor.Flags.DeprecateAutoEoi Value: 1 Key : Hypervisor.Flags.DynamicCpuDisabled Value: 0 Key : Hypervisor.Flags.Epf Value: 0 Key : Hypervisor.Flags.ExtendedProcessorMasks Value: 0 Key : Hypervisor.Flags.HardwareMbecAvailable Value: 0 Key : Hypervisor.Flags.MaxBankNumber Value: 0 Key : Hypervisor.Flags.MemoryZeroingControl Value: 0 Key : Hypervisor.Flags.NoExtendedRangeFlush Value: 1 Key : Hypervisor.Flags.NoNonArchCoreSharing Value: 0 Key : Hypervisor.Flags.Phase0InitDone Value: 1 Key : Hypervisor.Flags.PowerSchedulerQos Value: 0 Key : Hypervisor.Flags.RootScheduler Value: 0 Key : Hypervisor.Flags.SynicAvailable Value: 1 Key : Hypervisor.Flags.UseQpcBias Value: 0 Key : Hypervisor.Flags.Value Value: 536632 Key : Hypervisor.Flags.ValueHex Value: 0x83038 Key : Hypervisor.Flags.VpAssistPage Value: 1 Key : Hypervisor.Flags.VsmAvailable Value: 0 Key : Hypervisor.RootFlags.AccessStats Value: 0 Key : Hypervisor.RootFlags.CrashdumpEnlightened Value: 0 Key : Hypervisor.RootFlags.CreateVirtualProcessor Value: 0 Key : Hypervisor.RootFlags.DisableHyperthreading Value: 0 Key : Hypervisor.RootFlags.HostTimelineSync Value: 0 Key : Hypervisor.RootFlags.HypervisorDebuggingEnabled Value: 0 Key : Hypervisor.RootFlags.IsHyperV Value: 0 Key : Hypervisor.RootFlags.LivedumpEnlightened Value: 0 Key : Hypervisor.RootFlags.MapDeviceInterrupt Value: 0 Key : Hypervisor.RootFlags.MceEnlightened Value: 0 Key : Hypervisor.RootFlags.Nested Value: 0 Key : Hypervisor.RootFlags.StartLogicalProcessor Value: 0 Key : Hypervisor.RootFlags.Value Value: 0 Key : Hypervisor.RootFlags.ValueHex Value: 0x0 Key : SecureKernel.HalpHvciEnabled Value: 0 Key : WER.OS.Branch Value: vb_release Key : WER.OS.Version Value: 10.0.19041.1 BUGCHECK_CODE: 7e BUGCHECK_P1: ffffffffc0000005 BUGCHECK_P2: fffff8030e2a2265 BUGCHECK_P3: ffff80031c95b638 BUGCHECK_P4: ffff80031c95ae70 FAULTING_THREAD: ffffbb8793f90080 EXCEPTION_RECORD: ffff80031c95b638 -- (.exr 0xffff80031c95b638) ExceptionAddress: fffff8030e2a2265 (obpcallback!PteHookManager::fn_isolation_pagetable+0x0000000000000315) ExceptionCode: c0000005 (Access violation) ExceptionFlags: 00000000 NumberParameters: 2 Parameter[0]: 0000000000000000 Parameter[1]: 0000000239dab000 Attempt to read from address 0000000239dab000 CONTEXT: ffff80031c95ae70 -- (.cxr 0xffff80031c95ae70) rax=0000000239dab000 rbx=ffffbb8793f90080 rcx=0000000000001000 rdx=0000000000000000 rsi=0000000239dab000 rdi=ffff9181d99a7000 rip=fffff8030e2a2265 rsp=ffff80031c95b870 rbp=0000000000000080 r8=ffff80031c95b940 r9=8000000000000200 r10=0000000000000000 r11=ffffbb8790910000 r12=0000000000000280 r13=0000000000000000 r14=ffffbb878f4af040 r15=fffff8030c094000 iopl=0 nv up ei pl nz ac po nc cs=0010 ss=0018 ds=002b es=002b fs=0053 gs=002b efl=00050216 obpcallback!PteHookManager::fn_isolation_pagetable+0x315: fffff8030e2a2265 f3a4 rep movs byte ptr [rdi],byte ptr [rsi] Resetting default scope PROCESS_NAME: oxygen.exe READ_ADDRESS: unable to get nt!PspSessionIdBitmap 0000000239dab000 ERROR_CODE: (NTSTATUS) 0xc0000005 - 0x%p 0x%p %s EXCEPTION_CODE_STR: c0000005 EXCEPTION_PARAMETER1: 0000000000000000 EXCEPTION_PARAMETER2: 0000000239dab000 EXCEPTION_STR: 0xc0000005 STACK_TEXT: ffff80031c95b870 fffff8030e2a1da1 : ffffbb878efa7000 000000012e16c000 fffff803062ef000 ffffbb878efa3f80 : obpcallback!PteHookManager::fn_isolation_pagetable+0x315 [C:\Users\17116\source\repos\obpcallback\obpcallback\Ô´.cpp @ 526] ffff80031c95b9c0 fffff8030e2a25df : ffffbb878efa7000 0000000000001f74 fffff803062efb60 0000000000000035 : obpcallback!PteHookManager::fn_isolation_pages+0x261 [C:\Users\17116\source\repos\obpcallback\obpcallback\Ô´.cpp @ 602] ffff80031c95bad0 fffff8030e2a1344 : ffffbb878efa7000 0000000000001f74 ffff80031c95bbb0 fffff8030e2a1500 : obpcallback!PteHookManager::fn_pte_inline_hook_bp_pg+0x4f [C:\Users\17116\source\repos\obpcallback\obpcallback\Ô´.cpp @ 723] ffff80031c95bb90 fffff8030e2a13d4 : fffff8030e2a44a0 0000000000001f74 0000000000000000 fffff803040ef180 : obpcallback!InstallHook+0xa4 [C:\Users\17116\source\repos\obpcallback\obpcallback\Ô´.cpp @ 982] ffff80031c95bbe0 fffff80305ea29a5 : 0000000000001f74 fffff8030e2a13a0 0000000000001f74 fffff80300000001 : obpcallback!InstallHookWorker+0x34 [C:\Users\17116\source\repos\obpcallback\obpcallback\Ô´.cpp @ 1010] ffff80031c95bc10 fffff80305ffc868 : fffff803040ef180 ffffbb8793f90080 fffff80305ea2950 ffff80031c95bc80 : nt!PspSystemThreadStartup+0x55 ffff80031c95bc60 0000000000000000 : ffff80031c95c000 ffff80031c956000 0000000000000000 0000000000000000 : nt!KiStartSystemThread+0x28 FAULTING_SOURCE_LINE: C:\Users\17116\source\repos\obpcallback\obpcallback\Ô´.cpp FAULTING_SOURCE_FILE: C:\Users\17116\source\repos\obpcallback\obpcallback\Ô´.cpp FAULTING_SOURCE_LINE_NUMBER: 526 FAULTING_SOURCE_CODE: 522: memcpy(Va4kb, replace_align_addr, PAGE_SIZE); 523: 524: if (Table.IsLargePage && split_pde_ptr) { 525: auto split_pde = (decltype(PAGE_TABLE::PdeAddress))split_pde_ptr; > 526: memcpy(VaPt, (void*)(split_pde->flags.page_frame_number << 12), PAGE_SIZE); 527: } 528: else { 529: memcpy(VaPt, (void*)(Table.PdeAddress->flags.page_frame_number << 12), PAGE_SIZE); 530: } 531: SYMBOL_NAME: obpcallback!PteHookManager::fn_isolation_pagetable+315 MODULE_NAME: obpcallback IMAGE_NAME: obpcallback.sys STACK_COMMAND: .cxr 0xffff80031c95ae70 ; kb BUCKET_ID_FUNC_OFFSET: 315 FAILURE_BUCKET_ID: AV_obpcallback!PteHookManager::fn_isolation_pagetable OS_VERSION: 10.0.19041.1 BUILDLAB_STR: vb_release OSPLATFORM_TYPE: x64 OSNAME: Windows 10 FAILURE_ID_HASH: {03bbd39e-299b-91b8-212d-d002a6bff650} Followup: MachineOwner ---------按照你改的结果出错了。代码如下:#include <ntifs.h> #include <ntddk.h> #include <intrin.h> #include "ptehook.h" #define CR0_WP (1 << 16) HANDLE targetProcessId = NULL; typedef INT(*LDE_DISASM)(PVOID address, INT bits); typedef unsigned long DWORD; typedef unsigned __int64 ULONG64; // 使用WDK标准类型 typedef unsigned char BYTE; typedef LONG NTSTATUS; // 修正后的跳转指令结构 #pragma pack(push, 1) typedef struct _JMP_ABS { BYTE opcode[6]; // FF 25 00 00 00 00 ULONG64 address; // 8字节绝对地址 } JMP_ABS, * PJMP_ABS; #pragma pack(pop) LDE_DISASM lde_disasm; // 初始化引擎 VOID lde_init() { lde_disasm = (LDE_DISASM)ExAllocatePool(NonPagedPool, 12800); memcpy(lde_disasm, szShellCode, 12800); } // 得到完整指令长度,避免截断 ULONG GetFullPatchSize(PUCHAR Address) { ULONG LenCount = 0, Len = 0; // 至少需要14字节 while (LenCount <= 14) { Len = lde_disasm(Address, 64); Address = Address + Len; LenCount = LenCount + Len; } return LenCount; } #define PROCESS_NAME_LENGTH 16 #define DRIVER_TAG 'HKOB' EXTERN_C char* PsGetProcessImageFileName(PEPROCESS process); char target_process_name[] = "oxygen.exe"; typedef NTSTATUS(*fn_ObReferenceObjectByHandleWithTag)( HANDLE Handle, ACCESS_MASK DesiredAccess, POBJECT_TYPE ObjectType, KPROCESSOR_MODE AccessMode, ULONG Tag, PVOID* Object, POBJECT_HANDLE_INFORMATION HandleInformation ); fn_ObReferenceObjectByHandleWithTag g_OriginalObReferenceObjectByHandleWithTag = NULL; // PTE Hook Framework #define MAX_G_BIT_RECORDS 128 #define MAX_HOOK_COUNT 64 #define PAGE_ALIGN(va) ((PVOID)((ULONG_PTR)(va) & ~0xFFF)) #define PDPTE_PS_BIT (1 << 7) #define PDE_PS_BIT (1 << 7) #define PTE_NX_BIT (1ULL << 63) #define CACHE_WB (6ULL << 3) // 页表结构定义 typedef struct _PAGE_TABLE { UINT64 LineAddress; union { struct { UINT64 present : 1; UINT64 write : 1; UINT64 user : 1; UINT64 write_through : 1; UINT64 cache_disable : 1; UINT64 accessed : 1; UINT64 dirty : 1; UINT64 pat : 1; UINT64 global : 1; UINT64 ignored_1 : 3; UINT64 page_frame_number : 36; UINT64 reserved_1 : 4; UINT64 ignored_2 : 7; UINT64 protection_key : 4; UINT64 execute_disable : 1; } flags; UINT64 value; }*PteAddress; union { struct { UINT64 present : 1; UINT64 write : 1; UINT64 user : 1; UINT64 write_through : 1; UINT64 cache_disable : 1; UINT64 accessed : 1; UINT64 dirty : 1; UINT64 large_page : 1; UINT64 global : 1; UINT64 ignored_2 : 3; UINT64 page_frame_number : 36; UINT64 reserved_1 : 4; UINT64 ignored_3 : 7; UINT64 protection_key : 4; UINT64 execute_disable : 1; } flags; UINT64 value; }*PdeAddress; union { struct { UINT64 present : 1; UINT64 write : 1; UINT64 user : 1; UINT64 write_through : 1; UINT64 cache_disable : 1; UINT64 accessed : 1; UINT64 ignored_1 : 1; UINT64 page_size : 1; UINT64 ignored_2 : 4; UINT64 page_frame_number : 36; UINT64 reserved_1 : 4; UINT64 ignored_3 : 7; UINT64 protection_key : 4; UINT64 execute_disable : 1; } flags; UINT64 value; }*PdpteAddress; UINT64* Pml4Address; BOOLEAN IsLargePage; BOOLEAN Is1GBPage; UINT64 OriginalPte; UINT64 OriginalPde; UINT64 OriginalPdpte; UINT64 OriginalPml4e; HANDLE ProcessId; } PAGE_TABLE, * PPAGE_TABLE; // G位信息记录结构体 typedef struct _G_BIT_INFO { void* AlignAddress; union { struct { UINT64 present : 1; UINT64 write : 1; UINT64 user : 1; UINT64 write_through : 1; UINT64 cache_disable : 1; UINT64 accessed : 1; UINT64 dirty : 1; UINT64 large_page : 1; UINT64 global : 1; UINT64 ignored_2 : 3; UINT64 page_frame_number : 36; UINT64 reserved_1 : 4; UINT64 ignored_3 : 7; UINT64 protection_key : 4; UINT64 execute_disable : 1; } flags; UINT64 value; }*PdeAddress; union { struct { UINT64 present : 1; UINT64 write : 1; UINT64 user : 1; UINT64 write_through : 1; UINT64 cache_disable : 1; UINT64 accessed : 1; UINT64 dirty : 1; UINT64 pat : 1; UINT64 global : 1; UINT64 ignored_1 : 3; UINT64 page_frame_number : 36; UINT64 reserved_1 : 4; UINT64 ignored_2 : 7; UINT64 protection_key : 4; UINT64 execute_disable : 1; } flags; UINT64 value; }*PteAddress; BOOLEAN IsLargePage; } G_BIT_INFO, * PG_BIT_INFO; typedef struct _HOOK_INFO { void* OriginalAddress; void* HookAddress; UINT8 OriginalBytes[20]; UINT8 HookBytes[20]; UINT32 HookLength; BOOLEAN IsHooked; HANDLE ProcessId; union { struct { UINT64 present : 1; UINT64 write : 1; UINT64 user : 1; UINT64 write_through : 1; UINT64 cache_disable : 1; UINT64 accessed : 1; UINT64 dirty : 1; UINT64 pat : 1; UINT64 global : 1; UINT64 ignored_1 : 3; UINT64 page_frame_number : 36; UINT64 reserved_1 : 4; UINT64 ignored_2 : 7; UINT64 protection_key : 4; UINT64 execute_disable : 1; } flags; UINT64 value; }*HookedPte; union { struct { UINT64 present : 1; UINT64 write : 1; UINT64 user : 1; UINT64 write_through : 1; UINT64 cache_disable : 1; UINT64 accessed : 1; UINT64 dirty : 1; UINT64 large_page : 1; UINT64 global : 1; UINT64 ignored_2 : 3; UINT64 page_frame_number : 36; UINT64 reserved_1 : 4; UINT64 ignored_3 : 7; UINT64 protection_key : 4; UINT64 execute_disable : 1; } flags; UINT64 value; }*HookedPde; } HOOK_INFO; class PteHookManager { public: bool fn_pte_inline_hook_bp_pg(HANDLE process_id, _Inout_ void** ori_addr, void* hk_addr); bool fn_remove_hook(HANDLE process_id, void* hook_addr); static PteHookManager* GetInstance(); HOOK_INFO* GetHookInfo() { return m_HookInfo; } char* GetTrampLinePool() { return m_TrampLinePool; } UINT32 GetHookCount() { return m_HookCount; } bool fn_resume_global_bits(void* align_addr); ~PteHookManager(); // 添加析构函数声明 private: bool WriteTrampolineInstruction(void* trampoline, const JMP_ABS& jmpCmd); void fn_add_g_bit_info(void* align_addr, void* pde_address, void* pte_address); bool fn_isolation_pagetable(UINT64 cr3_val, void* replace_align_addr, void* split_pde); bool fn_isolation_pages(HANDLE process_id, void* ori_addr); bool fn_split_large_pages(void* in_pde, void* out_pde); NTSTATUS get_page_table(UINT64 cr3, PAGE_TABLE& table); void* fn_pa_to_va(UINT64 pa); UINT64 fn_va_to_pa(void* va); __forceinline KIRQL DisableWriteProtection(); __forceinline void EnableWriteProtection(KIRQL oldIrql); void logger(const char* info, bool is_err, LONG err_code = 0); void PrintPageTableInfo(const PAGE_TABLE& table); void PrintHookInfo(const HOOK_INFO& hookInfo); void PrintGBitInfo(const G_BIT_INFO& gbitInfo); static constexpr SIZE_T MAX_HOOKS = 256; // 根据需求调整 G_BIT_INFO m_GbitRecords[MAX_G_BIT_RECORDS]; UINT32 m_GbitCount = 0; void* m_PteBase = 0; HOOK_INFO m_HookInfo[MAX_HOOK_COUNT] = { 0 }; DWORD m_HookCount = 0; char* m_TrampLinePool = nullptr; // 合并为一个声明 UINT32 m_PoolUsed = 0; static PteHookManager* m_Instance; }; PteHookManager* PteHookManager::m_Instance = nullptr; // 实现部分 __forceinline KIRQL PteHookManager::DisableWriteProtection() { KIRQL oldIrql = KeRaiseIrqlToDpcLevel(); UINT64 cr0 = __readcr0(); __writecr0(cr0 & ~0x10000); // 清除CR0.WP位 _mm_mfence(); return oldIrql; } __forceinline void PteHookManager::EnableWriteProtection(KIRQL oldIrql) { _mm_mfence(); UINT64 cr0 = __readcr0(); __writecr0(cr0 | 0x10000); // 设置CR0.WP位 KeLowerIrql(oldIrql); } void PteHookManager::logger(const char* info, bool is_err, LONG err_code) { if (is_err) { DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_ERROR_LEVEL, "[PTE_HOOK] ERROR: %s (0x%X)\n", info, err_code); } else { DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "[PTE_HOOK] INFO: %s\n", info); } } void PteHookManager::PrintPageTableInfo(const PAGE_TABLE& table) { DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "[PTE_HOOK] Page Table Info for VA: 0x%p\n", (void*)table.LineAddress); DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, " PML4E: 0x%llx (Address: 0x%p)\n", table.OriginalPml4e, table.Pml4Address); DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, " PDPTE: 0x%llx (Address: 0x%p), Is1GBPage: %d\n", table.OriginalPdpte, table.PdpteAddress, table.Is1GBPage); DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, " PDE: 0x%llx (Address: 0x%p), IsLargePage: %d\n", table.OriginalPde, table.PdeAddress, table.IsLargePage); if (!table.IsLargePage && !table.Is1GBPage) { DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, " PTE: 0x%llx (Address: 0x%p)\n", table.OriginalPte, table.PteAddress); } } void PteHookManager::PrintHookInfo(const HOOK_INFO& hookInfo) { DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "[PTE_HOOK] Hook Info:\n"); DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, " Original Address: 0x%p\n", hookInfo.OriginalAddress); DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, " Hook Address: 0x%p\n", hookInfo.HookAddress); DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, " Hook Length: %d\n", hookInfo.HookLength); DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, " Is Hooked: %d\n", hookInfo.IsHooked); // 打印原始字节 DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, " Original Bytes: "); for (UINT32 i = 0; i < sizeof(hookInfo.OriginalBytes); i++) { DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "%02X ", hookInfo.OriginalBytes[i]); } DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "\n"); // 打印Hook字节 DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, " Hook Bytes: "); for (UINT32 i = 0; i < sizeof(hookInfo.HookBytes); i++) { DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "%02X ", hookInfo.HookBytes[i]); } DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "\n"); } void PteHookManager::PrintGBitInfo(const G_BIT_INFO& gbitInfo) { DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "[PTE_HOOK] G-Bit Info:\n"); DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, " Align Address: 0x%p\n", gbitInfo.AlignAddress); DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, " IsLargePage: %d\n", gbitInfo.IsLargePage); if (gbitInfo.PdeAddress) { DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, " PDE: 0x%llx (Address: 0x%p)\n", gbitInfo.PdeAddress->value, gbitInfo.PdeAddress); } if (gbitInfo.PteAddress) { DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, " PTE: 0x%llx (Address: 0x%p)\n", gbitInfo.PteAddress->value, gbitInfo.PteAddress); } } void* PteHookManager::fn_pa_to_va(UINT64 pa) { PHYSICAL_ADDRESS physAddr; physAddr.QuadPart = pa; return MmGetVirtualForPhysical(physAddr); } UINT64 PteHookManager::fn_va_to_pa(void* va) { PHYSICAL_ADDRESS physAddr = MmGetPhysicalAddress(va); return physAddr.QuadPart; } NTSTATUS PteHookManager::get_page_table(UINT64 cr3_val, PAGE_TABLE& table) { UINT64 va = table.LineAddress; UINT64 pml4e_index = (va >> 39) & 0x1FF; UINT64 pdpte_index = (va >> 30) & 0x1FF; UINT64 pde_index = (va >> 21) & 0x1FF; UINT64 pte_index = (va >> 12) & 0x1FF; // PML4 UINT64 pml4_pa = cr3_val & ~0xFFF; UINT64* pml4_va = (UINT64*)fn_pa_to_va(pml4_pa); if (!pml4_va) return STATUS_INVALID_ADDRESS; table.Pml4Address = &pml4_va[pml4e_index]; table.OriginalPml4e = *table.Pml4Address; if (!(table.OriginalPml4e & 1)) return STATUS_ACCESS_VIOLATION; // PDPTE UINT64 pdpte_pa = table.OriginalPml4e & ~0xFFF; UINT64* pdpte_va = (UINT64*)fn_pa_to_va(pdpte_pa); if (!pdpte_va) return STATUS_INVALID_ADDRESS; table.PdpteAddress = (decltype(table.PdpteAddress))&pdpte_va[pdpte_index]; table.OriginalPdpte = table.PdpteAddress->value; table.Is1GBPage = (table.PdpteAddress->flags.page_size) ? TRUE : FALSE; if (!(table.OriginalPdpte & 1)) return STATUS_ACCESS_VIOLATION; if (table.Is1GBPage) return STATUS_SUCCESS; // PDE UINT64 pde_pa = table.OriginalPdpte & ~0xFFF; UINT64* pde_va = (UINT64*)fn_pa_to_va(pde_pa); if (!pde_va) return STATUS_INVALID_ADDRESS; table.PdeAddress = (decltype(table.PdeAddress))&pde_va[pde_index]; table.OriginalPde = table.PdeAddress->value; table.IsLargePage = (table.PdeAddress->flags.large_page) ? TRUE : FALSE; if (!(table.OriginalPde & 1)) return STATUS_ACCESS_VIOLATION; if (table.IsLargePage) return STATUS_SUCCESS; // PTE UINT64 pte_pa = table.OriginalPde & ~0xFFF; UINT64* pte_va = (UINT64*)fn_pa_to_va(pte_pa); if (!pte_va) return STATUS_INVALID_ADDRESS; table.PteAddress = (decltype(table.PteAddress))&pte_va[pte_index]; table.OriginalPte = table.PteAddress->value; if (!(table.OriginalPte & 1)) return STATUS_ACCESS_VIOLATION; // 打印页表信息 PrintPageTableInfo(table); return STATUS_SUCCESS; } bool PteHookManager::fn_split_large_pages(void* in_pde_ptr, void* out_pde_ptr) { auto in_pde = (decltype(PAGE_TABLE::PdeAddress))in_pde_ptr; auto out_pde = (decltype(PAGE_TABLE::PdeAddress))out_pde_ptr; DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "[PTE_HOOK] 正在拆分大页: 输入PDE=0x%llx, 输出PDE=0x%p\n", in_pde->value, out_pde); PHYSICAL_ADDRESS LowAddr = { 0 }, HighAddr = { 0 }; HighAddr.QuadPart = MAXULONG64; auto pt = (decltype(PAGE_TABLE::PteAddress))MmAllocateContiguousMemorySpecifyCache( PAGE_SIZE, LowAddr, HighAddr, LowAddr, MmNonCached); if (!pt) { logger("分配连续内存失败 (用于拆分大页)", true); return false; } UINT64 start_pfn = in_pde->flags.page_frame_number; for (int i = 0; i < 512; i++) { pt[i].value = 0; pt[i].flags.present = 1; pt[i].flags.write = in_pde->flags.write; pt[i].flags.user = in_pde->flags.user; pt[i].flags.write_through = in_pde->flags.write_through; pt[i].flags.cache_disable = in_pde->flags.cache_disable; pt[i].flags.accessed = in_pde->flags.accessed; pt[i].flags.dirty = in_pde->flags.dirty; pt[i].flags.global = 0; pt[i].flags.page_frame_number = start_pfn + i; } out_pde->value = in_pde->value; out_pde->flags.large_page = 0; out_pde->flags.page_frame_number = fn_va_to_pa(pt) >> 12; DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "[PTE_HOOK] 大页拆分完成: 新PTE表物理地址=0x%llx\n", fn_va_to_pa(pt)); return true; } bool PteHookManager::fn_isolation_pagetable(UINT64 cr3_val, void* replace_align_addr, void* split_pde_ptr) { PHYSICAL_ADDRESS LowAddr = { 0 }, HighAddr = { 0 }; HighAddr.QuadPart = MAXULONG64; DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "[PTE_HOOK] 开始隔离页表: CR3=0x%llx, 地址=0x%p\n", cr3_val, replace_align_addr); auto Va4kb = (UINT64*)MmAllocateContiguousMemorySpecifyCache(PAGE_SIZE, LowAddr, HighAddr, LowAddr, MmNonCached); auto VaPt = (UINT64*)MmAllocateContiguousMemorySpecifyCache(PAGE_SIZE, LowAddr, HighAddr, LowAddr, MmNonCached); auto VaPdt = (UINT64*)MmAllocateContiguousMemorySpecifyCache(PAGE_SIZE, LowAddr, HighAddr, LowAddr, MmNonCached); auto VaPdpt = (UINT64*)MmAllocateContiguousMemorySpecifyCache(PAGE_SIZE, LowAddr, HighAddr, LowAddr, MmNonCached); if (!VaPt || !Va4kb || !VaPdt || !VaPdpt) { if (VaPt) MmFreeContiguousMemory(VaPt); if (Va4kb) MmFreeContiguousMemory(Va4kb); if (VaPdt) MmFreeContiguousMemory(VaPdt); if (VaPdpt) MmFreeContiguousMemory(VaPdpt); logger("分配连续内存失败 (用于隔离页表)", true); return false; } PAGE_TABLE Table = { 0 }; Table.LineAddress = (UINT64)replace_align_addr; NTSTATUS status = get_page_table(cr3_val, Table); if (!NT_SUCCESS(status)) { MmFreeContiguousMemory(VaPt); MmFreeContiguousMemory(Va4kb); MmFreeContiguousMemory(VaPdt); MmFreeContiguousMemory(VaPdpt); logger("获取页表信息失败", true, status); return false; } UINT64 pte_index = (Table.LineAddress >> 12) & 0x1FF; UINT64 pde_index = (Table.LineAddress >> 21) & 0x1FF; UINT64 pdpte_index = (Table.LineAddress >> 30) & 0x1FF; UINT64 pml4e_index = (Table.LineAddress >> 39) & 0x1FF; memcpy(Va4kb, replace_align_addr, PAGE_SIZE); if (Table.IsLargePage && split_pde_ptr) { auto split_pde = (decltype(PAGE_TABLE::PdeAddress))split_pde_ptr; memcpy(VaPt, (void*)(split_pde->flags.page_frame_number << 12), PAGE_SIZE); } else { memcpy(VaPt, (void*)(Table.PdeAddress->flags.page_frame_number << 12), PAGE_SIZE); } memcpy(VaPdt, (void*)(Table.PdpteAddress->flags.page_frame_number << 12), PAGE_SIZE); memcpy(VaPdpt, (void*)(Table.Pml4Address[pml4e_index] & ~0xFFF), PAGE_SIZE); auto new_pte = (decltype(PAGE_TABLE::PteAddress))VaPt; new_pte[pte_index].flags.page_frame_number = fn_va_to_pa(Va4kb) >> 12; auto new_pde = (decltype(PAGE_TABLE::PdeAddress))VaPdt; new_pde[pde_index].value = Table.OriginalPde; new_pde[pde_index].flags.large_page = 0; new_pde[pde_index].flags.page_frame_number = fn_va_to_pa(VaPt) >> 12; auto new_pdpte = (decltype(PAGE_TABLE::PdpteAddress))VaPdpt; new_pdpte[pdpte_index].flags.page_frame_number = fn_va_to_pa(VaPdt) >> 12; auto new_pml4 = (UINT64*)fn_pa_to_va(cr3_val & ~0xFFF); new_pml4[pml4e_index] = (new_pml4[pml4e_index] & 0xFFF) | (fn_va_to_pa(VaPdpt) & ~0xFFF); __invlpg(replace_align_addr); DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "[PTE_HOOK] 页表隔离完成: 新PFN=0x%llx\n", fn_va_to_pa(Va4kb) >> 12); return true; } bool PteHookManager::fn_isolation_pages(HANDLE process_id, void* ori_addr) { PEPROCESS Process; if (!NT_SUCCESS(PsLookupProcessByProcessId(process_id, &Process))) { logger("查找进程失败", true); return false; } DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "[PTE_HOOK] 开始隔离页面: PID=%d, 地址=0x%p\n", (ULONG)(ULONG_PTR)process_id, ori_addr); KAPC_STATE ApcState; KeStackAttachProcess(Process, &ApcState); void* AlignAddr = PAGE_ALIGN(ori_addr); PAGE_TABLE Table = { 0 }; Table.LineAddress = (UINT64)AlignAddr; UINT64 target_cr3 = *(UINT64*)((UCHAR*)Process + 0x28); if (!NT_SUCCESS(get_page_table(target_cr3, Table))) { KeUnstackDetachProcess(&ApcState); ObDereferenceObject(Process); logger("获取目标进程页表失败", true); return false; } bool success = false; decltype(PAGE_TABLE::PdeAddress) split_pde = nullptr; if (Table.IsLargePage) { split_pde = (decltype(PAGE_TABLE::PdeAddress))ExAllocatePoolWithTag(NonPagedPool, sizeof(*split_pde), 'pdeS'); if (!split_pde || !fn_split_large_pages(Table.PdeAddress, split_pde)) { if (split_pde) ExFreePoolWithTag(split_pde, 'pdeS'); KeUnstackDetachProcess(&ApcState); ObDereferenceObject(Process); logger("拆分大页失败", true); return false; } if (Table.PdeAddress->flags.global) { Table.PdeAddress->flags.global = 0; fn_add_g_bit_info(AlignAddr, Table.PdeAddress, nullptr); DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "[PTE_HOOK] 清除大页G位: PDE=0x%llx\n", Table.PdeAddress->value); } success = fn_isolation_pagetable(target_cr3, AlignAddr, split_pde); } else if (Table.PteAddress && Table.PteAddress->flags.global) { Table.PteAddress->flags.global = 0; fn_add_g_bit_info(AlignAddr, nullptr, Table.PteAddress); DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "[PTE_HOOK] 清除PTE G位: PTE=0x%llx\n", Table.PteAddress->value); success = fn_isolation_pagetable(target_cr3, AlignAddr, split_pde); if (split_pde) ExFreePoolWithTag(split_pde, 'pdeS'); DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "[PTE_HOOK] 页表状态: IsLargePage=%d, Is1GBPage=%d\n", Table.IsLargePage, Table.Is1GBPage); if (Table.PteAddress) { DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "[PTE_HOOK] PTE 值: 0x%llx (G位=%d)\n", Table.PteAddress->value, Table.PteAddress->flags.global); } KeUnstackDetachProcess(&ApcState); ObDereferenceObject(Process); if (success) { DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "[PTE_HOOK] 页面隔离成功\n"); } else { logger("页面隔离失败", true); } return success; } KeUnstackDetachProcess(&ApcState); ObDereferenceObject(Process); return true; } bool PteHookManager::WriteTrampolineInstruction(void* trampoline, const JMP_ABS& jmpCmd) { if (!MmIsAddressValid(trampoline)) { DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_ERROR_LEVEL, "[PTE_HOOK] 错误: 内存地址无效 (VA=%p)\n", trampoline); return false; } PHYSICAL_ADDRESS physAddr = MmGetPhysicalAddress(trampoline); if (physAddr.QuadPart == 0) { DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_ERROR_LEVEL, "[PTE_HOOK] 错误: 无法获取物理地址 (VA=%p)\n", trampoline); return false; } KIRQL oldIrql = KeRaiseIrqlToDpcLevel(); BOOLEAN wpEnabled = (__readcr0() & CR0_WP); if (wpEnabled) { __writecr0(__readcr0() & ~CR0_WP); _mm_mfence(); } PMDL pMdl = IoAllocateMdl(trampoline, sizeof(JMP_ABS), FALSE, FALSE, NULL); if (!pMdl) { if (wpEnabled) __writecr0(__readcr0() | CR0_WP); KeLowerIrql(oldIrql); return false; } NTSTATUS status = STATUS_SUCCESS; __try { MmBuildMdlForNonPagedPool(pMdl); MmProtectMdlSystemAddress(pMdl, PAGE_READWRITE); // 正确写入 FF25 00000000 和 8字节地址 memcpy(trampoline, jmpCmd.opcode, 6); // FF25 00000000 *(ULONG64*)((BYTE*)trampoline + 6) = jmpCmd.address; // 地址写入 RIP+0 的位置 _mm_sfence(); _mm_clflush(trampoline); _mm_clflush((BYTE*)trampoline + 8); __invlpg(trampoline); _mm_mfence(); } __except (EXCEPTION_EXECUTE_HANDLER) { status = GetExceptionCode(); DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_ERROR_LEVEL, "[PTE_HOOK] 异常: 写入跳板失败 (代码: 0x%X)\n", status); } IoFreeMdl(pMdl); if (wpEnabled) { __writecr0(__readcr0() | CR0_WP); _mm_mfence(); } KeLowerIrql(oldIrql); if (!NT_SUCCESS(status)) return false; // 验证写入结果 if (*(USHORT*)trampoline != 0x25FF || *(ULONG64*)((BYTE*)trampoline + 6) != jmpCmd.address) { DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_ERROR_LEVEL, "[PTE_HOOK] 验证失败: 跳板内容不匹配\n" " 预期: FF25 [%p]\n" " 实际: %02X%02X %02X%02X%02X%02X [%p]\n", jmpCmd.address, ((BYTE*)trampoline)[0], ((BYTE*)trampoline)[1], ((BYTE*)trampoline)[2], ((BYTE*)trampoline)[3], ((BYTE*)trampoline)[4], ((BYTE*)trampoline)[5], *(ULONG64*)((BYTE*)trampoline + 6)); return false; } return true; } bool PteHookManager::fn_pte_inline_hook_bp_pg(HANDLE process_id, _Inout_ void** ori_addr, void* hk_addr) { // [1] 页表隔离 if (!fn_isolation_pages(process_id, *ori_addr)) { return false; } // [2] 获取目标进程上下文 PEPROCESS targetProcess; if (!NT_SUCCESS(PsLookupProcessByProcessId(process_id, &targetProcess))) { return false; } KAPC_STATE apcState; KeStackAttachProcess(targetProcess, &apcState); // [3] 构造跳转指令 JMP_ABS jmpCmd = {}; memcpy(jmpCmd.opcode, "\xFF\x25\x00\x00\x00\x00", 6); // FF25 00000000 jmpCmd.address = reinterpret_cast(hk_addr); // [4] 直接写入被隔离页 void* targetFunc = *ori_addr; bool success = false; // 禁用写保护 KIRQL oldIrql = DisableWriteProtection(); __try { // 保存原始指令 (用于卸载) RtlCopyMemory(m_HookInfo[m_HookCount].OriginalBytes, targetFunc, sizeof(jmpCmd)); // 写入跳转指令到隔离页 memcpy(targetFunc, &jmpCmd, 6); *(ULONG64*)((BYTE*)targetFunc + 6) = jmpCmd.address; // 刷新缓存 _mm_sfence(); _mm_clflush(targetFunc); __invlpg(targetFunc); _mm_mfence(); success = true; DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "[PTE_HOOK] 直接写入隔离页成功: VA=%p -> Hook=%p\n", targetFunc, hk_addr); } __except (EXCEPTION_EXECUTE_HANDLER) { DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_ERROR_LEVEL, "[PTE_HOOK] 写入隔离页异常: 0x%X\n", GetExceptionCode()); } // 恢复写保护 EnableWriteProtection(oldIrql); // [5] 记录Hook信息 if (success) { m_HookInfo[m_HookCount].OriginalAddress = targetFunc; m_HookInfo[m_HookCount].HookAddress = hk_addr; m_HookInfo[m_HookCount].ProcessId = process_id; m_HookInfo[m_HookCount].IsHooked = TRUE; m_HookCount++; } // [6] 清理 KeUnstackDetachProcess(&apcState); ObDereferenceObject(targetProcess); return success; } // 析构函数清理资源 PteHookManager::~PteHookManager() { if (m_TrampLinePool) { MmFreeContiguousMemory(m_TrampLinePool); m_TrampLinePool = nullptr; } } bool PteHookManager::fn_remove_hook(HANDLE process_id, void* hook_addr) { DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "[PTE_HOOK] 尝试移除Hook: Hook地址=0x%p\n", hook_addr); for (UINT32 i = 0; i < m_HookCount; i++) { if (m_HookInfo[i].HookAddress == hook_addr && m_HookInfo[i].IsHooked) { DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "[PTE_HOOK] 找到匹配的Hook: 原始地址=0x%p\n", m_HookInfo[i].OriginalAddress); KIRQL oldIrql = DisableWriteProtection(); memcpy(m_HookInfo[i].OriginalAddress, m_HookInfo[i].OriginalBytes, sizeof(m_HookInfo[i].OriginalBytes)); EnableWriteProtection(oldIrql); m_HookInfo[i].IsHooked = FALSE; DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "[PTE_HOOK] Hook已成功移除\n"); return true; } } logger("未找到匹配的Hook", true); return false; } void PteHookManager::fn_add_g_bit_info(void* align_addr, void* pde_address, void* pte_address) { if (m_GbitCount >= MAX_G_BIT_RECORDS) { logger("达到最大G位记录数量限制", true); return; } PG_BIT_INFO record = &m_GbitRecords[m_GbitCount++]; record->AlignAddress = align_addr; record->PdeAddress = (decltype(G_BIT_INFO::PdeAddress))pde_address; record->PteAddress = (decltype(G_BIT_INFO::PteAddress))pte_address; record->IsLargePage = (pde_address && ((decltype(PAGE_TABLE::PdeAddress))pde_address)->flags.large_page); // 打印G位信息 PrintGBitInfo(*record); } bool PteHookManager::fn_resume_global_bits(void* align_addr) { KIRQL oldIrql = DisableWriteProtection(); bool found = false; DbgPrintEx(DPFLTR_ERROR_LEVEL, DPFLTR_INFO_LEVEL, "[PTE_HOOK] 开始恢复G位: 对齐地址=0x%p\n", align_addr); for (UINT32 i = 0; i < m_GbitCount; i++) { PG_BIT_INFO record = &m_GbitRecords[i]; if (align_addr && record->AlignAddress != align_addr) continue; if (record->PteAddress) { record->PteAddress->flags.global = 1; __invlpg(record->AlignAddress); DbgPrintEx(DPFLTR_ERROR_LEVEL, DPFLTR_INFO_LEVEL, " 恢复PTE G位: PTE=0x%llx, 地址=0x%p\n", record->PteAddress->value, record->AlignAddress); } if (record->PdeAddress) { record->PdeAddress->flags.global = 1; if (record->IsLargePage) { __invlpg(record->AlignAddress); } DbgPrintEx(DPFLTR_ERROR_LEVEL, DPFLTR_INFO_LEVEL, " 恢复PDE G位: PDE=0x%llx, 地址=0x%p, 大页=%d\n", record->PdeAddress->value, record->AlignAddress, record->IsLargePage); } found = true; if (align_addr) break; } EnableWriteProtection(oldIrql); if (found) { DbgPrintEx(DPFLTR_ERROR_LEVEL, DPFLTR_INFO_LEVEL, "[PTE_HOOK] G位恢复完成\n"); } else { logger("未找到匹配的G位记录", true); } return found; } PteHookManager* PteHookManager::GetInstance() { if (!m_Instance) { m_Instance = static_cast( ExAllocatePoolWithTag(NonPagedPool, sizeof(PteHookManager), 'tpHk')); if (m_Instance) { RtlZeroMemory(m_Instance, sizeof(PteHookManager)); DbgPrintEx(DPFLTR_ERROR_LEVEL, DPFLTR_INFO_LEVEL, "[PTE_HOOK] PTE Hook管理器实例已创建: 地址=0x%p\n", m_Instance); } else { DbgPrintEx(DPFLTR_ERROR_LEVEL, DPFLTR_ERROR_LEVEL, "[PTE_HOOK] 创建PTE Hook管理器实例失败\n"); } } return m_Instance; } // 全局PTE Hook管理器实例 PteHookManager* g_PteHookManager = nullptr; // 辅助函数:检查是否为目标进程 BOOLEAN IsTargetProcess(CHAR* imageName) { CHAR currentName[16]; // 复制到本地缓冲区并确保 NULL 终止 RtlCopyMemory(currentName, imageName, 16); currentName[15] = '\0'; // 确保终止 // 修剪尾部空格 for (int i = 15; i >= 0; i--) { if (currentName[i] == ' ') currentName[i] = '\0'; else if (currentName[i] != '\0') break; } return (strcmp(currentName, target_process_name) == 0); } // Hook 函数 NTSTATUS MyObReferenceObjectByHandleWithTag( HANDLE Handle, ACCESS_MASK DesiredAccess, POBJECT_TYPE ObjectType, KPROCESSOR_MODE AccessMode, ULONG Tag, PVOID* Object, POBJECT_HANDLE_INFORMATION HandleInformation ) { PEPROCESS currentProcess = PsGetCurrentProcess(); CHAR* imageName = PsGetProcessImageFileName(currentProcess); DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "[!] [HookFunction] 进入 Hook 函数! 当前进程: %s\n", imageName); __debugbreak(); // 强制中断,确认是否执行到这里 if (IsTargetProcess(imageName)) { DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "[!] [HookFunction] 拒绝访问目标进程 PID=%d\n", HandleToULong(PsGetCurrentProcessId())); return STATUS_ACCESS_DENIED; } return g_OriginalObReferenceObjectByHandleWithTag( Handle, DesiredAccess, ObjectType, AccessMode, Tag, Object, HandleInformation ); } NTSTATUS InstallHook() { UNICODE_STRING funcName; RtlInitUnicodeString(&funcName, L"ObReferenceObjectByHandleWithTag"); g_OriginalObReferenceObjectByHandleWithTag = (fn_ObReferenceObjectByHandleWithTag)MmGetSystemRoutineAddress(&funcName); if (!g_OriginalObReferenceObjectByHandleWithTag) { DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_ERROR_LEVEL, "[-] [InstallHook] 找不到 ObReferenceObjectByHandleWithTag\n"); return STATUS_NOT_FOUND; } __debugbreak(); DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "[+] [InstallHook] 找到目标函数地址: %p\n", g_OriginalObReferenceObjectByHandleWithTag); void* targetFunc = (void*)g_OriginalObReferenceObjectByHandleWithTag; void* hookFunc = (void*)MyObReferenceObjectByHandleWithTag; if (!g_PteHookManager->fn_pte_inline_hook_bp_pg(targetProcessId, &targetFunc, hookFunc)) { DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_ERROR_LEVEL, "[-] [InstallHook] PTE Hook 安装失败\n"); return STATUS_UNSUCCESSFUL; } g_OriginalObReferenceObjectByHandleWithTag = (fn_ObReferenceObjectByHandleWithTag)targetFunc; DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "[+] [InstallHook] Hook 成功安装. 跳板地址: %p\n", targetFunc); __debugbreak(); // 强制中断,验证是否执行到这里 return STATUS_SUCCESS; } // 移除 Hook VOID RemoveHook() { if (g_OriginalObReferenceObjectByHandleWithTag && g_PteHookManager) { g_PteHookManager->fn_remove_hook(PsGetCurrentProcessId(), (void*)MyObReferenceObjectByHandleWithTag); } } // 工作线程函数 VOID InstallHookWorker(PVOID Context) { targetProcessId = (HANDLE)Context; DbgPrint("[+] Worker thread started for hook installation on PID: %d\n", HandleToULong(targetProcessId)); InstallHook(); PsTerminateSystemThread(STATUS_SUCCESS); } // 进程创建回调 VOID ProcessNotifyCallback( _In_ HANDLE ParentId, _In_ HANDLE ProcessId, _In_ BOOLEAN Create ) { UNREFERENCED_PARAMETER(ParentId); if (Create) { PEPROCESS process = NULL; if (NT_SUCCESS(PsLookupProcessByProcessId(ProcessId, &process))) { CHAR* imageName = PsGetProcessImageFileName(process); CHAR currentName[16]; RtlCopyMemory(currentName, imageName, 16); currentName[15] = '\0'; for (int i = 15; i >= 0; i--) { if (currentName[i] == ' ') currentName[i] = '\0'; else if (currentName[i] != '\0') break; } if (strcmp(currentName, target_process_name) == 0) { DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "[+] [ProcessNotifyCallback] 目标进程 %s 创建 (PID: %d)\n", currentName, HandleToULong(ProcessId)); HANDLE threadHandle; NTSTATUS status = PsCreateSystemThread( &threadHandle, THREAD_ALL_ACCESS, NULL, NULL, NULL, InstallHookWorker, (PVOID)ProcessId // 关键:传递目标进程ID ); if (NT_SUCCESS(status)) { ZwClose(threadHandle); DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "[+] [ProcessNotifyCallback] 工作线程已创建\n"); } else { DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_ERROR_LEVEL, "[-] [ProcessNotifyCallback] 创建线程失败: 0x%X\n", status); } } ObDereferenceObject(process); } } } // 驱动卸载函数 VOID DriverUnload(PDRIVER_OBJECT DriverObject) { UNREFERENCED_PARAMETER(DriverObject); DbgPrint("[+] Driver unloading...\n"); // 移除进程通知回调 PsSetCreateProcessNotifyRoutineEx((PCREATE_PROCESS_NOTIFY_ROUTINE_EX)ProcessNotifyCallback, TRUE); // 移除Hook RemoveHook(); // 清理PTE Hook资源 if (g_PteHookManager) { DbgPrint("[PTE_HOOK] Cleaning up PTE...\n"); // 恢复所有被修改的G位 g_PteHookManager->fn_resume_global_bits(nullptr); // 移除所有活动的Hook HOOK_INFO* hookInfo = g_PteHookManager->GetHookInfo(); UINT32 hookCount = g_PteHookManager->GetHookCount(); for (UINT32 i = 0; i < hookCount; i++) { if (hookInfo[i].IsHooked) { g_PteHookManager->fn_remove_hook(PsGetCurrentProcessId(), hookInfo[i].HookAddress); } } // 释放跳板池内存 char* trampLinePool = g_PteHookManager->GetTrampLinePool(); if (trampLinePool) { ExFreePoolWithTag(trampLinePool, 'JmpP'); } // 释放管理器实例 ExFreePoolWithTag(g_PteHookManager, 'tpHk'); g_PteHookManager = nullptr; } DbgPrint("[+] Driver unloaded successfully\n"); } extern "C" NTSTATUS DriverEntry(PDRIVER_OBJECT DriverObject, PUNICODE_STRING RegistryPath) { UNREFERENCED_PARAMETER(RegistryPath); DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "[+] [DriverEntry] 驱动加载开始\n"); DriverObject->DriverUnload = DriverUnload; g_PteHookManager = PteHookManager::GetInstance(); if (!g_PteHookManager) { DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_ERROR_LEVEL, "[-] [DriverEntry] 初始化 PteHookManager 失败\n"); return STATUS_INSUFFICIENT_RESOURCES; } NTSTATUS status = PsSetCreateProcessNotifyRoutineEx((PCREATE_PROCESS_NOTIFY_ROUTINE_EX)ProcessNotifyCallback, FALSE); if (!NT_SUCCESS(status)) { DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_ERROR_LEVEL, "[-] [DriverEntry] 注册进程通知失败 (0x%X)\n", status); return status; } DbgPrintEx(DPFLTR_IHVDRIVER_ID, DPFLTR_INFO_LEVEL, "[+] [DriverEntry] 驱动加载成功\n"); return STATUS_SUCCESS; }

大家在看

recommend-type

信贷管理系统需求规格说明书

目录 第1章 1.1 背景概括 …………………………………………………… 5 1.2 可行性分析……………………………………………………… 7 (1) 经济可行性………………………………………………… 7 (2)操作可行性………………………………………………… 8 (3)技术可行性………………………………………………… (4)系统特点……………………………………………………… 8 第2章 需求分析………………………………………………………… 9 4.1 功能……………………………………………… 11 4.2 目标 4.3 业务流程设计……………………………………………… 11 (1) 存款管理……………………………………………………… 9 (2) 贷款管理…………………………………………………… 10 4.4 业务流程重组 4.5 数据流程图………………………………………………… 13 第3章 总体设计………………………………………………………… 11 第6章 详细设计………………………………………………………… 16 6.1 模块设计 ……………………………………………………… 16 6.2 代码设计……………………………………………………… 20 6.3 输入输出设计 ……………………………………………… 20 结束语 ……………………………………………………………………… 21 致谢 …………………………………………………………………………… 22 参考文献……………………………………………………………………… 23
recommend-type

genetic-algorithms:(python)01背包问题和平衡分配问题的遗传算法

这是遗传算法的python实现,用于对0/1背包问题进行组合优化,并将其与本地搜索(爬坡)进行混合,以解决平衡分配问题。
recommend-type

基于赛灵思的FPGA 支持 10-100Gbit/s 的 TCP/IP、RoCEv2、UDP/IP 的可扩展网络堆栈

赛灵思 Vivado 2019.1 cmake 3.0 或更高版本 支持的板 赛灵思VC709 赛灵思VCU118 阿尔法数据ADM-PCIE-7V3 文件内有详细说明
recommend-type

keras-gp:硬+高斯过程

Keras的高斯过程 KGP通过高斯过程(GP)层扩展了 。 它允许人们使用由Keras构建的网络构造的内核构建灵活的GP模型。 模型的结构化部分(神经网络)在或上运行。 GP层使用基于库的自定义后端,并基于和构建。 可以使用批量或半随机优化方法分阶段或联合训练模型(请参见)。 有关深度内核学习和KISS-GP的其他资源和教程,请参阅 KGP兼容:Python 2.7-3.5 。 特别是,此程序包实现了本文中描述的方法: 学习具有递归结构的可扩展深核Maruan Al-Shedivat,Andrew Gordon Wilson,Yunus Saatchi,Huzhiting Hu,Eric P.Xing ,2017。 入门 KGP允许使用功能性API以与Keras相同的方式构建模型。 例如,只需几行代码即可构建和编译一个简单的GP-RNN模型: from keras . layer
recommend-type

易语言CPU优化

易语言CPU优化,改下进程名字即可。。一起学习,一起进步。

最新推荐

recommend-type

arc-agents-0.86.0.jar

arc-agents-0.86.0.jar
recommend-type

tiffany-iosarm32-0.1.3-metadata.jar

tiffany-iosarm32-0.1.3-metadata.jar
recommend-type

Odoo与WooCommerce双向数据同步解决方案

在探讨Odoo与WooCommerce连接器模块之前,需要先了解几个关键的IT概念,比如Odoo,WooCommerce,ERP系统,以及如何将它们通过一个名为“connector-woocommerce”的Python模块整合在一起。 ### Odoo与WooCommerce的连接 **Odoo** 是一个全面的企业资源规划(ERP)软件包,用于管理企业中的所有业务流程。它包含了一系列的模块,覆盖了从会计、库存管理到电子商务和客户关系管理的各个方面。Odoo强大的模块化系统使其可以高度定制化,以适应不同企业的特定需求。 **WooCommerce** 是一个开源的电子商务解决方案,主要设计用于集成WordPress,是目前使用最广泛的电子商务平台之一。它能够提供完整的在线商店功能,并且可以通过众多插件进行扩展,以满足不同的业务需求。 ### ERP系统与电子商务的整合 在现代商务环境中,ERP系统和电子商务平台需要紧密集成。ERP系统负责内部业务流程的管理,而电子商务平台则负责与客户的直接交互,包括产品展示、订单处理、支付处理等。当两者被整合在一起时,它们可以提供无缝的工作流,例如实时库存同步、自动更新订单状态、以及统一的客户数据管理。 ### WooCommerceERPconnect **WooCommerceERPconnect**,也即“connector-woocommerce”,是一款专为连接Odoo ERP系统与WooCommerce电子商务平台设计的双向连接器。这个模块能够使得Odoo中的产品信息、订单信息、库存信息以及客户信息能够实时地同步到WooCommerce中。同样,从WooCommerce平台接收到的订单也可以实时地传输并反映到Odoo系统内。这样一来,企业可以确保他们的ERP系统和在线商店始终保持信息的一致性,极大地提高了业务效率和客户满意度。 ### 连接器的兼容性和实现方式 提到该连接器与**OpenERP 8.0** 和 **WooCommerce 2.4.x** 100% 兼容,说明开发团队在设计时考虑了特定版本间的兼容性问题,确保了连接器能够在这些版本上正常工作。考虑到Odoo是由OpenERP发展而来,它强调了此连接器是为最新版本的Odoo所设计,以确保能利用Odoo提供的最新功能。 **Python** 在这里扮演了重要的角色,因为Python是Odoo的开发语言,并且在连接器模块中也广泛使用。Python的易用性、灵活性以及丰富的库支持,使得开发者能够快速开发出功能强大的模块。该连接器模块很可能使用了Python进行后端逻辑处理,借助Odoo提供的API与WooCommerce进行数据交互。 ### 文件压缩包内容 关于提供的**connector-woocommerce-8.0** 压缩包,这显然是一个专为Odoo版本8.0设计的WooCommerce连接器。文件包内可能包括了所有必要的安装文件、配置脚本、以及可能的文档说明。安装这样的模块通常需要对Odoo有一定的了解,包括如何部署新模块,以及如何配置模块以确保其能够正确与WooCommerce通信。 ### 实施电子商务与ERP整合的考虑因素 企业实施ERP与电子商务整合时,需考虑以下因素: - **数据同步**:确保产品数据、库存数据、价格、订单信息等在Odoo和WooCommerce之间实时准确地同步。 - **安全性和稳定性**:在数据传输和处理过程中保障数据安全,并确保整合后的系统稳定运行。 - **扩展性**:随着业务的扩展,连接器需要能够适应更多的用户、更多的产品和更复杂的数据交互。 - **维护和更新**:连接器需要定期维护和更新,以适应Odoo和WooCommerce的版本迭代。 在进行整合时,可能需要进行定制开发以适应特定的业务逻辑和工作流程。这往往涉及到对Odoo或WooCommerce API的深入了解,并可能需要调整连接器的源代码以满足特殊需求。 ### 总结 通过Odoo连接器WooCommerce模块的使用,企业可以有效地整合其ERP系统与电子商务平台,实现数据的一体化管理,提高工作效率,优化客户体验。而这一切的实现,都离不开对Odoo、WooCommerce以及连接器背后的技术栈(如Python)的深入理解。
recommend-type

Linux系统运维知识大揭秘

### Linux 系统运维知识大揭秘 #### 1. 标准输入、输出与错误 在 Linux 系统中,标准输入(STDIN)、标准输出(STDOUT)和标准错误(STDERR)是非常基础且重要的概念。 |名称|默认目标|重定向使用|文件描述符编号| | ---- | ---- | ---- | ---- | |STDIN|计算机键盘|< (等同于 0<)|0| |STDOUT|计算机显示器|> (等同于 1>)|1| |STDERR|计算机显示器|2>|2| 常见的 Bash 重定向器如下: |重定向器|解释| | ---- | ---- | |> (等同于 1>)|重定向 STDOUT。
recommend-type

int arr1[4] = {1,2,3,4}; int arr2[4] = { 1,2 }; int arr[4] = {0];//所有元素为0 static int arr3[3]; int arr4[4]; cout << "arr1:"<<arr1[0] << arr1[1] << arr1[2] << arr1[3] << endl; cout << "arr2:" << arr2[0] << arr2[1] << arr2[2] << arr2[3] << endl; cout << "arr3:" << arr3[0] << arr3[1] << arr3[2] << arr3[3] << endl; cout << "arr4:" << arr4[0] << arr4[1] << arr4[2] << arr4[3] << endl;

### C++ 中数组的初始化与未初始化元素的默认值行为 在 C++ 中,数组的初始化行为取决于其类型(如内置数组、`std::array` 或 `std::vector`)以及使用的初始化语法。以下是对不同情况的详细分析。 #### 内置数组的初始化与默认值 对于内置数组(如 `int arr[10];`),如果未显式初始化,则其元素的值是未定义的。这意味着这些元素可能包含任意的垃圾值,具体取决于编译器和运行环境。例如: ```cpp int arr[10]; // 未初始化,元素值未定义 ``` 如果希望所有元素初始化为零,可以使用值初始化语法: ```cpp int arr[
recommend-type

基于Lerna和Module Federation的Micro前端架构

### 知识点一:微前端架构(microfrontend) 微前端是一种架构设计风格,它将一个大型前端应用拆分成多个较小的独立前端应用,每个独立的前端应用可以被单独开发、部署和扩展。微前端架构有助于团队的独立工作,降低了大规模项目的技术债务,提高了系统的可维护性和可扩展性。 #### 关键概念: 1. **独立自治:** 每个微前端都可以独立于整体应用进行开发、测试和部署。 2. **技术多样性:** 不同的微前端可以使用不同的前端技术栈。 3. **共享基础设施:** 为了保持一致性,微前端之间可以共享工具、框架和库。 4. **通信机制:** 微前端之间需要有通信机制来协调它们的行为。 ### 知识点二:Lerna Lerna 是一个优化了多包管理的 JavaScript 库,专用于维护具有多个包的大型JavaScript项目。Lerna 可以帮助开发者在一个仓库中管理多个包,减少重复的构建步骤,并且在包之间共享依赖。 #### 核心功能: 1. **作用域包管理:** Lerna 可以帮助开发者创建和管理仓库中的本地作用域包。 2. **自动链接:** 自动链接内部依赖,减少开发中的配置复杂性。 3. **版本管理:** 方便地处理多包项目的版本发布和变更。 4. **并行构建:** 加速构建过程,因为可以并行地构建多个包。 ### 知识点三:Module Federation Module Federation 是 Webpack 5 引入的一个实验性功能,它允许运行时从多个构建中动态加载代码。这使得在不同的前端应用之间共享模块成为可能,这是实现微前端架构的关键技术。 #### 关键特性: 1. **远程和本地模块共享:** 它不仅可以在应用程序之间共享模块,还可以在应用程序内部进行模块共享。 2. **代码分割:** 可以实现更好的代码分割和懒加载。 3. **独立部署:** 允许独立部署,由于模块是动态加载的,对应用程序的更改不需要重新部署整个应用。 4. **热模块替换:** 可以在不刷新页面的情况下替换模块。 ### 知识点四:Yarn 和 npm 包管理器 Yarn 和 npm 是 JavaScript 社区中最流行的两个包管理器,它们用于安装、更新和管理项目依赖。 #### Yarn: 1. **速度:** Yarn 在安装依赖时具有更快的速度。 2. **确定性:** 通过使用 lock 文件确保依赖安装的一致性。 3. **离线缓存:** Yarn 缓存了安装的每个包,以便在离线模式下工作。 #### npm: 1. **广泛性:** npm 是 JavaScript 社区中最广泛使用的包管理器。 2. **生态系统:** npm 拥有一个庞大且活跃的生态系统,提供了大量可用的包。 ### 知识点五:monorepo Monorepo 是一种源代码管理策略,其中所有项目代码都位于同一个仓库中。与多仓库(每个项目一个仓库)相反,monorepo 管理方式可以在整个项目的上下文中共享和管理代码。 #### monorepo 的优势: 1. **代码共享:** 项目之间可以共享代码库,便于代码复用。 2. **集中管理:** 统一的依赖管理和版本控制。 3. **项目间依赖清晰:** 项目间依赖关系透明,便于维护和开发。 ### 知识点六:工作区(Workspaces) 工作区是 monorepo 的一个重要组成部分,它允许一个仓库中包含多个包或项目。每个工作区可以有自己的 `package.json` 和依赖项,并且可以互相引用,简化了复杂项目的依赖管理。 #### 工作区特点: 1. **依赖管理:** 允许工作区依赖于仓库中的其他包。 2. **扁平化依赖:** 可以确保依赖项只被安装一次,节省了空间并减少了重复。 3. **开发流程简化:** 工作区设置简化了开发流程,允许同时工作在多个项目或包上。 ### 实际操作指令解读 1. **初始化项目:** - `yarn install` 或 `npm install`:安装依赖,根据仓库设置的不同可能包括 Lerna 或其他依赖。 2. **开发模式:** - `yarn develop` 或 `npm run develop`:启动开发服务器,对于带有预览的情况,可以使用 `WITH_PREVIEWS=1 yarn develop`。 3. **构建和启动单个远程生产版本:** - `yarn clean` 清除之前的构建。 - `yarn single:build` 单独构建生产版本。 - `yarn single:start` 启动生产服务器。 4. **构建和启动多个远程生产版本:** - `yarn multi:build` 构建多个生产版本。 - `yarn multi:start` 启动多版本生产服务器。 5. **构建镜像:** - `yarn docker:build` 使用 Yarn 构建 Docker 镜像。 以上操作指令展示了如何在使用 Lerna 和 Module Federation 的微前端 monorepo 环境下进行项目的初始化、开发、构建和部署,使用 Yarn 和 npm 工作区来优化多项目依赖管理与构建流程。
recommend-type

RHCSA实践考试及相关知识汇总

# RHCSA 实践考试及相关知识汇总 ## 一、实践考试说明 ### (一)通用注意事项 为确保考试在干净的环境中开始,需遵循以下提示: - 无需任何外部服务器或资源。 - 不要注册或连接任何外部存储库。 - 根据每个实践考试的说明安装新的虚拟机。 - 这些实践考试不提供示例解决方案,在实际考试中,你需要能够自行验证解决方案。 - 应能在两小时内完成每个考试。 ### (二)实践考试 C 内容 1. **安装虚拟机**:安装一个 RHEL 8 或 CentOS 8 虚拟机,要求如下: - 2GB 内存。 - 使用默认分区,20GB 磁盘空间。 - 一个额外的 20
recommend-type

'EOF' 现在,告诉我,eof左右俩边的标点

在 Bash 脚本或 Shell 编程中,`EOF` 通常用于 here document 语法结构中,表示一段多行文本的结束标记。在该结构中,`EOF` 左右两边的标点符号具有特定的语法意义。 ### `EOF` 左边的标点符号 `EOF` 左边通常使用 `<<` 这一组合符号作为重定向操作符,表示开始一个 here document。这一操作符紧接在命令之后,用于指示 Shell 接下来的输入将被视为输入流,直到遇到指定的结束标记(如 `EOF`)为止。例如: ```bash cat << EOF This is a test. EOF ``` 在上述示例中,`<<` 出现在 `E
recommend-type

构建智能银行聊天机器人:Watson Assistant与情绪分析

### 知识点 #### 概述 在给定的文件信息中,我们关注的是通过使用IBM Watson服务,如何构建一个银行行业的聊天机器人。该机器人整合了Watson Assistant、自然语言理解(NLU)、Tone Analyzer以及Watson Discovery服务,目的是提高客户互动体验,并能够应对常见问题解答和情绪检测等复杂场景。 #### 标题中的知识点 1. **Watson Assistant** Watson Assistant是IBM提供的一个以AI为基础的对话式客户服务工具,它允许开发者构建能够与用户进行自然语言交互的聊天机器人。Watson Assistant的核心优势在于其能够理解和预测用户的意图,并且可以学习并适应用户与之对话的方式。 2. **自然语言理解(NLU)** 自然语言理解是人工智能的一个分支,它专注于使计算机能够理解和处理人类语言。在这个项目中,NLU被用来识别和分析用户输入中的位置实体,这样机器人能够更精确地提供相关的服务或信息。 3. **Tone Analyzer服务** Tone Analyzer是IBM Watson的另一项服务,它运用情绪分析技术来检测文本中的情绪色彩。在聊天机器人应用中,通过Tone Analyzer可以判断用户的情绪状态,比如是否感到愤怒或沮丧,从而使得聊天机器人能够做出相应的反馈。 4. **聊天机器人** 聊天机器人是一种软件应用,旨在模拟人类对话,可以通过文本或语音识别,对用户的输入进行处理,并作出响应。在这里,聊天机器人应用于银行业务,以实现快速响应客户的查询和问题。 #### 描述中的知识点 1. **Node.js** Node.js是一个基于Chrome V8引擎的JavaScript运行时环境,它使得JavaScript能够用于服务器端开发。在构建聊天机器人时,Node.js可以用来创建Web UI界面,通过它可以实现用户与聊天机器人的互动。 2. **常见问题发现** 在聊天机器人的上下文中,常见问题发现指的是系统识别并回答客户经常提出的问题。这通常是通过预先设定的问题-答案对来实现的。 3. **愤怒检测** 愤怒检测是聊天机器人使用Tone Analyzer服务的一项功能,用于分析用户输入的语气,判断其是否含有负面情绪。这样机器人可以采取适当的行动,例如将对话转接给人工客服。 4. **FAQ文档中的段落检索** 在聊天机器人中,当客户的问题不能通过预设的答案解决时,需要从文档集合中检索相关信息。段落检索是一种高级搜索技术,用于从大量文档中快速找到最符合用户查询的部分。 #### 标签中的知识点 1. **IBM Cloud** IBM Cloud,先前称为Bluemix,是IBM提供的一套云计算服务,支持包括Watson服务在内的各种应用和服务的部署和运行。 2. **IBM Developer Technology** 这指的是IBM为开发者提供的技术和资源集合,其中包括IBM Watson服务和开发者可以利用的工具包。 3. **IBM Code** IBM Code是IBM倡导的开源项目和代码分享平台,旨在推动开发者社区通过共享代码实现创新。 4. **JavaScript** JavaScript是一种广泛用于网页开发的编程语言,也是Node.js的开发语言,它在构建聊天机器人时起到了前端逻辑处理的关键作用。 #### 压缩包子文件的文件名称列表中的知识点 1. **watson-banking-chatbot-master** 文件名称表明这是一个主项目文件夹,包含构建银行聊天机器人的所有源代码、资源文件及配置。"master"一词暗示这是项目的主分支或主版本。 综合以上信息,开发者将学习到如何利用IBM Watson平台提供的不同AI服务,结合Node.js来创建一个功能完善的银行服务聊天机器人。通过这个过程,开发者会掌握在IBM Cloud上部署和运行聊天机器人所需的知识和技能,同时了解到如何利用NLU服务进行实体识别,如何使用Tone Analyzer服务进行情绪分析,以及如何通过Watson Discovery服务检索FAQ相关的信息。
recommend-type

Linux技术术语全面解析

# Linux技术术语全面解析 ## 1. 基础概念 ### 1.1 变量与路径 - **$PATH**:一个变量,包含了用户输入命令时系统搜索可执行文件的目录列表。 - **.(当前目录)**:可使用`pwd`命令获取其值。 ### 1.2 文件与目录 - **绝对文件名**:完整的文件名,以根目录名开头,包含直至当前文件或目录的所有目录。 - **目录(Directory)**:文件系统中用于有组织地存储文件的文件夹。 ### 1.3 权限与访问控制 - **访问控制列表(ACL)**:在Linux权限管理中,该系统允许为多个用户和多个组授予权限,管理员还能为特定目录设置默认权限。