mac+hadoop3.1.1,启动start-dos.sh权限不够,permission denied

本文解决了一个Hadoop配置问题,即无法启动start-dfs.sh服务,通过设置ssh localhost为免密码登录来解决。具体步骤包括使用ssh-keygen生成密钥、将公钥添加到authorized_keys文件并调整权限。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

hadoop配置中的问题,按照网上的方法配置完成后,https://blog.csdn.net/zx1245773445/article/details/84875774

无法启动start-dfs.sh ,也就是无法进入http://localhost:9870/这个网址。具体解决办法就是把ssh localhost设置成免登录,指令如下

ssh-keygen -t rsa 

按回车,如果有让输入新密码的地方直接回车就行,就是不再设置什么密码

cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys

chmod og-wx ~/.ssh/authorized_keys

chmod 750 $HOME 

然后再启动start-dfs.sh就没问题了。

pyspark启动初始化2025-06-18 15:43:14,147 INFO conf.HiveConf: Found configuration file file:/D:/pyspark/hive/hive-3.1.1/conf/hive-site.xml Hive Session ID = ae15e233-5595-4035-9a63-90e6fef3164c 2025-06-18 15:43:15,369 INFO SessionState: Hive Session ID = ae15e233-5595-4035-9a63-90e6fef3164c Logging initialized using configuration in jar:file:/D:/pyspark/hive/hive-3.1.1/lib/hive-common-3.1.1.jar!/hive-log4j2.properties Async: true 2025-06-18 15:43:15,415 INFO SessionState: Logging initialized using configuration in jar:file:/D:/pyspark/hive/hive-3.1.1/lib/hive-common-3.1.1.jar!/hive-log4j2.properties Async: true 2025-06-18 15:43:16,270 INFO session.SessionState: Created HDFS directory: /tmp/hive/aaa/ae15e233-5595-4035-9a63-90e6fef3164c 2025-06-18 15:43:16,274 INFO session.SessionState: Created local directory: D:/pyspark/hive/hive-3.1.1/data/scratch/ae15e233-5595-4035-9a63-90e6fef3164c 2025-06-18 15:43:16,277 INFO session.SessionState: Created HDFS directory: /tmp/hive/aaa/ae15e233-5595-4035-9a63-90e6fef3164c/_tmp_space.db 2025-06-18 15:43:16,287 INFO conf.HiveConf: Using the default value passed in for log id: ae15e233-5595-4035-9a63-90e6fef3164c 2025-06-18 15:43:16,287 INFO session.SessionState: Updating thread name to ae15e233-5595-4035-9a63-90e6fef3164c main 2025-06-18 15:43:17,092 INFO metastore.HiveMetaStore: 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore 2025-06-18 15:43:17,111 WARN metastore.ObjectStore: datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored 2025-06-18 15:43:17,114 INFO metastore.ObjectStore: ObjectStore, initialize called 2025-06-18 15:43:17,116 INFO conf.MetastoreConf: Found configuration file file:/D:/pyspark/hive/hive-3.1.1/conf/hive-site.xml 2025-06-18 15:43:17,117 INFO conf.MetastoreConf: Unable to find config file hivemetastore-site.xml 2025-06-18 15:43:17,118 INFO conf.MetastoreConf: Found configuration file null 2025-06-18 15:43:17,119 INFO conf.MetastoreConf: Unable to find config file metastore-site.xml 2025-06-18 15:43:17,119 INFO conf.MetastoreConf: Found configuration file null 2025-06-18 15:43:17,256 INFO DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored 2025-06-18 15:43:17,450 INFO hikari.HikariDataSource: HikariPool-1 - Starting... 2025-06-18 15:43:17,626 INFO hikari.HikariDataSource: HikariPool-1 - Start completed. 2025-06-18 15:43:17,679 INFO hikari.HikariDataSource: HikariPool-2 - Starting... 2025-06-18 15:43:17,682 INFO hikari.HikariDataSource: HikariPool-2 - Start completed. 2025-06-18 15:43:17,799 INFO metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order" 2025-06-18 15:43:17,898 INFO metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is MYSQL 2025-06-18 15:43:17,899 INFO metastore.ObjectStore: Initialized ObjectStore 2025-06-18 15:43:18,059 WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored 2025-06-18 15:43:18,060 WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored 2025-06-18 15:43:18,061 WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored 2025-06-18 15:43:18,061 WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored 2025-06-18 15:43:18,061 WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored 2025-06-18 15:43:18,062 WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored 2025-06-18 15:43:19,914 WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored 2025-06-18 15:43:19,914 WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored 2025-06-18 15:43:19,915 WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored 2025-06-18 15:43:19,915 WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored 2025-06-18 15:43:19,916 WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored 2025-06-18 15:43:19,917 WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored 2025-06-18 15:43:23,624 INFO metastore.HiveMetaStore: Added admin role in metastore 2025-06-18 15:43:23,626 INFO metastore.HiveMetaStore: Added public role in metastore 2025-06-18 15:43:24,075 INFO metastore.HiveMetaStore: No user is added in admin role, since config is empty 2025-06-18 15:43:24,231 INFO metastore.RetryingMetaStoreClient: RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=aaa (auth:SIMPLE) retries=1 delay=1 lifetime=0 2025-06-18 15:43:24,248 INFO metastore.HiveMetaStore: 0: get_all_functions 2025-06-18 15:43:24,250 INFO HiveMetaStore.audit: ugi=aaa ip=unknown-ip-addr cmd=get_all_functions Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.Hive Session ID = 69d15420-0a4b-4d2a-934e-e5662db4697f 2025-06-18 15:43:24,766 INFO SessionState: Hive Session ID = 69d15420-0a4b-4d2a-934e-e5662db4697f 2025-06-18 15:43:24,767 INFO CliDriver: Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases. 2025-06-18 15:43:24,775 INFO session.SessionState: Created HDFS directory: /tmp/hive/aaa/69d15420-0a4b-4d2a-934e-e5662db4697f 2025-06-18 15:43:24,777 INFO session.SessionState: Created local directory: D:/pyspark/hive/hive-3.1.1/data/scratch/69d15420-0a4b-4d2a-934e-e5662db4697f 2025-06-18 15:43:24,779 INFO session.SessionState: Created HDFS directory: /tmp/hive/aaa/69d15420-0a4b-4d2a-934e-e5662db4697f/_tmp_space.db 2025-06-18 15:43:24,780 INFO metastore.HiveMetaStore: 1: get_databases: @hive# 2025-06-18 15:43:24,780 INFO HiveMetaStore.audit: ugi=aaa ip=unknown-ip-addr cmd=get_databases: @hive# 2025-06-18 15:43:24,781 INFO metastore.HiveMetaStore: 1: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore 2025-06-18 15:43:24,781 INFO metastore.ObjectStore: ObjectStore, initialize called 2025-06-18 15:43:24,786 INFO metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is MYSQL 2025-06-18 15:43:24,786 INFO metastore.ObjectStore: Initialized ObjectStore 2025-06-18 15:43:24,791 INFO metastore.HiveMetaStore: 1: get_tables_by_type: db=@hive#bclcredits pat=.*,type=MATERIALIZED_VIEW 2025-06-18 15:43:24,791 INFO HiveMetaStore.audit: ugi=aaa ip=unknown-ip-addr cmd=get_tables_by_type: db=@hive#bclcredits pat=.*,type=MATERIALIZED_VIEW 2025-06-18 15:43:24,796 INFO metastore.HiveMetaStore: 1: get_multi_table : db=bclcredits tbls= 2025-06-18 15:43:24,796 INFO HiveMetaStore.audit: ugi=aaa ip=unknown-ip-addr cmd=get_multi_table : db=bclcredits tbls= 2025-06-18 15:43:24,798 INFO metastore.HiveMetaStore: 1: get_tables_by_type: db=@hive#default pat=.*,type=MATERIALIZED_VIEW 2025-06-18 15:43:24,798 INFO HiveMetaStore.audit: ugi=aaa ip=unknown-ip-addr cmd=get_tables_by_type: db=@hive#default pat=.*,type=MATERIALIZED_VIEW 2025-06-18 15:43:24,799 INFO metastore.HiveMetaStore: 1: get_multi_table : db=default tbls= 2025-06-18 15:43:24,799 INFO HiveMetaStore.audit: ugi=aaa ip=unknown-ip-addr cmd=get_multi_table : db=default tbls= 2025-06-18 15:43:24,800 INFO metadata.HiveMaterializedViewsRegistry: Materialized views registry has been initialized
最新发布
06-19
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值