`

Hadoop 2.2.0版本./start-all.sh错误解决

阅读更多

转自<http://www.lvcy.net/?post=162>

001 This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
002 Starting namenodes on [Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hadoop/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
003 It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.]
004 HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM): Name or service not known
005 -c: Unknown cipher type 'cd'
006 64-Bit: ssh: Could not resolve hostname 64-Bit: Name or service not known
007 you: ssh: Could not resolve hostname you: Name or service not known
008 Server: ssh: Could not resolve hostname Server: Name or service not known
009 You: ssh: Could not resolve hostname You: Name or service not known
010 The: ssh: Could not resolve hostname The: Name or service not known
011 Java: ssh: Could not resolve hostname Java: Name or service not known
012 will: ssh: Could not resolve hostname will: Name or service not known
013 guard.: ssh: Could not resolve hostname guard.: Name or service not known
014 stack: ssh: Could not resolve hostname stack: Name or service not known
015 disabled: ssh: Could not resolve hostname disabled: Name or service not known
016 with: ssh: Could not resolve hostname with: Name or service not known
017 highly: ssh: Could not resolve hostname highly: Name or service not known
018 guard: ssh: Could not resolve hostname guard: Name or service not known
019 recommended: ssh: Could not resolve hostname recommended: Name or service not known
020 whichssh: Could not resolve hostname which: Name or service not known
021 it: ssh: Could not resolve hostname it: Name or service not known
022 It's: ssh: Could not resolve hostname It's: Name or service not known
023 have: ssh: Could not resolve hostname have: Name or service not known
024 fix: ssh: Could not resolve hostname fix: Name or service not known
025 or: ssh: Could not resolve hostname or: Name or service not known
026 VM: ssh: Could not resolve hostname VM: Name or service not known
027 now.: ssh: Could not resolve hostname now.: Name or service not known
028 try: ssh: Could not resolve hostname try: Name or service not known
029 stack: ssh: Could not resolve hostname stack: Name or service not known
030 library: ssh: Could not resolve hostname library: Name or service not known
031 <libfile>',: ssh: Could not resolve hostname <libfile>',: Name or service not known
032 might: ssh: Could not resolve hostname might: Name or service not known
033 noexecstack'.: ssh: Could not resolve hostname noexecstack'.: Name or service not known
034 have: ssh: Could not resolve hostname have: Name or service not known
035 'execstack: ssh: Could not resolve hostname 'execstack: Name or service not known
036 fix: ssh: Could not resolve hostname fix: Name or service not known
037 link: ssh: Could not resolve hostname link: Name or service not known
038 warning:: ssh: Could not resolve hostname warning:: Name or service not known
039 with: ssh: Could not resolve hostname with: Name or service not known
040 that: ssh: Could not resolve hostname that: Name or service not known
041 the: ssh: Could not resolve hostname the: Name or service not known
042 VM: ssh: Could not resolve hostname VM: Name or service not known
043 loaded: ssh: Could not resolve hostname loaded: Name or service not known
044 '-z: ssh: Could not resolve hostname '-z: Name or service not known
045 the: ssh: Could not resolve hostname the: Name or service not known
046 library: ssh: Could not resolve hostname library: Name or service not known
047 to: ssh: connect to host to port 22: Connection refused
048 localhost: Error: JAVA_HOME is not set and could not be found.
049 Starting secondary namenodes [Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hadoop/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
050 It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
051 0.0.0.0]
052 Java: ssh: Could not resolve hostname Java: Name or service not known
053 HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM): Name or service not known
054 Server: ssh: Could not resolve hostname Server: Name or service not known
055 64-Bit: ssh: Could not resolve hostname 64-Bit: Name or service not known
056 VM: ssh: Could not resolve hostname VM: Name or service not known
057 -c: Unknown cipher type 'cd'
058 You: ssh: Could not resolve hostname You: Name or service not known
059 have: ssh: Could not resolve hostname have: Name or service not known
060 that: ssh: Could not resolve hostname that: Name or service not known
061 warning:: ssh: Could not resolve hostname warning:: Name or service not known
062 loaded: ssh: Could not resolve hostname loaded: Name or service not known
063 stack: ssh: Could not resolve hostname stack: Name or service not known
064 the: ssh: Could not resolve hostname the: Name or service not known
065 fix: ssh: Could not resolve hostname fix: Name or service not known
066 library: ssh: Could not resolve hostname library: Name or service not known
067 with: ssh: Could not resolve hostname with: Name or service not known
068 stack: ssh: Could not resolve hostname stack: Name or service not known
069 have: ssh: Could not resolve hostname have: Name or service not known
070 try: ssh: Could not resolve hostname try: Name or service not known
071 recommended: ssh: Could not resolve hostname recommended: Name or service not known
072 It's: ssh: Could not resolve hostname It's: Name or service not known
073 might: ssh: Could not resolve hostname might: Name or service not known
074 the: ssh: Could not resolve hostname the: Name or service not known
075 now.: ssh: Could not resolve hostname now.: Name or service not known
076 disabled: ssh: Could not resolve hostname disabled: Name or service not known
077 highly: ssh: Could not resolve hostname highly: Name or service not known
078 VM: ssh: Could not resolve hostname VM: Name or service not known
079 fix: ssh: Could not resolve hostname fix: Name or service not known
080 'execstack: ssh: Could not resolve hostname 'execstack: Name or service not known
081 whichssh: Could not resolve hostname which: Name or service not known
082 you: ssh: Could not resolve hostname you: Name or service not known
083 will: ssh: Could not resolve hostname will: Name or service not known
084 guard: ssh: Could not resolve hostname guard: Name or service not known
085 noexecstack'.: ssh: Could not resolve hostname noexecstack'.: Name or service not known
086 <libfile>',: ssh: Could not resolve hostname <libfile>',: Name or service not known
087 link: ssh: Could not resolve hostname link: Name or service not known
088 or: ssh: Could not resolve hostname or: Name or service not known
089 The: ssh: Could not resolve hostname The: Name or service not known
090 library: ssh: Could not resolve hostname library: Name or service not known
091 with: ssh: Could not resolve hostname with: Name or service not known
092 it: ssh: Could not resolve hostname it: Name or service not known
093 guard.: ssh: Could not resolve hostname guard.: Name or service not known
094 '-z: ssh: Could not resolve hostname '-z: Name or service not known
095 to: ssh: connect to host to port 22: Connection refused
096 0.0.0.0: Error: JAVA_HOME is not set and could not be found.
097 starting yarn daemons
098 starting resourcemanager, logging to /home/hadoop/hadoop/logs/yarn-hadoop-resourcemanager-master.out
099 Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hadoop/hadoop/lib/native/libhadoop.so.1.0.0 whichmight have disabled stack guard. The VM will try to fix the stack guard now.
100 It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
101 localhost: Error: JAVA_HOME is not set and could not be found.

 

解决办法一:

hadoop@master~: sudo gedit ~/.bash_profile    然后输入如下内容并保存:

 

1 export HADOOP_COMMON_LIB_NATIVE_DIR=${HADOOP_PREFIX}/lib/native
2 export HADOOP_OPTS="-Djava.library.path=$HADOOP_PREFIX/lib"

解决办法二:

 

打开$HADOOP_HOME/etc/hadoop/hadoop-env.sh文件,输入如下内容并保存

 

1 export HADOOP_COMMON_LIB_NATIVE_DIR=${HADOOP_PREFIX}/lib/native
2 export HADOOP_OPTS="-Djava.library.path=$HADOOP_PREFIX/lib"

解决办法三:

 

打开$HADOOP_HOME/etc/hadoop/yarn-env.sh,在任意位置输入如下内容

 

1 export HADOOP_COMMON_LIB_NATIVE_DIR=${HADOOP_PREFIX}/lib/native
2 export HADOOP_OPTS="-Djava.library.path=$HADOOP_PREFIX/lib"

最后在运行$HADOOP_HOME/sbin/start-all.sh

分享到:
评论

相关推荐

    hadoop-3.1.1.3.1.4.0-315.tar.gz

    ambari-2.7.5 编译过程中四个大包下载很慢,所以需要提前下载,包含:hbase-2.0.2.3.1.4.0-315-bin.tar.gz ,hadoop-3.1.1.3.1.4.0-315.tar.gz , grafana-6.4.2.linux-amd64.tar.gz ,phoenix-5.0.0.3.1.4.0-315....

    HBase的安装与配置

    /root/hbase-0.98.12.1-hadoop2/bin/./start-hbase.sh 12、通过浏览器访问hbase管理页面 Node11:60010 Node12:60010 Node11:50070 Node12:50070 13、为保证集群的可靠性,要启动多个HMaster hbase-daemon.sh ...

    hadoop最新版本3.1.1全量jar包

    hadoop-annotations-3.1.1.jar hadoop-common-3.1.1.jar hadoop-mapreduce-client-core-3.1.1.jar hadoop-yarn-api-3.1.1.jar hadoop-auth-3.1.1.jar hadoop-hdfs-3.1.1.jar hadoop-mapreduce-client-hs-3.1.1.jar ...

    hadoop-common-2.2.0-bin-32.rar

    hadoop-common-2.2.0-bin-32.rarhadoop-common-2.2.0-bin-32.rarhadoop-common-2.2.0-bin-32.rarhadoop-common-2.2.0-bin-32.rarhadoop-common-2.2.0-bin-32.rarhadoop-common-2.2.0-bin-32.rarhadoop-common-2.2.0-...

    hadoop-2.2.0.tar.gz

    hadoop-2.2.0.tar.gz

    hadoop-2.2.0-x64.tar.gz part2

    hadoop-2.2.0 64bit下载,自己编译的 [INFO] Reactor Summary: ...This command was run using /home/hadoop/Desktop/hadoop-2.2.0-src/hadoop-dist/target/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0.jar

    hadoop2.2.0/2.6.0/2.7.0/2.7.1 64位安装包

    hadoop2.2.0/2.6.0/2.7.0/2.7.1 64位安装包。

    hadoop-2.2.0-x64.tar.gz part3

    自己编译的64bithadoop-2.2.0版本 [INFO] Reactor Summary: ...This command was run using /home/hadoop/Desktop/hadoop-2.2.0-src/hadoop-dist/target/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0.jar

    pentaho-hadoop-shims-cdh58-package-70.2016.10.00-25-dist.zip

    Shims是Pentaho提供的一系列连接各个source的适配器,具体配置位置根据Pentaho的组件来决定,现在的PDI Spoon的配置位置在../data-integration/plugins/pentaho-big-data-plugin/hadoop-configurations 下,注意这个...

    hadoop-eclipse-plugin-2.2.0.jar

    hadoop-eclipse-plugin-2.2.0.jar hadoop安装eclipse必备插件,亲测可用,欢迎大家下载,交换下载币,谢谢!

    hadoop-管理

    ./hadoop-daemon.sh start datanode ./hadoop-daemon.sh start tasktracker 2.处理hadoop的namenode宕机 ./hadoop-daemon.sh start namenode ./hadoop-daemon.sh start tasktracker 3.如果是新添加一个节点,...

    hadoop-mapreduce-client-core-2.5.1-API文档-中文版.zip

    赠送jar包:hadoop-mapreduce-client-core-2.5.1.jar; 赠送原API文档:hadoop-mapreduce-client-core-2.5.1-javadoc.jar; 赠送源代码:hadoop-mapreduce-client-core-2.5.1-sources.jar; 赠送Maven依赖信息文件:...

    hadoop-eclipse-plugin-1.1.2.jar

    &lt;property name="hadoop.root" location="${root}/../../../"/&gt; 改成 &lt;property name="root" value="/home/hadoop/soft/hadoop-1.1.2/src/contrib/eclipse-plugin"/&gt; &lt;property name="eclipse.home" location="/...

    hadoop-eclipse-plugin-2.10.0.jar

    必须注意对于不同的hadoop版本,` HADDOP_INSTALL_PATH/share/hadoop/common/lib`下的jar包版本都不同,需要一个个调整 - `hadoop2x-eclipse-plugin-master/ivy/library.properties` - `hadoop2x-eclipse-plugin-...

    hadoop-common-2.2.0-bin-master.zip

    hadoop-common-2.2.0-bin-master(包含windows端开发Hadoop和Spark需要的winutils.exe),Windows下IDEA开发Hadoop和Spark程序会报错,原因是因为如果本机操作系统是windows,在程序中使用了hadoop相关的东西,比如写入...

    hadoop-lzo-master

    1.安装 Hadoop-gpl-compression 1.1 wget http://hadoop-gpl-compression.apache-extras.org.codespot.com/files/hadoop-gpl-compression-0.1.0-rc0.tar.gz 1.2 mv hadoop-gpl-compression-0.1.0/lib/native/Linux-...

    pentaho-hadoop-shims-cdh514-package-8.3.2019.05.00-371-dist.zip

    Shims是Pentaho提供的一系列连接各个source的适配器,具体配置位置根据Pentaho的组件来决定,现在的PDI Spoon的配置位置在../data-integration/plugins/pentaho-big-data-plugin/hadoop-configurations 下,注意这个...

    hadoop-2.2.0-src.tar

    A couple of important points to note while upgrading to hadoop-2.2.0: HDFS - The HDFS community decided to push the symlinks feature out to a future 2.3.0 release and is currently disabled. YARN/...

    hadoop-2.2.0.tar.gz 64位 part1 hadoop源码包编译为linux64位版本

    [root@master hadoop-2.2.0]# file lib//native/* lib//native/libhadoop.a: current ar archive lib//native/libhadooppipes.a: current ar archive lib//native/libhadoop.so: symbolic link to `libhadoop.so....

    winutils.exe/hadoop-common-2.2.0.rar

    报错无配置hadoop.home.dir,没有winutils.exe的处理;解压hadoop-common-2.2.0.rar放入D盘,添加配置文件HADOOP_HOME=D:\hadoop-common-2.2.0 重新启动开发工具,使新增配置生效

Global site tag (gtag.js) - Google Analytics