logo

한국어

Technologies

글 수 12

http://www.yongbok.net/blog/how-to-install-hadoop-2-2-0-pseudo-distributed-mode/


How to install Hadoop 2.2.0 (Pseudo-Distributed Mode)

Hadoop 2.2.0 버전의 기준으로 작성 되었습니다.

1. hadoop 사용자 추가

 

2. SSH 설정
로컬 SSH 접속시 비밀번호 요구 하지 않도록 설정

 

- 접속 테스트

 

3. JDK 설치
http://www.oracle.com/technetwork/java/javase/downloads/index.html

 

- 환경 변수 등록

 

4. Hadoop 설치
http://www.apache.org/dyn/closer.cgi/hadoop/common/

 

- Hadoop 환경 변수 등록

 

- hadoop-env.sh

 

- yarn-env.sh

 

- 환경변수 적용

 

5. Hadoop 설정
- core-site.xml

 

- hdfs-site.xml

 

- mapred-site.xml

 

- yarn-site.xml

 

6. Hadoop 실행
- Name Node format

 

- Hadoop (name node, data node, yarn) 실행

 

- 프로세스 확인

- 웹 확인
http://localhost:8088/ or http://localhost:50070/

hadoop_cluster
hadoop_cluster

hadoop_admin
hadoop_admin

Comments

7 comments

18 thoughts on “How to install Hadoop 2.2.0 (Pseudo-Distributed Mode)

  1. this amazing negative youngster has got Lightscribe, and its a dual DVD/CD burner. it provides particular brilliant product (Nero) as well gifts to accept prospects the ability to create associated photos and videos soon after which copy thus to their range of disc. Dual level backbone hire the system, and also at over $60.00, you can’t beat the item,found at one of our best overnight parties, the type of goodie carrying cases consisted of (budget friendly however,just yet ‘classy’) relax t-shirts and/or shirts. really should be fact, one of your products for you to the romantic evening could improving associated sleep patterns tshirts. (buy excessive marketed tinge of color t-shirts, Mens’ significant, here is an example, so the girls can utilize them when nightgowns.
    ミュウミュウ 訳あり財布 http://www.jhasim.com/heart/events.cfm?tid=372

  2. I got a job by saying this answer in my last interview. thanks for awesome help.

    I got more idea about Hadoop from Besant Technologies. If anyone wants to get Hadoop Training in Chennai visit Besant Technologies.

    http://www.hadooptrainingchennai.co.in

  3. That’s because of the warning messages?
    If you are building source from 64-bit as a message about the issues that are fixed.
    If you have a problem with the result of the operation, memory expansion.

  4. hi I run the pi jop some error

    [test@p2 hadoop-2.4.0]$ hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.4.0.jar pi 5 100
    Number of Maps = 5
    Samples per Map = 100
    14/05/22 18:08:32 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
    Wrote input for Map #0
    Wrote input for Map #1
    Wrote input for Map #2
    Wrote input for Map #3
    Wrote input for Map #4
    Starting Job
    14/05/22 18:08:33 INFO client.RMProxy: Connecting to ResourceManager at /127.0.0.1:8032
    14/05/22 18:08:34 INFO input.FileInputFormat: Total input paths to process : 5
    14/05/22 18:08:34 INFO mapreduce.JobSubmitter: number of splits:5
    14/05/22 18:08:35 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1400761600720_0002
    14/05/22 18:08:41 INFO impl.YarnClientImpl: Submitted application application_1400761600720_0002
    14/05/22 18:08:41 INFO mapreduce.Job: The url to track the job: http://p2.in.krs.na:8088/proxy/application_1400761600720_0002/
    14/05/22 18:08:41 INFO mapreduce.Job: Running job: job_1400761600720_0002

  5. Please check whether there is any process that is using port 9000.
    Or please make sure that the format is successful namenode.

  6. Hello,
    I have followed the steps and installed Hadoop,
    but my namenode is not starting at all…
    [root@localhost sbin]# jps
    14489 NodeManager
    14104 DataNode
    14259 SecondaryNameNode
    15451 Jps
    14397 ResourceManager
    …………………………………………………………………………………………
    I have followed the namenode format and restart of all process. but it did not work.

    So please let me know how to make my namenode start.

  7. Very helpful.

    For debian users, it is better to add the hadoop user with adduser command, which will create the proper skeletons and home directory.

  8. Thank you.
    Very helpful blog.

  9. Please change the special characters.

    - bad
    export HADOOP_OPTS=-Djava.library.path=$HADOOP_PREFIX/lib
    [hadoop@ruo91 ~]$ echo $HADOOP_OPTS
    ”-Djava.library.path=$HADOOP_PREFIX/lib”

    - good
    export HADOOP_OPTS="-Djava.library.path=$HADOOP_PREFIX/lib"
    [hadoop@ruo91 ~]$ echo $HADOOP_OPTS
    -Djava.library.path=/home/hadoop/2.2.0/lib

  10. When I execute bin/hadoop namenode -format
    then emerged this error:
    bin/hadoop namenode -format
    DEPRECATED: Use of this script to execute hdfs command is deprecated.
    Instead use the hdfs command for it.

    error: Cannot find or load the main class:java.library.path=.usr.hadoop.hadoop-2.2.0.lib
    please tell me what i have to do.

  11. You add a variable on .bash_profile
    $ nano ~/.bash_profile
    export HADOOP_COMMON_LIB_NATIVE_DIR=${HADOOP_PREFIX}/lib/native
    export HADOOP_OPTS=”-Djava.library.path=$HADOOP_PREFIX/lib”

    Apply it.
    $ source ~/.bash_profile

    And.. starting!!
    $ start-all.sh

    Thanks. :D

  12. hadoop@ubuntu:~$ /home/hadoop/2.2.0/bin/hadoop dfs -copyFromLocal /tmp/abc.txt /home/hadoop/abc.txt
    DEPRECATED: Use of this script to execute hdfs command is deprecated.
    Instead use the hdfs command for it.

    Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hadoop/2.2.0/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
    It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
    13/10/24 18:00:51 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable

  13. Q1. When i write start-all.sh I got following error:
    hadoop@ubuntu:~$ start-all.sh
    This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
    13/10/24 17:54:47 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
    Starting namenodes on [Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hadoop/2.2.0/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
    It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
    localhost]
    sed: -e expression #1, char 6: unknown option to `s'
    -c: Unknown cipher type 'cd'
    warning:: ssh: Could not resolve hostname warning:: Name or service not known
    VM: ssh: Could not resolve hostname VM: Name or service not known
    VM: ssh: Could not resolve hostname VM: Name or service not known
    It's: ssh: Could not resolve hostname It's: Name or service not known
    64-Bit: ssh: Could not resolve hostname 64-Bit: Name or service not known
    try: ssh: Could not resolve hostname try: Name or service not known
    Server: ssh: Could not resolve hostname Server: Name or service not known
    recommended: ssh: Could not resolve hostname recommended: Name or service not known
    will: ssh: Could not resolve hostname will: Name or service not known
    The: ssh: Could not resolve hostname The: Name or service not known
    now.: ssh: Could not resolve hostname now.: Name or service not known
    library: ssh: Could not resolve hostname library: Name or service not known
    stack: ssh: Could not resolve hostname stack: Name or service not known
    Java: ssh: Could not resolve hostname Java: Name or service not known
    have: ssh: Could not resolve hostname have: Name or service not known
    HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM): Name or service not known
    loaded: ssh: Could not resolve hostname loaded: Name or service not known
    You: ssh: Could not resolve hostname You: Name or service not known
    stack: ssh: Could not resolve hostname stack: Name or service not known
    have: ssh: Could not resolve hostname have: Name or service not known
    the: ssh: Could not resolve hostname the: Name or service not known
    might: ssh: Could not resolve hostname might: Name or service not known
    which: ssh: Could not resolve hostname which: Name or service not known
    library: ssh: Could not resolve hostname library: Name or service not known
    with: ssh: Could not resolve hostname with: Name or service not known
    with: ssh: Could not resolve hostname with: Name or service not known
    '-z: ssh: Could not resolve hostname '-z: Name or service not known
    you: ssh: Could not resolve hostname you: Name or service not known
    fix: ssh: Could not resolve hostname fix: Name or service not known
    fix: ssh: Could not resolve hostname fix: Name or service not known
    guard.: ssh: Could not resolve hostname guard.: Name or service not known
    the: ssh: Could not resolve hostname the: Name or service not known
    disabled: ssh: Could not resolve hostname disabled: Name or service not known
    noexecstack'.: ssh: Could not resolve hostname noexecstack'.: Name or service not known
    guard: ssh: Could not resolve hostname guard: Name or service not known
    or: ssh: Could not resolve hostname or: Name or service not known
    'execstack: ssh: Could not resolve hostname 'execstack: Name or service not known
    link: ssh: Could not resolve hostname link: Name or service not known
    highly: ssh: Could not resolve hostname highly: Name or service not known
    <libfile>',: ssh: Could not resolve hostname <libfile>',: Name or service not known
    that: ssh: Could not resolve hostname that: Name or service not known
    …………………………………………………………………………………………………………………..

    2 . jps is working.
    hadoop@ubuntu:~$ jps
    4021 SecondaryNameNode
    8514 Jps
    4163 ResourceManager
    3580 NameNode
    4304 NodeManager
    3738 DataNode

    3. http://localhost:8088/cluster
    is also working.

    4. Hadoop commands not working

    I am getting following error while running any hadoop command like mkdir, copyFromLocal, fs etc.

    hadoop@ubuntu:~/2.2.0/lib$ hadoop fs -ls
    Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hadoop/2.2.0/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.

    Please tell me what i have to do.
    Thanks for nice post, its really helpful.

  14. Very helpful post. One little correction, in yarn-site.xml -
    <property>
    <name>yarn.nodemanager.aux-services</name>
    <value>mapreduce.shuffle</value>
    </property>
    value should be mapreduce_shuffle

    Thanks.

Leave a Reply

번호
제목
글쓴이
How to install Hadoop 2.2.0 (Pseudo-Distributed Mode)
persona
2014-06-30 5721
11 Hadoop 2.2 설치
persona
2014-06-30 2511
10 Open source collection
persona
2013-11-01 12967
9 ALPS KPX-17S Serial -> Keyboard converter file
persona
2012-04-30 22528
8 Base64 encoding
persona
2012-03-08 10710
7 RFID: What Will NFC Replace?
persona
2012-02-21 10122
6 Difference Between Android and Linux
persona
2012-02-21 10082
5 Difference Between RFID and NFC
persona
2012-02-21 10119
4 NFC TAG Type 에 대한 설명
persona
2012-02-21 13749
3 NFC 국내외 동향_ 산업기술 평가원 file
persona
2012-02-21 10808
2 재조명받는 NFC, 문제는 응용서비스
persona
2012-02-21 10308
1 NFriendConnector
persona
2012-02-21 10741