amrit kumar amrit kumar - 5 months ago 10
Java Question

Hadoop: getting container launched failed error

I have freshly installed a multinode hadoop cluster with one namenode machine, and two slavenodes. However, when I run a mapreduce task, I keep getting this error:

Container launch failed for container_1453020503065_0030_01_000009

:java.lang.IllegalArgumentException:java.net.UnknownHostException: HOME


Here HOME and shubhranshu-OptiPlex-9020 are hostname of slave machines. I have put their IP address and hostname in /etc/hosts file.
My /etc/hosts file look like this:

10.0.3.107 HadoopMaster
10.0.3.108 HadoopSlave1
10.0.3.109 HadoopSlave2
127.0.0.1 localhost amrit
#127.0.1.1 amrit
10.0.3.107 amrit
10.0.3.108 HOME
10.0.3.109 shubhranshu-OptiPlex-9020
# The following lines are desirable for IPv6 capable hosts
::1 ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff00::0 ip6-mcastprefix
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters


Please tell if I need to add some more things. Thank You!

Answer

Modify your /etc/hosts file as follows:

    127.0.0.1       localhost 
    10.0.3.107      HadoopMaster amrit
    10.0.3.108      HadoopSlave1
    10.0.3.109      HadoopSlave2

Also modify /etc/hosts of 10.0.3.108 machine as follows:

    127.0.0.1       localhost 
    10.0.3.107      HadoopMaster 
    10.0.3.108      HadoopSlave1 HOME
    10.0.3.109      HadoopSlave2

and modify /etc/hosts in 10.0.3.109 machine as follows:

    127.0.0.1       localhost 
    10.0.3.107      HadoopMaster 
    10.0.3.108      HadoopSlave1
    10.0.3.109      HadoopSlave2 shubhranshu-OptiPlex-9020