In this tutorial i am going to explain how to configure
hadoop 2 on a linux machine. If you have read my
previous post about What is hadoop then you will be
knowing about the general introduction about the
hadoop, like what are its components what is the
architecture and other things about it. You can
hadoop 2 on a linux machine. If you have read my
previous post about What is hadoop then you will be
knowing about the general introduction about the
hadoop, like what are its components what is the
architecture and other things about it. You can
check the previous post here
https://administrationinhadoop.blogspot.com
How to Install JAVA 8 on RHEL
Step 1 – Download Latest Java Archive
Sudo -i
Sudo yum install wget
tar xzf jdk-8u202-linux-x64.tar.gz
Step 1.1 – Install Java 8 with Alter
/opt/jdk1.8.0_202
alternatives --install /usr/bin/java java /opt/jdk1.8.0_202/bin/java 2
alternatives --config java
alternatives --install /usr/bin/jar jar /opt/jdk1.8.0_202/bin/jar 2
alternatives --install /usr/bin/javac javac /opt/jdk1.8.0_202/bin/javac 2
alternatives --set jar /opt/jdk1.8.0_202/bin/jar
alternatives --set javac /opt/jdk1.8.0_202/bin/javac
Step 1.2 – Check Installed Java V
java -version
Step 1.3 – Setup Java Environment Variables
export JAVA_HOME=/opt/jdk1.8.0_202/
export JRE_HOME=/opt/jdk1.8.0_202/jre
export PATH=$PATH:/opt/jdk1.8.0_202/bin/jdk1.8.0_202/jre/bin
vi /etc/bashrc
export JAVA_HOME=/opt/jdk1.8.0_202/
export JRE_HOME=/opt/jdk1.8.0_202/jre
export PATH=$PATH:/opt/jdk1.8.0_202/bin/jdk1.8.0_202/jre/bin
Step 2: Creating Hadoop
adduser hadoop
passwd hadoop
su - hadoop
ssh-keygen -t rsa
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
chmod 0600 ~/.ssh/authorized_keys
ssh localhost
exit
Step 3. Downloading Hadoop
cd ~
tar xzf hadoop-2.6.5.tar.gz
mv hadoop-2.6.5 hadoop
Step 4. Configure Hadoop Pseudo-Distributed
Mode
Vi ~/.bashrc
export HADOOP_HOME=/home/hadoop/hadoop
export HADOOP_INSTALL=$HADOOP_HOME
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
export HADOOP_INSTALL=$HADOOP_HOME
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
source ~/.bashrc
Now edit $HADOOP_HOME/etc/hadoop/hadoop-env.sh file and set
JAVA_HOME environment variable.
Change the JAVA path as per install on your syste
export JAVA_HOME=/opt/jdk1.8.0_202/
cd $HADOOP_HOME/etc/hadoop
vi core-site.xml
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
vi hdfs-site.xml
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>file:///home/hadoop/hadoopdata/hdfs/namenode</value>
</property>
<property>
<name>dfs.data.dir</name>
<value>file:///home/hadoop/hadoopdata/hdfs/datanode</value>
</property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>file:///home/hadoop/hadoopdata/hdfs/namenode</value>
</property>
<property>
<name>dfs.data.dir</name>
<value>file:///home/hadoop/hadoopdata/hdfs/datanode</value>
</property>
Edit mapred-site.xml
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
Edit yarn-site.xml
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
4.3. Format Namenode
hdfs namenode -format
Step 5. Start Hadoop Cluster
cd $HADOOP_HOME/sbin/
start-dfs.sh
start-yarn.sh