Format HDFS Namenode Error Could not find or load main class -Djava library path home hadoop hadoop-3 2 1 lib native

0 votes

I am building a single node HDFS on Ubuntu 18.04 and am getting the following error when I try to format the HDFS name node using the command:

hdfs namenode -format

Error: Could not find or load main class ”-Djava.library.path=.home.hadoop.hadoop-3.2.1.lib.native”

I have the following configurations files:

.bashrc

export HADOOP_HOME=/home/hadoop/hadoop-3.2.1
export HADOOP_INSTALL=$HADOOP_HOME
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
export HADOOP_OPTS=”-Djava.library.path=$HADOOP_HOME/lib/native”
hadoop-env.sh
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
core-site.xml
<configuration>
<property>
  <name>hadoop.tmp.dir</name>
  <value>/home/hadoop/tmpdata</value>
</property>
<property>
  <name>fs.default.name</name>
  <value>hdfs://127.0.0.1:9000</value>
</property>
</configuration>
hdfs-site.xml
<configuration>
<property>
  <name>dfs.data.dir</name>
  <value>/home/hadoop/dfsdata/namenode</value>
</property>
<property>
  <name>dfs.data.dir</name>
  <value>/home/hadoop/dfsdata/datanode</value>
</property>
<property>
  <name>dfs.replication</name>
  <value>1</value>
</property>
</configuration>
mapred-site.xml
<configuration> 
<property> 
  <name>mapreduce.framework.name</name> 
  <value>yarn</value> 
</property> 
</configuration>
yarn-site.xml
<configuration>
<property>
  <name>yarn.nodemanager.aux-services</name>
  <value>mapreduce_shuffle</value>
</property>
<property>
  <name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name>
  <value>org.apache.hadoop.mapred.ShuffleHandler</value>
</property>
<property>
  <name>yarn.resourcemanager.hostname</name>
  <value>127.0.0.1</value>
</property>
<property>
  <name>yarn.acl.enable</name>
  <value>0</value>
</property>
<property>
  <name>yarn.nodemanager.env-whitelist</name>   
  <value>JAVA_HOME,HADOOP_COMMON_HOME,HADOOP_HDFS_HOME,HADOOP_CONF_DIR,CLASSPATH_PERPEND_DISTCACHE,HADOOP_YARN_HOME,HADOOP_MAPRED_HOME</value>
</property>
</configuration>

I have created the namenode and datanode folders referenced in hdfs-site.xml.  I have been chasing this for several hours without a solutions.  Thanks for suggested fixes.

Jun 12, 2020 in Big Data Hadoop by fwood
• 120 points
1 flag
edited Jun 12, 2020 by MD 11,061 views
when i type in this command it works fine

$ java -version
openjdk version "1.8.0_265"
OpenJDK Runtime Environment (build 1.8.0_265-8u265-b01-0ubuntu2~18.04-b01)
OpenJDK 64-Bit Server VM (build 25.265-b01, mixed mode)

but when i type hadoop version this error shows up again.

$hadoop version
Error: Could not find or load main class ”-Djava.library.path=.home.hdoop.hadoop-3.2.1.lib.”
#Hadoop Related Options
export HADOOP_HOME=/home/hdoop/hadoop-3.2.1
export HADOOP_INSTALL=$HADOOP_HOME
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
export HADOOP_OPTS=”-Djava.library.path=$HADOOP_HOME/lib”
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64

done ...still same error

~$ hdfs namenode -format
Error: Could not find or load main class ”-Djava.library.path=.home.hdoop.hadoop-3.2.1.lib”

@Bezz,

Are you using the command as hadoop version? 

Try as:

hadoop -version

It should work

Hi@bezz,

You didn't set the path for Java according to your .bashrc file. Check this env. variable. If not present then try to add first and then try.

Hi @akhtar,@GItIka and @MD thank you all for your help..It has finally worked. It started working when i changed the double quotation into single quotation in .bashrc file.,

export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"

changed to this

export HADOOP_OPTS='-Djava.library.path=$HADOOP_HOME/lib'

and it worked and am done with the installation opened it in browser.

1 answer to this question.

+1 vote

Hi@fwood,

According to your configuration, you didn't set JAVA_HOME & PATH. Try to set these variables in .bashrc file. You can use the below-given command to find your JAVA_HOME & PATH.

$ which java

Also, change the value of HADOOP_OPTS as given below.

export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/"

In your hadoop-env.sh file export the value of HADOOP_CONF_DIR. I suggest you don't change the variable name. You change the variable name of HADOOP_CONF_DIR to HADOOP_INSTALL. Hadoop internally searches these variables and if it finds different values, it may give an error.

answered Jun 12, 2020 by MD
• 95,460 points
Thank You MD:  That did the trick, I have Hadoop up and running with yarn.  For those looking at this thread later Here are my updated files:

.bashrc

export HADOOP_HOME=/home/hadoop/hadoop-3.2.1
export HADOOP_INSTALL=$HADOOP_HOME
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/"
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64

hadoop-env.sh

export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64

export HADOOP_CONF_DIR=${HADOOP_HOME}/etc/hadoop

Now that I have hadoop up and running I plan to install ZooKeeper and Accumulo.  The overall project goal is to build a basic docker container app and get it to talk to data in the Hadoop store through the Accumulo security structure.  Thanks Again for the help.
thank you so much! command worked like a charm,have being trying to install hadoop to windows,but switched later to ubuntu. Encountered the same kind of error and the above command took care of it. Saved me 3 hours of sleep =)

Related Questions In Big Data Hadoop

0 votes
1 answer

Hadoop Mapreduce: Error: Could not find or load main class com.sun.tools.javac.Main

You have to add HADOOP_CLASSPATH environment parameter: expor ...READ MORE

answered Oct 30, 2018 in Big Data Hadoop by Omkar
• 69,220 points
3,655 views
0 votes
1 answer

Hadoop MacOS: formatting namenode: Could not find or load main class

You need to point the HADOOP_PREFIX to ...READ MORE

answered Nov 13, 2018 in Big Data Hadoop by Omkar
• 69,220 points
1,318 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
11,035 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,541 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
108,860 views
–1 vote
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,350 points
4,616 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP