Hadoop Installation on Ubuntu | Hadoop Installation Tutorial Guide | Hadoop Setup on Ubuntu

Video is ready, Click Here to View ×


Check our Hadoop Installation blog here:
This Hadoop Installation tutorial video will guide you step by step on how to install Hadoop on Ubuntu. This Hadoop tutorial video is ideal for beginners who want to learn hot to install Hadoop on Ubuntu.
To attend a live session on Big data & Hadoop, click here:

Ubuntu is an ideal platform for Big Data processing on Hadoop as it is based on industry standards, has the ability to scale without any…

20 Comments

  1. I am getting error while starting the datanode or namenode.
    ~/hadoop-2.6.5/sbin$ ./hadoop-daemon.sh start datanode
    starting datanode, logging to /home/piyush/hadoop-2.6.5/logs/hadoop-piyush-datanode-piyush.out
    [Fatal Error] core-site.xml:1:1: Content is not allowed in prolog.

    It seems the namenode is also not getting formatted..
    Please help…

  2. Dear Sir, I have done upto formatting namenode. But after that I typed hadoop-daemon.sh start datanode. It is saying bash: /hadoop-daemon.sh: No such file or directory. Kindly show me some solution. Thank you

  3. Hi, I keep getting this error:

    ~/hadoop-2.7.3/bin$ ./hadoop namenode -format
    DEPRECATED: Use of this script to execute hdfs command is deprecated.
    Instead use the hdfs command for it.

    Error: Could not find or load main class org.apache.hadoop.hdfs.server.namenode.namenode

    I suspected an issue with Java installation , however, $java -version runs fine. Any pointers would be helpful. Thanks

  4. ./hadoop namenode -format
    whenever i am trying to execute the above command from bin directory it is showing a message as follows:-
    bash: ./hadoop: Permission denied
    How can I resolve this issue?

  5. i m stuck at this…plz HELP HELP HELP
    ~/hadoop-2.7.1/bin$ ./hadoop namenode -format
    DEPRECATED: Use of this script to execute hdfs command is deprecated.
    Instead use the hdfs command for it.

    Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/hadoop/hdfs/server/namenode/NameNode : Unsupported major.minor version 51.0
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:648)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:272)
    at java.net.URLClassLoader.access$000(URLClassLoader.java:68)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:207)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:201)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:200)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:325)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:296)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:270)
    at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:406)

  6. when i started daemon for namenode and checked via jps it didnt show
    anything for namenode it isnt started i checked config in hdfs-site.xml
    but it was correct or plz tell how to resolve this issue???

    Your reply was:
    Please try to restart the daemons using the below details:
    STEP 1 stop hadoop
    hduser@edureka$ /usr/local/hadoop-2.2.0/sbin/stop-dfs.sh

    STEP 2 remove tmp folder
    hduser@edureka$ sudo rm -rf /app/hadoop/tmp/

    STEP 3 create /app/hadoop/tmp/
    hduser@edureka$ sudo mkdir -p /app/hadoop/tmp
    hduser@edureka$ sudo chown hduser:hadoop /app/hadoop/tmp
    hduser@edureka$ sudo chmod 750 /app/hadoop/tmp

    STEP 4 format namenode
    hduser@edureka$ hdfs namenode -format

    STEP 5 start dfs
    hduser@edureka$ /usr/local/hadoop-2.2.0/sbin/start-dfs.sh

    STEP 6 check jps
    hduser@edureka$ $ jps
    11342 Jps
    10804 DataNode
    11110 SecondaryNameNode
    10558 NameNode

    I did the above things:
    It shows:
    root@nithin-VirtualBox:~# /home/nithin/Desktop/nithin/hadoop-2.7.3/sbin/start-dfs.sh
    Incorrect configuration: namenode address dfs.namenode.servicerpc-address or dfs.namenode.rpc-address is not configured.
    Starting namenodes on []
    localhost: ssh: connect to host localhost port 22: Connection refused
    localhost: ssh: connect to host localhost port 22: Connection refused
    Starting secondary namenodes [0.0.0.0]
    0.0.0.0: ssh: connect to host 0.0.0.0 port 22: Connection refused
    root@nithin-VirtualBox:~# jps
    3443 Jps

    Please help me with this!!!!

  7. when i started daemon for namenode and checked via jps it didnt show anything for namenode it isnt started i checked config in hdfs-site.xml but it was correct or plz tell how to resolve this issue???

  8. I am having a problem.I followed up all the steps as it was shown in the tutorial yet my datanode won't start . The message displayed is chown: changing of ownership of <hadoop path> Operation not permitted.Can anybody help me ?

  9. The softwares I used for installation of Hadoop as follows please fix the issue

    1.VMware version 11.0.0 build-2305329
    2.Linux ISO file is ubuntu-16.04-desktop-amd64
    3.After that I just followed the instruction followed in the video still my hadoop installation is not finished…

Leave a Reply

Your email address will not be published.


*