user2627317 user2627317 - 10 days ago 6
Linux Question

Executing hadoop commands in browser

I have followed the famous michael noll tutorial http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/ to install

hadoop
in single node.

Now I want to execute the following command via browser through php

usr/local/hadoop/bin/hadoop jar /usr/local/hadoop/contrib/streaming/hadoop-*streaming*.jar -mapper "/usr/bin/python /var/www/DataMining/AnalysisByYear/AnalysisByYear_mapper.py 2011" -reducer "/usr/bin/python /var/www/DataMining/AnalysisByYear/AnalysisByYear_reducer.py" -input /user/hduser/dataset/final_eval.txt -output /user/hduser/dataset-outputyear


I am using the
exec()
command of
php
to execute it but it doesn't show any result.

In the tutorial the hduser is given the ownership of hadoop. However the browser logs in as the user www-data. As a result it cannot execute. I am assuming that is the error. Is it possible to execute the script in any way?

The namenode logs are given below:

2013-11-18 14:54:45,712 ERROR org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:www-data cause:org.apache.hadoop.security.AccessControlException: Permission denied: user=www-data, access=WRITE, inode="staging":hduser:supergroup:rwxr-xr-x
2013-11-18 14:54:45,712 INFO org.apache.hadoop.ipc.Server: IPC Server handler 1 on 54310, call mkdirs(/app/hadoop/tmp/mapred/staging/www-data/.staging, rwx------) from 127.0.0.1:40876: error: org.apache.hadoop.security.AccessControlException: Permission denied: user=www-data, access=WRITE, inode="staging":hduser:supergroup:rwxr-xr-x
org.apache.hadoop.security.AccessControlException: Permission denied: user=www-data, access=WRITE, inode="staging":hduser:supergroup:rwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:199)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:180)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:128)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5468)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:5442)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:2209)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2178)
at org.apache.hadoop.hdfs.server.namenode.NameNode.mkdirs(NameNode.java:857)
at sun.reflect.GeneratedMethodAccessor13.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:578)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1393)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1389)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1387)

Answer

The solution I used is starting my apache server as hduser.