This article will show you how to install Hue on a hadoop cluster. It assumes that you have a working hadoop cluster along with Hive installed and working. If not then follow various articles on this site to install hadoop and hive first.
Install Hue dependencies
Download Hue tarball from official site
http://gethue.com
Extract Hue
Install Hue
make install
This will install Hue in /usr/local folder.
Configure Hue. Edit hue.ini file to change various parameters as per your hadoop installation.
app_blacklist=impala,security
fs_defaultfs=hdfs://localhost:54310
hadoop_conf_dir=$HADOOP_CONF_DIR
resourcemanager_host=localhost
resourcemanager_api_url=http://localhost:8088
hive_server_host=localhost
hive_conf_dir=$HIVE_HOME/conf
proxy_api_url=http://localhost:8088
history_server_api_url=http://localhost:19888
hbase_clusters=(Cluster|localhost:9090)
oozie_url=http://localhost:11000/oozie
Add “hue” user and “hdfs” group
addgroup hdfs
Before accessing Hue, Start Hive Server. Please not & at the end of command will run hive server in the background.
run Hue
Now you can access Hue UI at http://localhost:8888
Login with user “root” or whichever user you are using to run hadoop.
Go to Query->Editor->Hive.
Try few queries.
I was able to make Hive work with Hue. But somehow I could not make Oozie to work with Hue. I tried below steps but I was stuck at some error.
Try Query->Editor->Sqoop1
Import sample mysql table.
--connect jdbc:mysql://localhost/employees \
--username root \
--password password \
--table employees --m 1 \
--hive-import \
--hive-overwrite \
--hive-database employees \
--target-dir /user/hive/warehouse/employees/
When tried, throws below error.
To resolve above error, edit core-site.xml and change user from “ubuntu” to “root” which is used for impersonating.
Before change,
<property>
<name>fs.defaultFS</name>
<value>hdfs://localhost:54310</value>
</property>
<property>
<name>hadoop.proxyuser.ubuntu.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.ubuntu.groups</name>
<value>*</value>
</property>
</configuration>
After change,
<property>
<name>fs.defaultFS</name>
<value>hdfs://localhost:54310</value>
</property>
<property>
<name>hadoop.proxyuser.root.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.root.groups</name>
<value>*</value>
</property>
</configuration>
Try import again, this time I got below error.