This article will show you how to install Hue on a Hadoop cluster. It assumes that you have a working Hadoop cluster along with Hive installed and working. If not then follow various articles on this site to install Hadoop and Hive first.
Install Hue dependencies.
apt-get install ant gcc g++ libkrb5-dev libffi-dev libmysqlclient-dev libssl-dev libsasl2-dev libsasl2-modules-gssapi-mit libsqlite3-dev libtidy-0.99-0 libxml2-dev libxslt-dev make libldap2-dev maven python-dev python-setuptools libgmp3-dev
Download Hue tarball from official site
http://gethue.com
wget https://www.dropbox.com/s/auwpqygqgdvu1wj/hue-4.1.0.tgz?dl=0#
Extract Hue.
tar -zxvf hue-4.1.0.tgz
Install Hue.
cd hue-4.1.0
make install
This will install Hue in /usr/local folder.
Configure Hue. Edit hue.ini file to change various parameters as per your Hadoop installation.
vi /usr/local/hue/desktop/conf/hue.ini
[hadoop]
app_blacklist=impala,security
fs_defaultfs=hdfs://localhost:54310
hadoop_conf_dir=$HADOOP_CONF_DIR
resourcemanager_host=localhost
resourcemanager_api_url=http://localhost:8088
hive_server_host=localhost
hive_conf_dir=$HIVE_HOME/conf
proxy_api_url=http://localhost:8088
history_server_api_url=http://localhost:19888
hbase_clusters=(Cluster|localhost:9090)
oozie_url=http://localhost:11000/oozie
Add “hue” user and “hdfs” group.
adduser hue
addgroup hdfs
Before accessing Hue, Start Hive Server. Please not & at the end of command will run hive server in the background.
$HIVE_HOME/bin/hiveserver2 &
Run Hue service.
/usr/local/hue/build/env/bin/supervison &
Now you can access Hue UI at http://<VM IP Address>:8888
Login with user “root” or whichever user you are using to run hadoop.
Go to Query->Editor->Hive.
Try few queries.
I was able to make Hive work with Hue. But somehow I could not make Oozie to work with Hue. I tried below steps but I was stuck at some error.
Try Query->Editor->Sqoop1
Import sample mysql table.
sqoop import \
--connect jdbc:mysql://localhost/employees \
--username root \
--password password \
--table employees --m 1 \
--hive-import \
--hive-overwrite \
--hive-database employees \
--target-dir /user/hive/warehouse/employees/

When tried, throws below error.
Failed to create deployment directory: SecurityException: Failed to obtain user group information: org.apache.hadoop.security.authorize.AuthorizationException: User: root is not allowed to impersonate root (error 403)
To resolve above error, edit core-site.xml and change user from “ubuntu” to “root” which is used for impersonating.
Before change.
vi $HADOOP_CONF_DIR/core-site.xml
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://localhost:54310</value>
</property>
<property>
<name>hadoop.proxyuser.ubuntu.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.ubuntu.groups</name>
<value>*</value>
</property>
</configuration>
After change,
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://localhost:54310</value>
</property>
<property>
<name>hadoop.proxyuser.root.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.root.groups</name>
<value>*</value>
</property>
</configuration>
Try import again, this time I got below error.
Error submitting workflow Batch job for query-sqoop1: The Oozie server is not running: HTTPConnectionPool(host='localhost', port=11000): Max retries exceeded with url: /oozie/v1/jobs?timezone=America%2FLos_Angeles&user.name=root&doAs=root (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused',))
How to import data into Hue from mysql?
Check my tutorial on sqoop. Sqoop will allow you to import data into hive tables. After that you can access them from Hue.
http://hadooptutorials.info/2017/09/18/part-3-install-sqoop-version-1-4-x/
HI, i have installed Hue 4.2 and I would like to use Hue as a visualization interface for hive, the server hiveserver 2 running well in port 2 and I can work in command without problem.
My hadoop is also functional (single node running on localhost), I managed to configure the hdfs files for hue and I can easily view hdfs files with the interface hue. but my big problem for weeks is to make a HIVE request with hue (even if I configured according to the research I found on the internet). I can not do it and get stuck on it