Part-8 : Install Hue

This article will show you how to install Hue on a hadoop cluster. It assumes that you have a working hadoop cluster along with Hive installed and working. If not then follow various articles on this site to install hadoop and hive first.

Install Hue dependencies

apt-get install ant gcc g++ libkrb5-dev libffi-dev libmysqlclient-dev libssl-dev libsasl2-dev libsasl2-modules-gssapi-mit libsqlite3-dev libtidy-0.99-0 libxml2-dev libxslt-dev make libldap2-dev maven python-dev python-setuptools libgmp3-dev

Download Hue tarball from official site
http://gethue.com

wget https://www.dropbox.com/s/auwpqygqgdvu1wj/hue-4.1.0.tgz?dl=0#

Extract Hue

tar -zxvf hue-4.1.0.tgz

Install Hue

cd hue-4.1.0
make install

This will install Hue in /usr/local folder.

Configure Hue. Edit hue.ini file to change various parameters as per your hadoop installation.

vi /usr/local/hue/desktop/conf/hue.ini
[hadoop]
app_blacklist=impala,security
fs_defaultfs=hdfs://localhost:54310
hadoop_conf_dir=$HADOOP_CONF_DIR
resourcemanager_host=localhost
resourcemanager_api_url=http://localhost:8088
hive_server_host=localhost
hive_conf_dir=$HIVE_HOME/conf
proxy_api_url=http://localhost:8088
history_server_api_url=http://localhost:19888
hbase_clusters=(Cluster|localhost:9090)
oozie_url=http://localhost:11000/oozie

Add “hue” user and “hdfs” group

adduser hue
addgroup hdfs

Before accessing Hue, Start Hive Server. Please not & at the end of command will run hive server in the background.

$HIVE_HOME/bin/hiveserver2 &

run Hue

/usr/local/hue/build/env/bin/supervison &

Now you can access Hue UI at http://localhost:8888

Login with user “root” or whichever user you are using to run hadoop.

Go to Query->Editor->Hive.
Hive

Try few queries.
Hive Editor

I was able to make Hive work with Hue. But somehow I could not make Oozie to work with Hue. I tried below steps but I was stuck at some error.

Try Query->Editor->Sqoop1
Import sample mysql table.

sqoop import \
--connect jdbc:mysql://localhost/employees \
--username root \
--password password \
--table employees --m 1 \
--hive-import \
--hive-overwrite \
--hive-database employees \
--target-dir /user/hive/warehouse/employees/

Sqoop

When tried, throws below error.

Failed to create deployment directory: SecurityException: Failed to obtain user group information: org.apache.hadoop.security.authorize.AuthorizationException: User: root is not allowed to impersonate root (error 403)

To resolve above error, edit core-site.xml and change user from “ubuntu” to “root” which is used for impersonating.

vi $HADOOP_CONF_DIR/core-site.xml

Before change,

<configuration>
  <property>
    <name>fs.defaultFS</name>
    <value>hdfs://localhost:54310</value>
  </property>
  <property>
   <name>hadoop.proxyuser.ubuntu.hosts</name>
   <value>*</value>
  </property>
  <property>
   <name>hadoop.proxyuser.ubuntu.groups</name>
   <value>*</value>
  </property>
</configuration>

After change,

<configuration>
  <property>
    <name>fs.defaultFS</name>
    <value>hdfs://localhost:54310</value>
  </property>
  <property>
   <name>hadoop.proxyuser.root.hosts</name>
   <value>*</value>
  </property>
  <property>
   <name>hadoop.proxyuser.root.groups</name>
   <value>*</value>
  </property>
</configuration>

Try import again, this time I got below error.

Error submitting workflow Batch job for query-sqoop1: The Oozie server is not running: HTTPConnectionPool(host='localhost', port=11000): Max retries exceeded with url: /oozie/v1/jobs?timezone=America%2FLos_Angeles&user.name=root&doAs=root (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7f302fb555d0>: Failed to establish a new connection: [Errno 111] Connection refused',))

2 thoughts to “Part-8 : Install Hue”

  1. Kath says:

    How to import data into Hue from mysql?

    1. Hadoop Tutorials says:

      Check my tutorial on sqoop. Sqoop will allow you to import data into hive tables. After that you can access them from Hue.
      http://hadooptutorials.info/2017/09/18/part-3-install-sqoop-version-1-4-x/

Leave a Reply

Your email address will not be published. Required fields are marked *