1. 获取镜像

  1. # 查看可用的稳定版本
  2. sudo docker search hue
  3. sudo docker pull gethue/hue:latest
  4. sudo docker image ls |grep hue

2. 启动容器

  1. sudo docker run -d --name=hue --network bigdata -p 28888:8888 gethue/hue:latest
  2. sudo docker ps |grep hue
  3. sudo docker start hue
  4. sudo docker restart hue
  5. sudo docker stop hue
  6. sudo docker rm hue
  7. sudo docker logs hue
  8. sudo docker logs -f hue

3. 验证

Web UIhttp://LTSR003:28888 (admin/123456,首次登录时创建)

4. 基本配置

  • 配置密钥和时区

    1. docker exec -it hue bash
    2. vi /usr/share/hue/desktop/conf/hue.ini

    内容如下:

    1. [desktop]
    2. secret_key=jFE93j;2[290-eiw.KEiwN2s3['d;/.q[eIW^y#e=+Iei*@Mn<qW5o
    3. http_host=0.0.0.0
    4. http_port=8888
    5. time_zone=Asia/Shanghai
  • 配置元数据

    1. docker exec -it mysql bash
    2. mysql -uroot -proot

    内容如下:

    1. create database huedb;
    2. grant all on huedb.* to root@'%' identified by 'root';
    3. grant all on huedb.* to root@localhost identified by 'root';
    4. grant all on huedb.* to root@172.16.0.6 identified by 'root';
    5. ALTER DATABASE huedb CHARACTER SET latin1;
    6. flush privileges;
    7. quit;

    5. 集成

    image.png

  • 进入容器

    1. docker exec -it hue bash
    2. vi /usr/share/hue/desktop/conf/hue.ini

    1. Hue与HDFS集成

  • 配置Hue

    1. [hadoop]
    2. [[hdfs_clusters]]
    3. [[[default]]]
    4. fs_defaultfs=hdfs://172.16.0.2:9000
    5. webhdfs_url=http://172.16.0.2:50070/webhdfs/v1
    6. hadoop_hdfs_home=/usr/local/hadoop-2.7.3
    7. hadoop_bin=/usr/local/hadoop-2.7.3/bin
    8. hadoop_conf_dir=/usr/local/hadoop-2.7.3/etc/hadoop
  • 启动httpfs ```bash

    1.进入容器

    docker exec -it hadoop-master bash

开启httpfs

cd /usr/local/hadoop-2.7.3/ bin/hdfs dfs -chmod -R o+x /tmp sbin/httpfs.sh start

检查端口(容器上执行)

netstat -ant |grep 14000

检查URL(宿主机执行)

sudo docker inspect hadoop-master| grep IPAddress curl -i “http://172.16.0.2:14000/webhdfs/v1/?user.name=root&op=GETHOMEDIRECTORY“ curl -i “http://172.16.0.2:50070/webhdfs/v1/?user.name=root&op=GETHOMEDIRECTORY

  1. <a name="pe1HU"></a>
  2. ## 2. Hue与YARN集成
  3. ```bash
  4. [hadoop]
  5. [[yarn_clusters]]
  6. [[[default]]]
  7. submit_to=True
  8. resourcemanager_host=172.16.0.2
  9. resourcemanager_port=8032
  10. resourcemanager_api_url=http://172.16.0.2:8088
  11. proxy_api_url=http://172.16.0.2:8088
  12. history_server_api_url=http://172.16.0.2:19888

3. Hue与MySQL集成

1. 配置MySQL作为Hue元数据库

  1. [desktop]
  2. [[database]]
  3. engine=mysql
  4. host=172.16.0.6
  5. port=3306
  6. user=root
  7. password=root
  8. name=huedb

2. 配置Hue管理MySQL数据库

  1. [librdbms]
  2. [[databases]]
  3. [[[mysql]]] # removed the symbol “##”
  4. nice_name="MySQL DB"
  5. engine=mysql
  6. host=172.16.0.6
  7. port=3306
  8. user=root
  9. password=root

4. Hue与Hive集成

  1. [beeswax]
  2. hive_server_host=172.16.0.5
  3. hive_server_port=10000
  4. hive_conf_dir=/usr/local/apache-hive-2.3.2-bin/conf
  5. auth_username=root
  6. auth_password=root
  7. thrift_version=7
  8. [metastore]
  9. # 许使用hive创建数据库表等操作
  10. enable_new_create_table=true

5. Hue与Zookeeper集成

  1. [zookeeper]
  2. [[clusters]]
  3. [[[default]]]
  4. host_ports=172.16.0.101:2181,172.16.0.102:2181,172.16.0.103:2181

6. Hue与Spark集成

  1. # 1. Distributed SQL Engine
  2. # 1.2. Thrift Server
  3. [spark]
  4. # 注意:此处为hive2Server的IP地址
  5. sql_server_host=172.16.0.5
  6. sql_server_port=10000
  7. security_enabled=false
  8. # 2. Apache Livy
  9. [spark]
  10. livy_server_url=http://172.16.0.2:8998

7. Hue集成NoteBook

  1. [notebook]
  2. [[interpreters]]
  3. # 1. Distributed SQL Engine
  4. # 1.1. SqlAlchemy
  5. # error:“Password should be set if and only if in LDAP or CUSTOM mode; Remove password or use one of those modes”
  6. [[[sparksql]]]
  7. name=Spark SQL
  8. interface=sqlalchemy
  9. options='{"url": "hive://root:root@172.16.0.5:10000/default"}'
  10. # 1.2. Thrift Server
  11. # 需要同时配置[spark]节点
  12. [[[sparksql]]]
  13. name=Spark SQL
  14. interface=hiveserver2
  15. # 2. Apache Livy
  16. # 需要同时配置[spark]节点
  17. [[[sparksql]]]
  18. name=Spark SQL
  19. interface=livy
  20. [[[spark]]]
  21. name=Scala
  22. interface=livy
  23. [[[hive]]]
  24. name=Hive
  25. interface=hiveserver2
  26. [[[mysql]]]
  27. name=MySQL
  28. interface=sqlalchemy
  29. options='{"url": "mysql://root:root@172.16.0.6:3306/huedb"}'
  • 参考配置

hue.ini

  • 重新启动Hue
    1. sudo docker restart hue

    6. 禁用组件

    配置如下:
    1. [desktop]
    2. app_blacklist=impala,pig,hbase,sqoop,search