hdf安装
HDF3.0.2安装
1、升级HDP版本至2.6.3
2、配置MySQL
mysql -u root -p
create database registry;
create database streamline;
CREATE USER ‘registry’\@’%’ IDENTIFIED BY ‘registry’;
CREATE USER ‘streamline’\@’%’ IDENTIFIED BY ‘streamline’;
GRANT ALL PRIVILEGES ON *.* TO ‘registry’\@’%’ WITH GRANT OPTION ;
GRANT ALL PRIVILEGES ON *.* TO ‘streamline’\@’%’ WITH GRANT OPTION ;
commit;
CREATE DATABASE druid DEFAULT CHARACTER SET utf8;
CREATE DATABASE superset DEFAULT CHARACTER SET utf8;
CREATE USER ‘druid’\@’%’ IDENTIFIED BY ‘druid’;
CREATE USER ‘superset’\@’%’ IDENTIFIED BY ‘superset’;
GRANT ALL PRIVILEGES ON *.* TO ‘druid’\@’%’ WITH GRANT OPTION;
GRANT ALL PRIVILEGES ON *.* TO ‘superset’\@’%’ WITH GRANT OPTION;
commit;
3、安装HDF mpack
ambari-server install-mpack \
–mpack=hdf-ambari-mpack-3.0.2.0-76.tar.gz \
–verbose
ambari-server restart
4、配置本地yum
tar -zxf HDF-3.0.2.0-centos7-rpm.tar.gz -C /var/www/html/
service httpd restart
hdf.repo
#VERSION_NUMBER=3.0.2.0-76
[HDF-3.0.2.0]
name=HDF Version - HDF-3.0.2.0
baseurl=http://ip/HDF/centos7/3.0.2.0-76
gpgcheck=0
enabled=1
priority=1
yum clean all
yum makecache
5、登陆ambari安装
6、nifi安装
配置密码,nifi123456789
7、Registry安装
Registry启动异常信息:Exception in thread “main” java.lang.NoSuchMethodError:
org.apache.hadoop.util.StringUtils.toLowerCase
解决方法:cp hadoop-common-2.7.3.2.6.3.0-235.jar
/usr/hdf/3.0.2.0-76/registry/libs,
删掉libs下之前的hadoop-common。
8、Superset安装
(1)升级openssl为1.0.2
(2)配置:SECRET_KEY:全字母
更改端口号为9099
(3)异常信息:
Slices下点击加号,报错:“UnboundLocalError: local variable ‘url’ referenced
before assignment”
详见:https://community.hortonworks.com/questions/117751/error-500-when-adding-new-slice-in-superset.html
解决方法:
https://github.com/SalehHindi/superset/commit/5dd1aefdf0d3a73d9dff6436031f4d88e618eac5