Hive环境搭建

相关文档

安装hive 0.13.1

要求:

  • jdk1.7以上
  • hadoop 2.0
  • linux(最常安装)

安装步骤;

  • 启动hdfs服务
    • sbin/start-dfs.sh
    • sbin/start-yarn.sh
    • sbin/mr-jobhistory-daemon.sh start historyserver
  • 解压至/usr/local目录(依据个人安装目录即可)
  • hive-env.sh
# Set HADOOP_HOME to point to a specific hadoop install directory
HADOOP_HOME=/opt/app/hadoop-2.5.0

# Hive Configuration Directory can be controlled by:
export HIVE_CONF_DIR=/usr/local/hive-0.13.1/conf
  • 创建hive-site.xml
<configuration>
        <property>
                <name>hive.metastore.local</name>
                <value>true</value>
        </property>
        <property>
                <name>javax.jdo.option.ConnectionURL</name>
                <value>jdbc:mysql://127.0.0.1:3306/hivedb?createDatabaseIfNotExist=true</value>
        </property>
        <property>
                <name>javax.jdo.option.ConnectionDriverName</name>
                <value>com.mysql.jdbc.Driver</value>
        </property>
        <property>
                <name>javax.jdo.option.ConnectionUserName</name>
                <value>root</value>
        </property>
        <property>
                <name>javax.jdo.option.ConnectionPassword</name>
                <value>MyNewPass4!</value>
        </property>
		<property>
			<name>datanucleus.schema.autoCreateAll</name>
			<value>true</value>
		</property>

		<property>
			<name>hive.metastore.schema.verification</name>
			<value>false</value>
		</property>
</configuration>

  • 在hdfs目录下创建文件夹
  $ $HADOOP_HOME/bin/hdfs dfs -mkdir       /tmp
  $ $HADOOP_HOME/bin/hdfs dfs -mkdir       /user/hive/wa
  rehouse
  $ $HADOOP_HOME/bin/hdfs dfs -chmod g+w   /tmp
  $ $HADOOP_HOME/bin/hdfs dfs -chmod g+w   /user/hive/warehouse
  • 运行hive
    • bin/hive

安装遇到问题

  1. 运行hive是缺少jar,如图:

Hive环境搭建
- 这是由于0.13.1版本没有lib文件,解压一个其他版本,然后将lib文件复制到0.13.1安装目录下即可。

  1. 运行bin/hive是出现:
    Hive环境搭建
  • 参考博客博客2
  • 我将hadoop3版本的share文件夹覆盖2.5.0中的share