Spark-Sql启动报错:Error creating transactional connection factory

1SparkSQL配置

   将$HIVE_HOME/conf/hive-site.xml配置文件拷贝到$SPARK_HOME/conf目录下。

   将$HADOOP_HOME/etc/hadoop/hdfs-site.xml配置文件拷贝到$SPARK_HOME/conf目录下。

2、运行SparkQL
   在 cd /usr/local/spark,下运行 ./bin/spark-sql报错 Error creating transactional connection factory
Spark-Sql启动报错:Error creating transactional connection factory
Spark-Sql启动报错:Error creating transactional connection factory
  
首先,因为给hive配的Mysql作为元数据服务器,因此需要对应的jar包支持,查看详细错误输出日志发现:确实是缺少mysql-connector的jar包,所以:需要在启动SparkQL时指定对于的mysql jar包所在路径:如下所示:
修改./bin/spark-sql --driver-class-path /usr/local/hive/lib/mysql-connector-java-5.1.25.jar  就可以正常运行:

输出如下:
Spark-Sql启动报错:Error creating transactional connection factory
Spark-Sql启动报错:Error creating transactional connection factory