gaussDB出现的问题

将hive中的表进行关联查询,然后把数据放入到gaussDB中:

package com.spark
import org.apache.spark.sql.hive.HiveContext
import org.apache.spark.sql.{DataFrame, SaveMode}
import org.apache.spark.{SparkConf, SparkContext}
object SparkOnHive1 {
def main(args: Array[String]): Unit = {
//读取hive中的数据方法一
//val spark = SparkSession.builder().appName(“scala Spark Hive”).enableHiveSupport().getOrCreate();
//读取hive中的数据方法二
val sparkConf = new SparkConf().setAppName(“SparkHive”)
val sc = new SparkContext(sparkConf)
val sqlContext = new HiveContext(sc)
//sql
val dataFrame : DataFrame = sqlContext.sql("SELECT distinct f.* " +
"FROM yxls_mirror.fc_gc a " +
"left join sjzt_amr_buff.c_mp b " +
"on a.gc_id = b.cons_id " +
"left join sjzt_amr_buff.c_meter_mp_rela c " +
"on b.mp_id = c.mp_id " +
"left join sjzt_amr_buff.c_meter d " +
"on c.meter_id = d.meter_id " +
"left join sjzt_amr_buff.p_mped e " +
"on d.meter_id = e.meter_id " +
"left join sjzt_amr_buff.e_mp_cur_curve f " +
"on e.mped_id = f.id " +
“where f.data_date > ‘2016-12-31 00:00:00’”)
//打印
dataFrame.show()
//写入到gaussDB
dataFrame.write
.format(“jdbc”)
.option(“url”, “jdbc:postgresql://25.39.82.191:28000/postgres?characterEncoding=utf-8”)
.option(“dbtable”, “day_ele”)
.option(“driver”,“org.postgresql.Driver”)
.option(“user”, “scene_test”)
.option(“password”, “[email protected]”)
.mode(SaveMode.Append)
.save()
}
}

报错:gaussDB出现的问题
gaussDB出现的问题
出现这种报错:
一般都是IP或者端口以及防火墙的问题,逐一排查

公司有2个端口,即使用对内开放IP