A master URL must be set in your configuration
org.apache.spark.SparkException: A master URL must be set in your configuration
at org.apache.spark.SparkContext.<init>(SparkContext.scala:376)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2516)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:918)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:910)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:910)
at com.dayima.local.SparkOnHBaseSparkSession$.main(SparkOnHBaseSparkSession.scala:15)
at com.dayima.local.SparkOnHBaseSparkSession.main(SparkOnHBaseSparkSession.scala)
2个原因
原因1:运行的程序是放在集群上跑,又在本地测试,在本地测试的时候忘了设置
setMaster("local")
解决方案:setMaster("local")
原因2:本地没有创建sparkConf,只有SparkSession(无法设置setMaster("local"))
解决方案:
点击edit configuration,在左侧点击该项目。在右侧VM options中输入“-Dspark.master=local”,指示本程序本地单线程运行,再次运行即可。
配置图片