构建Apache星火SQL核心
问题描述:
我试图构建Apache星火SQL核心(1.4.1),我得到以下堆栈跟踪。但是如果我构建整个Spark项目,一切进展顺利,并且该建筑物成功完成。有任何想法吗?构建Apache星火SQL核心
堆栈跟踪
[error] /home/ubuntu/Dev/spark/sql/core/src/main/scala/org/apache/spark/sql/DataFrameReader.scala:258: value globPathIfNecessary is not a member of org.apache.spark.deploy.SparkHadoopUtil
[error] SparkHadoopUtil.get.globPathIfNecessary(qualified)
[error] ^
[error] /home/ubuntu/Dev/spark/sql/core/src/main/scala/org/apache/spark/sql/DataFrameReader.scala:263: value map is not a member of Array[Nothing]
[error] globbedPaths.map(_.toString), None, None, extraOptions.toMap)(sqlContext))
[error] ^
[error] /home/ubuntu/Dev/spark/sql/core/src/main/scala/org/apache/spark/sql/execution/expressions/MonotonicallyIncreasingID.scala:22: object Nondeterministic is not a member of package org.apache.spark.sql.catalyst.expressions
[error] import org.apache.spark.sql.catalyst.expressions.{Nondeterministic, LeafExpression}
[error] ^
[error] /home/ubuntu/Dev/spark/sql/core/src/main/scala/org/apache/spark/sql/execution/expressions/MonotonicallyIncreasingID.scala:36: not found: type Nondeterministic
[error] private[sql] case class MonotonicallyIncreasingID() extends LeafExpression with Nondeterministic {
[error] ^
[error] /home/ubuntu/Dev/spark/sql/core/src/main/scala/org/apache/spark/sql/execution/expressions/SparkPartitionID.scala:22: object Nondeterministic is not a member of package org.apache.spark.sql.catalyst.expressions
[error] import org.apache.spark.sql.catalyst.expressions.{Nondeterministic, LeafExpression}
[error] ^
[error] /home/ubuntu/Dev/spark/sql/core/src/main/scala/org/apache/spark/sql/execution/expressions/SparkPartitionID.scala:30: not found: type Nondeterministic
[error] private[sql] case object SparkPartitionID extends LeafExpression with Nondeterministic {
[error] ^
[error] /home/ubuntu/Dev/spark/sql/core/src/main/scala/org/apache/spark/sql/sources/ddl.scala:252: value globPathIfNecessary is not a member of org.apache.spark.deploy.SparkHadoopUtil
[error] SparkHadoopUtil.get.globPathIfNecessary(qualifiedPattern).map(_.toString).toArray
[error] ^
[error] /home/ubuntu/Dev/spark/sql/core/src/main/scala/org/apache/spark/sql/sources/ddl.scala:279: value globPathIfNecessary is not a member of org.apache.spark.deploy.SparkHadoopUtil
[error] SparkHadoopUtil.get.globPathIfNecessary(qualifiedPattern).map(_.toString).toArray
[error] ^
[error] 8 errors found
[debug] Compilation failed (CompilerInterface)
[error] Compile failed at Jul 21, 2015 5:57:38 AM [29.605s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 38.435s
[INFO] Finished at: Tue Jul 21 05:57:38 UTC 2015
[INFO] Final Memory: 37M/609M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile (scala-compile-first) on project spark-sql_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile failed. CompileFailed -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile (scala-compile-first) on project spark-sql_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile failed.
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:225)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59)
at org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:320)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:141)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: org.apache.maven.plugin.PluginExecutionException: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile failed.
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:110)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209)
... 19 more
Caused by: Compile failed via zinc server
at sbt_inc.SbtIncrementalCompiler.zincCompile(SbtIncrementalCompiler.java:136)
at sbt_inc.SbtIncrementalCompiler.compile(SbtIncrementalCompiler.java:86)
at scala_maven.ScalaCompilerSupport.incrementalCompile(ScalaCompilerSupport.java:303)
at scala_maven.ScalaCompilerSupport.compile(ScalaCompilerSupport.java:119)
at scala_maven.ScalaCompilerSupport.doExecute(ScalaCompilerSupport.java:99)
at scala_maven.ScalaMojoSupport.execute(ScalaMojoSupport.java:482)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101)
... 20 more
[ERROR]
[ERROR]
答
那么有在SparkHadoopUtil没有globPathIfNecessary所以它必须是你自己的修改。当您从顶层运行构建时,Maven reactor可以看到整个项目并可以看到您的更改。当您从子项目运行构建时,Maven会在本地回购的子项目外寻找所有内容,因此除非您安装了它们,否则无法看到任何修改。因此,请从顶层再次运行构建版本,但请执行install
而不是package
以将修改安装到本地回购站中。一旦你这样做,从sql/core执行构建应该能够成功地解决你的更改。