执行spark程序,具体报错如下

Exception in thread "main" java.lang.NoSuchFieldError: JAVA_9
	at org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:207)
	at org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala)
	at org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:93)
	at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:370)
	at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:311)
	at org.apache.spark.SparkEnv$.create(SparkEnv.scala:359)
	at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:189)
	at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:272)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:448)
	at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2589)
	at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:937)
	at scala.Option.getOrElse(Option.scala:189)
	at org.apache.spark.sql.SparkSession$Builder.getOrCreate(S

报错代码是

https://github.com/apache/spark/blob/branch-3.0/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala

 第207行 if (SystemUtils.isJavaVersionAtLeast(JavaVersion.JAVA_9))

private[spark] object StorageUtils extends Logging {

  // In Java 8, the type of DirectBuffer.cleaner() was sun.misc.Cleaner, and it was possible
  // to access the method sun.misc.Cleaner.clean() to invoke it. The type changed to
  // jdk.internal.ref.Cleaner in later JDKs, and the .clean() method is not accessible even with
  // reflection. However sun.misc.Unsafe added a invokeCleaner() method in JDK 9+ and this is
  // still accessible with reflection.
  private val bufferCleaner: DirectBuffer => Unit =
    if (SystemUtils.isJavaVersionAtLeast(JavaVersion.JAVA_9)) {
      val cleanerMethod =
        Utils.classForName("sun.misc.Unsafe").getMethod("invokeCleaner", classOf[ByteBuffer])
      val unsafeField = classOf[Unsafe].getDeclaredField("theUnsafe")
      unsafeField.setAccessible(true)
      val unsafe = unsafeField.get(null).asInstanceOf[Unsafe]
      buffer: DirectBuffer => cleanerMethod.invoke(unsafe, buffer)
    } else {
      val cleanerMethod = Utils.classForName("sun.misc.Cleaner").getMethod("clean")
      buffer: DirectBuffer => {
        // Careful to avoid the return type of .cleaner(), which changes with JDK
        val cleaner: AnyRef = buffer.cleaner()
        if (cleaner != null) {
          cleanerMethod.invoke(cleaner)
        }
      }
    }

原因是与maven中的包不兼容,我们修改hadoop-common和hadoop-mapreduce-client-core包的引用,注意加上 <scope>provided</scope>

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>${hadoop.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-mapreduce-client-core</artifactId>
            <version>${hadoop.version}</version>
        </dependency>

修改后,重新reimport一下,再次执行程序,结果还是报错,没办法,继续查找资料,终于被我找到一个资料,告诉我们主要是因为Spark-3.x会依赖commons-lang3这个包

普通的解决方法就不说了,解决冲突,缺少commons-lang3的加上去,在pom文件加入下面依赖

        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-lang3</artifactId>
            <version>3.8.1</version>
        </dependency>

如果还不行,就检查下是否有hive-exec这个依赖的存在,打开这个引用你会发现它里面竟然也有commons-lang3,但是它的JavaVersion这个类竟然不一样!!!

如果是hive的问题,排除掉hive-exec就好了

总结
        感谢能看到这里的朋友😉

        本次的分享就到这里,猫头鹰数据致力于为大家分享技术干货😎

        如果以上过程中出现了任何的纰漏错误,烦请大佬们指正😅

        受益的朋友或对技术感兴趣的伙伴记得点赞关注支持一波🙏

        也可以搜索关注我的微信公众号【猫头鹰数据分析】,留言交流🙏

Logo

技术共进,成长同行——讯飞AI开发者社区

更多推荐