最新消息:Welcome to the puzzle paradise for programmers! Here, a well-designed puzzle awaits you. From code logic puzzles to algorithmic challenges, each level is closely centered on the programmer's expertise and skills. Whether you're a novice programmer or an experienced tech guru, you'll find your own challenges on this site. In the process of solving puzzles, you can not only exercise your thinking skills, but also deepen your understanding and application of programming knowledge. Come to start this puzzle journey full of wisdom and challenges, with many programmers to compete with each other and show your programming wisdom! Translated with DeepL.com (free version)

scala - Exception in thread "main" java.lang.UnsupportedOperationException: getSubject is supported only if a

matteradmin8PV0评论

env: Windows + idea + maven + scala frame

i am studying how to use spark and just trying to run code as follow...

import .apache.spark.{SparkConf, SparkContext}

object wordCount {
  def main(args: Array[String]): Unit = {
    val conf = new SparkConf().setMaster("local[*]").setAppName("WordCount")

    val sc = new SparkContext(conf)

    sc.setLogLevel("WARN")

    val fileRDD = sc.textFile("datas/words")

    val wordRDD = fileRDD.flatMap(_.split(" "))

    val word2OneRDD = wordRDD.map((_,1))

    val word2CountRDD = word2OneRDD.reduceByKey(_ + _)

    word2CountRDD.foreach(println)

    val collect = word2CountRDD.collect().toBuffer
    collect.foreach(println)

    word2CountRDD.repartition(1).saveAsTextFile("datas/words/result")

    println("save to datas/words/result")

    sc.stop()

    println("done")
  }
}

and here is the pom.xml

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns=".0.0"
         xmlns:xsi=";
         xsi:schemaLocation=".0.0 .0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>.example</groupId>
    <artifactId>untitled1</artifactId>
    <version>1.0-SNAPSHOT</version>

    <properties>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    </properties>

    <dependencies>
        <dependency>
            <groupId>.apache.spark</groupId>
            <artifactId>spark-2.12</artifactId>
            <version>3.3.0</version>
        </dependency>
    </dependencies>
</project>

i guess the problem cause by hadoop configured by maven on windows but can't deal with it

Articles related to this article

Post a comment

comment list (0)

  1. No comments so far