1. 程式人生 > >spark-submit 提示錯誤java.lang.IllegalArgumentException: System memory 468189184 must be at least 4.7185

spark-submit 提示錯誤java.lang.IllegalArgumentException: System memory 468189184 must be at least 4.7185

在執行spark-submit時會報錯,是因為記憶體不足導致的,但是配置了driver-memory和executor-memory時都不行,報錯:

ERROR SparkContext: Error initializing SparkContext.

java.lang.IllegalArgumentException: System memory 468189184 must be at least 4.718592E8. Please use a larger heap size.

通過檢視spark原始碼,發現原始碼是這麼寫的:

/**

   * Return the total amount of memory shared between execution and storage, in bytes.

   */

  private def getMaxMemory(conf: SparkConf): Long = {

    val systemMemory = conf.getLong("spark.testing.memory", Runtime.getRuntime.maxMemory)

    val reservedMemory = conf.getLong("spark.testing.reservedMemory",

      if (conf.contains("spark.testing")) 0 else RESERVED_SYSTEM_MEMORY_BYTES)

    val minSystemMemory = reservedMemory * 1.5

    if (systemMemory < minSystemMemory) {

      throw new IllegalArgumentException(s"System memory $systemMemory must " +

        s"be at least $minSystemMemory. Please use a larger heap size.")

    }

    val usableMemory = systemMemory - reservedMemory

    val memoryFraction = conf.getDouble("spark.memory.fraction", 0.75)

    (usableMemory * memoryFraction).toLong

  }

主要是標黑的地方的問題,獲取了配置引數:spark.testing.memory引數,所以解決方法如下:

方法1:在原始碼中加入 val conf=new SparkConf).setAppName("s")

                         conf.set("spark.testing.memory","2147480000") //後面的值大於512M即可。

方法2:可以在Eclipse的Run Configuration處,有一欄是Arguments,下面有VMarguments,在下面新增下面一行(值也是隻要大於512m即可)

-Dspark.testing.memory=1073741824

方法3:在saprk提交方式是:yarn-cluster模式時,在spark-default.conf檔案中配置引數spark.testing.memory=1073741824 

還有配置VM 的記憶體也是在該檔案中配置

spark.driver.extraJavaOptions -XX:PermSize=256M -XX:MaxPermSize=512M