美文网首页
spark错误:spark Container killed o

spark错误:spark Container killed o

作者: 程序媛啊 | 来源:发表于2022-05-23 14:36 被阅读0次

错误日志1:

For more detailed output, check application tracking page:http://bigserver1:8088/cluster/app/application_1555651019351_0001Then, click on links to logs of each attempt.
Diagnostics: Container [pid=280568,containerID=container_e09_1555651019351_0001_02_000001] is running beyond virtual memory limits. Current usage: 383.9 MB of 1 GB physical memory used; 1.5 GB of 1.1 GB virtual memory used. Killing container.

Killed by external signal
。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。省略。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。

Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
Failing this attempt

错误日志2:

2019-04-24 14:42:31 ERROR TransportRequestHandler:293 - Error sending result RpcResponse{requestId=8586978943123146370, body=NioManagedBuffer{buf=java.nio.HeapByteBuffer[pos=0 lim=13 cap=13]}} to /10.0.40.222:34982; closing connection
io.netty.handler.codec.EncoderException: java.lang.OutOfMemoryError: Java heap space

上面二个错误,都是因为spark分配的内存不足造成的,解决方法:
在提交命令里添加:
--conf spark.driver.memory=10g
--conf spark.executor.memory=32g

参考:http://blog.51yip.com/hadoop/2123.html

相关文章

网友评论

      本文标题:spark错误:spark Container killed o

      本文链接:https://www.haomeiwen.com/subject/hqmrprtx.html