在Spark job的发布中,遇到如下错误:
17/11/01 16:16:32 ERROR scheduler.JobScheduler: Error running job streaming job 1509524182000 ms.0
org.apache.spark.SparkException: Job aborted due to stage failure: Task 3 in stage 0.0 failed 4 times, most recent failure: Lost task 3.3 in stage 0.0 (TID 11, centos067): java.lang.VerifyError: Bad type on operand stack
Exception Details:
Location:
com/clife/data/base/HttpClientUtil$.httpPost(Ljava/lang/String;Lscala/collection/immutable/Map;)Z @134: invokevirtual
Reason:
Type 'org/apache/http/impl/client/DefaultHttpClient' (current frame, stack[0]) is not assignable to 'org/apache/http/impl/client/CloseableHttpClient'
Current Frame:
bci: @134
flags: { }
locals: { 'com/clife/data/base/HttpClientUtil$', 'java/lang/String', 'scala/collection/immutable/Map', top, 'org/apache/http/impl/client/DefaultHttpClient', 'org/apache/http/client/methods/HttpPost' }
stack: { 'org/apache/http/impl/client/DefaultHttpClient', 'org/apache/http/client/methods/HttpPost' }
这是第一次遇到java.lang.VerifyError这个异常,在网上找了许多的资料,根本原因就是jar包版本冲突导致了这个问题,本文参考下面文章:http://www.cnblogs.com/liuting/p/7210266.html
其实,从上面报错也可以看出,报错位置在自己写的一个http post 提交方法中,com/clife/data/base/HttpClientUtil$.httpPost,所以暂时确实为httpclient-jar版本冲突,利用eclipse的jar依赖树可以看到,许多的jar中都引用httpclient和httpcore这两个jar(这里已经在pom中exclusion)
从使用中也发现,高版本的jar删除了许多低版本中的方法和类,所以本着就低就高的原则,先exclusion引用了这两个jar的pom,在单独加入这两个jar,重新发布,解决问题。
网友评论