Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

java.lang.VerifyError: Inconsistent stackmap frames at branch target 152 #98

Open
alankala opened this issue Jan 9, 2017 · 3 comments

Comments

@alankala
Copy link

alankala commented Jan 9, 2017

Environment - Scala 2.11.8, spark 2.1.0 (with spark-core_2.11, spark-streaming_2.11), spark-rabbitmq 0.5.1.

When I run my Spark Streaming job using spark-submit on my standalone spark instance , I get below error -
2017-01-09 12:32:17 ERROR Executor:91 - Exception in task 0.0 in stage 0.0 (TID 0)
java.lang.VerifyError: Inconsistent stackmap frames at branch target 152
Exception Details:
Location:
akka/dispatch/Mailbox.processAllSystemMessages()V @152: getstatic
Reason:
Type top (current frame, locals[9]) is not assignable to 'akka/dispatch/sysmsg/SystemMessage' (stack map, locals[9])
Current Frame:
bci: @131
flags: { }
locals: { 'akka/dispatch/Mailbox', 'java/lang/InterruptedException', 'akka/dispatch/sysmsg/SystemMessage', top, 'akka/dispatch/Mailbox', 'java/lang/Throwable', 'java/lang/Throwable' }
stack: { integer }
Stackmap Frame:
bci: @152
flags: { }
locals: { 'akka/dispatch/Mailbox', 'java/lang/InterruptedException', 'akka/dispatch/sysmsg/SystemMessage', top, 'akka/dispatch/Mailbox', 'java/lang/Throwable', 'java/lang/Throwable', top, top, 'akka/dispatch/sysmsg/SystemMessage' }
stack: { }
Bytecode:
0x0000000: 014c 2ab2 0132 b601 35b6 0139 4db2 013e
0x0000010: 2cb6 0142 9900 522a b600 c69a 004b 2c4e
0x0000020: b201 3e2c b601 454d 2db9 0148 0100 2ab6
0x0000030: 0052 2db6 014b b801 0999 000e bb00 e759
0x0000040: 1301 4db7 010f 4cb2 013e 2cb6 0150 99ff
0x0000050: bf2a b600 c69a ffb8 2ab2 0132 b601 35b6
0x0000060: 0139 4da7 ffaa 2ab6 0052 b600 56b6 0154
0x0000070: b601 5a3a 04a7 0091 3a05 1905 3a06 1906
0x0000080: c100 e799 0015 1906 c000 e73a 0719 074c
0x0000090: b200 f63a 08a7 0071 b201 5f19 06b6 0163
0x00000a0: 3a0a 190a b601 6899 0006 1905 bf19 0ab6
0x00000b0: 016c c000 df3a 0b2a b600 52b6 0170 b601
0x00000c0: 76bb 000f 5919 0b2a b600 52b6 017a b601
0x00000d0: 80b6 0186 2ab6 018a bb01 8c59 b701 8e13
0x00000e0: 0190 b601 9419 09b6 0194 1301 96b6 0194
0x00000f0: 190b b601 99b6 0194 b601 9ab7 019d b601
0x0000100: a3b2 00f6 3a08 b201 3e2c b601 4299 0026
0x0000110: 2c3a 09b2 013e 2cb6 0145 4d19 09b9 0148
0x0000120: 0100 1904 2ab6 0052 b601 7a19 09b6 01a7
0x0000130: a7ff d62b c600 09b8 0109 572b bfb1
Exception Handler Table:
bci [290, 307] => handler: 120
Stackmap Table:
append_frame(@13,Object[#231],Object[#177])
append_frame(@71,Object[#177])
chop_frame(@102,1)
full_frame(@120,{Object[#2],Object[#231],Object[#177],Top,Object[#2],Object[#177]},{Object[#223]})
full_frame(@152,{Object[#2],Object[#231],Object[#177],Top,Object[#2],Object[#223],Object[#223],Top,Top,Object[#177]},{})
append_frame(@173,Object[#357])
full_frame(@262,{Object[#2],Object[#231],Object[#177],Top,Object[#2]},{})
same_frame(@307)
same_frame(@317)

at akka.dispatch.Mailboxes.<init>(Mailboxes.scala:33)
at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:628)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:142)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:109)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:100)
at org.apache.spark.streaming.rabbitmq.receiver.RabbitMQReceiver.onStart(RabbitMQInputDStream.scala:57)
at org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:149)
at org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:131)
at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:607)
at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:597)
at org.apache.spark.SparkContext$$anonfun$34.apply(SparkContext.scala:2021)
at org.apache.spark.SparkContext$$anonfun$34.apply(SparkContext.scala:2021)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:99)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

I see this issue is also reported here -
https://issues.apache.org/jira/browse/SPARK-16978.

I started to get this error when I upgraded from spark-rabbitmq_1.6 to latest version(0.5.1) and spark upgraded from spark-1.6.1 to spark-2.1.0

Below is my pom.xml file and if I remove the guava shade plugin, Streaming job runs fine. But I need this plugin.

<properties>
    <slf4j.version>1.7.7</slf4j.version>
    <log4j.version>1.2.17</log4j.version>
    <mapr.hbase.version>5.0.0-mapr</mapr.hbase.version>
    <guava.version>19.0</guava.version>
</properties>
....
<dependency>
        <groupId>com.google.guava</groupId>
        <artifactId>guava</artifactId>
        <version>${guava.version}</version>
 </dependency>
<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-core_2.11</artifactId>
  <version>2.1.0</version>
  <scope>provided</scope>
</dependency>
<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-streaming_2.11</artifactId>
  <version>2.1.0</version>
  <scope>provided</scope>
</dependency>
<dependency>
   <groupId>com.stratio.receiver</groupId>
   <artifactId>spark-rabbitmq</artifactId>
   <version>0.5.1</version>
 </dependency>
 </dependencies>
<build>
.....
.....
<plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-shade-plugin</artifactId>
        <version>2.4.3</version>
        <executions>
            <execution>
                <phase>package</phase>
                <goals>
                    <goal>shade</goal>
                </goals>
                <configuration>
  <relocations>
  <relocation>
    <pattern>com.google</pattern>
    <shadedPattern>shadeio</shadedPattern>
   <includes>
      <include>com.google.**</include>
  </includes>
  </relocation>
</relocations>
<transformers>
                        <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer" />
                    </transformers>

Has anyone hit this issue before. If yes, how did you solve it.

Thanks
Akhila.

@alankala
Copy link
Author

Any suggestions?

@brijeshkec11
Copy link

I am also seeing the same issue with scala 2.11 and spark 2.0.1
Any suggestions?

@compae
Copy link
Member

compae commented Aug 18, 2017

Now we update the akka version and the next weeks update the spark version to 2.1.0.

Do you have any update?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants