Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

java.lang.NoSuchMethodError: org.tartarus.snowball.models.EnglishStemmer.eq_s #103

Open
kumarivin opened this issue Aug 7, 2017 · 2 comments

Comments

@kumarivin
Copy link

kumarivin commented Aug 7, 2017

Hi I am getting the below exception while running the code in java api..
Exception in thread "main" java.lang.NoSuchMethodError: org.tartarus.snowball.models.EnglishStemmer.eq_s(ILjava/lang/String;)Z at org.tartarus.snowball.models.EnglishStemmer.r_prelude(EnglishStemmer.java:188) at org.tartarus.snowball.models.EnglishStemmer.stem(EnglishStemmer.java:1196) at fr.free.rocheteau.jerome.engines.Stemmer.process(Stemmer.java:108) at org.apache.uima.analysis_component.JCasAnnotator_ImplBase.process(JCasAnnotator_ImplBase.java:48) at org.apache.uima.analysis_engine.impl.PrimitiveAnalysisEngine_impl.callAnalysisComponentProcess(PrimitiveAnalysisEngine_impl.java:401) at org.apache.uima.analysis_engine.impl.PrimitiveAnalysisEngine_impl.processAndOutputNewCASes(PrimitiveAnalysisEngine_impl.java:318) at org.apache.uima.analysis_engine.asb.impl.ASB_impl$AggregateCasIterator.processUntilNextOutputCas(ASB_impl.java:570) at org.apache.uima.analysis_engine.asb.impl.ASB_impl$AggregateCasIterator.<init>(ASB_impl.java:412) at org.apache.uima.analysis_engine.asb.impl.ASB_impl.process(ASB_impl.java:344) at org.apache.uima.analysis_engine.impl.AggregateAnalysisEngine_impl.processAndOutputNewCASes(AggregateAnalysisEngine_impl.java:271) at org.apache.uima.analysis_engine.asb.impl.ASB_impl$AggregateCasIterator.processUntilNextOutputCas(ASB_impl.java:570) at org.apache.uima.analysis_engine.asb.impl.ASB_impl$AggregateCasIterator.<init>(ASB_impl.java:412) at org.apache.uima.analysis_engine.asb.impl.ASB_impl.process(ASB_impl.java:344) at org.apache.uima.analysis_engine.impl.AggregateAnalysisEngine_impl.processAndOutputNewCASes(AggregateAnalysisEngine_impl.java:271) at org.apache.uima.analysis_engine.impl.AnalysisEngineImplBase.process(AnalysisEngineImplBase.java:269) at org.apache.uima.analysis_engine.impl.AnalysisEngineImplBase.process(AnalysisEngineImplBase.java:284) at fr.univnantes.termsuite.framework.service.PreprocessorService.prepare(PreprocessorService.java:81) at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175) at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) at java.util.Iterator.forEachRemaining(Iterator.java:116) at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151) at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174) at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418) at fr.univnantes.termsuite.api.Preprocessor.toIndexedCorpus(Preprocessor.java:218) at fr.univnantes.termsuite.api.Preprocessor.toIndexedCorpus(Preprocessor.java:133) at fr.univnantes.termsuite.api.Preprocessor.toIndexedCorpus(Preprocessor.java:127) at fr.univnantes.termsuite.api.Preprocessor.toIndexedCorpus(Preprocessor.java:120) at com.fractal.data_extraction.TermExtractor.<init>(TermExtractor.scala:16) at com.fractal.data_extraction.TermExtractor$.delayedEndpoint$com$fractal$data_extraction$TermExtractor$1(TermExtractor.scala:21) at com.fractal.data_extraction.TermExtractor$delayedInit$body.apply(TermExtractor.scala:20) at scala.Function0$class.apply$mcV$sp(Function0.scala:40) at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12) at scala.App$$anonfun$main$1.apply(App.scala:76) at scala.App$$anonfun$main$1.apply(App.scala:76) at scala.collection.immutable.List.foreach(List.scala:383) at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35) at scala.App$class.main(App.scala:76) at com.fractal.data_extraction.TermExtractor$.main(TermExtractor.scala:20) at com.fractal.data_extraction.TermExtractor.main(TermExtractor.scala)

Code used..

`import java.nio.file.Paths
import fr.univnantes.termsuite.api.{TXTCorpus, TermSuite}
import fr.univnantes.termsuite.model.Lang
class TermExtractor {
val txtCorpus = new TXTCorpus(Lang.EN, Paths.get("/Users/trinadhimmedisetty/Downloads/TERMSUITE_WORKSPACE/wind-energy/English"))

val indexedCorpus = TermSuite.preprocessor
.setTaggerPath(Paths.get("/Users/trinadhimmedisetty/Downloads/TERMSUITE_WORKSPACE/treetagger"))
.toIndexedCorpus(txtCorpus,500000)
val terminology = indexedCorpus.getTerminology

}
object TermExtractor extends App{
val tempobj=new TermExtractor
System.out.println(tempobj.terminology)

}`

Dependencies used

<dependency> <groupId>fr.univ-nantes.termsuite</groupId> <artifactId>termsuite-core</artifactId> <version>3.0.9</version> </dependency> <!-- https://mvnrepository.com/artifact/org.antlr/antlr4 --> <dependency> <groupId>org.antlr</groupId> <artifactId>antlr4</artifactId> <version>4.7</version> </dependency>

@dcram
Copy link
Member

dcram commented Aug 28, 2017

Interesting. What language is that ?

I can ensure you that 3.0.9 is working fine with Lang.EN. Could you please give me the equivalent in Java ?

Damien

@kumarivin
Copy link
Author

kumarivin commented Aug 28, 2017

This is written in scala, and I did figure out the root cause but still not sure how to fix this as mentioned at weavejester/snowball-stemmer#2 .
The rootcause is that I have a stanfordnlp dependency in my project which indeed has lucene analyzers transitive dependency and when I excluded that transitive dependency via "exclusion" method , the aforementioned error disappeared, so this is kind of conflicting but not sure how to resolve.

edu.stanford.nlp stanford-corenlp 3.8.0 org.apache.lucene lucene-analyzers-common

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants