About Spark core update 3.4.1 and my working code stop #14070
aksh0369-smarten
started this conversation in
General
Replies: 1 comment
-
Could you please share more info? Versions? Platform? Scala? Python? SBT? Any code? Is it a pipeline? model? Just in case, if you are in a Scala/Java project, you must match the same exact Spark and Scala versions as our library or else some models/pipelines fail with that Java error (checking the sanity of the saved object.) |
Beta Was this translation helpful? Give feedback.
0 replies
# for free
to join this conversation on GitHub.
Already have an account?
# to comment
-
i update my spark core to 3.4.1 and my working code regarding jhownsnow nlp is not working and shows he below error
i am using latest version of spark nlp.
Can anyone help me?
java.io.InvalidClassException: com.johnsnowlabs.nlp.annotators.pos.perceptron.AveragedPerceptron; local class incompatible: stream classdesc serialVersionUID = -7114715142956979922, local class serialVersionUID = 6642857758815297725
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:699) ~[?:1.8.0_231]
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1885) ~[?:1.8.0_231]
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1751) ~[?:1.8.0_231]
Beta Was this translation helpful? Give feedback.
All reactions