-
Notifications
You must be signed in to change notification settings - Fork 240
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Use of GenericSerde results in "Method too large" error during scalac compilation #785
Comments
import com.sksamuel.avro4s.kafka.GenericSerde Here in simple code snippet that also throws following error I am using Here only UnitTerminalVisit is the nested case class and UnitFcyVisit, InvUnit, ArgoCarrierVisit are plane case classes whose fields are of primitive types or Option of primitive types. |
Even I tried with giving explicit schema for intermediate case classes, I am getting same error. **import com.sksamuel.avro4s.kafka.GenericSerde
}** Output: |
Same for me. Bump it. |
@sabujkumarjena To overcome the issue you have two choices:
Rewrite this:
into
Rewrite this:
into
These approaches work great together as well, you do not have to choose only one of them: feel free to use both at the same time. |
@vkorchik I tried out your suggestion, was able to fix the issue somehow. It works but the models have become a much bigger hassle. I wrapped my various case classes into separate objects.
Is there any better idea ? or any new implementation which i am not aware of ? cc: @sabujkumarjena |
@anmol-hpe , what do you mean by "much bigger hassle"? |
@vkorchik consider that i have over 80 tables in my Database named 'X'. Each of the table consists of more than 30-40 fields on an average. From what i understood, i will have to wrap each table separately in a different object. Kindly correct me if my understanding is incorrect ? I tried this out with less number of tables, it worked fine. Basically i am trying to create avroSchemas from such case classes. Are there any constraints in this scenario which i am not aware of ? |
@anmol-hpe , Depends on what you mean by 'table'. Idea is pretty simple: I you see this error - class/method too large - you have too many things defined in the "scope". By scope I rather mean some "object" (but it also could be class/method/etc). Take a look at my example above: I have not wrapped my Analyze your types, and find out the relations between classes. For big and repeatable once define its own decoder/encoder/schemaFor. |
Yes, I agree. okay, got it. Thanks will work in this direction. |
` object Test {
def valueJoiner2
: ValueJoiner[(UnitFcyVisit, InvUnit), ArgoCarrierVisit, (UnitFcyVisit, InvUnit, ArgoCarrierVisit)] =
(value1: (UnitFcyVisit, InvUnit), value2: ArgoCarrierVisit) => (value1._1, value1._2, value2)
def joinUfvWithIuwithAcv(ufvWithIu: KTable[Key, (UnitFcyVisit, InvUnit)], acv: KTable[Key, ArgoCarrierVisit]) = {
{
ufvWithIu
.join(
acv,
(ufv: (UnitFcyVisit, InvUnit)) => Key(gkey = ufv._1.intend_ob_cv),
valueJoiner2,
Materialized.
with
(new GenericSerdeKey,
new GenericSerde(UnitFcyVisit, InvUnit, ArgoCarrierVisit)
)
)
}
}
`
Getting following error during compilation
scalac: Error while emitting Test$
Method too large: Test$.joinUfvWithIuwithAcv (Lorg/apache/kafka/streams/scala/kstream/KTable;Lorg/apache/kafka/streams/scala/kstream/KTable;)Lorg/apache/kafka/streams/scala/kstream/KTable;
The text was updated successfully, but these errors were encountered: