You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
I have completed the training and exported the model to JIT format, while completing the LM training process, and the TLG file generation seems to be without error. When I use the runtime to decode, the following error occurs:
I0318 15:45:12.551067 162356 params.h:178] Reading torch model exp/conformer/final.zip
I0318 15:45:12.608501 162356 torch_asr_model.cc:35] Num intra-op threads: 1
terminate called after throwing an instance of 'torch::jit::ErrorReport'
what():
Unknown builtin op: aten::scaled_dot_product_attention.
Here are some suggestions:
aten::_scaled_dot_product_attention
The original call is:
File "code/torch/wenet/transformer/attention.py", line 85
dropout_rate = self.dropout_rate
d_k2 = self.d_k
output = torch.scaled_dot_product_attention(q_with_bias_u, k1, v1, mask1, dropout_rate, False, scale=torch.div(1, torch.sqrt(d_k2)))
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ <--- HERE
_11 = torch.contiguous(torch.transpose(output, 1, 2))
_12 = torch.size(query, 0)
Environment:
OS: Ubuntu 22.04.4 LTS
pytorch: 2.1.2
The text was updated successfully, but these errors were encountered:
after updating libtorch to 2.1.0 ,i have encountered a new issue:
java.lang.UnsatisfiedLinkError: dlopen failed: cannot locate symbol "_ZN5torch3jit4loadERKNSt6__ndk112basic_stringIcNS1_11char_traitsIcEENS1_9allocatorIcEEEEN3c108optionalINSA_6DeviceEEE" referenced by "/data/app/~~pa2p0KLjRWIUyu0rN2c0iA==/com.mobvoi.wenet-uzM7GQgNLvXkqgSNM7KkPg==/lib/arm64/libwenet.so"... @Deleter-D@Mddct
Describe the bug
I have completed the training and exported the model to JIT format, while completing the LM training process, and the TLG file generation seems to be without error. When I use the runtime to decode, the following error occurs:
Environment:
The text was updated successfully, but these errors were encountered: