Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Is there anything to pay attention to during the back translation process #95

Open
HuipengXu opened this issue Aug 8, 2020 · 0 comments

Comments

@HuipengXu
Copy link

Is there a strict procedure for back translation? I used fairseq's en<=>de transformer pre-training model to get back translation data for training uda, but I can’t get a good result. Using your prepared back translation can get a good result.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant