Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

xtts/tokenizer: merge duplicate implementations of preprocess_text #3170

Merged
merged 1 commit into from
Nov 9, 2023

Conversation

akx
Copy link
Contributor

@akx akx commented Nov 8, 2023

This was found w/ ruff:

F811 Redefinition of unused preprocess_text from line 570

I merged the lists of language codes (one list was missing Hungarian) and re-included the Korean check that was missing from one instance of the function.

This was found via ruff:

> F811 Redefinition of unused `preprocess_text` from line 570
@erogol erogol merged commit a8e9163 into coqui-ai:dev Nov 9, 2023
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants