Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Update implementation of Tokenizable for Vec<u8> #377

Merged
merged 4 commits into from
Sep 24, 2020

Conversation

ducthotran2010
Copy link
Contributor

The contract call could return uint8[] instead of bytes, but Vec<u8> implementation just supports bytes.

Token::FixedBytes(data) => Ok(data),
other => Err(Error::InvalidOutputType(format!("Expected `bytes`, got {:?}", other))),
Token::Bytes(data) | Token::FixedBytes(data) => Ok(data),
Token::Array(tokens) | Token::FixedArray(tokens) => {
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The problem with this is that you will decode Vec<u8> from Token::Array, but encoding (into_tokens) will always return Token::Bytes which is quite misleading.

The split is done on purpose currently (also a bit due to Rust limitations). I think the best way forward would be to introduce a type that looks like this:

pub struct BytesArray(pub Vec<u8>);
impl Tokenizable for BytesArray {
    fn from_token(token: Token) -> Result<Self, Error> {
        match token {
            Token::FixedArray(tokens) | Token::Array(tokens) => {
                Ok(Self(tokens.into_iter().map(Tokenizable::from_token).collect()?))
            }
            other => Err(Error::InvalidOutputType(format!("Expected `Array`, got {:?}", other))),
        }
    }

    fn into_token(self) -> Token {
        Token::Array(self.into_iter().map(Tokenizable::into_token).collect())
    }
}

Unless Token::Bytes and Token::Array have the same ethabi encoding and there is no confusion here (I really don't remember).

Also please add tests.

Copy link
Owner

@tomusdrw tomusdrw left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome, thanks! tiny doc improvement

@tomusdrw tomusdrw merged commit 023f139 into tomusdrw:master Sep 24, 2020
@tomusdrw
Copy link
Owner

Thank you!

@ducthotran2010 ducthotran2010 deleted the tokens branch September 25, 2020 14:26
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants