Skip to content
This repository was archived by the owner on Jan 22, 2025. It is now read-only.

added alt_bn syscalls #27961

Merged
merged 8 commits into from
Nov 21, 2022
Merged

Conversation

ananas-block
Copy link
Contributor

@ananas-block ananas-block commented Sep 21, 2022

Compatibility

This is an updated version of previous pull request (#26629)
Credit for the previous pull request is due to @ai-trepalin.
Problem

It is very expensive to compute elliptic curve operations over the bn256(bn254/bn128) curve on Solana. The following curve operations enable for example efficient verification of zero-knowledge proofs within one transaction.

Additionally, contracts written in the Solidity language cannot work in Solana if they contain calls to the following precompiled contracts:

bn256Add — Performs addition on the elliptic curve operations.
bn256ScalarMult — Performs scalar multiplication on the elliptic curve operations.
bn256Pairing — Elliptic curve pairing operations to perform zkSNARKs verification within the block gas limit.

https://github.com/ethereum/EIPs/blob/master/EIPS/eip-196.md
https://github.com/ethereum/EIPs/blob/master/EIPS/eip-197.md

The Neon EVM requires the implementation of system calls in Solana for these contracts.
Also, Light Protocol requires the implementation of these syscalls in Solana for more efficient zero-knowledge proof verification.
Solution

In order for the precompiled contracts to be used in Solana, it is proposed to implement sys-calls inside the Solana core. That is, to perform an implementation similar to the erc-recover implementation.
Summary of Changes

This merge request adds implementation of bn256 (alt_bn128) precompiles.
About compute budget costs

Add and ScalarMult are implemented in constant time.
Execution time for the pairing implementation follows a linear equation:
pairingCU = firstPairingExecutionTime + (numberOfPairings - 1) * otherPairingsExecutionTime

This equation computes the compute units charged (implementation).

The equation results from the arkwork products of pairings implementation which executes a miller loop for every pairing and only one final exponentiation.

In Ethereum gas costs for the pairing operation are calculated similarly.

Possible # based on 33ns execution time per CU:

Addition: 334 CU
Multiplication: 3,840 CU
Pairing: 36,364 CU + (numberOfPairings - 1) * 12,121 CU

Comput units for one G1 multiplication: 126.71 * 1000 / 33 = 3,840 CU
Compute units for the first pairing: 1.2ms * 1_000_000 / 33 (ns per CU) = 36,364 CU
Compute units for one additional pairing: 0.4ms * 1_000_000 / 33 (ns per CU) = 12,121 CU
(Both 1.2ms and 0.4ms have been selected as upper bounds with buffer based on the bench results below.)

Bench Results:
Benchmarks were conducted on an aws c6a.2xlarge instance, see detailed bench results and bench implementation.

Summary:
G1 addition time:: [4.1247 us 4.1553 us 4.1903 us]
Found 6 outliers among 100 measurements (6.00%) 1 (1.00%) low severe 2 (2.00%) high mild 3 (3.00%) high severe

G1 mul time:: [126.64 us 126.68 us 126.71 us]
Found 8 outliers among 100 measurements (8.00%) 1 (1.00%) low severe 3 (3.00%) low mild 3 (3.00%) high mild 1 (1.00%) high severe

2 pairings time: [1.3630 ms 1.3718 ms 1.3839 ms]
Found 6 outliers among 100 measurements (6.00%) 2 (2.00%) high mild 4 (4.00%) high severe

3 pairings time: [1.6807 ms 1.6819 ms 1.6835 ms]
Found 6 outliers among 100 measurements (6.00%) 1 (1.00%) high mild 5 (5.00%) high severe

4 pairings time: [1.9977 ms 1.9980 ms 1.9983 ms]
Found 3 outliers among 100 measurements (3.00%) 3 (3.00%) high mild

5 pairings time: [2.3224 ms 2.3227 ms 2.3230 ms]
Found 3 outliers among 100 measurements (3.00%) 3 (3.00%) high mild

Fixes #

@mergify mergify bot added the community Community contribution label Sep 21, 2022
@mergify mergify bot requested a review from a team September 21, 2022 05:32
@jackcmay
Copy link
Contributor

@dmakarov do yiou know why this pr is exceeding the test-stable-bpf failing due to the build dependency cap?

@dmakarov
Copy link
Contributor

@dmakarov do yiou know why this pr is exceeding the test-stable-bpf failing due to the build dependency cap?

It looks like mem, sanity, and simulation (at least) recompile solana-program redundantly. Probably some of the crates, that ark-* depend on, were imported with different features than the features specified by ark-* crates. I would have to investigate which dependencies trigger recompilation of solana-program when mem is built after 128bit has been built to know which dependency features need to be synchronized.

@ananas-block
Copy link
Contributor Author

ananas-block commented Sep 22, 2022

@dmakarov do yiou know why this pr is exceeding the test-stable-bpf failing due to the build dependency cap?

It looks like mem, sanity, and simulation (at least) recompile solana-program redundantly. Probably some of the crates, that ark-* depend on, were imported with different features than the features specified by ark-* crates. I would have to investigate which dependencies trigger recompilation of solana-program when mem is built after 128bit has been built to know which dependency features need to be synchronized.

Thanks! It looks like all three mem, sanity, and simulation, import solana-program-runtime, solana-program-test, solana-sdk as dev dependencies, maybe these trigger the rebuild?

Dependencies within ark_* crates look consistent on first glance, but import some crates with different features than solana-program does:
zeroize:

  • solana_program: zeroize = { version = "1.3", default-features = true, features = ["zeroize_derive"] }
  • ark_* : zeroize = { version = "1", default-features = false, features = ["zeroize_derive"] }

num-traits:

  • solana_program: num-traits = { version = "0.2", default-features = true, features = ["i128", "std"] }
  • ark_* : num-traits = { version = "0.2", default-features = false }

itertools is imported twice btw once in [dependencies], once in [target.'cfg(not(target_os = "solana"))'.dependencies].

I will try whether importing ark_* crates with default features=false helps, compile them one by one, and adjust the imports in ark_ crates.
Please lmk if you have other ideas how to investigate this.

@ananas-block
Copy link
Contributor Author

@dmakarov, @jackcmay , I made it the test run by adapting the dependency versions which resulted in the recompile. This lead to forking the ark-* libraries thus currently the PR imports the ark-* libraries from a github repo.
(feels silly to publish a new crate with some minor dep versions changed)
ark-ec = { version = "=0.3.0", default-features = false, git="https://github.com/Lightprotocol/algebra" , tag="alt_bn128" }

Does this import work for you?

Otherwise maybe it would be cleaner to just bump the number of recompiles up again to 10. I saw that the number was decreased from 10 to 3 in this commit ten days ago.

Now looking into the timeout of the downstream projects, if you have any idea regarding this please lmk.

@dmakarov
Copy link
Contributor

@dmakarov, @jackcmay , I made it the test run by adapting the dependency versions which resulted in the recompile. This lead to forking the ark-* libraries thus currently the PR imports the ark-* libraries from a github repo. (feels silly to publish a new crate with some minor dep versions changed) ark-ec = { version = "=0.3.0", default-features = false, git="https://github.com/Lightprotocol/algebra" , tag="alt_bn128" }

Does this import work for you?

Otherwise maybe it would be cleaner to just bump the number of recompiles up again to 10. I saw that the number was decreased from 10 to 3 in this commit ten days ago.

Now looking into the timeout of the downstream projects, if you have any idea regarding this please lmk.

Let me experiment with the dependencies a little and maybe I'll be able to keep the rebuilds at the minimum without having to modify ark-* crates. I'll let you know about my results.

The timeout does happen sometimes. I restarted the job, let's see if it passes.

@ananas-block
Copy link
Contributor Author

@dmakarov, @jackcmay , I made it the test run by adapting the dependency versions which resulted in the recompile. This lead to forking the ark-* libraries thus currently the PR imports the ark-* libraries from a github repo. (feels silly to publish a new crate with some minor dep versions changed) ark-ec = { version = "=0.3.0", default-features = false, git="https://github.com/Lightprotocol/algebra" , tag="alt_bn128" }
Does this import work for you?
Otherwise maybe it would be cleaner to just bump the number of recompiles up again to 10. I saw that the number was decreased from 10 to 3 in this commit ten days ago.
Now looking into the timeout of the downstream projects, if you have any idea regarding this please lmk.

Let me experiment with the dependencies a little and maybe I'll be able to keep the rebuilds at the minimum without having to modify ark-* crates. I'll let you know about my results.

The timeout does happen sometimes. I restarted the job, let's see if it passes.

Sounds good. That would be the best way of course, thanks!

@ananas-block
Copy link
Contributor Author

Let me experiment with the dependencies a little and maybe I'll be able to keep the rebuilds at the minimum without having to modify ark-* crates. I'll let you know about my results.

The timeout does happen sometimes. I restarted the job, let's see if it passes.

@dmakarov did you get a chance to experiment yet?
I tried among other things to bump the rand dependency in solana-program from version 0.7.0 to 0.8.0 which is used in ark-std a dependency of ark-ff and ark-ec.
This created a mess for further dependencies relying on rand 0.7.0 such as libsecp256k1 and many Solana crates some of which solana-program-runtime, a dev-dependency of solana-bpf-rust-mem, depends on such as, solana-metrics, etc.

@dmakarov
Copy link
Contributor

dmakarov commented Oct 5, 2022

Let me experiment with the dependencies a little and maybe I'll be able to keep the rebuilds at the minimum without having to modify ark-* crates. I'll let you know about my results.
The timeout does happen sometimes. I restarted the job, let's see if it passes.

@dmakarov did you get a chance to experiment yet? I tried among other things to bump the rand dependency in solana-program from version 0.7.0 to 0.8.0 which is used in ark-std a dependency of ark-ff and ark-ec. This created a mess for further dependencies relying on rand 0.7.0 such as libsecp256k1 and many Solana crates some of which solana-program-runtime, a dev-dependency of solana-bpf-rust-mem, depends on such as, solana-metrics, etc.

Yes, still working on it. These changes reduce the number of dependencies with different fingerprints, but not all of them

 modified   sdk/program/Cargo.toml
@@ -22,6 +22,7 @@ itertools =  "0.10.1"
 lazy_static = "1.4.0"
 log = "0.4.17"
 memoffset = "0.6"
+num-bigint = { version = "0.4.3", default-features = true, features = ["std"] }
 num-derive = "0.3"
 num-traits = { version = "0.2", default-features = true, features = ["i128", "std"] }
 rustversion = "1.0.7"
@@ -47,8 +48,9 @@ curve25519-dalek = { version = "3.2.1", features = ["serde"] }
 itertools = "0.10.1"
 libc = { version = "0.2.126", features = ["extra_traits"] }
 libsecp256k1 = "0.6.0"
-rand = "0.7"
+rand = { version = "0.8.5", default-features = true, features = ["alloc", "getrandom", "libc", "rand_chacha", "small_rng", "std", "std_rng"] }
 rand_chacha = { version = "0.2.2", default-features = true, features = ["simd", "std"] }
+syn = { version = "0.15.44", default-features = true, features = ["clone-impls", "derive", "parsing", "printing", "proc-macro", "quote"] }
 tiny-bip39 = "0.8.2"
 wasm-bindgen = "0.2"
 zeroize = { version = "1.3", default-features = true, features = ["zeroize_derive"] }

Although syn above has no effect.

It's a bit of nuisance to find which features cause the fingerprints differences. I still have these differences when mem is built after 128bit has been built

[2022-10-04T23:49:05Z DEBUG cargo::core::compiler::fingerprint] fingerprint at: /home/dmakarov/work/git/solana/programs/bpf/target/debug/.fingerprint/solana-program-3ce19421c7551b3a/lib-solana-program
[2022-10-04T23:49:05Z DEBUG cargo::core::compiler::fingerprint] features ["default"]
[2022-10-04T23:49:05Z INFO  cargo::core::compiler::fingerprint] fingerprint error for solana-program v1.15.0 (/home/dmakarov/work/git/solana/sdk/program)/Build/TargetInner { ..: lib_target("solana-program", ["cdylib", "rlib"], "/home/dmakarov/work/git/solana/sdk/program/src/lib.rs", Edition2021) }
[2022-10-04T23:49:05Z INFO  cargo::core::compiler::fingerprint]     err: unit dependency information changed
    
    Caused by:
        new (ark_bn254/35805929985d436d) != old (ark_bn254/d3b2d35cdfbef3f9)
[2022-10-04T23:49:05Z DEBUG cargo::core::compiler::fingerprint] fingerprint at: /home/dmakarov/work/git/solana/programs/bpf/target/debug/.fingerprint/ark-bn254-3e93dbf7f904b191/lib-ark-bn254
[2022-10-04T23:49:05Z DEBUG cargo::core::compiler::fingerprint] features ["curve", "default", "scalar_field"]
[2022-10-04T23:49:05Z INFO  cargo::core::compiler::fingerprint] fingerprint error for ark-bn254 v0.3.0/Build/TargetInner { ..: lib_target("ark-bn254", ["lib"], "/home/dmakarov/.cargo/registry/src/github.heygears.com-1ecc6299db9ec823/ark-bn254-0.3.0/src/lib.rs", Edition2018) }
[2022-10-04T23:49:05Z INFO  cargo::core::compiler::fingerprint]     err: failed to read `/home/dmakarov/work/git/solana/programs/bpf/target/debug/.fingerprint/ark-bn254-3e93dbf7f904b191/lib-ark-bn254`
    
    Caused by:
        No such file or directory (os error 2)
[2022-10-04T23:49:05Z DEBUG cargo::core::compiler::fingerprint] fingerprint at: /home/dmakarov/work/git/solana/programs/bpf/target/debug/.fingerprint/ark-ec-79d329c9daf1e0d4/lib-ark-ec
[2022-10-04T23:49:05Z DEBUG cargo::core::compiler::fingerprint] features ["default"]
[2022-10-04T23:49:05Z INFO  cargo::core::compiler::fingerprint] fingerprint error for ark-ec v0.3.0/Build/TargetInner { ..: lib_target("ark-ec", ["lib"], "/home/dmakarov/.cargo/registry/src/github.heygears.com-1ecc6299db9ec823/ark-ec-0.3.0/src/lib.rs", Edition2018) }
[2022-10-04T23:49:05Z INFO  cargo::core::compiler::fingerprint]     err: failed to read `/home/dmakarov/work/git/solana/programs/bpf/target/debug/.fingerprint/ark-ec-79d329c9daf1e0d4/lib-ark-ec`
    
    Caused by:
        No such file or directory (os error 2)
[2022-10-04T23:49:05Z DEBUG cargo::core::compiler::fingerprint] fingerprint at: /home/dmakarov/work/git/solana/programs/bpf/target/debug/.fingerprint/ark-ff-8441c49b805bc4a3/lib-ark-ff
[2022-10-04T23:49:05Z DEBUG cargo::core::compiler::fingerprint] features ["default"]
[2022-10-04T23:49:05Z INFO  cargo::core::compiler::fingerprint] fingerprint error for ark-ff v0.3.0/Build/TargetInner { ..: lib_target("ark-ff", ["lib"], "/home/dmakarov/.cargo/registry/src/github.heygears.com-1ecc6299db9ec823/ark-ff-0.3.0/src/lib.rs", Edition2018) }
[2022-10-04T23:49:05Z INFO  cargo::core::compiler::fingerprint]     err: failed to read `/home/dmakarov/work/git/solana/programs/bpf/target/debug/.fingerprint/ark-ff-8441c49b805bc4a3/lib-ark-ff`
    
    Caused by:
        No such file or directory (os error 2)

I need to modify cargo a little bit more to dump more information to see which deps cause change the fingerprints.

@ananas-block
Copy link
Contributor Author

ananas-block commented Oct 5, 2022

great thanks!
In case it helps, a summary of what I changed in ark-* to get rid of the recompiles:
num-bigint = { version = "0.4", default-features = false }
num-traits = { version = "0.2", default-features = false }
rand = { version = "0.8", default-features = false, features = ["std_rng"] }
to:
num-bigint = { version = "0.4"}
num-traits = { version = "0.2"}
rand = { version = "0.7"}

see:
Lightprotocol/ark-std@ca3c427
arkworks-rs/algebra@master...Lightprotocol:algebra:alt_bn128_precompiles

@dmakarov
Copy link
Contributor

dmakarov commented Oct 6, 2022

There's some cargo weirdness that I can't resolve. When ark-ff-macros is built as a dependency of 128bit crate, the crate num-bigint is included with an empty set of features, which seems to be in line with ark-ff-macros Cargo.toml file. However, when ark-ff-macros is built as a dependency of mem crate, num-bigint is included with a different set of features that contradict the specification in ark-ff-macros' Cargo.toml. No matter what I do, cargo changes the set of features unexplainedly.

This is the tree of ark-ff-macros dependencies with features when 128bit crate is being built

[2022-10-06T19:07:24Z DEBUG cargo::core::compiler::fingerprint] fingerprint at: /home/dmakarov/work/git/solana/programs/bpf/target/debug/.fingerprint/ark-ff-macros-004e5844e60a692c/lib-ark-ff-macros fingerprint features: [], deps:
      name syn fp features: ["clone-impls", "default", "derive", "extra-traits", "fold", "full", "parsing", "printing", "proc-macro", "quote", "visit", "visit-mut"], deps:
        name proc_macro2 fp features: ["default", "proc-macro"], deps:
          name build_script_build fp features: , deps:
            name build_script_build fp features: ["default", "proc-macro"], deps:
          name unicode_ident fp features: [], deps:
        name build_script_build fp features: , deps:
          name build_script_build fp features: ["clone-impls", "default", "derive", "extra-traits", "fold", "full", "parsing", "printing", "proc-macro", "quote", "visit", "visit-mut"], deps:
        name unicode_ident fp features: [], deps:
        name quote fp features: ["default", "proc-macro"], deps:
          name proc_macro2 fp features: ["default", "proc-macro"], deps:
            name build_script_build fp features: , deps:
              name build_script_build fp features: ["default", "proc-macro"], deps:
            name unicode_ident fp features: [], deps:
      name num_traits fp features: ["i128"], deps:
        name build_script_build fp features: , deps:
          name build_script_build fp features: ["i128"], deps:
            name autocfg fp features: [], deps:
      name num_bigint fp features: [], deps:
        name num_traits fp features: ["i128"], deps:
          name build_script_build fp features: , deps:
            name build_script_build fp features: ["i128"], deps:
              name autocfg fp features: [], deps:
        name build_script_build fp features: , deps:
          name build_script_build fp features: [], deps:
            name autocfg fp features: [], deps:
        name num_integer fp features: ["i128"], deps:
          name num_traits fp features: ["i128"], deps:
            name build_script_build fp features: , deps:
              name build_script_build fp features: ["i128"], deps:
                name autocfg fp features: [], deps:
          name build_script_build fp features: , deps:
            name build_script_build fp features: ["i128"], deps:
              name autocfg fp features: [], deps:
      name quote fp features: ["default", "proc-macro"], deps:
        name proc_macro2 fp features: ["default", "proc-macro"], deps:
          name build_script_build fp features: , deps:
            name build_script_build fp features: ["default", "proc-macro"], deps:
          name unicode_ident fp features: [], deps:

And this is the tree of ark-ff-macros dependencies when mem is built. This difference in features of some dependencies, num-bigint for example, causes the fingerprints to be different and triggers rebuild of solana-program, which we try to avoid.

[2022-10-06T21:14:01Z DEBUG cargo::core::compiler::fingerprint] fingerprint at: /home/dmakarov/work/git/solana/programs/bpf/target/debug/.fingerprint/ark-ff-macros-4c203acaa427e435/lib-ark-ff-macros fingerprint features: [], deps:
      name syn fp features: ["clone-impls", "default", "derive", "extra-traits", "fold", "full", "parsing", "printing", "proc-macro", "quote", "visit", "visit-mut"], deps:
        name proc_macro2 fp features: ["default", "proc-macro"], deps:
          name build_script_build fp features: , deps:
            name build_script_build fp features: ["default", "proc-macro"], deps:
          name unicode_ident fp features: [], deps:
        name build_script_build fp features: , deps:
          name build_script_build fp features: ["clone-impls", "default", "derive", "extra-traits", "fold", "full", "parsing", "printing", "proc-macro", "quote", "visit", "visit-mut"], deps:
        name unicode_ident fp features: [], deps:
        name quote fp features: ["default", "proc-macro"], deps:
          name proc_macro2 fp features: ["default", "proc-macro"], deps:
            name build_script_build fp features: , deps:
              name build_script_build fp features: ["default", "proc-macro"], deps:
            name unicode_ident fp features: [], deps:
      name num_traits fp features: ["default", "i128", "std"], deps:
        name build_script_build fp features: , deps:
          name build_script_build fp features: ["default", "i128", "std"], deps:
            name autocfg fp features: [], deps:
      name num_bigint fp features: ["default", "std"], deps:
        name num_traits fp features: ["default", "i128", "std"], deps:
          name build_script_build fp features: , deps:
            name build_script_build fp features: ["default", "i128", "std"], deps:
              name autocfg fp features: [], deps:
        name build_script_build fp features: , deps:
          name build_script_build fp features: ["default", "std"], deps:
            name autocfg fp features: [], deps:
        name num_integer fp features: ["i128", "std"], deps:
          name num_traits fp features: ["default", "i128", "std"], deps:
            name build_script_build fp features: , deps:
              name build_script_build fp features: ["default", "i128", "std"], deps:
                name autocfg fp features: [], deps:
          name build_script_build fp features: , deps:
            name build_script_build fp features: ["i128", "std"], deps:
              name autocfg fp features: [], deps:
      name quote fp features: ["default", "proc-macro"], deps:
        name proc_macro2 fp features: ["default", "proc-macro"], deps:
          name build_script_build fp features: , deps:
            name build_script_build fp features: ["default", "proc-macro"], deps:
          name unicode_ident fp features: [], deps:

We probably don't want to include dependencies to github repositories in solana-program Cargo.toml, so I'm willing to relax for now the CI check that counts the number of solana-program rebuilds. Could you please rebase your PR to most recent master, remove the latest commit that changes ark-* dependencies to github repositories, and add a commit that changes the solana_program_count upper boundary here

if ((solana_program_count > 4)); then
so that the stable_bpf CI check passes?

@ananas-block
Copy link
Contributor Author

Ok, thank you! I will revert the git imports, rebase and adapt the ci test accordingly.

@ananas-block ananas-block force-pushed the alt_bn128_precompiles branch 3 times, most recently from 0657c1e to e719986 Compare October 9, 2022 15:33
@ananas-block
Copy link
Contributor Author

ananas-block commented Oct 9, 2022

The changes are implemented, all tests passed prior squashing the commits and pushing again.
Now the tests didn't run again I guess since the final commit didn't change, everything should work though.
Just to be sure @dmakarov, @jackcmay could you start the tests again?

Then I will mark the Pr to ready for review.

@ananas-block ananas-block force-pushed the alt_bn128_precompiles branch 2 times, most recently from 9468682 to 84f9e86 Compare October 10, 2022 18:17
@dmakarov dmakarov added the CI Pull Request is ready to enter CI label Oct 10, 2022
@solana-grimes solana-grimes removed the CI Pull Request is ready to enter CI label Oct 10, 2022
@ananas-block ananas-block marked this pull request as ready for review October 11, 2022 16:17
@ananas-block
Copy link
Contributor Author

@Lichtso @alessandrod , cc: @jackcmay following up this. This PR targets master here as discussed and was previously at: #27659
Could you take a look?
Such that once the PR contents are reviewed I can resolve the conflicts and we can get this merged.

@@ -2,4 +2,4 @@

here="$(dirname "$0")"
set -x
exec cargo run --manifest-path="$here"/Cargo.toml --bin solana-test-validator -- "$@"
exec cargo run --release --manifest-path="$here"/Cargo.toml --bin solana-test-validator -- "$@"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should take this out and only tests it locally in release mode.

Copy link
Contributor Author

@ananas-block ananas-block Oct 21, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! Ok, will remove this. Just tested it again and the local test-validator built from source works fine without release mode.

Before I push it, is the Pr ready to be merged apart from this?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

@Lichtso
Copy link
Contributor

Lichtso commented Oct 27, 2022

We have three different syscalls with the same signature, can we maybe combine them into one with an enum to select the operation? There is still two unused parameter slots available to do so.

@ananas-block
Copy link
Contributor Author

ananas-block commented Oct 27, 2022

We have three different syscalls with the same signature, can we maybe combine them into one with an enum to select the operation? There is still two unused parameter slots available to do so.

Thanks for the feedback, makes sense. Merged the three syscalls into one using const u64s (ADD, MUL, PAIRING) and match similarly to the 25519 syscalls.

Constants are easier to use than an enum since the declare syscall macro expects u64 in the empty slots, otherwise I need to convert every time. ADD and MUL are compatible with the constants implemented in curve_syscall traits. I didn't import those for it incurs a circular dependency.

What do you think?

@Lichtso
Copy link
Contributor

Lichtso commented Oct 27, 2022

Yes, great.
Can you prefix the enums with ALT_BN128_ and also merge the return paths of all three into one?

@Lichtso
Copy link
Contributor

Lichtso commented Oct 28, 2022

What I meant was merging the code in programs/bpf_loader/src/syscalls/mod.rs into something like:

let budget = invoke_context.get_compute_budget();

let (cost, output_len, calculation) = match group_op {
    ALT_BN128_ADD => (
        budget.alt_bn128_addition_cost,
        ALT_BN128_ADDITION_OUTPUT_LEN,
        alt_bn128_addition,
    ),
    ALT_BN128_MUL => (
        budget.alt_bn128_multiplication_cost,
        ALT_BN128_MULTIPLICATION_OUTPUT_LEN,
        alt_bn128_multiplication,
    ),
    ALT_BN128_PAIRING => {
        let ele_len = input_size.saturating_div(ALT_BN128_PAIRING_ELEMENT_LEN as u64);
        let cost = budget
            .alt_bn128_pairing_one_pair_cost_first
            .saturating_add(
                budget
                    .alt_bn128_pairing_one_pair_cost_other
                    .saturating_mul(ele_len.saturating_sub(1)),
            )
            .saturating_add(budget.sha256_base_cost)
            .saturating_add(input_size)
            .saturating_add(ALT_BN128_PAIRING_OUTPUT_LEN as u64);
        (cost, ALT_BN128_PAIRING_OUTPUT_LEN, alt_bn128_pairing)
    },
};
invoke_context.get_compute_meter().consume(cost)?;

let input = translate_slice::<u8>(
    memory_mapping,
    input_addr,
    input_size,
    invoke_context.get_check_aligned(),
    invoke_context.get_check_size(),
)?;

let call_result = translate_slice_mut::<u8>(
    memory_mapping,
    result_addr,
    output_len as u64,
    invoke_context.get_check_aligned(),
    invoke_context.get_check_size(),
)?;

let result_point = match calculation(input) {
    Ok(result_point) => result_point,
    Err(e) => {
        return Ok(e.into());
    }
};

if result_point.len() != output_len {
    return Ok(AltBn128Error::SliceOutOfBounds.into());
}

call_result.copy_from_slice(&result_point);
Ok(SUCCESS)

@ananas-block
Copy link
Contributor Author

Done, essentially used your example, just boxed the passed function to get around incompatible return types in match arms.

type DynAltBnFunction = Box<dyn for<'r> Fn(&'r [u8]) -> Result<Vec, AltBn128Error>>;
let (cost, output, calculation): (u64, usize, DynAltBnFunction) = match group_op {

What do you think?

@Lichtso
Copy link
Contributor

Lichtso commented Oct 28, 2022

I don't think dyn and Box are necessary.
What happens if you elide the type like so: let (cost, output, calculation): (u64, usize, _) =

On the playground this at least compiles.

@ananas-block
Copy link
Contributor Author

Weird, apparently it works if you only return the function but not a triple.
Adding (1u64, 2usize, a) Rust playground throws the same error I get for the syscall.

error[[E0308]](https://doc.rust-lang.org/stable/error-index.html#E0308): match arms have incompatible types
  --> src/main.rs:10:14
   |
8  |       let func = match data.len() {
   |  ________________-
9  | |         1 => (1u64, 2usize, a),
   | |              ----------------- this is found to be of type (u64, usize, for<'r> fn(&'r [u8]) -> Result<Vec<u8>, ()> {a})
10 | |         2 => (1u64, 2usize, b),
   | |              ^^^^^^^^^^^^^^^^^ expected fn item, found a different fn item
11 | |         _ => (1u64, 2usize, c),
12 | |     };
   | |_____- match arms have incompatible types
   |
   = note: expected tuple (_, _, for<'r> fn(&'r [u8]) -> Result<_, _> {a})
              found tuple (_, _, for<'r> fn(&'r [u8]) -> Result<_, _> {b})

For more information about this error, try rustc --explain E0308

It stays the same adding let (cost, output, calculation): (u64, usize, _) =
Playground

Just adding dyn results in unknown size at compile time error which is solved with Box.
Playground

@Lichtso
Copy link
Contributor

Lichtso commented Oct 29, 2022

Then just split it in two match expressions. That should be more efficient than using Box as we should avoid dynamic allocation.

@ananas-block
Copy link
Contributor Author

ananas-block commented Oct 29, 2022

Makes sense. Done, I split it into two and moved the second match to the code which uses the calculation.
What do you think?

@valiksinev
Copy link
Contributor

I checked the ComputeBudget calculations and got the performance that is about 10 times lower.
For example, in my laptop 1 CU = 54 ns. But "pairing" test takes more then 10 times longer:
pairing 2 time: [12.849 ms 12.864 ms 12.881 ms]
pairing 3 time: [19.294 ms 19.310 ms 19.328 ms]
pairing 4 time: [25.853 ms 25.918 ms 25.986 ms]
pairing 5 time: [34.004 ms 34.338 ms 34.712 ms]
pairing 6 time: [39.683 ms 39.879 ms 40.083 ms]
pairing 7 time: [46.003 ms 46.247 ms 46.501 ms]
pairing 8 time: [52.640 ms 52.988 ms 53.369 ms]

Can you check your bench results ?

@ananas-block
Copy link
Contributor Author

ananas-block commented Nov 15, 2022

just ran the benches myself and had the same issue

change the branch to alt_bn128_precompiles_arkworks then you can replicate the results.

I changed the library in July to arkworks and did not update the default branch in the bench repo.

@ananas-block
Copy link
Contributor Author

ananas-block commented Nov 16, 2022

Then just split it in two match expressions. That should be more efficient than using Box as we should avoid dynamic allocation.

@Lichtso the 2 match expressions are implemented.
Do you have further suggestions for this or other parts of the PR?

}

fn convert_edianness_64(bytes: &[u8]) -> Vec<u8> {
let mut vec = Vec::new();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Replace by .flatten().collect::<Vec<u8>>() ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done see 8bd12f4

@Lichtso
Copy link
Contributor

Lichtso commented Nov 17, 2022

Looking good, just needs a rebase onto master and we can restart the CI.

@ananas-block
Copy link
Contributor Author

ananas-block commented Nov 19, 2022

Looking good, just needs a rebase onto master and we can restart the CI.

Perfect, it is rebased. Feel free to restart the CI.

@Lichtso Lichtso merged commit afc3fee into solana-labs:master Nov 21, 2022
gnapoli23 pushed a commit to gnapoli23/solana that referenced this pull request Dec 16, 2022
* added alt_bn128_syscalls

* increased regression build redundancy to > 10
# for free to subscribe to this conversation on GitHub. Already have an account? #.
Labels
community Community contribution
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants