0xPolygonZero / plonky

Recursive SNARKs based on Plonk and Halo

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Refactor existing modular arithmetic code

unzvfu opened this issue · comments

There is currently a fair amount of copy-pasta between the different base field implementations (e.g. here, here and here). This should be factored out, so as to more clearly and conveniently accommodate the coming implementations and specialisations.

NB: Need to be careful to maintain the performance advantages that some of those implementations have from inlining and specialisation.

This issue depends on the completion of #76 and #77.

Agreed! I was imagining that those implementations would eventually diverge as we added field-specific assumptions, but it hasn't happened so far.

NB: Need to be careful to maintain the performance advantages that some of those implementations have from inlining and specialisation.

That's a good point also. One small optimization that comes to mind is that ORDER[2] is zero for the Tweedle* curves. I assume the compiler is able to eliminate a mul and an add there, though I haven't verified that.

One thing I didn't realise until I started trying to do the refactoring is that const generics are only in Rust nightly, not beta or stable. AFAICT, const generics are necessary in order to write generic code that works with type [T; N] where N is a parameter.

What's your policy on using features from nightly? I would usually be hesitant to jump ahead like that, but maybe you already use nightly for something else?

Hey Hamish, I used const generics for Plookup, also to get types of the form [T; N]. I think these types would fall under the MVP discussed here, so it's likely that they would be stabilized before we have to use Plonky in production.
We can wait for Daniel's confirmation but on my end I think it's safe to use them.

Thanks William, that's promising!

That's a good point also. One small optimization that comes to mind is that ORDER[2] is zero for the Tweedle* curves. I assume the compiler is able to eliminate a mul and an add there, though I haven't verified that.

Yes, I would certainly hope the compiler could eliminate those instructions, but the only way to check is to spend some quality time at https://godbolt.org. I've been surprised many times by the number of things the compiler completely fails to optimise, so definitely want to check all assumptions; now following at #75.

I agree with William -- seems reasonable to use const generics here if the alternative is a bunch of code duplication. Hopefully we can switch back to stable once that feature lands.

On a related note, we have some redundancy in the field tests, and some fields with very few tests. If you end up adding more tests that are sufficiently generic, it might be good to write them as macros that can be invoked for multiple fields. So far we have just one such test: test_square_root!.

If you end up adding more tests that are sufficiently generic, it might be good to write them as macros that can be invoked for multiple fields. So far we have just one such test: test_square_root!.

Possibly naïve question: Is it actually necessary to use a macro here, rather than using the usual support for generic types? I haven't used Rust macros before, but my experience (mainly from using various Lisps) is that macros are only necessary to adjust/circumvent the usual evaluation rules, which doesn't seem necessary here. All fields should share a (large) subset of their interface, and generic tests can be written in terms of that interface. Am I missing something?

generic tests can be written in terms of that interface

I wrote that test a while back, and if I remember correctly, there's no way to test for generic trait implementations (but it's an RFC rust-lang/rfcs#616).
Alternatively, I guess we could write generic functions like

pub fn square_root_works<F: Field>() -> bool {
    ...
}

in field.rs and have tests like

#[test]
fn test_square_root() {
    assert!(square_root_works::<TweedleBase>());
}

for each trait implementation. Is this what you're thinking about?

Yeah exactly -- calling a generic method would work, it'd just involve slightly more code in the impls. We could reduce it slightly by putting the assertions in the generic method, so in the impls we'd have

#[test]
fn test_square_root() {
    assert_square_root_works::<TweedleBase>();
}

Either approach sounds fine to me -- arguably the extra couple lines are justified since generic methods are a little easier to write and debug than macros.

@wborgeaud Yes, that's what I had in mind.

arguably the extra couple lines are justified since generic methods are a little easier to write and debug than macros.

That is my intuition. If either approach is fine, then I'll stick to the generic functions until the benefit of using macros is a little more persuasive.

The easy parts of this refactor have been covered in #80. There is more to do, but it will be followed up in #86.