eduherminio / AoCHelper

Helper .NET library for solving Advent of Code puzzles

Home Page:https://adventofcode.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Enhancement] Allow for higher resolution timings

tslater2006 opened this issue · comments

Should be able to leverage Stopwatch's "ElapsedTicks" and TimeSpan.TicksPerMillisecond to get partial millisecond timings:

image

The problem is that this 'quick and dirty' way of measuring the performance will never be reliable.
If you run SolveAll() or Solve<T>() multiple times, you'll get different results for each run, even when you do it in a GitHub Action (using 'someone' else's computer).

Given that, how useful is to have such a good resolution (partial milliseconds) if you are potentially getting a +-3ms difference between runs?
IMHO not much, and that's why Elapsed time should always be taken with a pinch of salt and I didn't originally implemented it using those elapsed ticks.

Tools such as BenchmarkDotnet should be used if you really want to get into performance details, and even those measurements end up depending on the state of your machine when you run them. That's definitely the way to go if you want reliable performance information, which of course is not as fast to get as using the good old Stopwatch.
I'm using it myself in this year's code to compare different implementations/solution for the same problem, in case you want to have a look.

Elapsed milliseconds (instead of ticks) does exactly what the enchancement is suggesting, however it returns a long instead of double. so you lose some precision. Its not like going off of ticks and dividing by TicksPerMillisecond is less reliable than what AoCHelper is using now. They are both sourced off the same info.

image
https://github.com/microsoft/referencesource/blob/5697c29004a34d80acdaf5742d7e699022c64ecd/System/services/monitoring/system/diagnosticts/Stopwatch.cs#L124

That said, the suggestion was born out of wanting to see if an alternate algorithm was faster for Day 2/3, and having it show a runtime of 0ms wasn't helping determine that. Seeing 0.09 and 0.01 certainly did. So maybe, what if the milliseconds are < some amount (5?, 10?) then show the partial, but for longer runtimes as the days progress it becomes less useful so elapsed milliseconds would be sufficient.

I do understand there is a natural variability that timings can have, and I'm no stranger to running the process a few times to get the trend, if I can see that part 1 is consistently around 0.2ms and then after changes it seems to be consistently around 0.05, that's useful information in my opinion.

You raised a very good point about lower and higher amounts, I'll give it a proper thought and come back with some kind of proposal that involves more resolution for, at the very least, those 0ms solutions.
From 10ms, or even less, decimals equals to noise to me; but let me properly consider it.

Option 1 ➡️ 2 decimal digits when <10 ms
less_than_10

Option 2 ➡️ 2 decimal digits when <1ms
less_than_1

Both would solve the problem of 0ms solutions and, by using ticks everywhere as suggested, would help getting rid of some errors when rounding.

Option 2 seems clearer to me, because of the 'decimal digit noise' mixed with the variability.
However, I acknwledge that that's a personal preference.

I'll make Option 2 the default configuration, and add an option to override the default milliseconds format specifier, so that you can do Solver.ElapsedTimeFormatSpecifier= "F2"; and get two decimal digits everywhere, or any other possible formatting achievable via standard numeric format strings.

How does that sound, @tslater2006?

Sounds good to me! Thanks for considering the request 👍

v0.12.1 released, including this enhancement.