I have a relevant expertise improving (the performance of) random algorithms and C# is one of my favourite languages; I do have limited unmanaged-C# experience, but
as already explained this shouldn't be a problem. Despite of all that, I did find some difficulties while working on this optimisation; even various issues whose effect on performance (i.e., shorter/longer execution time of a standalone application including the modified version of
ParseNumber
) was different than expected. On the other hand, this is a more or less logical consequence of the true motivation behind the current optimisation process: my
previously-recognised extra-motivation to get a good enough result no matter what.
Below these lines, I am including a list of modifications of the
original code which were proven to not have a positive effect on performance.
The immediate idea coming to my mind after first seeing the code was replacing the option
binary operations (e.g., (options & NumberStyles.AllowLeadingWhite) != 0
) with variables, because their values never changed. Binary operations are certainly quick, but wouldn't it be quicker to rely on a variable rather than performing the same operation various times? This assumption was proven wrong: creating a new (NumberStyles
or even Boolean
) variable is slower than the original version, where binary operations are performed every time. Additionally, relying on updated-in-each-iteration variables to store state
binary operations (e.g., (state & StateSign) == 0
) isn't a good idea either.
I also tried the approach in the previous point with constants. For example: const Int32 StateParens2 = -0x0003
to convert state &= ~StateParens
into state &= StateParens2
. This time, getting a worse performance was less surprising; the only chunks of code allowing such a configuration, the aforementioned one and state |= StateSign | StateParens
, aren't too relevant (i.e., virtually no time is spent there in any input scenario).
The aforementioned ideas are not applicable to non-binary operations. For example: replacing all the occurrences of bigNumber
with its (sb != null)
equivalence is slower.
There was a specific modification which seemed to deliver a better performance in the first tests, but which was finally proven inadequate (i.e., indifferent from a performance point of view and, consequently, not part of the modifications to be pulled). It consisted in replacing the char
variables with their int
equivalences, but only when being involved in mathematical operations. For example: converting ch >= '0' && ch <= '9'
into ch >= 48 && ch <= 57
.