M
Martin Brown
Guest
On 06/09/2023 19:20, Joe Gwinn wrote:
Basically a biassed random number generator which creates test data for
the routines under test and then verifies the answers.
It is extra work to do both a solver and a verifier but not that much
and the verification of such provides a basis for regression testing.
Most software the computation being done may be very difficult but the
inverse is often relatively easy by comparison. Finding all real roots
of a function f(x)=0 to maximum accuracy is quite tricky but given a
supposed root x0 then computing the value of the equation f(x0) and its
derivative f\'(x0) is easy. Then you can use NR to see if the correction
is acceptably small enough, if not rinse and repeat.
I found a new bug in a cubic solver that is as robust as any on the
planet quite recently. It required a very specific near exact
combination of 3 64 bit parameters to create a catastrophic numeric
cancellation down a seldom trodden path where the cubic equation has
three real roots and you want the one that it can\'t compute accurately.
Most of these problems we try very hard to only have one real root...
My initial reaction was that it was tested library code so it must be my
problem - until I traced into it and saw how it failed. It gives 8
instead of 16 sig fig in double precision for these pathological data.
--
Martin Brown
One tactic I have adopted for testing numerical code is very similar.On Wed, 6 Sep 2023 09:49:48 +0100, Martin Brown
\'\'\'newspam\'\'\'@nonad.co.uk> wrote:
The example is topologically equivalent to real* code you merely have to
construct input data that will force execution down each binary choice
in turn at every level. Getting the absolute minimum number of test
vectors for full coverage is a much harder problem but a good enough
solution is possible in most practical cases.
In practice, this is certainly pretty effective, but the proposed
requirement did not allow for such shortcuts, rendering the
requirement intractable - the Sun will blow up first.
Also, in practice we do a combination of random probing and fuzzing.
.<https://en.wikipedia.org/wiki/Fuzzing
Basically a biassed random number generator which creates test data for
the routines under test and then verifies the answers.
It is extra work to do both a solver and a verifier but not that much
and the verification of such provides a basis for regression testing.
Most software the computation being done may be very difficult but the
inverse is often relatively easy by comparison. Finding all real roots
of a function f(x)=0 to maximum accuracy is quite tricky but given a
supposed root x0 then computing the value of the equation f(x0) and its
derivative f\'(x0) is easy. Then you can use NR to see if the correction
is acceptably small enough, if not rinse and repeat.
I found a new bug in a cubic solver that is as robust as any on the
planet quite recently. It required a very specific near exact
combination of 3 64 bit parameters to create a catastrophic numeric
cancellation down a seldom trodden path where the cubic equation has
three real roots and you want the one that it can\'t compute accurately.
Most of these problems we try very hard to only have one real root...
My initial reaction was that it was tested library code so it must be my
problem - until I traced into it and saw how it failed. It gives 8
instead of 16 sig fig in double precision for these pathological data.
--
Martin Brown