I am working on a large project that generally works just fine, but shows serious issues once the input data size exceeds some limitations.
These issues are (suspected) only due to signed integer overflows like these:
int a, o;
// Initialize a and o
int x = (a+o) >> 1);
Obviously, once the sum of a and o overflows (gets larger than 2^31-1), x is no longer the mean of a and o.
Is there a generic way to find all of these integer overflows in a running program?
I am thinking of a tool like Valgrind or a GDB extension that breaks at every integer arithmetic instruction, takes the parameters and compares the correct result (calculated with a larger-sized datatype or arbitrary-precision arithmetic) with the actual result. If the results differ, it should output a warning, trigger a debug break or something like this.
I know, how to check a single arithmetic instruction for overflows (e.g. checking the sign for additions), however due to the vast amount of code, it is not viable solution for me to go through the whole project and insert checking code everywhere by hand.