rand() will be infrequent < 10 (at least ten in 2^15 times, if not exponentially more), so automated tests are likely to pass. If they don't, they're likely to pass on the second try, and then everyone shrugs and continues. If it's buried in 500 other lines, then it's likely the code reviewer will give it all a quick scan and say "it's fine". It's the three line diffs that get lots of scrutiny.
In other words, you seem to have a lot more faith in the process than I do.
If it's a 16-bit integer platform, it might hit every once in a while.
If it's a 32-bit integer platform, it'll hit very rarely.
If it's a 64-bit integer platform, someone would have to do the math with some reasonable assumptions, but I wouldn't be surprised if it would never hit before the universe becomes nothing but black holes.
The point being made is that it also depends how often the 'true' value gets used in the code. Tests might only evaluate it a few times per run, or they could cause billions of evaluations per run. You can't know the probability of a test failure without knowing the occurrence rate of that expression.
Yes you're correct, this was the point I was making.
To elaborate: could be 100s of times in a codebase, even 1000s, being executed in tests on local machines and build servers 100s of times a day, etc. etc.
But it would hit a different place every time... Most developers wouldn't even consider checking for this, and the chance of getting a repro in a debugger is slim to none
No, you can't assume that. The probability of hitting the condition each time is low. If there aren't very many calls that hit this, it could easily slip through. Especially on 64-bit int platforms.
Yes agree if you’re talking about unit tests. I’m thinking smoke tests, which is are the most common automated tests in games, where I’ve spent most of professional career. The amount of booleans checks that happen in a single frame I doubt the game wouldn’t crash within the first couple seconds.