Dividing integer types - Are results predictable?
I have a 64-bit long that I want to round down to the nearest 10,000, so I am doing a simple:
long myLong = 123456789 long rounded = (myLong / 10000) * 10000; //rounded = 123450000
This appears to do what I expect, but as I'm not 100% on the internals of how integer types get divided, I am just slightly concerned that there may be situations where this doesn't work as expected.
Will this still work at very large numbers / edge cases?
Yes, it will work, so long as no result, intermediate or otherwise, exceeds long.MaxValue.
To be explicit about your constants you could use the L specifier at the end, e.g. 123456789L.
For straightforward calculations like this, can I suggest Pex from Microsoft ( http://research.microsoft.com/en-us/projects/pex/ ), which looks for edge cases and tests them. This is a clean-cut example, but if you were building up lots of logic based on things you are unsure of, it's a great tool.