Minimum number of bits to represent number

What is the most efficient way to find out how many bits are needed to represent some random int number? For example number 30,000 is represented binary with


So it needs 15 bits


int v = 30000; // 32-bit word to find the log base 2 of
int r = 0; // r will be lg(v)

while ( (v >>= 1) != 0) // unroll for more speed...

For more advanced methods, see here

Note that this computes the index of the leftmost set bit (14 for 30000). If you want the number of bits, just add 1.

Need Your Help

PPC breakpoints

debugging osx assembly

How is a breakpoint implemented on PPC (On OS X, to be specific)?

What's the word that encompasses Desktop & Laptop computer?

javascript inheritance website naming-conventions

I'm creating many child classes from one parent class in javascript. Each child will represent a set of devices that view my website in different ways. So far I have these: