Spherical coordinates: range choice - best practices & performance

I am writing some code which deals with coordinate systems, geometry and other similar stuff. I would like to know which is, in general, the most common/practical/efficient way for storing spherical coordinates, regarding common calculations on them. Is it:

```theta - [0, 180)
phi - [0, 360)
```

or

```theta - [-90, 90)
phi - [-180, 180)
```

or something other?

(The above coordinates are expressed in radians for clarity, but I would normally keep them in radians to improve speed, since math functions are usually implemented for radians.)

I know that from a mathematical aspect, it is completely irrelevant, but I am wondering if a certain choice would result in an easier or more efficient implementation.

A couple of thoughts:

1. The representations are indeed mathematically equivalent. Converting from one to the other will cost you a couple of floating point additions, by pi/2 and pi, respectively. The cost of those additions on common hardware pales in comparison to that of the trigonometry, reverse trig, multiplication/division, and square-root calculations that are common in the arithmetic of sphere geometry.

2. There is a large body of arithmetic text regarding sphere geometry that was developed over the years for dealing with navigation over the earth. This text often uses the latitude/longitude coordinate system, of -90..+90 and -180..+180, respectively. To use the well known formulas without conversion, you might want to stick with that coordinate system.