The New York Times has an interesting article that explains the moment magnitude scale used to calibrate the strength of an earthquake. The logarithmic was proposed by Thomas C. Hanks and Hiroo Kanamori in 1979. The method measures an earthquake's magnitude based on its seismic moment. It replaced the older Richter scale because it saturates around a magnitude of 8.0 and thus can't measure larger earthquakes. The strongest magnitude earthquake every measured is the 1960 Chilean event that had a rating between 9.4 - 9.6. It lasted for 10 minutes and had a maximum slip of 25 - 30 meters.
Seismologists register the intensity of an earthquake in units known as moment magnitude, which measure how much energy was released when the rocks along a fault moved during the quake.
The moment magnitude scale, as it is known, replaced one developed by an American seismologist, Charles Richter, that was used until the 1970s. The Richter scale was found to be inaccurate for very large earthquakes.
The moment magnitude scale can be hard for nonexperts to decipher. It is logarithmic, meaning that each whole number of magnitude represents about a 32-fold increase in the amount of energy released during a rupture.
So, for instance, an earthquake with a magnitude of 2.0 is not twice as strong as a quake with a magnitude of 1.0. Instead, the amplitude of shaking would be 10 times as great, and it would release 32 times as much energy.