Meaning magnitude

April 17, 2016
+ Image not available

The two most common outputs of microseismic monitoring are the location of events, and their magnitude. The magnitude of an event describes the strength of that event.

You may be familiar with the “Richter scale” which was originally developed back in 1935 to describe the strength of medium sized earthquakes (between magnitudes of 3.0 and 7.0) in California. The Richter scale uses the amplitude of a waveform recorded with a Wood-Anderson seismograph at a known distance from the source to calculate the strength of an event. Unfortunately, the Richter scale and many other magnitude scales which have been proposed have some drawbacks. For one, the Richter scale is capped at a magnitude of 7.0, meaning that all larger earthquakes would always have a magnitude of 7.0 or less. Also, the Richter scale only describes the maximum wave amplitude, and does not give any indication of the total energy that is released by the event.

Moment magnitude (Mw) was introduced in 1979 by Hanks and Kanamori and has since become the most commonly used method of describing the size of a microseism. Moment magnitude measures the size of events in terms of how much energy is released. Specifically, moment magnitude relates to the amount of movement by rock (i.e. the distance of movement along a fault or fracture) and the area of the fault or fracture surface. Since moment magnitude can describe something physical about the event, calculated values can be easily compared to magnitude values for other events. The moment magnitude is also a more accurate scale for describing the size of events.

Moment Magnitude

Since magnitude scales are logarithmic, an increase of one unit of magnitude on a magnitude scale is equivalent to an increase of 10 times the amplitude recorded by a seismograph and approximately 30 times the energy. In the image above, the area of the circle is proportional to the energy of an event at moment magnitude +1 versus moment magnitude +2.

Share this Post