Skip to main content icon/video/no-internet

A power law is a statistical relationship in which the probability of observing a particular “size” of some event is inversely proportional to some power (exponent) of that size. Let y be the size of the event—say the Richter measurement of an earthquake, or the number of acres destroyed in a forest fire—then

None

and

None

where C is a constant and α is the power law exponent. This relationship is scale invariant in the sense that the same α and C describe the probability of a particular magnitude of event no matter how large or small the size in question. Hence notice that, if Y is power law distributed, large events will tend to be relatively rare, while small-sized events will be relatively common. A feature of power laws is that

None

in which case a (log-log) plot of log(Pr(y)) versus log(y) will be a straight line. Consider Figure 1; here, some simulated power law data are displayed for which it is (approximately) true that Pr(y) = 0.5xy−2.5. That is, C = 0.5 and α = −2.5. In Figure 2 we take logs of both axes and then impose a linear regression line. The correlation between the two (logged) variables (the R2 for the regression) is typically very high, and is 0.945 here.

Power laws are regularly seen in the physical sciences where there are a wide variety of extremely diverse examples, some of which have particular names; for example, the Stefan-Boltzmann law concerns energy radiation, and the inverse square law describes the gravitational pull between objects. Such laws are fairly rare, and rarely discussed, in the social sciences. Power laws are seen in complex system models, which are used to describe complicated networks of some kind. Examples include the U.S. power grid, a beehive, and an advanced industrial economy. The individual elements of these networks—like bee-to-bee interaction—may be impossible to measure and observe directly, so system-level observations are the norm. Successfully asserting power law behavior for a particular system requires a large amount of data (typically hundreds of observations) with a wide variance (from very small magnitudes to huge ones) and the careful ruling out of other possible density functions such as the lognormal and various versions of the exponential function. The estimation of α is usually performed via maximum likelihood with goodness of fit to the power law distribution verified with a Kolmogorov-Smirnov test.

A classic example discussed in political science was given by Fry Richardson, a mathematician active in the postwar period. Richardson considered international and domestic cases of “deadly quarrels” (wars, murders, etc.). Using a logarithmic scale with categories of 3, 4, 5, 6, and 7, he classified the various events by their casualty numbers. So, for example, World War II with over a million casualties was placed (with the Great War) into the 7 bin. He discovered a remarkable regularity: for every tenfold increase in the size of the event—for example, a move from bin 3 to bin 4, or from bin 6 to bin 7—the number of such events decreased by around 2.5 times relative to the previous category. This result, with α = − 2.5, has proved stable in the face of new data, and has been extended to the study of terrorist acts.

...

  • Loading...
locked icon

Sign in to access this content

Get a 30 day FREE TRIAL

  • Watch videos from a variety of sources bringing classroom topics to life
  • Read modern, diverse business cases
  • Explore hundreds of books and reference titles

Sage Recommends

We found other relevant content for you on other Sage platforms.

Loading