"Moore's Law" as originally stated is not nearly as important as a corollary of Moore's Law, which is how many computations per second occur per dollar of computing power. If I go to the Wikipedia article "FLOPS" (floating operations per second), I find data that can be graphed as shown below. Note that the y-axis is logarithmic. That means that a straight line shows the cost in dollars per gigaflop of computer power is going down exponentially. A linear regression line through the data indicates that the cost in dollars per gigaflop has declined (and will decline) by approximately 11 orders of magnitude in the 42 years from 1982 ($100 million per gigaflop) to 2024 ($0.001 per gigaflop). If the human brain is approximately 10 petaflops (10 million gigaflops), then the cost for a computer that can perform the same number of calculations as a human brain in 2024 will be approximately $10,000 (i.e., 10 million gigaflops times $0.001 per g
Comments