January 13, 2003

A maximum of 600 years left for Moore's Law

This is good fun. A physicist computes the absolute boundaries on how much computation you could possible get out of matter. Quantum theory of course limits the information that *can* be contained in matter since there are non zero minimal energies to account for. It turns out that to apply Moore's Law on the doubling in capacity of integrated circuits for another 600 years would consume all available resources in the universe as part of the computation. Finiteness is a humbling thing.

Comments (post your own)

Help the campaign to stomp out Warnock's Dilemma. Post a comment.