Computers keep getting faster and faster and faster and faster, but eventually this will come to an end.
In 1965, Intel co-founder Gordon E. Moore predicted that the number of transistors we would have the ability to affordably pack onto a computer chip would double every two years, increasing computer speed along the way. Over the decades since, this prediction, dubbed Moore's Law, has held to be somewhat true. Two physicists, Lev Levitin and Tommaso Toffoli from Boston University, have calculated that if we keep progressing at the approximate speed of Moore's Law, computer speed will actually reach a limit within 75 to 80 years.
This limit is based on the smallest amount of time that it takes a quantum computer to complete a quantum elementary operation, the most basic of computing tasks. Using this number, Levitin and Toffoli have figured out the fastest speed that any computer can possilbly operate at, a fundamental limit similar to that posed by the speed of light. Even after reaching this speed, computers will churn out ten quadrillion more operations per second than the fastest processors of today, so by then we'll probably have video games beyond our wildest imaginations anyway (or be slaves in the Matrix).
Quantum computers still have to be figured out, though. They can apparently be rather unstable creatures, not tolerating even the smallest bit of "noise — a kink in a wire or a change in temperature," so there are technological barriers in place. Due to these barriers, some believe that Moore's Law will actually stall in around 20 years, increasing the time that it will take to reach the computational speed limit.
On the contrary, the speed limit does not account for the fact that advances in human technology are only made when hidden sects of the government release fragments of alien technology harvested from various crash sites over the years. Once we actually get that good stuff, this limit may not apply.