In the late 80s and early 90s, the computer speed is usually is just determined by the processor (cpu), which usually comes from Intel or one of the x86 clones. Starting in the mid 1990s, memory became more affordable and usually adding more memory would speed the computer up to a certain point until it reach a point where adding more memory would provide lesser benefit know in economics as "diminishing returns".
In the late 1990s, the video card became the new hype to play all the new graphically intense video games. The previous $200 video card is no longer good enough. There are $400, $500, and even $600 video cards. And if that is not enough power, can combine two or even three of these video cards together to increase performance.
Last week I upgraded one of my faster computer to play one of the newer games, StarCraft 2, without any slowdown. After upgrading the video card from 128MB to 1024MB, upgrading 1GB memory to 4GB, and even upgrading from an IDE hard drive to a newer SATA hard drive, there was barely any speed improvement.
It didn't make sense that there is no speed improvement even if the cpu is a 64-bit processor running a 64-bit Windows 7 operating system until I notice that the 4GB of memory I put in is still recognized as only 3.15GB. It could be that the motherboard max out at 4GB or does not support true 64-bit.
I didn't put too much thought about the motherboard when I purchase it about 18 months ago and now realize that I could not put more than 1 of the newer video cards, the board only uses the slower 533Mhz memory, and very few options for an upgrade unless I buy a brand new motherboard. This is worst if the computer is from Dell, IBM (Leveno), or HP because the motherboard is propriatory. Lesson learned is to spend more thought and pay for a better motherboard.