I started really playing Atari games in 1982. I saw how it started with Telstar in the 1970s (I remember it when I was about four or five; I tried to get one and my mother pried it from my hands because it was too expensive for her at the time), then we got our first Atari 2600 when I was seven years old. We had a lot of fun with that A2600. Then, Christmastime in 1987, we got an A7800. It had PPII and also my parents got Xevious. It was hidden from view and I actually got upset because of that. I thought that I did not get it. Then, I saw it and hugged the machine.
Over the years, we had a Genesis, and I played that endlessly. I also got a TG-16 for a while. Then, after me parents gat tired of the games, I "inherited" the Atari collection (quoting the word because my parents are still alive). I got a SMS, Intellivision, and a ColecoVision for a while, but I sold those and went strictly Atari, which is largely unchanged as a collection for the past thirty-eight years. I decided to go that route because it was getting to the point that Atari games were becoming more and more valuable as collector's items.
Later, I got a Jaguar and the CD unit. This is the most advanced game system I have had. I love to play my Jaguar as well. It still is a lot of fun. I also have a Lynx and my new A5200, which I have recently acquired.
So, what am I getting at here? What I mean is this. How classic gaming has changed over the years is the technology. Gaming has become more advanced over time in playability and appearance. It has also miniaturized. Granted, the technology has advanced in great strides. But, it has improved the gaming experience, and the graphics improve over time in visage as well. The A2600 has very primitive graphics in earlier cases, but over time as technology has improved, the graphics get better with every generation of game systems. So, I guess that is my point. You would think that the newer games, having better graphics and faster CPUs and GPUs, would have better gameplay. But, not always. I think that we are relying too much on emulation to imitate the classic gameplay.
Don't get me wrong, as emulation is cool and I think it is cool because it can imitate the games. But, some things are sacrificed due to the software emulation. You cannot get it 100% perfect in emulation ever, unless we really actually do advance in the emulation technology. Speed can suffer. For instance, my iMac has Virtual Jaguar. The speed is OK, but the music is choppy in every game I play. I have a Core i7 processor running at 2.8 GHz, a Radeon HD 4850 GPU with 512 MB of RAM, and 16 GB of RAM (which I could expand to 32 GB of RAM, if I wanted to, and actually I do). This iMac is also primitive in today's standards, being made in 2009. But, the Jag does it spot-on with real hardware at only a mere fraction of the iMac's CPU/GPU speed and with only 2 MB of RAM. And, that RAM amount is skimpy even for the Jag!
I believe that, if you are going to use emulation, use it in cohesion with the real hardware. Reverse-engineer the real hardware, try to make it better with the real technology, and use hardware emulation more sparingly with it, if you need to. Like I said, emulation is great. But, I think that, as a technologically advanced society, we are too dependent on emulation, and I think that we are really cheapening ourselves. Emulation is great, but the real experience with real hardware is so much better. I say re-work the real hardware and advance it even more from that. Then, I think that you will get even more stellar results. We can do so much better than now.