Jump to content

64 BIT Do the Math. I did. Here is the truth


peteym5

Recommended Posts

I grew up using Commodore and Atari 8-bit. I had friends who were into Video Game Consoles. Went from Atari, Coleco, Nintendo, and Sega. They did not know what a bit was. When Sega released the 16bit Genesis (MegaDrive), they believe "16Bit" was some video game super power. Later, Atari came up with 64bit Jaguar. This was the "Bit Wars" era. As programmers, we know there are many other factors involved in making systems play better video games.
 

Memory Address Size. The Program Counter, and how much memory the Central Processing Unit can address, is one of the important factors in making good video games. The 8-bit 6502 based machines, can address only 64 Kilobytes without bank switching. These machines had a 16bit program counter and address space. If bank switching is used, it must be managed in a way as to not to bog down the Central Processing Unit (CPU). Some of that memory needed to be used for display memory and put a limit on the graphics resolution. The other part of memory was needed for programming and data. That put a limit on how much you can do.  Later CPUs such as the 68000 and 80x86, had larger address space. 32bit program counter and address space, that can go up to 4 Gigabytes without bank switching. So, higher resolutions are possible and having more space for programs and data.

Clock Speed of the Central Processing Unit. This was much more important with consoles after the "bit wars" and was one reason why the Sega Genesis appear to be playing better games than other machines around the time when it was actively sold. The 68000 ran at 7.16 mhz on the Genesis vs 3.52 on the Super Nintendo. This is a major factor when moving objects around the screen while playing music, sound effects, and other things. Some may notice a few games slow down when the program tries to move too many things at once. "Blast Processing" was Sega's way to make it easier to understand that their system had a faster CPU.

Video Processor.  The Video Chipset or Graphics Processing Unit (GPU). The graphics resolution, bits per pixel, memory available and just some things that make a difference. How things are displayed, layered, and managed are important factors. Earlier video processors used hardware sprites overlaid on top of a background. Later units would be using "Blitters" to render moving objects. 3D accelerators came into use around 1995. As we know, the faster these things ran, the better and more realistic something looks. The video processors operate independent of the CPU and may have their own dedicated memory. So the # of bits of the main CPU may not matter.

There are other factors like optimizations of the programming, using compiled programming languages vs assembly. Management of graphics and data. There are people that make games on 8-bit machines with near 16 bit performance because they learned how to manage things, use optimized code, and pre-calculate things and store them inside tables. 


 

 

Link to comment
Share on other sites

All good points Peter. 🙂

The Lynx and Turbografx are good examples of 8-bit CPUs not mattering. The graphics chips and clever design make them punch above their weight. The TG-16 was hitting games that were on par with the Genesis and SNES. 

 

 Free to download--> Carrot Kingdom™- :atari_2600: - Released 5/11/2021

Link to comment
Share on other sites

I agree that Turbografx 16 is on par with Genesis and SNES, and it had a microprocessor that is derived from the 65C02. As I said before, clock speed of the system buss and CPU are more of a factor than the number of bits the CPU can handle. Along with the memory address space, the system. Something else that did help systems like the Genesis and later is the fact that the 68000 processors has more processor registers that allow program space reduction and increase speed. Less need to store registers in temporary memory locations. 

Link to comment
Share on other sites

Yes, I fully agree that there are so many factors to consider for a high performance computing system.

Digital Equipment Coorporationion (DEC) made some amazing computers. They went directly from the very popular 16-bit PDP-11 computer to a full 32-bit architecture with the VAX computers.

In my opinion, the 386 PC and forward could never achieve the same performance per MHz. The 32-bit mode of the x86 is just an extension of the 286 16-bit mode, not a pure 32-bit mode.

Later Digital Equipment Coorporation released the first, and in my opinion only full 64-bit computer system with the Alpha architecture (as as a replacement of the aging VAX system)

The x86_64 is not even close to that as the only difference between a 32-bit x86 and the x86_64 is the 64-bit addressing. In the C programming language the only thing you can notice is that a pointer variable (void *) is 64-bit instead of 32, but all the other datatypes are just the same as in 32-bit mode. So in fact the x86_64 does not even perform 64-bit math, it just uses a wider 64-bit datatype to address memory without paging. So in my opinion, the Atari Jaguar is more of a 64-bit system than a modern 64-bit PC.

So why does games run faster on modern PCs and consoles?

- What they did was to increase the clock frequency to crazy levels (e.g. Gigahertz). Remember that retro consoles have CPUs running at a few MHz, compared to a GHz that is totally crazy. It's like running a very old bicycle using a rocket engine, and trying to patch the frame of the bike to withstand the pressure.

DECs computer systems never ran at higher clock rates than a few hundred of MHz, even the 64-bit Alpha, but we're much faster at performing calculations, or servicing 100s of users.

So what do we loose by running an ancient computer system like the x86 on steroids (instead of designing new ones like DEC did)?

- It has to do with energy vs. performance. A gaming PC with only one CPU can easily consume up to 500 Watts.

This power generates a lot of heat, which you need fans to dissipate (now it consumes energy, just to get rid of excess energy).

OK, so why care as a single x86 user, I can still afford the electricity bill?

- Sure, but imagine running a computing cluster using 600 nodes, then this fact will become important. Not only for your electricity bill, but also for you storage of this cluster that will generate a lot of heat.

Eventually other solutions has become popular for efficiency reasons, especially in today's smartphones (you don't want to recharge your phone every 5 minutes, or carry a battery backpack). That's why you much more likely will see an ARM CPU in your smartphone instead of an x86. Which is a fairly newely designed CPU architecture compared to the x86. We even see renewed implementations of the MIPS architecture today (which is the CPU used by the N64 and PS1).

DEC, SUN Microsystems are long out of buisness. Companies like IBM and HP does not design their own hardware anymore. In many regards thanks to the combined effort they made to design the next generations mastodont 64-bit CPU called Itanium, which turned out to be a disaster. One of the main reasons is that they tried to solve a classic software problem like scheduling in hardware, compared to how the Russians implemented static scheduling in the compiler for their Elbrus CPU. At least we have a few opinions to the x86 today thanks to ARM for example.

Unfortunately modern game consoles is like a gaming PC today, as they have gone back to using x86 instead of developing their own chipsets. One reason is probably because the programmers work on PCs anyway, and no one wants to spend time and money on writing libraries for custom hardware. Instead we use layer on top of layer of legacy and new PC software. So the we pay for the more expensive x86 hardware and pay higher electricity bills instead.

Screenshot_20231009_092654_X.jpg.f6d8e6899a128ab8b5c57c901549efbb.jpg

Edited by phoboz
Link to comment
Share on other sites

Something that could always be done with these system is replacing the main CPU with a faster one, and add more RAM. That opens up the possibilities to run better games. What would happen to an Atari 2600 with a 6502 running at 20mhz with 64K RAM? A Sega Genesis or Super Nintendo running at 50mhz? A Jaguar running at 80 Mhz?  A lot more stuff can happen. The graphics chips may need to remain on an independent buss, as they are geared to only run at a certain speed to output a TV signal. But manipulating registers on each scan line opens up some interesting graphical effects.  

Something that people did not realize about the Sega Genesis over the earlier systems, with the improved graphic processors that allowed more onscreen sprites at one time. I know from experience with working with 8-bit systems that a limiting factor is the number of onscreen sprites. Later, systems used blitters instead of hardware sprites that wrote data directly onto a bit-mapped screen. (Each pixel has its own memory address). 

Edited by peteym5
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...