The Acorn core implementation progressed quite rapidly unlike how long it has taken me to write up my notes and publish these blog posts. The last remaining task for video generation was to support all the graphics/text modes as well as the colour palette.

This required another read through the advanced user guide to determine the pixel format for each mode/palette.

VSCode just gets better and better. I’ve been using it as my editor for the Acorn Electron replay core for a while now but always tabbed out to the terminal to run the build script and impact gui to program the FPGA.

No longer! This can all be done now from within VSCode via ctrl+shift+b to build and ctrl+shift+p to program the FPGA.

Having reached the point where the Acorn Electron core can now boot from ROM to the Basic prompt in mode 6. I had a choice between adding keyboard support or further graphics modes. I decided keyboard support should take priority as it will make testing different graphics/text modes simpler.

Enough is now implemented that the ULA just needs to access the video section of RAM and display it, which should result in a nice Acorn Electron prompt. Simple right?

If only.

Someone not only spilt water on my implementation but clearly fed it after midnight because the gremlins are out in force.

Be ready to snigger, point and mock, because this blog post is all about bugs, glorious bugs.

After quite a lot of research to find any documentation I could relating to the Electron, I took a copy of the Getting Started core and decided to tackle the core in roughly the following order

  1. Video Signal
  2. RAM/ROM
  3. CPU
  4. ULA RAM/Rom interface and video
  5. ULA Keyboard
  6. ULA Cassette
  7. ULA Sound
  8. Expansion port

In this and a few subsequent blog posts I’ll present an overview of these steps and the issues encountered along the way. Before all that, a small detour is needed to discuss the PAL standard.

I remember playing on an early game console as a child that included several riveting games such as “squash” (one player pong), “tennis” (pong against the CPU), “tennis for two” (pong against a human) and duck hunt (shoot the pong ball square) now that’s art reuse!

It wasn’t until our first computer, the Acorn Electron, that a game really hooked me. That game was the revolutionary Elite.

Not only did it bring with it 3d graphics but thousands of stars and space ports to trade at and the choice to be a lawful space citizen or a pirate blasting apart anyone you could find and scooping up their cargo, just not too close to a space port that frowned upon piracy, the police didn’t take kindly to that and their guns hurt….

Building upon the audio playback core discussed in the Audio Guide I decided to move onto video output. Rather than recreating the same functionality as the loader by displaying a background, I thought a basic audio visualiser would make for a simple yet interesting alternative.

I finally bit the bullet and started learning my way around the FPGA Replay Framework and it proved quite hard to initially get going. I flailed around in the dark for some time, but every so often had a “aha” moment as a new concept clicked into place, how the library was structured hierarchy wise, how it instantiates a user core, where clocks come in and which new clocks are generated…

Following on from learning the basics of VHDL and the 1-bit dac experiment, I decided to try to play an audio file via the Replay Framework using the “getting_started” core as a minimal starting base. This time the file would be coming off of the SD card and transfered via the ARM cpu to the FPGA.