Friday, January 3, 2020

End of year update

As hard as it to believe, the decade is almost over! Wait, it actually IS over! I meant to finish this post around Christmas, but became busy with other things. It has been several months since my last general update in July. Less time has been available since then due to college and such, but I've still got quite a bit done. Let's have a look!

A (slightly late) Christmas present


Without further ado, here is the AY-3-8500, ported to the MiSTer FPGA system. It's centered around a verilog description of the original microchip that was extracted from die photos. The paddles are currently controlled with an attached keyboard. All games are playable, including the undocumented "handicap" game. As to color output, the original chip outputs five different digital signals (corresponding to different game elements) which are combined by external circuitry to create a video signal. Most consoles made using the chip output mono video, some used a separate chip called the AY-3-8515 to color those signals. As the 8515 has not been decapped, I've recreated some palettes by eyeballing the colors from game footage of systems that used the AY-3-8500 or variants. In addition, there are a few palettes I came up with myself, such as the holiday themed one in the image above.

One other feature I played around with is a mode where the ball becomes invisible after colliding with something. This is possible without modifying the chip's internal circuitry by monitoring the audio output. I'm planning on adding a "robot" mode in a similar manner, an original implementation accomplished this with a small number of external TTL chips. Also on the to-do list is support for a joystick/analog controls, once I find a controller to test it.

This core easily has the highest ratio of time spent on it versus game complexity, considering it took me several months just to implement a simple game of pong! Much of the trouble was due to the odd design of the AY-3-8500 (such as latches and no central clock.) I had gotten it working semi-reliably on the tinyFPGA, but porting it using a different synthesis toolchain caused it to not work at all. The fix for this was to emulate non-FPGA friendly design choices using a faster clock, which made all logic properly synchronous.

Alexa, google "FPGA exorcist" please


MiSTer also has a bit of a learning curve which I had to get over. After making the edits described above, Quartus (used to synthesize the design) kept telling me that it couldn't meet timing requirements. This didn't make any sense as the circuitry was clocked at a snail's pace relative to what the FPGA was capable of. Somehow that was fixed by creating a new project from scratch. One other problem I had after that was that the image appeared incorrect/glitched in a different way each time I started the core. Turns out I had the reset signal's polarity swapped, so the chip was enabled for a fraction of a second before being forced into a reset state. This generated an unpredictable glitched image which stayed in the framebuffer. I scratched my head quite a bit figuring those things out.

Next cores

AY-3-8606 "Wipeout" latest debug photo

One ported chip may not be much, but it's a start. Next thing on the to-do list is to work any issues in the
verilog that I fixed manually, into DLAET (my netlist->verilog tool.) Then I'll fix the issues preventing the next two chips (Wipeout and Naval Battle) from being fully processed. Once that's done, they can get their own cores. I've also been meaning to refactor the toolchain I've developed and host it on Github for quite a while now. One thing at a time though.



In addition to those two, there are two more chips which might be simulated sometime soon! I've mentioned acquiring and decapping the AY-3-8603 "Road Race" chip before. Turns out that Sean Riddle actually decapped and photographed one back in 2018 but didn't put it online until recently. As mentioned before, it's a vertical racing game for one or two players. There's some footage of it here.

I've also been highlighting a chip called the MM57105 (rolls right off the tongue!) on and off for a while now. It's gone a bit slower than the others as the features are less clear. All of the transistors have been marked, and almost all of the vias have been as well. This chip isn't particularly special, it plays pong variants just like the AY-3-8500. It was National Semiconductor's response to the '8500, and notably implemented color output without an external chip. I'll have to do the diffusion layer manually as well, but I plan on taking a shortcut with the metal layer.



Automating the process


I've been working on a way to do less work, by getting my computer to do the laborious task of highlighting for me. Some people have tried to automate the process of highlighting/polygon capture before, but not to much success. Back when I marked up the AY-3-8500, I didn't bother with it at all. But after doing two more chips manually, and realizing that there are well over a dozen more to go, I decided automation was definitely something I wanted to look into.

In the past few years there's been an explosion of work done in (soft) "AI" and "Machine Learning" through artificial neural networks. The basic idea of them dates back to the late '50s, but tools like Tensorflow have made it easier than ever to play with these functions. I've been wanting to mess around with these for a while now, and this is the perfect opportunity.




A section of output from the network. Green indicates pixels it believes to be metal
The way my process works is first, the user takes the original die photos, shrinks them down to an ideal size, then highlights a small area manually. This is then used to train a neural network specific to the image/layer. It will likely have trouble the first time marking the image and need additional "guidance." The user finds any trouble spots, does a few of them manually, then marks them for training. A new re-trained network has then "learned" how to properly process those trouble patterns.

So how well does it work so far? I've been experimenting with the 8603 (Road Race) chip's photos. Manually highlighting only a small fraction (roughly 5%) of the metal layer results in a network that is able to discern between metal and non-metal areas fairly accurately, as seen above. Most edges are fairly rough, and some polygons are split or connected where they shouldn't be. It's promising though, as these can be fixed with smart post-processing algorithms.

Output of network trained to recognize vias, transistors, and diffusion

Getting a network to identify other layers is a little harder. Vias and transistors can look very similar to each other, and diffusion is usually only visible as a slight discoloration compared to the background. A section of the results is above. As you can see, it comes pretty close when identifying the vias (dark blue), but other layers could definitely use some improvement.

This is all still experimental at the moment. I have some ideas planned to improve it though. First off, each pixel is currently classified individually using it and its neighbors (within a radius) as input. This scheme will be replaced with a proper CNN, which process multiple pixels at a time using overlapping windows and then "votes" on each pixel, which should boost accuracy. Another way that might improve accuracy is to feed a map of the metal layer into the diffusion-marking network, so it can correct for color changes under where metal was.

The real part to work on isn't the neural network, it's in the post-processing algorithms. The algorithm needs to correct small errors (such as 1-pixel wide bridges between polygons) and alert the user to spots where it is unsure. So there is still plenty of work to be done. Also, some images/layers may simply be too noisy and will need to be done manually (which is why I'm still manually marking the MM57105's underlayers.)

Applying techniques


Manually highlighting chips takes hours upon hours of work and is thus the biggest bottleneck of the whole die photo -> Verilog pipeline. If automation can be applied to even just one or two layers per chip, the time savings would be enormous. The process could be used on more advanced chips in need of better emulation, such as sound synthesis chips.

I've been eyeing the speech generation chip inside the famous Speak 'n Spell for a while now. The SP0256 (another speech generator) has already been highlighted, it just needs some error-hunting work before it will work virtually. Many other interesting chips can/have been decapped and photographed. There's no shortage of targets!

One chip I'm very interested in working on is one that never got an opportunity to be sold to the public. One that most believed lost. One that would return one day seeking revenge. OK, this is sounding a lot like a movie trailer. The chip is the Atari AMY, and it was canceled, was presumed lost, although its intentions regarding revenge are still unknown.



Curt Vendel of the Atari Museum has collected documents and files related to the AMY, while John Hardie of the NVGM found the specimen pictured above. Careful decapping should give us Verilogifiable photos. That may not be necessary though, as Mr. Vendel also has printed plots of the chip which will be non-destructively scanned to obtain a digital layout. Once these show up the reverse-engineering work can begin!

Collected hardware


Since I set up a Patreon earlier this year, I've been collecting some original pong consoles off of Ebay to acquire the chips inside, in addition to documenting the PCBs and photographing them. An additional two have been picked up since July. First is a "Sears Pong Sports IV" which used the last iteration of Atari's dedicated PONG chips. The other is a "Sears Speedway" which contained a "chip" called the F4301, capable of ball and paddle games (with a "robot" mode) in addition to two vertical racing games. I put quotation marks around "chip" because it's actually two different dies in a single Multi-Chip-Module. Some more information about this rare chip can be found here.
The F4301 is inside the black package on a white extra-wide DIP. Interestingly, the PCB and some of the support chips have Atari branding on them
All of these systems are currently in various states of disassembly and modification. Since I'll be home for a little while, there should be time for me to work more on these. Finally getting around to developing a video mod and plundering the chips on one or two of these will be a new year's resolution!

New year


That's pretty much everything. Plenty of stuff to keep me busy in 2020. I have no idea which subproject I'll post about next, so stay tuned. If you have any questions/comments, leave them below or contact my twitter. Until next time!