Lost wrote:
I've got a solderless breadboard kit computer with a w65c02s CPU and w65c22s for I/O (
http://www.apatco.com/products.php). There isn't much room leftover on the board to add more chips and no way to transfer data on or off the system (other than manual entry).
So, I'm wondering if anyone has managed to use the w65c22s as a serial interface driver. I would settle for bit-banging at 300 bits per second across a serial port. If so can you point me to some hints or directions?
If you just want to load/save programs via RS-232 (e.g. a PC's serial port), and you're willing to devote 100% of the processor time to communications during loading/saving, and only either transmit or recieve at a time (i.e. not do both simultaneously) then the hardware and the software is very simple. All you'll need for hardware is a level shifting IC for converting 0-5V signals to RS-232 voltage levels, e.g. Maxim's MAX232, and a few external capacitors as required by that IC (the MAX232 requires 4 external capacitors for the charge pump, if memory serves). The IC documentation has this information.
In software, transmitting can be accomplished by driving the "TX" pin low (the start bit), delay, drive TX pin high/low (data bit 0), delay, etc., up through the stop bit. It doesn't matter if there's idle time between bytes. Receiving can be similarly accomplished by waiting for a falling edge (the beginning of the start bit), delay, sample data bit 0, delay, sample data bit 1, etc. (The start and stop bits can also be sampled, but this is usually not necessary.)
To keep the math simple, let's assume that there are 100 cycles per bit (i.e. clock frequency / baud rate = 100). After the falling edge, the start bit will last 100 cycles, the data bit 0 will last 100 cycles, etc. You should sample in the middle of the bit; in other words, data bit 0 should be sampled 150 cycles after the falling edge.
Often times, one sample per bit is sufficient. In a noisy environment, you can use a majority-rules scheme. Take an odd number of samples (3 or 5) and whichever value there is more of (zeros or ones) is the value that bit has. (Because there is an odd number of samples, there can't be a tie). You may find that in most cases, it will be unanimous (i.e. all 3 or 5 samples are the same).
The nice thing with this approach is you don't have to set up timers, or deal with interrupts; all you need is one pin set up as an output and one as an input. You'll also have a lot of flexibility with which pin to use, the baud rate, the parity scheme (if any), the number of data bits, stop bits, etc, since this is all in software. The downside is that the 6502 will be completely tied up during communcation (which may not be an issue).
If I recall correctly, 100 cycles per bit is enough time to handle transmission and reception (even with 3 samples per bit majority rule), so at 2 MHz, the standard 19200 baud rate is feasible.
Here's a big tip: get the cycle counts exact. If it's 100 cycles per bit, make sure you code takes exactly 100 cycles (not 99 or 101) per bit regardless of whether the bit is a zero or a one. It may seem like extra work, but it will save you a lot of trouble later on. (I have an old 8085-based EPROM programmer that bit bangs RS-232 communication and it works correctly at 4800 baud, but not 9600 baud.) It's helpful to write a "delay N cycles" assembler macro that generates NOPs, delay loops, etc. for you.
I've written this bit-banging software before (for the PIC) so feel free to follow up if you have questions. I may even have some 6502 code sitting around somewhere (that I may or may not have tested) too. I haven't done this specifically with a 6522, but I can't see any reason why it wouldn't work.
There also some example 6502 assembly code (albeit lightly documented) in Gforth:
http://www.complang.tuwien.ac.at/forth/ ... 6.2.tar.gz
in the file:
gforth-0.6.2/arch/6502/softuart.fs
Finally, I would recommend that you plan for the possibility of a faster communications rate than 300 baud, even if you don't start out with a faster speed. 300 baud was a standard speed back in the day because it was possible for a person to read text that was displayed as it was recieved. At 10 bits (1 start bit + 8 data bits + 1 stop bit) per byte, that's 30 bytes per second. For programs of any signficant length, you may eventually find loading/saving at 300 baud to be frustratingly slow.