CaptainCulry wrote:
I thought about bit banging the UART, but then I feel like I'm getting into a much fancier timing system to solve with NOPs than just figuring out how many NOPs I need in my loop to make sure the W65c51 has sent its byte, but not so many that the animation lags excessively. There's really only one place in code I need to tune the NOPs for my solution, whereas I think there's going to be a lot more places in code that tuning is involved to bit bang UART.
I'm going to agree with
what Dr. Jefyll said and suggest you look a little more into bit-banging the serial output, because (though I'm a bit of a noob) it really does look to me easier overall for this application than dealing with all the extra stuff using a W65C51 brings in.
My reasoning is this: the "manual" delay you need to do because of the W65C51 transmit flag bug is about the same level of difficulty as bit-banging. But if you bit-bang via an SR latch:
- Debugging will be easier because the connection between the instructions you execute and the output produced is simpler and more clear. You don't need to reason about the internal behaviour of the W65C51 (and all its configuration): you just look at the output of the latch, which presumably should correspond exactly to a bit from the data bus when it was written (and should be easily debugged if it doesn't).
- You probably make address decoding a bit easier because there's only one "output register" to decode.
- All the complexity of UART configuration, and some of the complexity related to interfacing the UART, is gone, leaving less to learn and fewer places for things to go wrong.
dsmc showed some code for bit-banging.¹ This is actually simpler than it looks because all you really need is a subroutine to shift out one byte. That sample code tries to get exact inter-character spacing for maximum transfer rate, but this isn't actually necessary.
The key thing to remember here is that the "stop bits" in this kind of serial data stream aren't actually bits: the 1, 1.5 or two stop bits actually define the minimum inter-character spacing. The link starts high and this is considered an idle condition; when the link goes low that indicates a start bit. The start bit and subsequent data bits need to have the specific timings for the particular bits-per-second rate, but after transmission of a character you bring the link back to high (idle state) and wait for at least 1, 1.5 or 2 bit times and at most forever. The link can stay idle for as long as you need or care to.
So basically all you need is a subroutine that loads the character to send (the
loop1 section of the sample code), loops through shifting and transfering the bits (the
bit1 section of the sample code) with the appropriate amount of delay between setting/resetting the latch for each bit, and then it sets the latch to idle value and exits. If you might be calling it again almost immediately with another byte, it could delay for the appropriate amount of time for the stop bits.
You do
not need to do anything like having the
loop2 etc. sections right after the initial loop to get perfect timing; I am fairly certain that that having slightly longer delays at various points in the data stream won't be a big issue. (It's not as if your output appears at precisely constant speed anyway; differences in the terminal codes needed as the train moves across the screen will mean that each "frame" will be a different number of bytes transmitted, anyway.)
I think that this would also help with your goals of this as an educational project for your students: instead of all the mysterious magic going on inside a UART this design exposes exactly what a serial link is and how simple it is. After all, we don't use UARTs because bit-banging is so hard to do or because the link protocol is terribly complex; we use them just because it's a lot more efficient to have a second "processor" banging out the bits than doing it ourselves.
__________
¹This was originally misattributed to CaptainCulry. Thanks to rwiker for the correction.