First documentation for the SL Appliance build

Building your first 6502-based project? We'll help you get started here.
CaptainCulry
Posts: 30
Joined: 31 Dec 2019

Re: First documentation for the SL Appliance build

Post by CaptainCulry »

rwiker wrote:
cjs wrote:
I still like the idea of using LZ4 compression. I tested with output from the sl command (which I haven't come across before, but I'm now going to install it everywhere I can), and LZ4 compressed ~49kB down to 8926 bytes. Sooo, it should be possible to implement this with a 65(c)02, a 32kB eprom, a flip-flop and a single NOR gate. That does not make for something immediately useful for anything else, of course.
I still like that too and am keeping that bookmarked. If I succeed in getting a student(s) working on projects with this, I will suggest a project idea of getting it down to one ROM and a more standard memory map using the compression. That would be a nice combination of hardware and software changes for a student to take on. I think for now, as something to hopefully draw students in, using compression makes the code a little more intimidating.
User avatar
cjs
Posts: 759
Joined: 01 Dec 2018
Location: Tokyo, Japan
Contact:

Re: First documentation for the SL Appliance build

Post by cjs »

CaptainCulry wrote:
BigEd has hit the nail on the head. The UART is the appropriate level of abstraction/system level for this application. I need to draw students in and generate interest. There are no students already interested that are signed up to do anything. I know from experience that as soon as someone says "bit shift" to a sophomore that their eyes are going to glaze over and you lost them.
Wow, that seems really weird given that shift registers seem to me such a basic component of any sort of digital logic system that EEs and CEs might be using, and shifting such a basic part of computer programming. But hey, whatever works.
Curt J. Sampson - github.com/0cjs
User avatar
BigEd
Posts: 11463
Joined: 11 Dec 2008
Location: England
Contact:

Re: First documentation for the SL Appliance build

Post by BigEd »

On the subject of compression, the version of 'sl' I have (which produces only 40k of output) only produces 48 distinct values. So, 6 bits per byte is enough. The values are quite skewed too, so either a full-on huffman or a hand-rolled variable length code might be helpful. That's three possible compression schemes, relatively straightforward, before you get to the heavy-hitting LZ4.

(There used to be utilities called 'pack' and 'compact' ... see https://retrocomputing.stackexchange.com/a/11415 - 'pack' uses a static Huffman table whereas 'compact' is adaptive, I think.)
User avatar
BigDumbDinosaur
Posts: 9426
Joined: 28 May 2009
Location: Midwestern USA (JB Pritzker’s dystopia)
Contact:

Re: First documentation for the SL Appliance build

Post by BigDumbDinosaur »

BigEd wrote:
Systems thinking is a good skill, and a UART is a good level of system component.
Indeed. If memory correctly serves me, a single-chip UART was on the market in 1970 or 1971 to act as a basic system interface device.

In making the decision to bit-bang or go with a UART, I think the question to be answered is how much bare metal can you tolerate? I put this in the same category as to whether to use a can oscillator as a clock generator or roll-your-own with a crystal and supporting components. My perspective is I want to build a house but don't wish to manufacture bricks and fell trees to make framing lumber. So I use UARTs and can oscillators.
x86?  We ain't got no x86.  We don't NEED no stinking x86!
CaptainCulry
Posts: 30
Joined: 31 Dec 2019

Re: First documentation for the SL Appliance build

Post by CaptainCulry »

BigDumbDinosaur wrote:
BigEd wrote:
Systems thinking is a good skill, and a UART is a good level of system component.
Indeed. If memory correctly serves me, a single-chip UART was on the market in 1970 or 1971 to act as a basic system interface device.

In making the decision to bit-bang or go with a UART, I think the question to be answered is how much bare metal can you tolerate? I put this in the same category as to whether to use a can oscillator as a clock generator or roll-your-own with a crystal and supporting components. My perspective is I want to build a house but don't wish to manufacture bricks and fell trees to make framing lumber. So I use UARTs and can oscillators.
This question comes up so often in academia when it comes to student projects. It's even more important of a question when it comes to coursework though. The decision is typically project by project to determine a reasonable scope. The goal there is to find that sweet spot that provides a reasonable challenge somewhere between growing and doping our own silicon wafers, and just buying a product that already does what we are looking to do. Neither of those ends of the spectrum are acceptable, so just where in between to go is the usual question.
dmsc
Posts: 153
Joined: 17 Sep 2018

Re: First documentation for the SL Appliance build

Post by dmsc »

Hi!
CaptainCulry wrote:
BigDumbDinosaur wrote:
BigEd wrote:
Systems thinking is a good skill, and a UART is a good level of system component.
Indeed. If memory correctly serves me, a single-chip UART was on the market in 1970 or 1971 to act as a basic system interface device.

In making the decision to bit-bang or go with a UART, I think the question to be answered is how much bare metal can you tolerate? I put this in the same category as to whether to use a can oscillator as a clock generator or roll-your-own with a crystal and supporting components. My perspective is I want to build a house but don't wish to manufacture bricks and fell trees to make framing lumber. So I use UARTs and can oscillators.
This question comes up so often in academia when it comes to student projects. It's even more important of a question when it comes to coursework though. The decision is typically project by project to determine a reasonable scope. The goal there is to find that sweet spot that provides a reasonable challenge somewhere between growing and doping our own silicon wafers, and just buying a product that already does what we are looking to do. Neither of those ends of the spectrum are acceptable, so just where in between to go is the usual question.
You should also consider *which* UART do you plan to use. The 65C51 is known to have a bug, it is limited in the baud rates it support, and don't have FIFO, so I don't think is the easy or extensible choice. In my view, it is easier to bit-bang a serial signal than to initialize a UART....

A UART shines when you can use interrupts or DMA for transferring, then you can do processing at the same time as sending/receiving data, but managing that is much more complicated.
User avatar
BitWise
In Memoriam
Posts: 996
Joined: 02 Mar 2004
Location: Berkshire, UK
Contact:

Re: First documentation for the SL Appliance build

Post by BitWise »

You could port the app. Its not that complex. I had a quick go this evening.

https://youtu.be/X6lun7d0JLI

I tried porting the C but there is a problem in wdc02cc that makes it crash (probably a pointer issue in the compiler itself -- I tested it on Windows 10 and XP in the virtual machine).

In the end I just wrote an assembler version of the code for my three chip 65C02 board. The combination of a low CPU speed (~1MHz) and a 19200 baud connection isn't ideal.

The UNIX code uses the curses library to abstract the terminal type. Its not really optimising the screen output -- infact I suspect it actually generates more output than is actually needed.
Andrew Jacobs
6502 & PIC Stuff - http://www.obelisk.me.uk/
Cross-Platform 6502/65C02/65816 Macro Assembler - http://www.obelisk.me.uk/dev65/
Open Source Projects - https://github.com/andrew-jacobs
User avatar
GARTHWILSON
Forum Moderator
Posts: 8773
Joined: 30 Aug 2002
Location: Southern California
Contact:

Re: First documentation for the SL Appliance build

Post by GARTHWILSON »

dmsc wrote:
The 65C51 is known to have a bug, it is limited in the baud rates it support, and don't have FIFO, so I don't think is the easy or extensible choice.

The reason for not having a FIFO is that the 6502's interrupt overhead is so low that it's not a problem to interrupt with every byte. It would not be very suitable for something like the 68000 though whose interrupt-response time is very, very long by comparison. The 6551's standard baud rates are 50, 75, 109.92, 134.58, 150, 300, 600, 1200, 1800, 2400, 3600, 4800, 7200, 9600, and 19,200 bps. For other speeds, including 115,200 bps, you use the x16 clock input. The 6850 OTOH lacks an onboard BRG.

Quote:
In my view, it is easier to bit-bang a serial signal than to initialize a UART....
I do something like

Code: Select all

        STZ  ACIA_STAT   ; Reset ACIA by storing 0 in its status register.

        LDA  #00011110B  ; Set for 1 stop bit, 8 data bits, 9600 bps by
        STA  ACIA_CTRL   ; storing the number in the control register.

        LDA  #00001001B  ; No parity or rcv echo, RTS true, receive IRQ but no
        STA  ACIA_COMM   ; transmit IRQ, set DTR true.  Store in command register.
http://WilsonMinesCo.com/ lots of 6502 resources
The "second front page" is http://wilsonminesco.com/links.html .
What's an additional VIA among friends, anyhow?
CaptainCulry
Posts: 30
Joined: 31 Dec 2019

Re: First documentation for the SL Appliance build

Post by CaptainCulry »

BitWise wrote:
You could port the app. Its not that complex. I had a quick go this evening.

https://youtu.be/X6lun7d0JLI

I tried porting the C but there is a problem in wdc02cc that makes it crash (probably a pointer issue in the compiler itself -- I tested it on Windows 10 and XP in the virtual machine).

In the end I just wrote an assembler version of the code for my three chip 65C02 board. The combination of a low CPU speed (~1MHz) and a 19200 baud connection isn't ideal.

The UNIX code uses the curses library to abstract the terminal type. Its not really optimising the screen output -- infact I suspect it actually generates more output than is actually needed.
Dude, that just about got it. As far as how that ran, that's really all it needs to do. Assembly, especially on 8-bit machines like this, is far from my strong suit. I'm much more comfortable in C on an ARM Cortex M, so that is why I went the capture route as opposed to trying to port the code.

I suspect it is generating more than is actually needed as well. I can see that when I run it in WSL and change the screen size, what it generates changes based on that screen size. Once I get something mostly working I will play around with different captures to get a capture that more closely fits the screen size of the final implementation.
User avatar
BigDumbDinosaur
Posts: 9426
Joined: 28 May 2009
Location: Midwestern USA (JB Pritzker’s dystopia)
Contact:

Re: First documentation for the SL Appliance build

Post by BigDumbDinosaur »

GARTHWILSON wrote:
The reason for not having a FIFO is that the 6502's interrupt overhead is so low that it's not a problem to interrupt with every byte.
Also, it's a single-channel UART, so the MPU isn't likely to be buried under interrupts, even at the maximum "official" bit rate of 19,200. Running it at 115.2 Kbps would hammer the MPU with IRQs (specifically, 23,040 per second with CBAT going), which is where a FIFO becomes of value.
Quote:
It would not be very suitable for something like the 68000 though whose interrupt-response time is very, very long by comparison.
That's also true of the x86 architecture, although extremely high core clock rates mask that to some extent. The slowness of x86 interrupt response is what led to the development of the 16550 UART, which has a FIFO.
dmsc wrote:
In my view, it is easier to bit-bang a serial signal than to initialize a UART....
I disagree. A simple UART such as the 6551 requires little in the way of setup. A couple of writes is all it takes to set bit rate, datum format, etc. Bit-banging requires much more code, as well as careful timing, as asynchronous serial communications demands accurately-separated marks and spaces if the receiver is to output anything other than gibberish. The UART takes care of that for you, which reduces serial I/O to little more than load/store activity.
x86?  We ain't got no x86.  We don't NEED no stinking x86!
CaptainCulry
Posts: 30
Joined: 31 Dec 2019

Re: First documentation for the SL Appliance build

Post by CaptainCulry »

GARTHWILSON wrote:
I do something like

Code: Select all

        STZ  ACIA_STAT   ; Reset ACIA by storing 0 in its status register.

        LDA  #00011110B  ; Set for 1 stop bit, 8 data bits, 9600 bps by
        STA  ACIA_CTRL   ; storing the number in the control register.

        LDA  #00001001B  ; No parity or rcv echo, RTS true, receive IRQ but no
        STA  ACIA_COMM   ; transmit IRQ, set DTR true.  Store in command register.
Yeah, this is about what I thought, except I didn't quite realize I had to zero the status register first and I didn't know about the STZ instruction until you posted that. I've apparently been looking at the list of 6502 op codes instead of the list of 65c02 op codes.
User avatar
cjs
Posts: 759
Joined: 01 Dec 2018
Location: Tokyo, Japan
Contact:

Re: First documentation for the SL Appliance build

Post by cjs »

BigDumbDinosaur wrote:
A simple UART such as the 6551 requires little in the way of setup. A couple of writes is all it takes to set bit rate, datum format, etc. Bit-banging requires much more code, as well as careful timing, as asynchronous serial communications demands accurately-separated marks and spaces if the receiver is to output anything other than gibberish. The UART takes care of that for you, which reduces serial I/O to little more than load/store activity.
It seems I was perhaps not entirely clear in my previous post on this topic. There are actually three choices under consideration for this application:
  1. Use a working UART.
  2. Use bit-banging.
  3. Use a W65C51, which cannot tell you when you may write another byte to the transmit register.
In my opinion, this is in order of easiest to most difficult for this particular application (particularly taking into account that it's transmit-only) and audience.

Using a working UART has slightly more complex address decoding and setup, but hides everything about the serial protocol timing; the complexity removed by the latter seems to more than make up for the former.

For the other two cases, you need to understand, implement and debug serial protocol timing to at least some degree. Once you've admitted that complexity to your application, I think it's easier to bring it all out into the open in an easily-debuggable way, which is what shifting bits into a latch does. Hiding large pieces of it but still having to know what's going on in those hidden parts well enough to make sure you've implemented appropriate timing to avoid writing the transmit register before the previous transmit is complete seems to me to require a fair amount of sophistication, particularly since recognizing and debugging bad output is not trivial. (Hint: describe exactly what happens on the serial output when you write the output register too early.)

That said, it's prefectly reasonable to go with #3 over #2 if you bring in other considerations, such as "It's worth extra cost and time in order to have two highly-integrated chips in the project instead of one" or "mentioning shifting is a politicial problem with my audience, so it's worth adding complexity and making it more difficult to understand if it avoids that." So long as you're clear about the tradeoffs you're making.
Curt J. Sampson - github.com/0cjs
User avatar
GARTHWILSON
Forum Moderator
Posts: 8773
Joined: 30 Aug 2002
Location: Southern California
Contact:

Re: First documentation for the SL Appliance build

Post by GARTHWILSON »

BigDumbDinosaur wrote:
GARTHWILSON wrote:
The reason for not having a FIFO is that the 6502's interrupt overhead is so low that it's not a problem to interrupt with every byte.
Also, it's a single-channel UART, so the MPU isn't likely to be buried under interrupts, even at the maximum "official" bit rate of 19,200. Running it at 115.2 Kbps would hammer the MPU with IRQs (specifically, 23,040 per second with CBAT going)
True, a 1MHz '02 would have its hands pretty full at 23,040 interrupts per second. I've done well over 100,000 per second though at 5MHz.
Quote:
dmsc wrote:
In my view, it is easier to bit-bang a serial signal than to initialize a UART....
I disagree. A simple UART such as the 6551 requires little in the way of setup. A couple of writes is all it takes to set bit rate, datum format, etc. Bit-banging requires much more code, as well as careful timing, as asynchronous serial communications demands accurately-separated marks and spaces
Bit-banging asynchronous serial cannot be interrupted either. (Bit-banging synchronous serial like SPI can.)
http://WilsonMinesCo.com/ lots of 6502 resources
The "second front page" is http://wilsonminesco.com/links.html .
What's an additional VIA among friends, anyhow?
CaptainCulry
Posts: 30
Joined: 31 Dec 2019

Re: First documentation for the SL Appliance build

Post by CaptainCulry »

cjs wrote:
...There are actually three choices under consideration for this application:
  1. Use a working UART.
  2. Use bit-banging.
  3. Use a W65C51, which cannot tell you when you may write another byte to the transmit register.
In my opinion, this is in order of easiest to most difficult for this particular application (particularly taking into account that it's transmit-only) and audience.


Using a working UART has slightly more complex address decoding and setup, but hides everything about the serial protocol timing; the complexity removed by the latter seems to more than make up for the former.

For the other two cases, you need to understand, implement and debug serial protocol timing to at least some degree. Once you've admitted that complexity to your application, I think it's easier to bring it all out into the open in an easily-debuggable way, which is what shifting bits into a latch does. Hiding large pieces of it but still having to know what's going on in those hidden parts well enough to make sure you've implemented appropriate timing to avoid writing the transmit register before the previous transmit is complete seems to me to require a fair amount of sophistication, particularly since recognizing and debugging bad output is not trivial. (Hint: describe exactly what happens on the serial output when you write the output register too early.)

That said, it's prefectly reasonable to go with #3 over #2 if you bring in other considerations, such as "It's worth extra cost and time in order to have two highly-integrated chips in the project instead of one" or "mentioning shifting is a politicial problem with my audience, so it's worth adding complexity and making it more difficult to understand if it avoids that." So long as you're clear about the tradeoffs you're making.
Your very strongly felt opinion, and very strenuous objection to the W65C51, has been noted. For the random reader who comes along and reads this thread, they should probably heed those warnings, as their exact implementation details are very unlikely to match a few key details of my exact implementation. Those details are:
  • I am sending an exact known array of bytes out the UART, in the exact same order, repeatedly. It never changes.
  • This device doesn't do anything at all other than that. It's a bit spitting appliance
  • This device doesn't receive anything
  • The output only needs to look okay by eye. This is not a critical link to carbon scrubbers in the ISS.
  • This device isn't going into mass production, so whether or not to include a single $7 chip, as far as chip count and cost goes, is not a concern.
I agree that trying to determine if you are stomping on the W65C51's transmit register too early would be a bit of a bear to debug in a normal situation. In a normal situation you likely have no actual record of exactly what bytes were intended and may not even have much control over when the transmission was supposed to happen. But taking into consideration all the details of this exact application, it will be trivial. Just compare a capture from a logic analyzer to the byte array. If necessary I can capture the data bus too. This is not a very difficult task in this application.

Having all of the bare naked details of the UART serial protocol and its timing exposed and given special focus or attention is not actually a stipulation or requirement of this build. If it was, then yes, bit-banging would be a great way to showcase and highlight those details. In this application though, it is perfectly fine in to abstract those details away in hardware. Likewise, we are also abstracting away the details of how the CPU works internally, and how bits are stored in RAM and ROM. There's going to be some abstractions, somewhere, in pretty much everything we do.
User avatar
BitWise
In Memoriam
Posts: 996
Joined: 02 Mar 2004
Location: Berkshire, UK
Contact:

Re: First documentation for the SL Appliance build

Post by BitWise »

Are you building or buying the hardware for this project?

If you have a 65C02 board with an ACIA and a VIA then one of timers can be used to generate transmit interrupts.

You could use a 65C134, a 65C02 microcontroller with built in UARTs that don't have the 65C51 transmit bug. WDC's own W65C134SXB board looks pretty good (I have a W65C265SXB).

https://www.tindie.com/products/wdc/w65c134sxb/

In some ways I like the microcontroller versions of the 65C02 and 65C816 more than microprocessor versions and the WDC 134/265 boards are cheaper than the 02/816 ones. The built in monitor ROM makes downloading assembled code in an S19 file easy for testing and the final version can be installed in the flash ROM and made to start automatically at power up.

And then there are always other brands of microcontroller that could do the whole thing in one small chip for a couple of dollars and are fast enough to be programmed in C (or C++) -- you'd just need to write a simple implementation of parts of the curses library to use with the ported Linux code. (I have one somewhere I wrote for a terminal based EDSAC emulator for the PIC32MX170F256B).

If students are doing this then using a modern microcontroller rather than a legacy microprocessor might be better. ESP32s are modern, fast, cheap and programmed over a serial connection from the Arduino IDE -- no expensive equipment required at all.
Andrew Jacobs
6502 & PIC Stuff - http://www.obelisk.me.uk/
Cross-Platform 6502/65C02/65816 Macro Assembler - http://www.obelisk.me.uk/dev65/
Open Source Projects - https://github.com/andrew-jacobs
Post Reply