First documentation for the SL Appliance build
-
CaptainCulry
- Posts: 30
- Joined: 31 Dec 2019
Re: First documentation for the SL Appliance build
rwiker wrote:
cjs wrote:
I still like the idea of using LZ4 compression. I tested with output from the sl command (which I haven't come across before, but I'm now going to install it everywhere I can), and LZ4 compressed ~49kB down to 8926 bytes. Sooo, it should be possible to implement this with a 65(c)02, a 32kB eprom, a flip-flop and a single NOR gate. That does not make for something immediately useful for anything else, of course.
Re: First documentation for the SL Appliance build
CaptainCulry wrote:
BigEd has hit the nail on the head. The UART is the appropriate level of abstraction/system level for this application. I need to draw students in and generate interest. There are no students already interested that are signed up to do anything. I know from experience that as soon as someone says "bit shift" to a sophomore that their eyes are going to glaze over and you lost them.
Curt J. Sampson - github.com/0cjs
Re: First documentation for the SL Appliance build
On the subject of compression, the version of 'sl' I have (which produces only 40k of output) only produces 48 distinct values. So, 6 bits per byte is enough. The values are quite skewed too, so either a full-on huffman or a hand-rolled variable length code might be helpful. That's three possible compression schemes, relatively straightforward, before you get to the heavy-hitting LZ4.
(There used to be utilities called 'pack' and 'compact' ... see https://retrocomputing.stackexchange.com/a/11415 - 'pack' uses a static Huffman table whereas 'compact' is adaptive, I think.)
(There used to be utilities called 'pack' and 'compact' ... see https://retrocomputing.stackexchange.com/a/11415 - 'pack' uses a static Huffman table whereas 'compact' is adaptive, I think.)
- BigDumbDinosaur
- Posts: 9428
- Joined: 28 May 2009
- Location: Midwestern USA (JB Pritzker’s dystopia)
- Contact:
Re: First documentation for the SL Appliance build
BigEd wrote:
Systems thinking is a good skill, and a UART is a good level of system component.
In making the decision to bit-bang or go with a UART, I think the question to be answered is how much bare metal can you tolerate? I put this in the same category as to whether to use a can oscillator as a clock generator or roll-your-own with a crystal and supporting components. My perspective is I want to build a house but don't wish to manufacture bricks and fell trees to make framing lumber. So I use UARTs and can oscillators.
x86? We ain't got no x86. We don't NEED no stinking x86!
-
CaptainCulry
- Posts: 30
- Joined: 31 Dec 2019
Re: First documentation for the SL Appliance build
BigDumbDinosaur wrote:
BigEd wrote:
Systems thinking is a good skill, and a UART is a good level of system component.
In making the decision to bit-bang or go with a UART, I think the question to be answered is how much bare metal can you tolerate? I put this in the same category as to whether to use a can oscillator as a clock generator or roll-your-own with a crystal and supporting components. My perspective is I want to build a house but don't wish to manufacture bricks and fell trees to make framing lumber. So I use UARTs and can oscillators.
Re: First documentation for the SL Appliance build
Hi!
Indeed. If memory correctly serves me, a single-chip UART was on the market in 1970 or 1971 to act as a basic system interface device.
In making the decision to bit-bang or go with a UART, I think the question to be answered is how much bare metal can you tolerate? I put this in the same category as to whether to use a can oscillator as a clock generator or roll-your-own with a crystal and supporting components. My perspective is I want to build a house but don't wish to manufacture bricks and fell trees to make framing lumber. So I use UARTs and can oscillators.
This question comes up so often in academia when it comes to student projects. It's even more important of a question when it comes to coursework though. The decision is typically project by project to determine a reasonable scope. The goal there is to find that sweet spot that provides a reasonable challenge somewhere between growing and doping our own silicon wafers, and just buying a product that already does what we are looking to do. Neither of those ends of the spectrum are acceptable, so just where in between to go is the usual question.
You should also consider *which* UART do you plan to use. The 65C51 is known to have a bug, it is limited in the baud rates it support, and don't have FIFO, so I don't think is the easy or extensible choice. In my view, it is easier to bit-bang a serial signal than to initialize a UART....
A UART shines when you can use interrupts or DMA for transferring, then you can do processing at the same time as sending/receiving data, but managing that is much more complicated.
CaptainCulry wrote:
BigDumbDinosaur wrote:
BigEd wrote:
Systems thinking is a good skill, and a UART is a good level of system component.
In making the decision to bit-bang or go with a UART, I think the question to be answered is how much bare metal can you tolerate? I put this in the same category as to whether to use a can oscillator as a clock generator or roll-your-own with a crystal and supporting components. My perspective is I want to build a house but don't wish to manufacture bricks and fell trees to make framing lumber. So I use UARTs and can oscillators.
A UART shines when you can use interrupts or DMA for transferring, then you can do processing at the same time as sending/receiving data, but managing that is much more complicated.
- BitWise
- In Memoriam
- Posts: 996
- Joined: 02 Mar 2004
- Location: Berkshire, UK
- Contact:
Re: First documentation for the SL Appliance build
You could port the app. Its not that complex. I had a quick go this evening.
https://youtu.be/X6lun7d0JLI
I tried porting the C but there is a problem in wdc02cc that makes it crash (probably a pointer issue in the compiler itself -- I tested it on Windows 10 and XP in the virtual machine).
In the end I just wrote an assembler version of the code for my three chip 65C02 board. The combination of a low CPU speed (~1MHz) and a 19200 baud connection isn't ideal.
The UNIX code uses the curses library to abstract the terminal type. Its not really optimising the screen output -- infact I suspect it actually generates more output than is actually needed.
https://youtu.be/X6lun7d0JLI
I tried porting the C but there is a problem in wdc02cc that makes it crash (probably a pointer issue in the compiler itself -- I tested it on Windows 10 and XP in the virtual machine).
In the end I just wrote an assembler version of the code for my three chip 65C02 board. The combination of a low CPU speed (~1MHz) and a 19200 baud connection isn't ideal.
The UNIX code uses the curses library to abstract the terminal type. Its not really optimising the screen output -- infact I suspect it actually generates more output than is actually needed.
Andrew Jacobs
6502 & PIC Stuff - http://www.obelisk.me.uk/
Cross-Platform 6502/65C02/65816 Macro Assembler - http://www.obelisk.me.uk/dev65/
Open Source Projects - https://github.com/andrew-jacobs
6502 & PIC Stuff - http://www.obelisk.me.uk/
Cross-Platform 6502/65C02/65816 Macro Assembler - http://www.obelisk.me.uk/dev65/
Open Source Projects - https://github.com/andrew-jacobs
- GARTHWILSON
- Forum Moderator
- Posts: 8774
- Joined: 30 Aug 2002
- Location: Southern California
- Contact:
Re: First documentation for the SL Appliance build
dmsc wrote:
The 65C51 is known to have a bug, it is limited in the baud rates it support, and don't have FIFO, so I don't think is the easy or extensible choice.
The reason for not having a FIFO is that the 6502's interrupt overhead is so low that it's not a problem to interrupt with every byte. It would not be very suitable for something like the 68000 though whose interrupt-response time is very, very long by comparison. The 6551's standard baud rates are 50, 75, 109.92, 134.58, 150, 300, 600, 1200, 1800, 2400, 3600, 4800, 7200, 9600, and 19,200 bps. For other speeds, including 115,200 bps, you use the x16 clock input. The 6850 OTOH lacks an onboard BRG.
Quote:
In my view, it is easier to bit-bang a serial signal than to initialize a UART....
Code: Select all
STZ ACIA_STAT ; Reset ACIA by storing 0 in its status register.
LDA #00011110B ; Set for 1 stop bit, 8 data bits, 9600 bps by
STA ACIA_CTRL ; storing the number in the control register.
LDA #00001001B ; No parity or rcv echo, RTS true, receive IRQ but no
STA ACIA_COMM ; transmit IRQ, set DTR true. Store in command register.http://WilsonMinesCo.com/ lots of 6502 resources
The "second front page" is http://wilsonminesco.com/links.html .
What's an additional VIA among friends, anyhow?
The "second front page" is http://wilsonminesco.com/links.html .
What's an additional VIA among friends, anyhow?
-
CaptainCulry
- Posts: 30
- Joined: 31 Dec 2019
Re: First documentation for the SL Appliance build
BitWise wrote:
You could port the app. Its not that complex. I had a quick go this evening.
https://youtu.be/X6lun7d0JLI
I tried porting the C but there is a problem in wdc02cc that makes it crash (probably a pointer issue in the compiler itself -- I tested it on Windows 10 and XP in the virtual machine).
In the end I just wrote an assembler version of the code for my three chip 65C02 board. The combination of a low CPU speed (~1MHz) and a 19200 baud connection isn't ideal.
The UNIX code uses the curses library to abstract the terminal type. Its not really optimising the screen output -- infact I suspect it actually generates more output than is actually needed.
https://youtu.be/X6lun7d0JLI
I tried porting the C but there is a problem in wdc02cc that makes it crash (probably a pointer issue in the compiler itself -- I tested it on Windows 10 and XP in the virtual machine).
In the end I just wrote an assembler version of the code for my three chip 65C02 board. The combination of a low CPU speed (~1MHz) and a 19200 baud connection isn't ideal.
The UNIX code uses the curses library to abstract the terminal type. Its not really optimising the screen output -- infact I suspect it actually generates more output than is actually needed.
I suspect it is generating more than is actually needed as well. I can see that when I run it in WSL and change the screen size, what it generates changes based on that screen size. Once I get something mostly working I will play around with different captures to get a capture that more closely fits the screen size of the final implementation.
- BigDumbDinosaur
- Posts: 9428
- Joined: 28 May 2009
- Location: Midwestern USA (JB Pritzker’s dystopia)
- Contact:
Re: First documentation for the SL Appliance build
GARTHWILSON wrote:
The reason for not having a FIFO is that the 6502's interrupt overhead is so low that it's not a problem to interrupt with every byte.
Quote:
It would not be very suitable for something like the 68000 though whose interrupt-response time is very, very long by comparison.
dmsc wrote:
In my view, it is easier to bit-bang a serial signal than to initialize a UART....
x86? We ain't got no x86. We don't NEED no stinking x86!
-
CaptainCulry
- Posts: 30
- Joined: 31 Dec 2019
Re: First documentation for the SL Appliance build
GARTHWILSON wrote:
I do something like
Code: Select all
STZ ACIA_STAT ; Reset ACIA by storing 0 in its status register.
LDA #00011110B ; Set for 1 stop bit, 8 data bits, 9600 bps by
STA ACIA_CTRL ; storing the number in the control register.
LDA #00001001B ; No parity or rcv echo, RTS true, receive IRQ but no
STA ACIA_COMM ; transmit IRQ, set DTR true. Store in command register.Re: First documentation for the SL Appliance build
BigDumbDinosaur wrote:
A simple UART such as the 6551 requires little in the way of setup. A couple of writes is all it takes to set bit rate, datum format, etc. Bit-banging requires much more code, as well as careful timing, as asynchronous serial communications demands accurately-separated marks and spaces if the receiver is to output anything other than gibberish. The UART takes care of that for you, which reduces serial I/O to little more than load/store activity.
- Use a working UART.
- Use bit-banging.
- Use a W65C51, which cannot tell you when you may write another byte to the transmit register.
Using a working UART has slightly more complex address decoding and setup, but hides everything about the serial protocol timing; the complexity removed by the latter seems to more than make up for the former.
For the other two cases, you need to understand, implement and debug serial protocol timing to at least some degree. Once you've admitted that complexity to your application, I think it's easier to bring it all out into the open in an easily-debuggable way, which is what shifting bits into a latch does. Hiding large pieces of it but still having to know what's going on in those hidden parts well enough to make sure you've implemented appropriate timing to avoid writing the transmit register before the previous transmit is complete seems to me to require a fair amount of sophistication, particularly since recognizing and debugging bad output is not trivial. (Hint: describe exactly what happens on the serial output when you write the output register too early.)
That said, it's prefectly reasonable to go with #3 over #2 if you bring in other considerations, such as "It's worth extra cost and time in order to have two highly-integrated chips in the project instead of one" or "mentioning shifting is a politicial problem with my audience, so it's worth adding complexity and making it more difficult to understand if it avoids that." So long as you're clear about the tradeoffs you're making.
Curt J. Sampson - github.com/0cjs
- GARTHWILSON
- Forum Moderator
- Posts: 8774
- Joined: 30 Aug 2002
- Location: Southern California
- Contact:
Re: First documentation for the SL Appliance build
BigDumbDinosaur wrote:
GARTHWILSON wrote:
The reason for not having a FIFO is that the 6502's interrupt overhead is so low that it's not a problem to interrupt with every byte.
Quote:
dmsc wrote:
In my view, it is easier to bit-bang a serial signal than to initialize a UART....
http://WilsonMinesCo.com/ lots of 6502 resources
The "second front page" is http://wilsonminesco.com/links.html .
What's an additional VIA among friends, anyhow?
The "second front page" is http://wilsonminesco.com/links.html .
What's an additional VIA among friends, anyhow?
-
CaptainCulry
- Posts: 30
- Joined: 31 Dec 2019
Re: First documentation for the SL Appliance build
cjs wrote:
...There are actually three choices under consideration for this application:
Using a working UART has slightly more complex address decoding and setup, but hides everything about the serial protocol timing; the complexity removed by the latter seems to more than make up for the former.
For the other two cases, you need to understand, implement and debug serial protocol timing to at least some degree. Once you've admitted that complexity to your application, I think it's easier to bring it all out into the open in an easily-debuggable way, which is what shifting bits into a latch does. Hiding large pieces of it but still having to know what's going on in those hidden parts well enough to make sure you've implemented appropriate timing to avoid writing the transmit register before the previous transmit is complete seems to me to require a fair amount of sophistication, particularly since recognizing and debugging bad output is not trivial. (Hint: describe exactly what happens on the serial output when you write the output register too early.)
That said, it's prefectly reasonable to go with #3 over #2 if you bring in other considerations, such as "It's worth extra cost and time in order to have two highly-integrated chips in the project instead of one" or "mentioning shifting is a politicial problem with my audience, so it's worth adding complexity and making it more difficult to understand if it avoids that." So long as you're clear about the tradeoffs you're making.
- Use a working UART.
- Use bit-banging.
- Use a W65C51, which cannot tell you when you may write another byte to the transmit register.
Using a working UART has slightly more complex address decoding and setup, but hides everything about the serial protocol timing; the complexity removed by the latter seems to more than make up for the former.
For the other two cases, you need to understand, implement and debug serial protocol timing to at least some degree. Once you've admitted that complexity to your application, I think it's easier to bring it all out into the open in an easily-debuggable way, which is what shifting bits into a latch does. Hiding large pieces of it but still having to know what's going on in those hidden parts well enough to make sure you've implemented appropriate timing to avoid writing the transmit register before the previous transmit is complete seems to me to require a fair amount of sophistication, particularly since recognizing and debugging bad output is not trivial. (Hint: describe exactly what happens on the serial output when you write the output register too early.)
That said, it's prefectly reasonable to go with #3 over #2 if you bring in other considerations, such as "It's worth extra cost and time in order to have two highly-integrated chips in the project instead of one" or "mentioning shifting is a politicial problem with my audience, so it's worth adding complexity and making it more difficult to understand if it avoids that." So long as you're clear about the tradeoffs you're making.
- I am sending an exact known array of bytes out the UART, in the exact same order, repeatedly. It never changes.
- This device doesn't do anything at all other than that. It's a bit spitting appliance
- This device doesn't receive anything
- The output only needs to look okay by eye. This is not a critical link to carbon scrubbers in the ISS.
- This device isn't going into mass production, so whether or not to include a single $7 chip, as far as chip count and cost goes, is not a concern.
Having all of the bare naked details of the UART serial protocol and its timing exposed and given special focus or attention is not actually a stipulation or requirement of this build. If it was, then yes, bit-banging would be a great way to showcase and highlight those details. In this application though, it is perfectly fine in to abstract those details away in hardware. Likewise, we are also abstracting away the details of how the CPU works internally, and how bits are stored in RAM and ROM. There's going to be some abstractions, somewhere, in pretty much everything we do.
- BitWise
- In Memoriam
- Posts: 996
- Joined: 02 Mar 2004
- Location: Berkshire, UK
- Contact:
Re: First documentation for the SL Appliance build
Are you building or buying the hardware for this project?
If you have a 65C02 board with an ACIA and a VIA then one of timers can be used to generate transmit interrupts.
You could use a 65C134, a 65C02 microcontroller with built in UARTs that don't have the 65C51 transmit bug. WDC's own W65C134SXB board looks pretty good (I have a W65C265SXB).
https://www.tindie.com/products/wdc/w65c134sxb/
In some ways I like the microcontroller versions of the 65C02 and 65C816 more than microprocessor versions and the WDC 134/265 boards are cheaper than the 02/816 ones. The built in monitor ROM makes downloading assembled code in an S19 file easy for testing and the final version can be installed in the flash ROM and made to start automatically at power up.
And then there are always other brands of microcontroller that could do the whole thing in one small chip for a couple of dollars and are fast enough to be programmed in C (or C++) -- you'd just need to write a simple implementation of parts of the curses library to use with the ported Linux code. (I have one somewhere I wrote for a terminal based EDSAC emulator for the PIC32MX170F256B).
If students are doing this then using a modern microcontroller rather than a legacy microprocessor might be better. ESP32s are modern, fast, cheap and programmed over a serial connection from the Arduino IDE -- no expensive equipment required at all.
If you have a 65C02 board with an ACIA and a VIA then one of timers can be used to generate transmit interrupts.
You could use a 65C134, a 65C02 microcontroller with built in UARTs that don't have the 65C51 transmit bug. WDC's own W65C134SXB board looks pretty good (I have a W65C265SXB).
https://www.tindie.com/products/wdc/w65c134sxb/
In some ways I like the microcontroller versions of the 65C02 and 65C816 more than microprocessor versions and the WDC 134/265 boards are cheaper than the 02/816 ones. The built in monitor ROM makes downloading assembled code in an S19 file easy for testing and the final version can be installed in the flash ROM and made to start automatically at power up.
And then there are always other brands of microcontroller that could do the whole thing in one small chip for a couple of dollars and are fast enough to be programmed in C (or C++) -- you'd just need to write a simple implementation of parts of the curses library to use with the ported Linux code. (I have one somewhere I wrote for a terminal based EDSAC emulator for the PIC32MX170F256B).
If students are doing this then using a modern microcontroller rather than a legacy microprocessor might be better. ESP32s are modern, fast, cheap and programmed over a serial connection from the Arduino IDE -- no expensive equipment required at all.
Andrew Jacobs
6502 & PIC Stuff - http://www.obelisk.me.uk/
Cross-Platform 6502/65C02/65816 Macro Assembler - http://www.obelisk.me.uk/dev65/
Open Source Projects - https://github.com/andrew-jacobs
6502 & PIC Stuff - http://www.obelisk.me.uk/
Cross-Platform 6502/65C02/65816 Macro Assembler - http://www.obelisk.me.uk/dev65/
Open Source Projects - https://github.com/andrew-jacobs