6502.org Forum  Projects  Code  Documents  Tools  Forum
It is currently Sat Nov 09, 2024 10:41 pm

All times are UTC




Post new topic Reply to topic  [ 48 posts ]  Go to page Previous  1, 2, 3, 4  Next
Author Message
PostPosted: Mon Jan 06, 2020 2:45 pm 
Offline

Joined: Tue Dec 31, 2019 12:30 pm
Posts: 30
rwiker wrote:
cjs wrote:
I still like the idea of using LZ4 compression. I tested with output from the sl command (which I haven't come across before, but I'm now going to install it everywhere I can), and LZ4 compressed ~49kB down to 8926 bytes. Sooo, it should be possible to implement this with a 65(c)02, a 32kB eprom, a flip-flop and a single NOR gate. That does not make for something immediately useful for anything else, of course.


I still like that too and am keeping that bookmarked. If I succeed in getting a student(s) working on projects with this, I will suggest a project idea of getting it down to one ROM and a more standard memory map using the compression. That would be a nice combination of hardware and software changes for a student to take on. I think for now, as something to hopefully draw students in, using compression makes the code a little more intimidating.


Top
 Profile  
Reply with quote  
PostPosted: Mon Jan 06, 2020 4:05 pm 
Offline
User avatar

Joined: Sat Dec 01, 2018 1:53 pm
Posts: 727
Location: Tokyo, Japan
CaptainCulry wrote:
BigEd has hit the nail on the head. The UART is the appropriate level of abstraction/system level for this application. I need to draw students in and generate interest. There are no students already interested that are signed up to do anything. I know from experience that as soon as someone says "bit shift" to a sophomore that their eyes are going to glaze over and you lost them.

Wow, that seems really weird given that shift registers seem to me such a basic component of any sort of digital logic system that EEs and CEs might be using, and shifting such a basic part of computer programming. But hey, whatever works.

_________________
Curt J. Sampson - github.com/0cjs


Top
 Profile  
Reply with quote  
PostPosted: Mon Jan 06, 2020 4:21 pm 
Offline
User avatar

Joined: Thu Dec 11, 2008 1:28 pm
Posts: 10975
Location: England
On the subject of compression, the version of 'sl' I have (which produces only 40k of output) only produces 48 distinct values. So, 6 bits per byte is enough. The values are quite skewed too, so either a full-on huffman or a hand-rolled variable length code might be helpful. That's three possible compression schemes, relatively straightforward, before you get to the heavy-hitting LZ4.

(There used to be utilities called 'pack' and 'compact' ... see https://retrocomputing.stackexchange.com/a/11415 - 'pack' uses a static Huffman table whereas 'compact' is adaptive, I think.)


Top
 Profile  
Reply with quote  
PostPosted: Mon Jan 06, 2020 8:26 pm 
Offline
User avatar

Joined: Thu May 28, 2009 9:46 pm
Posts: 8479
Location: Midwestern USA
BigEd wrote:
Systems thinking is a good skill, and a UART is a good level of system component.

Indeed. If memory correctly serves me, a single-chip UART was on the market in 1970 or 1971 to act as a basic system interface device.

In making the decision to bit-bang or go with a UART, I think the question to be answered is how much bare metal can you tolerate? I put this in the same category as to whether to use a can oscillator as a clock generator or roll-your-own with a crystal and supporting components. My perspective is I want to build a house but don't wish to manufacture bricks and fell trees to make framing lumber. So I use UARTs and can oscillators.

_________________
x86?  We ain't got no x86.  We don't NEED no stinking x86!


Top
 Profile  
Reply with quote  
PostPosted: Mon Jan 06, 2020 8:50 pm 
Offline

Joined: Tue Dec 31, 2019 12:30 pm
Posts: 30
BigDumbDinosaur wrote:
BigEd wrote:
Systems thinking is a good skill, and a UART is a good level of system component.

Indeed. If memory correctly serves me, a single-chip UART was on the market in 1970 or 1971 to act as a basic system interface device.

In making the decision to bit-bang or go with a UART, I think the question to be answered is how much bare metal can you tolerate? I put this in the same category as to whether to use a can oscillator as a clock generator or roll-your-own with a crystal and supporting components. My perspective is I want to build a house but don't wish to manufacture bricks and fell trees to make framing lumber. So I use UARTs and can oscillators.


This question comes up so often in academia when it comes to student projects. It's even more important of a question when it comes to coursework though. The decision is typically project by project to determine a reasonable scope. The goal there is to find that sweet spot that provides a reasonable challenge somewhere between growing and doping our own silicon wafers, and just buying a product that already does what we are looking to do. Neither of those ends of the spectrum are acceptable, so just where in between to go is the usual question.


Top
 Profile  
Reply with quote  
PostPosted: Mon Jan 06, 2020 11:49 pm 
Offline

Joined: Mon Sep 17, 2018 2:39 am
Posts: 138
Hi!

CaptainCulry wrote:
BigDumbDinosaur wrote:
BigEd wrote:
Systems thinking is a good skill, and a UART is a good level of system component.

Indeed. If memory correctly serves me, a single-chip UART was on the market in 1970 or 1971 to act as a basic system interface device.

In making the decision to bit-bang or go with a UART, I think the question to be answered is how much bare metal can you tolerate? I put this in the same category as to whether to use a can oscillator as a clock generator or roll-your-own with a crystal and supporting components. My perspective is I want to build a house but don't wish to manufacture bricks and fell trees to make framing lumber. So I use UARTs and can oscillators.


This question comes up so often in academia when it comes to student projects. It's even more important of a question when it comes to coursework though. The decision is typically project by project to determine a reasonable scope. The goal there is to find that sweet spot that provides a reasonable challenge somewhere between growing and doping our own silicon wafers, and just buying a product that already does what we are looking to do. Neither of those ends of the spectrum are acceptable, so just where in between to go is the usual question.


You should also consider *which* UART do you plan to use. The 65C51 is known to have a bug, it is limited in the baud rates it support, and don't have FIFO, so I don't think is the easy or extensible choice. In my view, it is easier to bit-bang a serial signal than to initialize a UART....

A UART shines when you can use interrupts or DMA for transferring, then you can do processing at the same time as sending/receiving data, but managing that is much more complicated.


Top
 Profile  
Reply with quote  
PostPosted: Tue Jan 07, 2020 12:40 am 
Offline
User avatar

Joined: Tue Mar 02, 2004 8:55 am
Posts: 996
Location: Berkshire, UK
You could port the app. Its not that complex. I had a quick go this evening.

https://youtu.be/X6lun7d0JLI

I tried porting the C but there is a problem in wdc02cc that makes it crash (probably a pointer issue in the compiler itself -- I tested it on Windows 10 and XP in the virtual machine).

In the end I just wrote an assembler version of the code for my three chip 65C02 board. The combination of a low CPU speed (~1MHz) and a 19200 baud connection isn't ideal.

The UNIX code uses the curses library to abstract the terminal type. Its not really optimising the screen output -- infact I suspect it actually generates more output than is actually needed.

_________________
Andrew Jacobs
6502 & PIC Stuff - http://www.obelisk.me.uk/
Cross-Platform 6502/65C02/65816 Macro Assembler - http://www.obelisk.me.uk/dev65/
Open Source Projects - https://github.com/andrew-jacobs


Top
 Profile  
Reply with quote  
PostPosted: Tue Jan 07, 2020 1:37 am 
Offline
User avatar

Joined: Fri Aug 30, 2002 1:09 am
Posts: 8538
Location: Southern California
dmsc wrote:
The 65C51 is known to have a bug, it is limited in the baud rates it support, and don't have FIFO, so I don't think is the easy or extensible choice.

The reason for not having a FIFO is that the 6502's interrupt overhead is so low that it's not a problem to interrupt with every byte. It would not be very suitable for something like the 68000 though whose interrupt-response time is very, very long by comparison. The 6551's standard baud rates are 50, 75, 109.92, 134.58, 150, 300, 600, 1200, 1800, 2400, 3600, 4800, 7200, 9600, and 19,200 bps. For other speeds, including 115,200 bps, you use the x16 clock input. The 6850 OTOH lacks an onboard BRG.

Quote:
In my view, it is easier to bit-bang a serial signal than to initialize a UART....
I do something like
Code:
        STZ  ACIA_STAT   ; Reset ACIA by storing 0 in its status register.

        LDA  #00011110B  ; Set for 1 stop bit, 8 data bits, 9600 bps by
        STA  ACIA_CTRL   ; storing the number in the control register.

        LDA  #00001001B  ; No parity or rcv echo, RTS true, receive IRQ but no
        STA  ACIA_COMM   ; transmit IRQ, set DTR true.  Store in command register.

_________________
http://WilsonMinesCo.com/ lots of 6502 resources
The "second front page" is http://wilsonminesco.com/links.html .
What's an additional VIA among friends, anyhow?


Top
 Profile  
Reply with quote  
PostPosted: Tue Jan 07, 2020 2:01 am 
Offline

Joined: Tue Dec 31, 2019 12:30 pm
Posts: 30
BitWise wrote:
You could port the app. Its not that complex. I had a quick go this evening.

https://youtu.be/X6lun7d0JLI

I tried porting the C but there is a problem in wdc02cc that makes it crash (probably a pointer issue in the compiler itself -- I tested it on Windows 10 and XP in the virtual machine).

In the end I just wrote an assembler version of the code for my three chip 65C02 board. The combination of a low CPU speed (~1MHz) and a 19200 baud connection isn't ideal.

The UNIX code uses the curses library to abstract the terminal type. Its not really optimising the screen output -- infact I suspect it actually generates more output than is actually needed.


Dude, that just about got it. As far as how that ran, that's really all it needs to do. Assembly, especially on 8-bit machines like this, is far from my strong suit. I'm much more comfortable in C on an ARM Cortex M, so that is why I went the capture route as opposed to trying to port the code.

I suspect it is generating more than is actually needed as well. I can see that when I run it in WSL and change the screen size, what it generates changes based on that screen size. Once I get something mostly working I will play around with different captures to get a capture that more closely fits the screen size of the final implementation.


Top
 Profile  
Reply with quote  
PostPosted: Tue Jan 07, 2020 2:11 am 
Offline
User avatar

Joined: Thu May 28, 2009 9:46 pm
Posts: 8479
Location: Midwestern USA
GARTHWILSON wrote:
The reason for not having a FIFO is that the 6502's interrupt overhead is so low that it's not a problem to interrupt with every byte.

Also, it's a single-channel UART, so the MPU isn't likely to be buried under interrupts, even at the maximum "official" bit rate of 19,200. Running it at 115.2 Kbps would hammer the MPU with IRQs (specifically, 23,040 per second with CBAT going), which is where a FIFO becomes of value.

Quote:
It would not be very suitable for something like the 68000 though whose interrupt-response time is very, very long by comparison.

That's also true of the x86 architecture, although extremely high core clock rates mask that to some extent. The slowness of x86 interrupt response is what led to the development of the 16550 UART, which has a FIFO.

dmsc wrote:
In my view, it is easier to bit-bang a serial signal than to initialize a UART....

I disagree. A simple UART such as the 6551 requires little in the way of setup. A couple of writes is all it takes to set bit rate, datum format, etc. Bit-banging requires much more code, as well as careful timing, as asynchronous serial communications demands accurately-separated marks and spaces if the receiver is to output anything other than gibberish. The UART takes care of that for you, which reduces serial I/O to little more than load/store activity.

_________________
x86?  We ain't got no x86.  We don't NEED no stinking x86!


Top
 Profile  
Reply with quote  
PostPosted: Tue Jan 07, 2020 2:19 am 
Offline

Joined: Tue Dec 31, 2019 12:30 pm
Posts: 30
GARTHWILSON wrote:
I do something like
Code:
        STZ  ACIA_STAT   ; Reset ACIA by storing 0 in its status register.

        LDA  #00011110B  ; Set for 1 stop bit, 8 data bits, 9600 bps by
        STA  ACIA_CTRL   ; storing the number in the control register.

        LDA  #00001001B  ; No parity or rcv echo, RTS true, receive IRQ but no
        STA  ACIA_COMM   ; transmit IRQ, set DTR true.  Store in command register.


Yeah, this is about what I thought, except I didn't quite realize I had to zero the status register first and I didn't know about the STZ instruction until you posted that. I've apparently been looking at the list of 6502 op codes instead of the list of 65c02 op codes.


Top
 Profile  
Reply with quote  
PostPosted: Tue Jan 07, 2020 3:01 am 
Offline
User avatar

Joined: Sat Dec 01, 2018 1:53 pm
Posts: 727
Location: Tokyo, Japan
BigDumbDinosaur wrote:
A simple UART such as the 6551 requires little in the way of setup. A couple of writes is all it takes to set bit rate, datum format, etc. Bit-banging requires much more code, as well as careful timing, as asynchronous serial communications demands accurately-separated marks and spaces if the receiver is to output anything other than gibberish. The UART takes care of that for you, which reduces serial I/O to little more than load/store activity.

It seems I was perhaps not entirely clear in my previous post on this topic. There are actually three choices under consideration for this application:
  1. Use a working UART.
  2. Use bit-banging.
  3. Use a W65C51, which cannot tell you when you may write another byte to the transmit register.

In my opinion, this is in order of easiest to most difficult for this particular application (particularly taking into account that it's transmit-only) and audience.

Using a working UART has slightly more complex address decoding and setup, but hides everything about the serial protocol timing; the complexity removed by the latter seems to more than make up for the former.

For the other two cases, you need to understand, implement and debug serial protocol timing to at least some degree. Once you've admitted that complexity to your application, I think it's easier to bring it all out into the open in an easily-debuggable way, which is what shifting bits into a latch does. Hiding large pieces of it but still having to know what's going on in those hidden parts well enough to make sure you've implemented appropriate timing to avoid writing the transmit register before the previous transmit is complete seems to me to require a fair amount of sophistication, particularly since recognizing and debugging bad output is not trivial. (Hint: describe exactly what happens on the serial output when you write the output register too early.)

That said, it's prefectly reasonable to go with #3 over #2 if you bring in other considerations, such as "It's worth extra cost and time in order to have two highly-integrated chips in the project instead of one" or "mentioning shifting is a politicial problem with my audience, so it's worth adding complexity and making it more difficult to understand if it avoids that." So long as you're clear about the tradeoffs you're making.

_________________
Curt J. Sampson - github.com/0cjs


Top
 Profile  
Reply with quote  
PostPosted: Tue Jan 07, 2020 5:20 am 
Offline
User avatar

Joined: Fri Aug 30, 2002 1:09 am
Posts: 8538
Location: Southern California
BigDumbDinosaur wrote:
GARTHWILSON wrote:
The reason for not having a FIFO is that the 6502's interrupt overhead is so low that it's not a problem to interrupt with every byte.

Also, it's a single-channel UART, so the MPU isn't likely to be buried under interrupts, even at the maximum "official" bit rate of 19,200. Running it at 115.2 Kbps would hammer the MPU with IRQs (specifically, 23,040 per second with CBAT going)

True, a 1MHz '02 would have its hands pretty full at 23,040 interrupts per second. I've done well over 100,000 per second though at 5MHz.

Quote:
dmsc wrote:
In my view, it is easier to bit-bang a serial signal than to initialize a UART....

I disagree. A simple UART such as the 6551 requires little in the way of setup. A couple of writes is all it takes to set bit rate, datum format, etc. Bit-banging requires much more code, as well as careful timing, as asynchronous serial communications demands accurately-separated marks and spaces

Bit-banging asynchronous serial cannot be interrupted either. (Bit-banging synchronous serial like SPI can.)

_________________
http://WilsonMinesCo.com/ lots of 6502 resources
The "second front page" is http://wilsonminesco.com/links.html .
What's an additional VIA among friends, anyhow?


Top
 Profile  
Reply with quote  
PostPosted: Wed Jan 08, 2020 12:43 pm 
Offline

Joined: Tue Dec 31, 2019 12:30 pm
Posts: 30
cjs wrote:
...There are actually three choices under consideration for this application:
  1. Use a working UART.
  2. Use bit-banging.
  3. Use a W65C51, which cannot tell you when you may write another byte to the transmit register.

In my opinion, this is in order of easiest to most difficult for this particular application (particularly taking into account that it's transmit-only) and audience.


Using a working UART has slightly more complex address decoding and setup, but hides everything about the serial protocol timing; the complexity removed by the latter seems to more than make up for the former.

For the other two cases, you need to understand, implement and debug serial protocol timing to at least some degree. Once you've admitted that complexity to your application, I think it's easier to bring it all out into the open in an easily-debuggable way, which is what shifting bits into a latch does. Hiding large pieces of it but still having to know what's going on in those hidden parts well enough to make sure you've implemented appropriate timing to avoid writing the transmit register before the previous transmit is complete seems to me to require a fair amount of sophistication, particularly since recognizing and debugging bad output is not trivial. (Hint: describe exactly what happens on the serial output when you write the output register too early.)

That said, it's prefectly reasonable to go with #3 over #2 if you bring in other considerations, such as "It's worth extra cost and time in order to have two highly-integrated chips in the project instead of one" or "mentioning shifting is a politicial problem with my audience, so it's worth adding complexity and making it more difficult to understand if it avoids that." So long as you're clear about the tradeoffs you're making.


Your very strongly felt opinion, and very strenuous objection to the W65C51, has been noted. For the random reader who comes along and reads this thread, they should probably heed those warnings, as their exact implementation details are very unlikely to match a few key details of my exact implementation. Those details are:

  • I am sending an exact known array of bytes out the UART, in the exact same order, repeatedly. It never changes.
  • This device doesn't do anything at all other than that. It's a bit spitting appliance
  • This device doesn't receive anything
  • The output only needs to look okay by eye. This is not a critical link to carbon scrubbers in the ISS.
  • This device isn't going into mass production, so whether or not to include a single $7 chip, as far as chip count and cost goes, is not a concern.

I agree that trying to determine if you are stomping on the W65C51's transmit register too early would be a bit of a bear to debug in a normal situation. In a normal situation you likely have no actual record of exactly what bytes were intended and may not even have much control over when the transmission was supposed to happen. But taking into consideration all the details of this exact application, it will be trivial. Just compare a capture from a logic analyzer to the byte array. If necessary I can capture the data bus too. This is not a very difficult task in this application.

Having all of the bare naked details of the UART serial protocol and its timing exposed and given special focus or attention is not actually a stipulation or requirement of this build. If it was, then yes, bit-banging would be a great way to showcase and highlight those details. In this application though, it is perfectly fine in to abstract those details away in hardware. Likewise, we are also abstracting away the details of how the CPU works internally, and how bits are stored in RAM and ROM. There's going to be some abstractions, somewhere, in pretty much everything we do.


Top
 Profile  
Reply with quote  
PostPosted: Wed Jan 08, 2020 1:19 pm 
Offline
User avatar

Joined: Tue Mar 02, 2004 8:55 am
Posts: 996
Location: Berkshire, UK
Are you building or buying the hardware for this project?

If you have a 65C02 board with an ACIA and a VIA then one of timers can be used to generate transmit interrupts.

You could use a 65C134, a 65C02 microcontroller with built in UARTs that don't have the 65C51 transmit bug. WDC's own W65C134SXB board looks pretty good (I have a W65C265SXB).

https://www.tindie.com/products/wdc/w65c134sxb/

In some ways I like the microcontroller versions of the 65C02 and 65C816 more than microprocessor versions and the WDC 134/265 boards are cheaper than the 02/816 ones. The built in monitor ROM makes downloading assembled code in an S19 file easy for testing and the final version can be installed in the flash ROM and made to start automatically at power up.

And then there are always other brands of microcontroller that could do the whole thing in one small chip for a couple of dollars and are fast enough to be programmed in C (or C++) -- you'd just need to write a simple implementation of parts of the curses library to use with the ported Linux code. (I have one somewhere I wrote for a terminal based EDSAC emulator for the PIC32MX170F256B).

If students are doing this then using a modern microcontroller rather than a legacy microprocessor might be better. ESP32s are modern, fast, cheap and programmed over a serial connection from the Arduino IDE -- no expensive equipment required at all.

_________________
Andrew Jacobs
6502 & PIC Stuff - http://www.obelisk.me.uk/
Cross-Platform 6502/65C02/65816 Macro Assembler - http://www.obelisk.me.uk/dev65/
Open Source Projects - https://github.com/andrew-jacobs


Top
 Profile  
Reply with quote  
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 48 posts ]  Go to page Previous  1, 2, 3, 4  Next

All times are UTC


Who is online

Users browsing this forum: No registered users and 1 guest


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to: