brk flag useless?

For discussing the 65xx hardware itself or electronics projects.
User avatar
BigDumbDinosaur
Posts: 9428
Joined: 28 May 2009
Location: Midwestern USA (JB Pritzker’s dystopia)
Contact:

Re: brk flag useless?

Post by BigDumbDinosaur »

ElEctric_EyE wrote:
Permit me to express my opinion and thereby also reveal my ignorance on interrupts: I dislike interrupts. I've never used them. They seem to have been made for slow machines.

Dunno about that. The quad-core AMD Opteron-powered Linux server behind my desk is very fast (about 22,000 BogoMips), and it uses interrupts...lots of them. :) The entire I/O system is interrupt-driven, timekeeping is interrupt-driven, task scheduling is interrupt-driven, and on and on it goes.

Quote:
...A fast enough 6502 doesn't need interrupts does it? It should have sufficient time to poll all flags/inputs within the system.

It depends on what the system is expected to do. The problem with not having an interrupt system is that asynchronous I/O can become quite inefficient, as the MPU has to constantly check all possible I/O sources for activity, which reduces the time available to execute other tasks, such as run a user application. Also, you'd have to be constantly saving the state of the MPU as it went from device to device looking for activity to service. I could see where it could become a programming nightmare.

If interrupts are involved, polling doesn't have to occur at all until something demands attention and the ISR can be designed to preserve the MPU state so the main task continues without a hitch after the interrupt has been processed. I think you'd find it cumbersome to work with a general purpose computer that didn't have a means of reacting asynchronously to external events (e.g., typing on a keyboard). That is where interrupts can be your friend.
x86?  We ain't got no x86.  We don't NEED no stinking x86!
User avatar
BigDumbDinosaur
Posts: 9428
Joined: 28 May 2009
Location: Midwestern USA (JB Pritzker’s dystopia)
Contact:

Re: brk flag useless?

Post by BigDumbDinosaur »

BigEd wrote:
Hi BDD surely it's a valid point that A can be restored from ZP within a few instructions, and so long as that happens before CLI, all is well.

No argument there, except for the fact that the ZP location cannot be used by any foreground task unless interrupt processing is suspended. With ZP being as valuable as it is, that seems to be a counterproductive limitation to impose on the programmer.
x86?  We ain't got no x86.  We don't NEED no stinking x86!
whartung
Posts: 1004
Joined: 13 Dec 2003

Re: brk flag useless?

Post by whartung »

It's interesting, especially today where even at the high level, outside of the kernel, much of today's software is almost completely interrupt driven.

There are always exceptions, but, as an example, modern GUIs are driven off of an event queue that is filled by interrupts. Various input device activities (key down, mouse move, etc.) post event structures on a queue, and the GUI code dispatches off of that. Back in the day, screen repaints were queue based as well.

On servers, the buzzword today is "asynchronous", where you have the OS waiting on socket events, that are then dispatched to the appropriate tasks. Async, event driven programming is get very main stream right now, in many domains.
User avatar
Arlet
Posts: 2353
Joined: 16 Nov 2010
Location: Gouda, The Netherlands
Contact:

Re: brk flag useless?

Post by Arlet »

ElEctric_EyE wrote:
A fast enough 6502 doesn't need interrupts does it? It should have sufficient time to poll all flags/inputs within the system.
That completely depends on the application. If you need a quick and predictable response to an external event, you probably need interrupts. I've worked on systems where the maximum allowed response latency was 1 microsecond. It's not practically possible to poll flags at that rate.
rwiker
Posts: 294
Joined: 03 Mar 2011

Re: brk flag useless?

Post by rwiker »

Arlet wrote:
ElEctric_EyE wrote:
A fast enough 6502 doesn't need interrupts does it? It should have sufficient time to poll all flags/inputs within the system.
That completely depends on the application. If you need a quick and predictable response to an external event, you probably need interrupts. I've worked on systems where the maximum allowed response latency was 1 microsecond. It's not practically possible to poll flags at that rate.
Use of interrupts can also make for much more power-efficient systems than polling. For example, in 1990 or so I designed a portable data logger that would put the processor (or most of it, anyway) to sleep and only wake up to service interrupts. That way, the datalogger could run comfortably for a week on a 9V battery; if I had chosen to use polling, it would only have lasted for a few hours.
User avatar
Arlet
Posts: 2353
Joined: 16 Nov 2010
Location: Gouda, The Netherlands
Contact:

Re: brk flag useless?

Post by Arlet »

Another good use of interrupts is on a system with external user code. On a PC, you can't expect every application to be nice enough to poll the hardware at the required speed, or even know what polling frequency is required.
Tor
Posts: 597
Joined: 10 Apr 2011
Location: Norway/Japan

Re: brk flag useless?

Post by Tor »

ElEctric_EyE wrote:
A fast enough 6502 doesn't need interrupts does it? It should have sufficient time to poll all flags/inputs within the system.
Well.. imagine if you have a very simple real-time clock in your system, maybe it's just an accurate counter which gives a tick every second. Now imagine if a program could just read a certain memory location to read the current time.. to have the correct time in that memory location you could have a little interrupt handler and use the one-per-second tick as an interrupt. So the interrupt handler would count up the time and maintain that memory region.
It would be inconvenient to have to program that as a poll in every program you write. Although I understand that was basically how the first Macs were programmed.. all user applications had to include a regime which made sure that all the housekeeping was done.

There are interrupt-less chips out there, the Parallax Propeller for example. But it can be interrupt-less because it has eight cores, and the idea is that instead of using interrupts to handle everything you just dedicate one core to each part (or one core could poll a number of housekeeping jobs, and then you don't have to think about that in your 'main' program).

For a single-chip/core system it's difficult to do without interrupts as soon as there are any real-time issues at hand. There may be a lot of time between when action is needed, but if it then has to be handled _very quickly_ you would have to poll very often.. even if nothing happened 99.999% of the time. Add a few of those and there's nothing left for processing, even if the processor is fast.

So, two issues really.. do you really want to have to include all the necessary polling in every trivial program? And can you handle real-time demands just with polling, without wasting most of the processor's time?

Edit: Ah, I didn't notice there was already a page-full of replies to that question.. :)

-Tor
Last edited by Tor on Fri Aug 16, 2013 9:22 am, edited 1 time in total.
User avatar
BigEd
Posts: 11464
Joined: 11 Dec 2008
Location: England
Contact:

Re: brk flag useless?

Post by BigEd »

Hi EEye, for an extreme case, reading and writing floppy disks can take one interrupt per byte, and that turns out to give very few cycles to handle each byte - you absolutely have to keep up with the disk.

There's some info at http://www.stardot.org.uk/forums/viewto ... =3&p=33412 which contains a pointer to the rather good http://beebwiki.jonripley.com/Quad_dens ... isc_access which is unfortunately offline at present. (Edit: but now found here)

Similarly, you might want to allow for keyboard input (or serial input) - catching events in an IRQ handler and placing them in a buffer is far better than writing all your programs to poll often enough not to miss anything. If your word processor is counting words or reformatting a document, you want to be able to cancel the operation.

Cheers
Ed
Last edited by BigEd on Sun May 30, 2021 3:07 am, edited 1 time in total.
User avatar
Arlet
Posts: 2353
Joined: 16 Nov 2010
Location: Gouda, The Netherlands
Contact:

Re: brk flag useless?

Post by Arlet »

Another good example is making simple beeps with a speaker connected to a IO pin. On a system with fast (and predictable) interrupt handling, it is feasible to use a timer interrupt and a general purpose IO to have the CPU do this manually, while still doing other tasks at the same time. Without interrupts, it is basically impossible.
User avatar
GARTHWILSON
Forum Moderator
Posts: 8775
Joined: 30 Aug 2002
Location: Southern California
Contact:

Re: brk flag useless?

Post by GARTHWILSON »

Quote:
almost any non-trivial IRQ handler will end up using .A for something else (distinguishing interrupt events in a 65C22, for example),
Depending on what interrupt(s) are enabled you might be able to distinguish with the BIT instruction without using A.  Bits 6 and 7 can be tested this way without involving the accumulator.  Bit 7 tells if any of the enabled interrupts are active, and bit 6 tells if it was a T1 time-out.  If you only had one interrupt source in the particular 65c22 enabled, there's no question as to which one it is.  If you have two, and one is the T1 time-out, you could still tell, since it's either T1 or the other one, whatever it is.  It is rare to have more than one or two interrupt sources enabled at once on the same 65c22.  The actual servicing of the interrupt will usually require use of A for something else anyway though.

Quote:
and thereby also reveal my ignorance on interrupts: I dislike interrupts. I've never used them. They seem to have been made for slow machines.

I delayed in posting this response because I realized I had a small error in my RMS jitter numbers that I needed to correct on my potpourri web page.  Some of what's below has been at least partially addressed above but I'll post it anyway as it may still be helpful.

In the case of something like a printer port or RS-232 port, there may be lots of leeway in timing to service the interrupt after it hits, and you may do just fine polling it once in a while instead of using the interrupts.  As long as you take care of it before the next one comes in, you're fine.  The requirements are pretty loose.  Actually, in the case of a printer port, you can delay service as long as you want, and it won't have any significant effect, since the printer will patiently wait for the next data.  It will probably still be printing data from its buffer while it is waiting anyway.

However, what might be a problem is that every single program you write now has to have polling places in it to check the port or other peripheral that might need service at any time.  If you use interrupts, then once your interrupt service is set up, the next program you load can be oblivious to the servicing.  The only other way around this is to have a multitasking system where a task services the port.  That is still not suitable for fast response times though, and, if you have preëmptive multitasking, it will use...here it is...timer interrupts!

When I do the audio sampling however (or any fast analog sampling-- it doesn't have to be audio), the exactness of the timing is very critical, so you can't just do another sample just any ol' time it's convenient within a wide window of time.  The exactness of the timing is more important than the amount of response time delay.  Jitter is how much the sample timing "rattles around" on each side of the exact ideal time.  Jitter causes audio distortion and noise, so it is desirable to minimize it.  In professional digital recording equipment where they want 16-bit (or more) performance to 20kHz, they go to great lengths to minimize jitter in the sampling clocks.  In the case of the 65c22 using the time-out period of T1 (timer 1) to produce a train of very evenly spaced interrupts, the jitter of the crystal-controlled system clock that runs it, and the aperture jitter (ie, the timing sloppiness of the sample-and-hold circuit in the A/D converter) can easily be dwarfed by the fact that there will be a variable number of clock cycles left to finish the currently executing instruction before the interrupt sequence can start.  So with interrupts on a 6502 system, the RMS jitter, for just recording, or just playback, is about 1.8 clocks, and 2.6 clocks for the entire record-play combination, which is a little over one-half the average instruction time.  (Wait states would make it worse, which is a reason I don't want to use them.)  Without interrupts, there is no way of achieving that kind of timing accuracy, let alone while running a program that does something else useful, unless you use separate buffer circuitry that clocks the samples in and out at an independently controlled rate so µP instruction length and progress won't affect it.  Buffer circuitry like this would often be used with DMA which further complicates things.

The number of bits of resolution in your A/D or D/A converter refers to voltage precision (and hopefully initial accuracy too), while the jitter refers to timing accuracy.  Jitter has the effect of reducing the number of bits of accuracy (particularly in high frequencies), thus producing noise and distortion.  This diagram illustrates why:

JitterGraphic.jpg

I cover this matter of jitter and effective number of bits further on the circuit potpourri page at http://wilsonminesco.com/6502primer/potpourri.html#JIT (where I just made some corrections in the calculations, since my earlier numbers were slightly too optimistic).

The 6502 interrupts primer gives more on the reasons for interrupts.
http://WilsonMinesCo.com/ lots of 6502 resources
The "second front page" is http://wilsonminesco.com/links.html .
What's an additional VIA among friends, anyhow?
White Flame
Posts: 704
Joined: 24 Jul 2012

Re: brk flag useless?

Post by White Flame »

Here's a little tale from Commodore engineer Bill Herd explaining why you should use interrupts. (The 8563 is the Commodore 128's 80-column video chip.)
Quote:
22-Jan-93 14:17:32

Memory flash, I just remembered when we found out there was no interrupt
facility built in to the 8563. I remember how patient the designer was when
he sat me down to explain to me that you don't need an interrupt from the
8563 indicating that an operation is complete because you can check the
status ANY TIME merely by stopping what you're doing (over and over) and
looking at the appropriate register, (even if this means banking in I/O) or
better yet sit in a loop watching watching the register that indicates when
an operation is done (what else could be going on in the system besides
talking to the 8563 ???) Our running gag became not needing a ringer on the
phone because you can pick it up ANY TIME and check to see if someone's on
it, or better yet, sit at your desk all day picking the phone up. Even in
the hottest discussions someone would suddenly stop, excuse himself, and pick
up the nearest phone just to see if there was someone on it. This utterly
failed to get the point across but provided hours of amusement. The owners
at the local bar wondered what fixation the guys from Commodore had with the
pay phone.
(snipped from http://homepage.hispeed.ch/commodore/c128_story.html)
User avatar
BigEd
Posts: 11464
Joined: 11 Dec 2008
Location: England
Contact:

Re: brk flag useless?

Post by BigEd »

Lovely!
ElEctric_EyE
Posts: 3260
Joined: 02 Mar 2009
Location: OH, USA

Re: brk flag useless?

Post by ElEctric_EyE »

That story is funny and interesting at the same time. In my project I have a BUSY flag output from a hardware line generator. I made 2 branch opcodes that the programmer could use to test the flag. Branch if BUSY or branch if /BUSY very similar to BNE/BEQ. That way the programmer could choose to repetitively wait for the process to be done, or if it's not done you're free to do something else except send the last coordinate (which triggers the plotter and makes it busy for some cycles).
User avatar
BigDumbDinosaur
Posts: 9428
Joined: 28 May 2009
Location: Midwestern USA (JB Pritzker’s dystopia)
Contact:

Re: brk flag useless?

Post by BigDumbDinosaur »

White Flame wrote:
Here's a little tale from Commodore engineer Bill Herd explaining why you should use interrupts. (The 8563 is the Commodore 128's 80-column video chip.)

The 8568 that was used in the C-128D did have an open collector IRQ output that went low when bit 7 in the status register was set. However, the IRQ output wasn't connected to anything.
x86?  We ain't got no x86.  We don't NEED no stinking x86!
Post Reply