In
another thread...
GARTHWILSON wrote:
I suspect the issue of bus contention in the beginning of phase 1 on the '816 is pretty insignificant, since otherwise they couldn't get away with putting only a single ground pin and a single power-supply pin on the DIP version. The inductance of the one pin would have rendered things non-op, especially at higher speeds.
Garth, your reference to speed seems unclear. I hope we're in agreement that the presence or absence of contention is determined by turn-on and turn-off times -- ie; the turn-on and turn-off times of the CPU tri-state data bus drivers as compared to those of the memory & IO devices (or the bus transceiver). These times are specified in ns., not in relation to clock rate. Since clock rate doesn't affect driver turn-on and turn-off times, I submit that a change in clock rate can neither create nor eliminate contention in any given example.
To some that may sound odd, so let's look at the beginning of phase 1 as an example. If the previous cycle was a read then at the beginning of phase 1 the bus gets handed over from memory back to the CPU. Each individual memory-to-CPU handover is an event with potential for contention. The deciding factor, of course, is whether the memory IC "lets go" of the bus before the CPU commences to drive it. IOW the deciding factor is the driver turn-on and turn-off times -- which are delays measured in ns.
If we lower or raise the system clock rate, that'll change
repetition rate of the bus handovers, but it doesn't change the severity of the contention each time. So, for example: if all that changes is the oscillator frequency, a system that is free of contention at 1 Mhz will also be free of it at 20 MHz. On the flip side,
a system that has a contention problem at 20 Mhz will also have the problem at 1 MHz. Instead of getting 20 million nasty current spikes per second, you get 1 million -- each just as nasty as one of the 20 MHz spikes.
GARTHWILSON wrote:
Unfortunately there's no minimum spec. on tBAS
Right -- I mentioned this in the lead post. Another driver turn-on/turn-off figure that's missing from WDC specs is the maximum for tDHR. The omissions aren't trivial. Lacking these figures, we're unable to make a comparison with memory turn-on/turn-off times. To protect against contention we have only seat-of-the-pants reckoning.
On a positive note, how can we assess the situation, and how can we improve it? Some actual experiments would be helpful, assuming the right things are measured. It would also be informative to test '816 samples from different foundries, since the driver turn-on/turn-off times may well be affected by the new process.
I hope it's clear I'm not forecasting doom & gloom -- my comments are in support of
understanding the contention issue. The problem might be pretty insignificant, as Garth says. We
know that successful '816 systems exist. The question is, to what extent is success reliant on good PCB design and solid supply bypassing -- IOW, contention tolerance? AFAIK nobody's tried to run an '816 on a breadboard.
BigDumbDinosaur wrote:
The most fool-proof solution is the use of a data bus transceiver as you suggested, which is a straightforward method [...]
I agree that a data bus transceiver is a good idea, for two reasons.
- it's easier to make the CPU play nicely with one device (the transceiver) than with several (RAM, ROM, peripherals).
- swapping out the transceiver and replacing it with one from a different logic family is one of the most effective tactics at our disposal. The best transceiver is not necessarily the fastest or the slowest, but the one whose turn-on/turn-off times are similar to those of the '816.
BigDumbDinosaur wrote:
[...] (74AC245 recommended if Ø2 will exceed 8 MHz, 74ABT245 recommended if Ø2 will exceed 14 MHz).
Is this recommendation your own advice, or based on WDC doc? It seems at odds with two of my points above. If there's authoritative information I've overlooked I'd be grateful if you can direct me to it.
cheers
Jeff