candle wrote:
kc5tja: this is antic (atari XL XE series gpu) timings, and it goes with border - for futher information refer to
www.atari-history.com or .org, don't remember... they have full documentation on antic gpu and supporting it gtia chip
14mhz clock was used for generating dram necesary signals (RAS and CAS), then 3.5mhz output was used for driving antic chip, which fetched all data from memory and putting them by 3 or 4 wire bus into gtia chip. when you doubled first clock to 28mhz you got 7mhz on antic and 31khz hsync...
this is not theory - i've done this in '94
AND PLEASE... stop acting as guru...
First, let me make something absolutely, positively clear -- you cannot break the laws of physics. Period. End of discussion. The laws of physics for video display
dictates that if you have a horizontal line period of 63.89us, and you have 320 pixels on that line, each pixel period is going to be (63.89us / 320) = 199.68ns. Put in terms of frequency, this is 5.008MHz --
precisely what I said it was in my previous post.
This does not cover borders or overscan requirements, nor does it take into account the time spent during horizontal blanking or retrace. These things take up
extra pixel slots on the display, and consequently, your horizontal total will be larger than the 320 pixels you see on the screen. If you don't believe me, do the research -- VGA boards have a horizontal total of 800 pixels when in 640x480 video mode. Amiga had a horizontal total of approximately 736 for NTSC displays, even though only 640 was visible on the screen. Commodore 64's VIC-II chip is documented as having 512 pixel slots in each of its horizontal periods. Do I really need to go on? A very simple Google search will reveal much abou the subject. One doesn't need to be a "guru."
You physically
cannot get a 320 pixel display with a 3.5MHz clock
unless you use both edges of the clock, which has the effect of a 7MHz clock with a single edge. If you feel otherwise, I wholeheartedly invite you to try. Please. And DO report back to us, with full disclosure of schematics, screen shots, and oscilloscope traces. If you can find a way to pull this stuff off, you'll become a very rich individual -- with SVGA and XVGA video bandwidths getting close to 300MHz, transmission line effects of video cables start to make huge impacts on perceived video quality. Anything that can cut video bandwidth in half, or more, while still retaining video resolution and refresh rates
will hit it big in the market. It's a big problem -- it's why we don't have 4096x4096 displays yet.
Also, I have reviewed Atari's older chipsets in the past (as an Amiga owner, the chipset of which was made by none other than Jay Minor himself, I have an active interest in Atari's earlier technology), and the Antic derives most of its timing
not from the 3.5MHz clock (which, BTW, is used primarily for its color burst output, and is actually 3.579545MHz for NTSC; not sure for PAL equivalent), but from its 14MHz clock. So I do also invite you to do your homework in these matters before proceeding to publicly impinge on another's reputation on these forums.
To wit from
http://www.howell1964.freeserve.co.uk/A ... cteristics:
Quote:
The ANTIC and CTIA chips generate the television display at the rate of 60 frames per second on the NTSC (US) system. Each frame consists of 262 horizontal TV lines and each line is made up of 228 color clocks. The 6502 microprocessor runs at 1.79 MHz. This rate was chosen so that one machine cycle is equivalent in length to two color clocks. One clock is approximately equal in width to two TV lines.
From this, we can see that the 3.579545MHz clock is used to split the line up into several "color clocks," of which there are only 228 per line (3579545 Hz/15734 Hz = 227.5; they happen to round up). Obviously, to get 320 pixels on the display, some mechanism is required to shift the data through the TIA at a rate
faster than the color clock rate. Fortunately for both the ANTIC and the TIA chips, they happen to have a 14MHz clock at their disposal to do this with.