6502.org Forum  Projects  Code  Documents  Tools  Forum
It is currently Sat Nov 23, 2024 5:48 pm

All times are UTC




Post new topic Reply to topic  [ 20 posts ]  Go to page Previous  1, 2
Author Message
 Post subject:
PostPosted: Mon Oct 13, 2003 5:35 pm 
Offline

Joined: Sun May 04, 2003 5:03 pm
Posts: 47
Location: Lublin, Poland
kc5tja: this is antic (atari XL XE series gpu) timings, and it goes with border - for futher information refer to www.atari-history.com or .org, don't remember... they have full documentation on antic gpu and supporting it gtia chip
14mhz clock was used for generating dram necesary signals (RAS and CAS), then 3.5mhz output was used for driving antic chip, which fetched all data from memory and putting them by 3 or 4 wire bus into gtia chip. when you doubled first clock to 28mhz you got 7mhz on antic and 31khz hsync...
this is not theory - i've done this in '94

AND PLEASE... stop acting as guru...


Top
 Profile  
Reply with quote  
 Post subject:
PostPosted: Tue Oct 14, 2003 5:21 pm 
Offline
User avatar

Joined: Sun Dec 29, 2002 8:56 pm
Posts: 460
Location: Canada
kc5tja wrote:
The dot-clock for the Apple II couldn't have been 14MHz -- not with pixels the size of cynder blocks, even for that era of computers. :) I can believe it is the base frequency from which everything else is derived though (however, even there, I don't think it was, as the CPU's frequency wasn't an even power of 14.31818MHz, IIRC).


I think the dot clock in the Apple II was 7.16MHz, although the 14MHz might have been used to phase shift the dots to generate more colors. It has a 280x192 display. The cpu was run at 14.31818 MHz / 14 or 1.022 MHz, but I think maybe every 65th clock cycle was extended a bit.

It had a two transitor oscillator circuit too, using a crystal.


Top
 Profile  
Reply with quote  
 Post subject:
PostPosted: Thu Oct 16, 2003 2:30 am 
Offline

Joined: Sat Jan 04, 2003 10:03 pm
Posts: 1706
candle wrote:
kc5tja: this is antic (atari XL XE series gpu) timings, and it goes with border - for futher information refer to www.atari-history.com or .org, don't remember... they have full documentation on antic gpu and supporting it gtia chip
14mhz clock was used for generating dram necesary signals (RAS and CAS), then 3.5mhz output was used for driving antic chip, which fetched all data from memory and putting them by 3 or 4 wire bus into gtia chip. when you doubled first clock to 28mhz you got 7mhz on antic and 31khz hsync...
this is not theory - i've done this in '94

AND PLEASE... stop acting as guru...


First, let me make something absolutely, positively clear -- you cannot break the laws of physics. Period. End of discussion. The laws of physics for video display dictates that if you have a horizontal line period of 63.89us, and you have 320 pixels on that line, each pixel period is going to be (63.89us / 320) = 199.68ns. Put in terms of frequency, this is 5.008MHz -- precisely what I said it was in my previous post. This does not cover borders or overscan requirements, nor does it take into account the time spent during horizontal blanking or retrace. These things take up extra pixel slots on the display, and consequently, your horizontal total will be larger than the 320 pixels you see on the screen. If you don't believe me, do the research -- VGA boards have a horizontal total of 800 pixels when in 640x480 video mode. Amiga had a horizontal total of approximately 736 for NTSC displays, even though only 640 was visible on the screen. Commodore 64's VIC-II chip is documented as having 512 pixel slots in each of its horizontal periods. Do I really need to go on? A very simple Google search will reveal much abou the subject. One doesn't need to be a "guru."

You physically cannot get a 320 pixel display with a 3.5MHz clock unless you use both edges of the clock, which has the effect of a 7MHz clock with a single edge. If you feel otherwise, I wholeheartedly invite you to try. Please. And DO report back to us, with full disclosure of schematics, screen shots, and oscilloscope traces. If you can find a way to pull this stuff off, you'll become a very rich individual -- with SVGA and XVGA video bandwidths getting close to 300MHz, transmission line effects of video cables start to make huge impacts on perceived video quality. Anything that can cut video bandwidth in half, or more, while still retaining video resolution and refresh rates will hit it big in the market. It's a big problem -- it's why we don't have 4096x4096 displays yet.

Also, I have reviewed Atari's older chipsets in the past (as an Amiga owner, the chipset of which was made by none other than Jay Minor himself, I have an active interest in Atari's earlier technology), and the Antic derives most of its timing not from the 3.5MHz clock (which, BTW, is used primarily for its color burst output, and is actually 3.579545MHz for NTSC; not sure for PAL equivalent), but from its 14MHz clock. So I do also invite you to do your homework in these matters before proceeding to publicly impinge on another's reputation on these forums.

To wit from http://www.howell1964.freeserve.co.uk/A ... cteristics:
Quote:
The ANTIC and CTIA chips generate the television display at the rate of 60 frames per second on the NTSC (US) system. Each frame consists of 262 horizontal TV lines and each line is made up of 228 color clocks. The 6502 microprocessor runs at 1.79 MHz. This rate was chosen so that one machine cycle is equivalent in length to two color clocks. One clock is approximately equal in width to two TV lines.


From this, we can see that the 3.579545MHz clock is used to split the line up into several "color clocks," of which there are only 228 per line (3579545 Hz/15734 Hz = 227.5; they happen to round up). Obviously, to get 320 pixels on the display, some mechanism is required to shift the data through the TIA at a rate faster than the color clock rate. Fortunately for both the ANTIC and the TIA chips, they happen to have a 14MHz clock at their disposal to do this with.


Top
 Profile  
Reply with quote  
 Post subject:
PostPosted: Thu Oct 16, 2003 12:45 pm 
Offline

Joined: Sun May 04, 2003 5:03 pm
Posts: 47
Location: Lublin, Poland
only fredie chip is getting 14mhz clock, antic is getting its 3.5mhz (i cannot tell you exact figure) and color clock is 2 pixel wide
for color brust there is another oscilator circuit - and this for your ntsc is something like 3.5mhz, but for pal it is more like 4.5mhz
stop acting in anger, accept the FACT you don't know everything
geezes


Top
 Profile  
Reply with quote  
 Post subject:
PostPosted: Thu Oct 16, 2003 1:35 pm 
Offline

Joined: Sat Jan 04, 2003 10:03 pm
Posts: 1706
candle wrote:
only fredie chip is getting 14mhz clock, antic is getting its 3.5mhz (i cannot tell you exact figure) and color clock is 2 pixel wide


Thank you for proving exactly what I wrote above -- that the only way to get the higher resolution is to use both edges of the 3.5MHz clock. It turns out I was right after all. "Jeezes." I'll admit that I got the base clock frequencies for the Antic and GTIA chips wrong, but that doesn't change the fundamental fact that I've been pointing out for some time now.

Also, IIRC, the Freddie chip never made it to market. It was intended for the 1400XL or something similarly named. That chip definitely needed the higher clock, because it multiplexed the CPU and Antic against higher-speed RAM. Though, I wonder why it didn't just use the phase1/phase2 approach to multiplex the bus. It would seem to me to be easier to do.


Top
 Profile  
Reply with quote  
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 20 posts ]  Go to page Previous  1, 2

All times are UTC


Who is online

Users browsing this forum: No registered users and 24 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to: