I think I'm not really following this, but why not get an old graphics card(s) and salvage the chip(s) or just put a 16 bit ISA connector in and plug in an old graphics card?
**LOTS** of reasons.
1) Availability. As in, significantly diminishing, and increasingly unreliable, supply.
2) Availability. It is unacceptable from a new product line point of view. If you're intending on going into production (a possibility for my Kestrel), then it is not practicable.
3) Compatibility. Now you need installable video drivers that are card specific.
4) Complexity. Ever program a VGA chip at the register level? It's damn near impossible to get anything right. The "best way" or "recommended way" is to let PC BIOS set the video mode, then read back the register values. However, this is not always reliable -- most SVGA chips have registers which are
write-only to save silicon for their unique selling features.
5) Bus interface logic requirements. You are attempting to interface a 6502 to a chip that is specifically tailored for not only an Intel interface, but an IBM PC/AT interface in particular. That means, you must mimic the ISA's precise operational semantics. This imposes software and/or hardware overhead.
6) Lack of features. VGA cards for the PC are generally designed for static, overhead projection of graphics. There are NO features that caters to more advanced graphics capabilities (raster interrupts? What are those? Vertical sync interrupts? MAYBE, but not guaranteed -- see item 3 above. Sprites? What are those? Split-screen operation?
Only to video offset 0, thus preventing anything more than 2 screens. Multiple resolutions on the same monitor? Why would anyone ever want to do such a thing? Blitter?! Confidential information -- here, sign this NDA, and
maybe we'll give you the information you need, or maybe we'll just give you only a teaser [as has happened to Chuck Moore when he signed the NDA with ATI to gain access to their stuff]).
Need I continue?
This works for the PC because there is a massive bias towards raw CPU horsepower versus the video display bandwidth (my computer at home has a 66MHz FSB with a 32-bit bus, thus allowing 266MBps raw throughput to the video card, while the 1024x768x16.7M display mode I run it at requires a measily 135MBps average throughput. Thus, CPU can pretty much bit-bang all the features that traditionally required dedicated hardware support for, in real-time (indeed, my Atari 800, Amiga, and Commodore emulators all run easily with 50fps to 60fps performance on my 800MHz Athlon box)). Combine that kind of performance with the fact that the PC rarely updates the entire screen at 60fps, and you can see where this is leading.
In short, commercial VGA solutions are patently not hacker/homebrew friendly at the register level, and not 65xx-friendly at the programming level. You can acquire VGA controllers from OpenCores which are open source and well documented, are 1000% easier to program, etc., but they're still lacking in the more dynamic features that the older chips like VIC-II, TMS9918, GTIA, and AGNES & DENISE supported. Again, it is obviously optimized for processors that have bus bandwidths equal to or higher than video refresh rates. Sprite capability is limited to a lousy mouse pointer.
That's why there is still a niche "pseudo-market" for hacker-friendly video chips. Ask anyone on this board if they'd rather code for a VGA card or a VIC-II, and they'll almost always say VIC-II. There's a reason for it.