whartung wrote:
Thinking even more (always dangerous), especially today, the idea of "poking" in to screen RAM is really the anomaly, I think, even historically rather than reality. Especially today when everything is being run by GPUs and talking high level protocols and object (rectangle, lines, circles, etc.).
I (foolishly!) decided some 8 years ago or so that I wanted to write what I considered to be "my" BASIC interpreter - so set about doing so in C under Linux. At that point, while I'd been using Linux since day 1 and Unix (with and without X windows) since {mumble} and having worked for 3D graphics companies, and a PC games company, I'd never written any graphical or X windows programs under Linux/Unix... So working it all out was a bit of a challenge. I settled on a library called SDL which is ostensibly cross-platform, however what it gives is a buffered "poke pixels at the display" type of interface. So I poke pixels into it then call an "update" function which magically blits the software framebuffer I have to the real screen. It can do this very, very fast - even on a Raspberry Pi without using the GPU.
Of-course after doing it, the cool kids told me I was doing it wrong, however it worked very well for me (and still does today), but I know it's "wrong".
Or is it?
I think it's a matter of perception and having my own little 2D graphics library (the usual stuff, lines, clipping, shapes, circles, polygons, sprites, etc.) has done me well for many little projects involving bitmapped displays on Pi's, Arduinos and so on. I also like turtle graphics...
But today... We want (or think we want) more pixels, more colours, higher resolutions... But we only have an 8 or 16 bit micro running at 14Mhz (if we're lucky) and without some sort of hardware assist, going above what's effectively composite video resolutions starts to become hard - in terms of hardware and getting the software fast enough.
Stefanies Foenix has a 640x480 VGA output with 8 bits per pixel. To draw a diagonal line from one corner to the next means putting down 800 pixels while calculating Bressenham in 16-bits. I don't know if her FPGA is doing the basic drawing (in addition to tile and sprite handling), or the 65816, but if it's the 65816 it's going to be visible.
And I've just seen someone ask: Will it run Doom on 8-bit Daves Facebook group (and I'm sure this will be asked again, and again, and ...)
The way round that is to say: What do you want to draw a line for? Use a paint program to design background tiles and sprites instead. Maybe that's it. It won't give me my turtle graphics though.
The VS23S010-D is a nice solution - it's effectively memory, can be pixel poked (via a peripheral bus interface), but also has a fast 2D memory move engine. (aka blitter) but could a 6502/816 drive it? I think so - more-so if you look at some of the C64 stuff from the "demo scene" people. Even Apple II which is also a 1Mhz 6502 and BBC Micro which is a 2Mhz 6502 has had some fantastic games and demos written.
We have a modern 65C02, fast memory to match, a multitude of graphics options.... Just write the code
-Gordon
(but will it run Doom?
_________________
--
Gordon Henderson.
See my
Ruby 6502 and 65816 SBC projects here:
https://projects.drogon.net/ruby/