6502.org Forum  Projects  Code  Documents  Tools  Forum
It is currently Tue Apr 23, 2024 3:15 pm

All times are UTC




Post new topic Reply to topic  [ 24 posts ]  Go to page 1, 2  Next
Author Message
PostPosted: Sat Jun 16, 2018 12:27 pm 
Offline
User avatar

Joined: Fri Aug 14, 2015 5:19 pm
Posts: 27
Location: ENCOM mainframe.
Attachment:
MarbleMadnessEscheresqueDiscoveries.png
MarbleMadnessEscheresqueDiscoveries.png [ 27.53 KiB | Viewed 6786 times ]

This discovery came about inadvertantly while working on a soft blitter chip for the Atari 2600 mirroring a similarly inadvertant parallel discovery where the researcher was working on advanced 3D displays.

The technology is already being implemented to enhance 60 FPS movies and gaming on 120 Hz monitors in software and hardware.

Read the indepth discussion thread on the BlurBusters forum and check out the MBR effects in the Atari games and online examples and described in peer review:

https://forums.blurbusters.com/viewtopic.php?f=7&t=4135

It was possible to write a 60 FPS blitter chip for the Atari 2600 which in turn made this discovery possible because it has a nonstandard video signal that is not interlaced, rather than putting out 60 fields per second it outputs 60 full frames (240p) on classic Televisions designed to display 30 FPS.

_________________
Load BASIC from tape on your Atari 2600:
http://RelationalFramework.com/vwBASIC.htm


Top
 Profile  
Reply with quote  
PostPosted: Sun Jun 17, 2018 3:55 pm 
Offline

Joined: Tue Jul 24, 2012 2:27 am
Posts: 672
All the classic home computers & video game consoles did 240p, and would be capable of black frame insertion for 30Hz with a simple interrupt, as most had display enable/disable bits. But since everybody was on CRTs back then, there was no need to attempt to compensate for low-speed ghosting displays.

_________________
WFDis Interactive 6502 Disassembler
AcheronVM: A Reconfigurable 16-bit Virtual CPU for the 6502 Microprocessor


Top
 Profile  
Reply with quote  
PostPosted: Sun Jun 17, 2018 6:43 pm 
Offline
User avatar

Joined: Fri Aug 14, 2015 5:19 pm
Posts: 27
Location: ENCOM mainframe.
White Flame wrote:
All the classic home computers & video game consoles did 240p, and would be capable of black frame insertion for 30Hz with a simple interrupt, as most had display enable/disable bits. But since everybody was on CRTs back then, there was no need to attempt to compensate for low-speed ghosting displays.


I wrote commercial video game in the 80's but none that output full screen high FPS animation where motion blur reduction technology is demonstrable.

If you display 30 FPS motion video at 60 Hz you will have two doubled frames causing motion blur, inserting the blank frame matches the Hz to the FPS and remediates it. Same with the 60 FPS movie footage on 120 Hz monitors.

For example compare this 30 FPS @ 30 Hz Atari 2600 game:

http://javatari.org/?ROM=http://relatio ... LITZII.bin

To the same game running at 30 FPS @ 60 Hz (you'll have to press the space bar to get through the intro):

http://javatari.org/?ROM=http://relatio ... VE_AFP.bin

The motion blur should be pretty apparent in the second game. You can also throw the black and white switch on the console (lower left in the emulator) to make the game output 60 FPS at 60 Hz which also gets rid of the motion blur because it gets rid of the doubled frame.

You should be able to see these effects on your LCD monitor in the emulator and on a CRT Television you have a real Atari.

_________________
Load BASIC from tape on your Atari 2600:
http://RelationalFramework.com/vwBASIC.htm


Top
 Profile  
Reply with quote  
PostPosted: Mon Jun 18, 2018 12:10 am 
Offline
User avatar

Joined: Sun Jun 30, 2013 10:26 pm
Posts: 1924
Location: Sacramento, CA, USA
I must have very crude senses ... they both look fine to me!

Mike B.


Top
 Profile  
Reply with quote  
PostPosted: Tue Jun 19, 2018 12:12 am 
Offline
User avatar

Joined: Fri Aug 14, 2015 5:19 pm
Posts: 27
Location: ENCOM mainframe.
barrym95838 wrote:
I must have very crude senses ... they both look fine to me!

Mike B.

There's another demo built into the second game on the title screen that may be easier to see, the BW switch will change the scrolling text from 30 FPS at 60 Hz to 30 FPS at 30 Hz.

The text should become clearer, if it doesn't try the ROM in the Stella emulator:

http://relationalframework.com/WARPDRIVE_AFP.bin

_________________
Load BASIC from tape on your Atari 2600:
http://RelationalFramework.com/vwBASIC.htm


Top
 Profile  
Reply with quote  
PostPosted: Tue Jun 19, 2018 6:42 am 
Offline

Joined: Tue Jul 24, 2012 2:27 am
Posts: 672
Mr SQL wrote:
If you display 30 FPS motion video at 60 Hz you will have two doubled frames causing motion blur, inserting the blank frame matches the Hz to the FPS and remediates it. Same with the 60 FPS movie footage on 120 Hz monitors.

On an LCD, 30Hz double frames do not "cause" motion blur, since it's not a flickered display. It's literally a solid image for 33ms each frame, just as 16ms each frame at 60Hz is solid, besides the standard temporal aliasing of staying static within that span of time (and any dithering your display might be doing). But the 30fps/30Hz doubled display would have less motion blur than a 60fps/60Hz display, as the majority cause is slow pixel transitions on LCDs, and the pixels would have twice the time to settle on the color in comparison to 60Hz. Plus, it spends less time in image transition; the next image takes 16ms to scan into view, but the 30fps one holds it for an additional 16ms as well without any modification.

I guess if you create a 30fps game on a 60Hz CRT with standard frame doubling, there are 2 flashes in each position which might cause a false interlace as your eye expects to track a smooth 30Hz movement. But in practice, this was never a problem even with CRTs. In fact, CRT phosphors return to black VERY quickly, often only retaining a glow for a dozen or two scanlines, so black frame insertion can be considered built-in. There's a great video by the Slow Mo Guys on youtube that take high-speed footage of the CRT, and you can see how brief the physical persistence is. However, this doubling isn't "motion blur", it's temporal aliasing, and it has been used to create double-resolution interlacing in both directions, even with a plain 240p stable scan. Motion blur is still only caused by slow LCD transitions.

These were long, long discussions years ago when this technique first came out, back when LCDs were a lot worse. Basically, this technique redirects the LCD blur from relative prior pixel values, to normalizing the blur for all pixels against black, and using that strobe effect to go back to persistence of vision instead of the solid illumination that LCDs allow.

_________________
WFDis Interactive 6502 Disassembler
AcheronVM: A Reconfigurable 16-bit Virtual CPU for the 6502 Microprocessor


Top
 Profile  
Reply with quote  
PostPosted: Tue Jun 19, 2018 7:22 am 
Offline
User avatar

Joined: Thu Dec 11, 2008 1:28 pm
Posts: 10792
Location: England
Hmm, sorry, but I couldn't see from the demos (or understand from the text) what's being compared here. Is it the idea that it's better to interrupt the image with a black frame than to show the same image for two frames?? Like this:

image1 - black - image2 - black - image3 - black

being perceived as smoother than

image1 - image1 - image2 - image2 - image3 - image3

with the same timebase in both cases. If not that, could someone perhaps type out a similar diagram of what the innovation is?


Top
 Profile  
Reply with quote  
PostPosted: Tue Jun 19, 2018 9:51 am 
Offline
User avatar

Joined: Fri Aug 14, 2015 5:19 pm
Posts: 27
Location: ENCOM mainframe.
White Flame wrote:
Mr SQL wrote:
If you display 30 FPS motion video at 60 Hz you will have two doubled frames causing motion blur, inserting the blank frame matches the Hz to the FPS and remediates it. Same with the 60 FPS movie footage on 120 Hz monitors.

On an LCD, 30Hz double frames do not "cause" motion blur, since it's not a flickered display. It's literally a solid image for 33ms each frame, just as 16ms each frame at 60Hz is solid, besides the standard temporal aliasing of staying static within that span of time (and any dithering your display might be doing).

No that long pulse width is actually responsible for a great deal of motion blur and the comparitively short pulse width of 2-3ms for CRT accounts for how crisp the motion is by comparison.
Quote:
But the 30fps/30Hz doubled display would have less motion blur than a 60fps/60Hz display, as the majority cause is slow pixel transitions on LCDs, and the pixels would have twice the time to settle on the color in comparison to 60Hz. Plus, it spends less time in image transition; the next image takes 16ms to scan into view, but the 30fps one holds it for an additional 16ms as well without any modification.

I guess if you create a 30fps game on a 60Hz CRT with standard frame doubling, there are 2 flashes in each position which might cause a false interlace as your eye expects to track a smooth 30Hz movement. But in practice, this was never a problem even with CRTs. In fact, CRT phosphors return to black VERY quickly, often only retaining a glow for a dozen or two scanlines, so black frame insertion can be considered built-in.

Yes that's it and your eye percieves the doubled image as one long pulse width instead of 2 ms since the image is the same, losing the crispness you would otherwise get from the short pulse width.

The motion blur vs crispness is very visible on real CRT with my games when switching from 30fps @ 60 Hz and at 30fps @ 30 Hz.

Quote:

There's a great video by the Slow Mo Guys on youtube that take high-speed footage of the CRT, and you can see how brief the physical persistence is. However, this doubling isn't "motion blur", it's temporal aliasing, and it has been used to create double-resolution interlacing in both directions, even with a plain 240p stable scan. Motion blur is still only caused by slow LCD transitions.

These were long, long discussions years ago when this technique first came out, back when LCDs were a lot worse. Basically, this technique redirects the LCD blur from relative prior pixel values, to normalizing the blur for all pixels against black, and using that strobe effect to go back to persistence of vision instead of the solid illumination that LCDs allow.


I think this may be different than slow pixel transitions since it also happens on CRT. The effect can be pleasant though depending on the game, I can see subpixels superimposed on the buildings on a CRT and in Stella the buildings appear cyndrilical.

Motion blur was the first thing I noticed when I went from watching CRT Television to Plasma and LCD so I may be particularly sensitive to it.

_________________
Load BASIC from tape on your Atari 2600:
http://RelationalFramework.com/vwBASIC.htm


Top
 Profile  
Reply with quote  
PostPosted: Tue Jun 19, 2018 10:02 am 
Offline
User avatar

Joined: Fri Aug 14, 2015 5:19 pm
Posts: 27
Location: ENCOM mainframe.
BigEd wrote:
Hmm, sorry, but I couldn't see from the demos (or understand from the text) what's being compared here. Is it the idea that it's better to interrupt the image with a black frame than to show the same image for two frames?? Like this:

image1 - black - image2 - black - image3 - black

being perceived as smoother than

image1 - image1 - image2 - image2 - image3 - image3

with the same timebase in both cases. If not that, could someone perhaps type out a similar diagram of what the innovation is?


Yes. There's an indepth discussion about this on the BlurBusters thread. The other researcher and I took this to pretty far extremes in opposite directions, inserting 7 black frames for every black frame to create super smooth video at:

7.5 fps @ 7.5 Hz out of a 60 Hz signal (my experiment)

and

60 fps @ 60 Hz out of a 480 Hz signal (the BlurBusters researchers experiment)

The BlurBusters experiment used an LCD monitor but by inserting 7 black frames they created an effectively very short pulse width like CRT for practically motion blur free video.

From my experiment it was interesting to observe super smooth animation at strobe flicker rates.

_________________
Load BASIC from tape on your Atari 2600:
http://RelationalFramework.com/vwBASIC.htm


Top
 Profile  
Reply with quote  
PostPosted: Tue Jun 19, 2018 10:30 am 
Offline
User avatar

Joined: Thu Dec 11, 2008 1:28 pm
Posts: 10792
Location: England
Thanks! I'm inclined to think of this effect as 'reduced jerkiness' or 'reduced strobiness' rather than 'added motion blur' but I do now see what you mean.


Top
 Profile  
Reply with quote  
PostPosted: Tue Jun 19, 2018 11:00 am 
Offline

Joined: Mon May 21, 2018 8:09 pm
Posts: 1462
I occasionally have similar discussions about this with PCMR folks, where the object is to properly explain why modern 144Hz gaming monitors are superior to 60Hz office monitors, and even more so than 30fps game consoles. My approach is to talk about motion tracking, not retinal persistence. That's also the measurement approach taken by testufo.com.

It helps that I've finally managed to get a 144Hz Freesync monitor of my own...

Let's suppose you have an object (a small ball, perhaps, or a mouse cursor) crossing the screen in 1 second. At 30fps and with 8-bit standard graphics, that's about 10 pixels per frame; on a modern PC at 60fps, it would be about 35 pixels per frame. If you simply stare at the background while that object crosses in front of your eyes, you'll see it occupy those discrete positions in rapid succession; with persistence of vision, you'll perceive multiple copies of the object spaced at regular intervals. This effect actually increases with higher framerates, such that you'll see a larger number of copies of the object, spaced more closely together. Such effects are countered in the movie industry by applying motion blur, so that you see a continuous smear of the object instead of discrete copies.

But let's suppose that the ball is actually a target important to gameplay - an enemy vehicle, perhaps, or the more common use of a mouse cursor - so that you need to track its position by eye and predict its future motion both precisely and accurately. The more accurately you hold the computing gunsight over the MiG, the more of your shots are likely to hit it, even though it's desperately trying to turn out of your line of fire. In such a case, ideally your eye would always point towards the target and you would see a perfectly sharp image of it, with no duplicates and no blur, while the background scenery gets equivalent effects to the above. Applying motion blur to the ball would be counter-productive!

Late-model CRTs approached this ideal when driven at high refresh rates and with a newly rendered frame per refresh, because they had very short persistence phosphors, usually less than a millisecond, and relied on retinal PoV to give the impression of a steady image. That's why 60Hz was slightly flickery on later CRT monitors; TVs often had longer persistence to reduce flicker on typical broadcast signals (60fps NTSC, 50fps PAL). At 75Hz most people stopped noticing the flicker; many monitors could be driven as high as 120Hz at lower resolutions, limited mainly by horizontal scan frequencies and pixel bandwidth, and that was taken advantage of by competitive gamers.

LCD panels work differently; they tend to hold each pixel with a steady colour and require active work to change it. The latest LCD panels can change a pixel very quickly (less than 5ms), but that only occurs in response to a refresh cycle passing it, which even on a 144Hz display happens only every 7ms. So a game console producing 30fps will have the ball in the same place for 33ms at a time, during which it should have moved as much as 70 pixels to simulate smooth motion across the screen in 1 second. If your eyes are trying to follow that, you won't see a sharp (if flickery) image of the ball, but a 70-pixel-long smear that seems to jump rapidly forwards and slide backwards.

At 30fps, a standard 60Hz monitor is refreshing twice per frame. On a CRT, that results in the ball being briefly displayed twice in the same place before moving to the next position. This effect is not noticeable if you stare at the background, but you will see a double image of the ball if you're tracking it - which one is real? Black frame insertion effectively removes the second image, at the cost of increasing flicker and reducing brightness. On an LCD, it effectively reduces the inherent persistence time to 16ms, so the image you see is smeared across only 35 pixels and doesn't jump quite as far within your field of view.

Blur reduction technology for LCDs mostly revolves around strobing the backlight, which is similar in principle to black frame insertion on a CRT. Essentially, the backlight is pulsed very briefly during the vertical blanking interval, when the panel has settled to the contents of a particular frame, thereby simulating the short-persistence phosphor of a CRT and removing the smear on eye-tracked objects. The downside is that it re-introduces flicker to the display - and because it's the whole display pulsing simultaneously, not a scan, it's even more noticeable - so it can only be used safely on high refresh rates, about 75Hz upward. Many monitors with this technology restrict it to 100Hz or higher.


Top
 Profile  
Reply with quote  
PostPosted: Tue Jun 19, 2018 11:36 am 
Offline
User avatar

Joined: Thu Dec 11, 2008 1:28 pm
Posts: 10792
Location: England
Possibly useful to understand Phi, Beta, and Persistence...
http://mesosyn.com/mental8-14.html

(If there's a hard white ball moving against a hard black background, I'd only want to refer to blur if there are some grey pixels being displayed. Possibly in the community of gamers who discuss such things, my preferred terminology would be non-standard, and only give rise to confusion.)


Top
 Profile  
Reply with quote  
PostPosted: Tue Jun 19, 2018 2:18 pm 
Offline

Joined: Tue Jul 24, 2012 2:27 am
Posts: 672
One very strange assumption here is the notion of 8-bit games running at 30fps. Since most of them were tied to the raster, the vast majority ran at 60fps update speed, with their main game loop running every frame.

The platforms that were purely framebuffer, like the Apple II of ZX spectrum, would be the exceptions. They'd usually run slower but wouldn't be consistent, either. Most weren't double buffered, so the screen partially updated instead of slamming all pixels at once.

_________________
WFDis Interactive 6502 Disassembler
AcheronVM: A Reconfigurable 16-bit Virtual CPU for the 6502 Microprocessor


Top
 Profile  
Reply with quote  
PostPosted: Tue Jun 19, 2018 2:30 pm 
Offline

Joined: Mon May 21, 2018 8:09 pm
Posts: 1462
If you have hardware scrolling and sprites, then it's relatively easy to run at 60fps (or 50fps with PAL output). Most 8-bit micros did not have that (the C64 being a prominent exception), but arcade machines seem more likely to have optimised for graphics performance.


Top
 Profile  
Reply with quote  
PostPosted: Tue Jun 19, 2018 4:07 pm 
Offline
User avatar

Joined: Fri Dec 11, 2009 3:50 pm
Posts: 3345
Location: Ontario, Canada
BigEd wrote:
I'm inclined to think of this effect as 'reduced jerkiness' or 'reduced strobiness' rather than 'added motion blur' but I do now see what you mean.
As a newcomer to the topic, I had the same reaction as Ed -- I found the word blur unhelpful. I understand now. But perhaps the blurbusters web site would benefit from an introductory explanation -- part of a FAQ, maybe.

-- Jeff

_________________
In 1988 my 65C02 got six new registers and 44 new full-speed instructions!
https://laughtonelectronics.com/Arcana/ ... mmary.html


Top
 Profile  
Reply with quote  
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 24 posts ]  Go to page 1, 2  Next

All times are UTC


Who is online

Users browsing this forum: No registered users and 5 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to: