Hi,
I'm working on a flixel game with a dude and wondered what the framerate of the NES is, by way of comparison. I couldn't find the info on wikipedia but thought maybe one of you awesome people could help...
I'm basically wondering how often the sprites for a typical wlk cycle change.
Any help is much appreciated.
Stay funky,
- Bez
I'm just some random guy. Don't let my words get you riled - I have my opinions but they're only mine.
I always thought that the framerate of NES was exactly 60 FPS. I wonder why they chose a weird number like 60.098 instead of a nice and round number like 60.
Oops! Sorry, I didn't even think to search!
Thanks for the info. :-)
Are you able, by any chance, to also tell me how many frames each frame of animation is usually 'held' for?
I imagine that the walk cycle in SMB is closer to 20 or 30FPS.
Or if you're kind enough to explain to me how to do frame advance in a NES emulator (or I can work it out myself...) I'll maybe discover the info that way.
I'm just some random guy. Don't let my words get you riled - I have my opinions but they're only mine.
AFAIK back then it wasn't really a question of choice. The refresh rate was tied to the refresh rate of the local TV standard, which in itself was tied to the frequency of the main AC input (which is why in the US games updated at 60 FPS while in Europe they updated at 50 FPS).
Frame advance is pretty simple... in FCEUX press the frame advance hot key, which is \ by default I believe. You can customize it as well. That will pause the emulation. Pressing it again will advance one frame. To continue at normal speed you have to push the Pause hotkey, which I think is Pause/Break by default and also customizable.
From what I remember walking animation updates every 3 to 4 frames but it varies a lot. The human eye sees at about 25 frames per second I believe so no use going much faster than that...
Heh, yeah I was wondering if that was really correct. From what I've read now it sounds like each individual cell sends an impulse that takes about 1/25 of a second. But we're not really like a tv/monitor or whatever, and the cells are not synced up so we can definitely detect better frame rates. Plus we see light better than an absence of light. So you can apparently flash an image up for 1/220 of a second in a dark room and most people will be able to recognize the image. I guess my point is that I have no idea what I'm really talking about :)
"In other words, the flicker-fusion point, where the eyes sees gray instead of flickering tends to be around 60 Hz." "Most people do not detect flicker above 75Hz."
Obviously this is white vs black, so the general color related number may be a bit lower, but I'm fairly sure rate of vision is larger than 25 Hz.
That depends on what we are talking about.
Movies are filmed and shown in theaters at 24 frames per second. If the frames of such a movie alternated from being purely black and purely white, you would certainly see the 24-fps flickering (rather than a static gray screen). However, for example the movie showing a person walking does not look jerky but smooth. In order to see jerkiness in the movement, the framerate would need to be dropped to below 16 or such. So it really depends.
Movies are filmed and shown in theaters at 24 frames per second. If the frames of such a movie alternated from being purely black and purely white, you would certainly see the 24-fps flickering (rather than a static gray screen). However, for example the movie showing a person walking does not look jerky but smooth. In order to see jerkiness in the movement, the framerate would need to be dropped to below 16 or such. So it really depends.
Yeah, it depends on how much change there is between the frames. A black frame turning white is as much change as is possible, so it's easy to notice, but if a person in a movie moves 2 cm between two frames when he's walking, it's not as big of a change, so it looks smoother than if he would move 2 m forward in 2 frames.
I think people should start filming movies at 60 FPS instead of 24, they would look much better that way. Almost all games are 60 FPS, but there are only a few movies which are 60 FPS, probably because people aren't used to seeing movies so smooth, so they don't "feel like movies" if they are too smooth.
Actually, movies can get away with being filmed at 24 FPS because, unlike games, they show everything that happened during a frame. If someone is running in a movie, then you'll see their legs blur slightly each frame, so even though the movie is shot at 24FPS, you can still "see" more "frames". Games render the current state of the universe at discrete moments, though, so every frame is sharp and has no visible motion. Therefore you need to show more frames to not look choppy.
Movies could render at a higher speed, but you'd need to triple all the infrastructure related to video content (cable TV, DVD capacity, etc.). And I don't think the gain would be particularly noticeable. It'd be better, sure, but the gains would be less than you'd get from, say, improving resolution.
Pyrel - an open-source rewrite of the Angband roguelike game in Python.