Joined: 10/28/2007
Posts: 1360
Location: The dark horror in the back of your mind
Hi all. As I mentioned [post 292713]elsewhere[/post], I believe 10-bit and YUV444 h.264 playback support will, in the not-too-distant future, be widespread enough in media players, codecs, etc. to begin preparing primary encodes as such.
Thus, I am officially encouraging that encoders produce a 10-bit YUV444 encode in addition to the set of encodes normally produced for a publication.
For those Windows users employing AVISynth to carry out their encoding, you will need to make three changes to your encoding environment to be able to do this:
You'll need a 10-bit build x264; this is available in the usual places. You will probably want to use a different filename for it so as not to conflict with the 8-bit x264 that you'll be using for other types of encodes.
Remove the 'ConvertToYV12' call towards the end of your encoding script; you can leave the 'ConvertToYV24' call intact.
Add '--output-csp=i444' to your x264 command line(s) so that the output actually is YUV444.
I have configured the site to detect adding a mirror with '_10bit444' in the filename separately from other mirrors; this means you will be able to add one of these encodes per movie without my intervention.
So, go to town, everyone!
---
EDIT: Since writing the above passage on AVISynth usage, I've noticed that DeDup, which virtually all of our AVISynth users are using, only support YV12 or YUY2 input/output. Both of these colorspaces have chroma subsampling, so just by doing the above you're not getting the best possible output.
Thus, I suggest you use a different deduplication plugin (such as [thread 12065]ExactDedup[/thread], which I specifically wrote for this purpose).
You might also experiment with removing the ConvertToYV24 call; since x264 contains a library (swscale) which can do colorspace conversions and AVISynth only support 8-bit/channel colour output, doing so (and thus using swscale's conversion) should theoretically be able to produce output covering the entire 10-bit/channel range for 10-bit encodes and thereby provide slightly better quality.
I actually registered (with the encouragement of a few others) to post just for this topic. I am writing this in response to the recent news post of this soon becoming the primary codec.
I am asking that you strongly reconsider this.
First let me state plainly that I don't mind whatsoever if there are secondary encodes in 10-bit, or what have you. Options are always nice.
However making 10-bit the primary encode, I can't agree with. Here is my list of reasons, in no particular order:
1. Codec support is mostly experimental. VLC being one of the only ones I can think of that is shipping it as a stable release.
2. It requires viewers to obtain completely new software or codecs in order to appropriately view the content. Worse, if viewed on incompatible viewers, it will look abnormal, leading viewers to believe the video is corrupt or otherwise.
3. It completely removes hardware acceleration support across the board. I don't think there is a single consumer device out there that has 10-bit support with hardware decoding.
4. As a corollary to #3, the removal of hardware acceleration significantly impacts the playability for any mobile device. Smartphones, tablets, and netbooks all rely on hardware acceleration (especially for resolutions above 640x480). These devices would have to rely on YouTube encodes instead, which is hardly ideal. Furthermore, adding compatible decoders/players on these devices are incredibly slim.
5. The future of 10-bit playback is not certain at all. Considering the ratification of h.265 is coming in the near future (Which will offer much greater performance overall as a codec), 10-bit is becoming little more than a stop-gap.
6. As a corollary to #5, it's a stop-gap measure that doesn't serve much purpose. The quality improvements are minimal at best, file size reductions are also minimal (on the order of single-digit megabytes per 100MB).
Overall, I think it contains a significant impact on both the compatibility of the videos on the site, with very little return. In addition, it will cause unneeded complaints from people who don't understand the issues.
As I said at the beginning, using 10-bit as a secondary option is fine. It introduces none of the above problems. But the intent to enforce this as a primary codec in the near future is very brash. If TASVideos is desperate to implement 10-bit for some reason that I am not aware of, I would at least strongly encourage the staff to consider delaying any such decisions for at least 1 year, to see if hardware actually catches up with the software. I believe it will completely pass over it, but only time can prove that. In the meantime, there's no need to rush towards implementation.
Joined: 6/25/2007
Posts: 732
Location: Vancouver, British Columbia, Canada
It looks better, has wide compatibility (LAV Filters, ffdshow, mplayer2, and VLC), and has equal or smaller file size. 10-bit 4:4:4 should definitely be the primary downloadable encode, with possible 8-bit 4:2:0 as secondary. The quality improvements are huge in low-res video like NES/SNES/GBC/Sega Genesis/Mega Drive. I support options, but I don't support the idea that old technology should be considered "primary" while new, widely-supported, and greatly improved technology is only secondary. We switched to H264 early in its existence and nobody had an issue with that. It was a huge improvement over xvid or divx.
The most significant problem with this is that all current players will play back the video with no warnings. Instead, people will just see graphical corruption with zero indication of what is wrong. This is not the same situation as previous codec changes.
In addition, as I mentioned extensively, I think it would be wholly unfair to judge compatibility only on a few desktop players now. Tablets, netbooks, smartphones are common nowadays. Using hardware decoding on desktops keeps them quiet and cool. These weren't problems back in the XviD days. They are problems now, that should not be simply swept under the rug.
Addressing concepts like "quality improvements" is difficult since everyone can have a screaming match at eachother over subjective things. I'm not interested in some quality flame war, that is counter-productive. What I can do, however, is address it from a purely factual point of view, on how the technology actually works.
The most significant improvements of 10-bit color in terms of quality come into play with gradients, dithering and color swapping. It is able to store more color information, so dithering and color swapping are less necessary to avoid color banding.
This is, however, where it gets more complicated. The most popular monitor is an LCD, and the most popular panel is a TN panel. TN panels are not capable of even 8-bit color (6-bit, actually). They use a color swapping and dithering system to achieve a color range equivalent to 8-bit. The result of this is that most color improvements that would be had by 10-bit color are lost when displayed. There are some cases where this is not true (especially with darker values), but it applies for the majority.
And then the next problem comes up. The most common location for these color bands and dithering is across gradients. True gradients are one of the few things you don't commonly see on old console games. The color range of those old systems (especially NES/SMS) is incredibly narrow, and can typically be reproduced perfectly even in 8-bit color.
That is why the quality differences aren't anything to write home about, from a technical standpoint.
This isn't really old technology. This is mainstream technology. 10-bit support has been extremely slow on pickup for a reason (it has minimal benefits when compared to h.265). Back during the pre-h.264 days, hardware decoding was rare, and most devices weren't even capable of such playback. The world has changed since then. I don't think it would be responsible to use the same guidelines for deciding that were used 2, 3, 4+ years ago (whenever the transition actually happened).
I'm not sure I agree with the "greatly improved" sentiment.
Let me add this:
Even IF quality improvements were significant, I think there are way too many drawbacks to make it worthwhile, at least for the time being. Fansubbers, where 10-bit really started to be pushed, are even switching back due to these problems.
1. 10 bit support is in all current libavcodec builds. I am curious though; does anyone know when it was introduced? Searching changelogs is no fun.
2. I agree that this is a disappointment. It always surprised me about 10 bit h264 that an 8 bit decoder can decode it and in many cases show something "good enough" that the user will assume it's being decoded correctly and blame the problem on the content producer.
3. As far as desktop computers go, I doubt there's a single one in existence that both can't handle 240p60 10bit in pure software yet can handle 240p60 8bit in pure hardware. As far as other devices go, see #4.
4. There is some truth to this, but what's also true is our other encodes don't necessarily fare any better in this respect. Many hardware decoders have limitations that we don't respect. I'm not up on all the details, but I know high numbers of reference frames causes problems in many cases, and the current guidelines use max reference searching. Our normal encodes often times use MKV container with all bells and whistles activated; even though this isn't an h264 decoding issue, it can still be a playback issue. Even the "_512kb" encodes, which don't even have direct download links, are targeted towards the Main Concept decoder, which has little to no limitations other than colorspace, and so those encodes aren't suitable for many mobile devices either.
5. I do not follow this point at all. 10 bit encoding will not get worse than it is now if and when h265 comes out. There's just no logic to this statement.
6. As far as the filesize argument goes, I think you are missing something: these are 4:4:4 encodes, where as the old encodes were 4:2:0. Low res pixel art has extremely detailed chroma information, and these encodes are much higher in quality than the 4:2:0 ones while maintaining a similar filesize. 4:4:4 is the real winner here and the reason why all of this is being done. 10-bit is just a moderate compression improvement that is used because once you bring in 4:4:4, you've already lost all the hardware players anyway.
If you want to watch TASes on your mobile device, use the youtube versions. That's what they're there for.
The Kindle Fire I use for Android development won't even play back the non-MQ (ie non-baseline profile) encodes available over on SDA, so I'd be shocked out of my mind if it could play 10-bit anything. I'm curious now if it it'll even play the mkvs available here. I'll have to try it on Monday.
Encoding NES videos at 1080p is essentially harmless to the people who don't care that much about quality, but limiting everything to 10-bit encodes is going to shut out a lot of people who can't be arsed to install something new. They'll just watch it on youtube instead.
Edit: Also, if "QUALITY" is the rallying cry here, isn't telling people to go watch the youtube encodes if they can't play the files sort of contradictory? Not everybody wants to have to resort to youtube, maybe they want to store the files locally.
Cool story, bro. Care to cite any sources for this? Last I checked, pretty much every group that isn't a complete joke or using shareraws uses 10-bit in their primary (and usually only) release.
Hmm? No one said the youtube encodes were low quality. They're just massively inefficient spacewise. On the other hand, if you want to download an encode, 4:4:4 obliterates 4:2:0 any day.
As far as the "people who can't be arsed", how good an outcome are they ever going to get? I'd say getting them to even click on the youtube link is a major win.
But not everybody who wants to store the files also wants to have to go find something that will play 10-bit encodes. People do like to use their PS3s as a media center, for example. You're essentially forcing those people to watch it on youtube or re-encode it themselves.
Joined: 6/25/2007
Posts: 732
Location: Vancouver, British Columbia, Canada
Obviously the "compatibility" encode should remain. It shouldn't be considered "primary" though. The "primary" encode should always be the latest and greatest technology, with compatibility only in the most common viewing situation (a video player in a PC OS like Windows, Linux, or OSX) rather than compatibility with obscure viewing situations like phones or gaming consoles.
Why not have lossless RGB as the primary encode?
The quality would be better, and it would fail properly if you didn't have it setup right.
I've heard lossless is smaller than x264 compressed on some games, is that generally true?
Lossless RGB with what codec? H264? That's not generally true. I just went back to my most recent completed encode (http://tasvideos.org/507M.html) and ran an RGB h264 lossless command for it. The raw .h264 file I got is more than 4 times the size of the .h264 files that were published for the normal and hi10 encodes.
Now, 74.0MB and truly lossless beats the intermediate codecs we currently use (more information here: http://tasvideos.org/forum/viewtopic.php?t=12390) but that's a workflow thing. The files are unnecessarily large for final encodes.
It's possible there are some particular games (it would have to be very low detail ones) where it comes out below lossy, but I doubt it. H264 lossy mode scales down very well.
I've seen it be slightly smaller in some cases, with no discernible change in quality. It's not a sticking point though; I'll update the sticky to reflect that.
Edit: One question to consider: Do we even like the greyscale color scheme? There are many alternates to choose from.
Joined: 4/17/2010
Posts: 11497
Location: Lake Chargoggagoggmanchauggagoggchaubunagungamaugg
I suggest using a better sound setting for 10bits, this way they can be considered HQ encodes, the viewer only needs a proper resizer in his player to watch at higher resolution.
PS: someone unembed Nahoc's picture.
Warning: When making decisions, I try to collect as much data as possible before actually deciding. I try to abstract away and see the principles behind real world events and people's opinions. I try to generalize them and turn into something clear and reusable. I hate depending on unpredictable and having to make lottery guesses. Any problem can be solved by systems thinking and acting.
Joined: 4/17/2010
Posts: 11497
Location: Lake Chargoggagoggmanchauggagoggchaubunagungamaugg
Reading this thread I realize that we can not stop making 8-bit encodes for now. As for switching from one to another as a primary, I don't see how the either way is better because it would only change the order in the publication module. So, no changes to the current encoding system here to expect in the near future.
Warning: When making decisions, I try to collect as much data as possible before actually deciding. I try to abstract away and see the principles behind real world events and people's opinions. I try to generalize them and turn into something clear and reusable. I hate depending on unpredictable and having to make lottery guesses. Any problem can be solved by systems thinking and acting.
I think the idea was to drop current primary and make new primary to be identical to current 10bit444. So one less encode class to do.
Then if compatibility encode is needed, the 512kb cound be used for that purpose. Additionally, 512kb is MP4 and is CFR, which should make it even more compatible.
Of course, that doesn't help with crap players like QuickTime. Even 512kb looks hilariously broken.
I strongly agree with this. Getting rid of the 8bit420 encode has multiple advantages:
It encourages viewers to finally update their software.
It saves a substantial amount of time to encode a movie.
It reduces storage requirements of items on archive.org.
512kb serves as a backup and those who primarily watch movies on Youtube don't care too much about quality anyway.
From all the encodes we currently do for each publication the Youtube transcodes are the most compatible: They can be played back on most mobile devices whereas there might occur problems with downloadables due to the demanding x264 settings being used.
Regarding device compatibility: There might be cases such as devices being able to play back neither 10bit444 nor arbitrary flash movies but for instance Youtube encodes. Viewers then would be responsible for creating their own transcodes if they want to watch a movie with full fps on these devices. Maybe we could introduce a new class that specifies a certain x264 profile so one would have guaranteed compatibility in that case. Comments on that?
All syllogisms have three parts, therefore this is not a syllogism.
Joined: 4/17/2010
Posts: 11497
Location: Lake Chargoggagoggmanchauggagoggchaubunagungamaugg
Making 512kb encode officially downloadable sounds good. Would we be aiming for reducing its size then, since for now I only apply slight compression to it to save on encoding time, as it is only a stream? Also, I'd expect 512kb be played back as well in most cases as 8bit is played.
Warning: When making decisions, I try to collect as much data as possible before actually deciding. I try to abstract away and see the principles behind real world events and people's opinions. I try to generalize them and turn into something clear and reusable. I hate depending on unpredictable and having to make lottery guesses. Any problem can be solved by systems thinking and acting.
Would we be aiming for reducing its size then, since for now I only apply slight compression to it to save on encoding time, as it is only a stream?
Seeing how light the settings are, differences to settings I use for one random primary:
Motion estimation: umh-24 vs. umh-64
subme: 10 vs 11.
Threads: 3 vs 2.
Lookahead_threads: N/A vs 1.
Keyint: 600 vs. 250.
rc_lookahead: 60 vs 250.
So in summary, those settings don't look that light... Hmm... Need to test effects (time / filesize) for subme (10 and 11) and rc-lookahead[1].
[1] Some documentation I have read says that high rc-lookahead has very little effect (unless in 1-pass bitrate) and can actually be harmful.
Joined: 4/17/2010
Posts: 11497
Location: Lake Chargoggagoggmanchauggagoggchaubunagungamaugg
Starting from 2450M, we are no longer producing 8bit420 encode that was our primary downloadable for years. 10bit444 encode was picked to replace it, since it is a huge upgrade in video quality, not sacrificing the file size. For compatibility with players that do not support 10bit444 encodes, we are offering 512kb encode as downloadable. It has slightly bigger file size, due to using aspect ratio correction internally and not using duplicate frame removal.
Read here how to setup your player to playback 10bit444 encodes.
Here is the general thread about them.
Old publications are not affected by this change, encodes that already exist will stay untouched.
Warning: When making decisions, I try to collect as much data as possible before actually deciding. I try to abstract away and see the principles behind real world events and people's opinions. I try to generalize them and turn into something clear and reusable. I hate depending on unpredictable and having to make lottery guesses. Any problem can be solved by systems thinking and acting.