Editor
Joined: 11/3/2013
Posts: 506
Warp wrote:
Perhaps the simple act of changing the word "technical" to "effort" might in fact be enough. The latter gives a much better idea of what the rating is about than the former. And using one single word. I like it.
I personally prefer the word "difficulty", since I feel technical ratings should incorporate the level of TASing skill required to make the TAS as well as the sheer volume of gruntwork. Highly botted TASes, for example, should not get lower technical ratings because the bot cuts down the manpower; on the contrary, they should get higher ratings because skilful bot use is a cut above bog-standard TASing, and because the bot, with its millions of calculations per second, can help achieve a higher level of perfection/superhuman-ness than manual re-records could alone. Actually, that's another reason I don't like the technical rating being purely a metric of effort. We already have one of those - it's called the re-record count.
Active player (264)
Joined: 8/14/2014
Posts: 188
Location: North Kilttown
thatguy wrote:
Actually, that's another reason I don't like the technical rating being purely a metric of effort. We already have one of those - it's called the re-record count.
The problem with that is that with the more frequent use of TASstudio which (last I checked) had an issue with recording too many rerecords. I don't know if that's fixed but it did present an issue. The other problem comes down to different TASing methods that could give an inaccurate rr count. As a personal example, I like to use multiple instances of BizHawk to test out different strats. This allows for easy side-by-side comparison while avoiding accidentally erasing over one of the strats. It's also useful for testing out RNG manipulation i've found. The issue, though, is that you only end up seeing the rerecords for the "final" strat and not all of the other tests that were done. The rerecord count by itself would give a fairly inaccurate view of the actual work.
Somewhat damaged.
Editor, Player (54)
Joined: 12/25/2004
Posts: 634
Location: Aguascalientes, Mexico
I see the Technical part as how much research was done in the game: Studying the RAM, disassembling the game, comparing times other runs of the same game, scripts used, etc. So even if a run doesn't look impressive, when you read the description you can realize how much background it has and have an idea of the quality of the run.
I'm the best in the Universe! Remember that!
Samsara
She/They
Senior Judge, Site Admin, Expert player (2238)
Joined: 11/13/2006
Posts: 2822
Location: Northern California
TASvideos Admin and acting Senior Judge 💙 Currently unable to dedicate a lot of time to the site, taking care of family. Now infrequently posting on Bluesky
warmCabin wrote:
You shouldn't need a degree in computer science to get into this hobby.
Editor, Expert player (2098)
Joined: 8/25/2013
Posts: 1200
and a year later, here I am again to complain (nothing new right loll) [3191] GBC Scooby-Doo! Classic Creep Capers by Birth in 18:47.86 [3198] PSX Suzuki Bakuhatsu by Spikestuff in 15:38.35 [3202] Arcade Shinobi "maximum kills" by V in 11:41.42 [3245] A2600 Pitfall II: Lost Caverns "maximum score" by Alyosha in 09:30.32 [3237] A2600 Pitfall II: Lost Caverns by Alyosha in 04:03.05 [3210] Arcade Bionic Commando by Jules in 05:33.70 [3238] GBA Drake & Josh by jlun2 in 24:10.52 [3220] GBA Prince of Persia: The Sands of Time "100%" by theenglishman in 1:14:16.72 [3262] GB Jurassic Park Part 2: The Chaos Continues by Hetfield90 & StarvinStruthers in 23:20.04 [3274] Genesis Gunstar Heroes "2 players" by Samsara in 31:00.33 [3129] Genesis Disney's Pinocchio by Meerkov in 16:25.29 [3268] MSX Nuts & Milk by Zupapa in 05:09.85 [3187] NES Inversion "Extra world" by MESHUGGAH & jhztfs in 01:16.44 [3185] NES Inversion by MESHUGGAH & jhztfs in 05:18.91 [3133] NES Vice: Project Doom by Sonikkustar & Alyosha in 12:30.23 [3207] NES Wolverine "warpless" by Jules in 07:57.46 [3200] PSX Mega Man X3 "all stages" by Hetfield90 & nrg_zam in 34:20.04 [3070] PSX Oddworld: Abe's Oddysee "100%" by Samtastic in 1:04:19.52 [3192] PSX Soul Blade by Spikestuff in 03:06.82 [3229] SMS The Lucky Dime Caper Starring Donald Duck by Challenger in 18:33.01 [3206] SNES Action Pachio by WarHippy in 19:06.16 [3203] SMS The Ninja by Isotarge in 05:16.54 [3246] SNES Classic Kong Complete by TheRealThingy & BrunoVisnadi in 01:54.71 [3180] SNES Front Mission Series: Gun Hazard by Hetfield90 in 2:06:15.37 [3188] SNES Justice League Task Force by KusogeMan in 08:39.63 [3247] SNES Mega Man X3 by GlitchMan & Hetfield90 in 37:54.50 [3219] SNES Porky Pig's Haunted Holiday by Challenger in 23:30.46 [3174] SNES Prehistorik Man by WarHippy in 18:40.75 [3248] Wii Muramasa: The Demon Blade "Kisuke" by Bernka in 1:03:49.17 Twenty-nine movies.....all of which were made this year. I get some of these, perhaps. Oh, this crappy game's in the vault, I'm not gonna watch it, screw it. But then you have something like Gunstar and Mega Man X3 that aren't getting many votes at all. What the heck? It's like as soon as a movie is published nowadays people ignore it unless it's got Mario or Super Metroid in the title. It's, quite frankly, ridiculous. So yeah, that's enough of my complaining this year.
effort on the first draft means less effort on any draft thereafter - some loser
Memory
She/Her
Site Admin, Skilled player (1553)
Joined: 3/20/2014
Posts: 1765
Location: Dumpster
arandomgameTASer wrote:
It's like as soon as a movie is published nowadays people ignore it unless it's got Mario or Metroid in the title.
I wish Friendly reminder that not all games in the same series are created equal when it comes to receiving attention
[16:36:31] <Mothrayas> I have to say this argument about robot drug usage is a lot more fun than whatever else we have been doing in the past two+ hours
[16:08:10] <BenLubar> a TAS is just the limit of a segmented speedrun as the segment length approaches zero
Samsara
She/They
Senior Judge, Site Admin, Expert player (2238)
Joined: 11/13/2006
Posts: 2822
Location: Northern California
There's really no point to rating movies anymore. It's a broken and nebulous system that people are clearly abusing for personal gain and I doubt it's ever going to be properly fixed.
TASvideos Admin and acting Senior Judge 💙 Currently unable to dedicate a lot of time to the site, taking care of family. Now infrequently posting on Bluesky
warmCabin wrote:
You shouldn't need a degree in computer science to get into this hobby.
Editor, Skilled player (1344)
Joined: 12/28/2013
Posts: 396
Location: Rio de Janeiro, Brasil
I've noticed people are barely rating movies anymore... This is a notable example of an entertaining movie that is there since February and got only 1 rating so far: http://tasvideos.org/3339M.html By the other hand, older movies (from 2006-2007) have a very big amount of ratings, and while they of course had more time for people seeing and rating them, the difference is obviously disproportional. Users are definitely less interested in rating nowadays. So maybe this is the time to renovate the rating system? There are various suggestions in this and other threads, such as implementing the 1-5 stars system and changing the design in a way it's more appealing to people to rate. There might be better ideas, and I just feel it's time to do something.
My YouTube channel: https://www.youtube.com/channel/UCVoUfT49xN9TU-gDMHv57sw Projects: SMW 96 exit. SDW any%, with Amaraticando. SMA2 SMW small only Kaizo Mario World 3
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
On a somewhat tangential note, I don't really like the "1 to 5 stars rating" system, which seems to be so prevalent. If you see this rating, how would you interpret it? Most people would interpret it as average, pretty much exactly half-way between the minimum and maximum. Would you, however, believe that that rating is the result of two 5-star ratings and four 1-star ratings? (Which, in other words, means that twice as many people gave it a minimum rating than a maximum rating.) How so? That seems impossible and completely unintuitive, right? The average of all those ratings is (1+1+1+1+5+5)/6 = 2.33, which rounding to the nearest half-star gives indeed 2.5 stars. Yet, it still feels very unintuitive, and hard to understand how such an imbalance in ratings can put the bar exactly in the middle. And that's what I consider somewhat of a problem. The star bar rating gives a false impression of how it has been rated. It gives the impression that the ratings are about evenly split, when they clearly are not. The problem is that 1 is the minimum rating. Which in turn means that the leftmost star is always "lit". Even if every single rating was the minimum, it will still be lit. This gives a misleading visual representation. If the leftmost star is removed, it suddenly gives a much more intuitive visual cue that corresponds more to the idea of four minimum ratings and two maximums: Another (perhaps better) solution is to allow for 0-star ratings (which means that even with 5 stars, the leftmost one can be off, or only half filled.)
Amaraticando
It/Its
Editor, Player (159)
Joined: 1/10/2012
Posts: 673
Location: Brazil
I think {0, 1, 2, 3, 4, 5} stars is the best idea. Or, if not that, removing the decimal digit from the voting system. Sometimes, having many options is the biggest reason to avoid choosing at all.
Site Admin, Skilled player (1254)
Joined: 4/17/2010
Posts: 11475
Location: Lake Char­gogg­a­gogg­man­chaugg­a­gogg­chau­bun­a­gung­a­maugg
I'd prefer a system identical to IMDB's. You can put integers 1-10, and the result has the fractional part. People are used to that system, and to TAS movies having 10 integers, so it should be fine. Problems come from having 2 separate ratings for each movie and also manual fractional part, becoming 200 ways to vote instead of 10.
Warning: When making decisions, I try to collect as much data as possible before actually deciding. I try to abstract away and see the principles behind real world events and people's opinions. I try to generalize them and turn into something clear and reusable. I hate depending on unpredictable and having to make lottery guesses. Any problem can be solved by systems thinking and acting.
Alyosha
He/Him
Editor, Emulator Coder, Expert player (3821)
Joined: 11/30/2014
Posts: 2829
Location: US
I think one way to improve participation is to be able to rate movies while on the workbench. That way people won't have to be bothered with going back after the fact after the movie is published (although this would still be an option too.) Maybe even just replace the 'did you find this entertaining?' question with the ratings box. Personally, I've never rated a single movie, as it seems like a pointless exercise, but at least having the option up front might be more motivating.
Experienced player (689)
Joined: 2/5/2012
Posts: 1794
Location: Brasil
i agree with alyosha,both the why and the solution for the problem
I want all good TAS inside TASvideos, it's my motto. TAS i'm interested: Megaman series, specially the RPGs! Where is the mmbn1 all chips TAS we deserve? Where is the Command Mission TAS? i'm slowly moving away from TASing fighting games for speed, maybe it's time to start finding some entertainment value in TASing.
Player (26)
Joined: 8/29/2011
Posts: 1206
Location: Amsterdam
Warp wrote:
If you see this rating, how would you interpret it?
I would interpret this as mediocre to awful. People tend to use only the upper part of the scale as a matter of courtesy (e.g. 5 = excellent, 4 = pretty good, 3 = meh) so anything getting an average of three or below is probably not worth watching.
Would you, however, believe that that rating is the result of two 5-star ratings and four 1-star ratings? (Which, in other words, means that twice as many people gave it a minimum rating than a maximum rating.)
Of course, the answer to your issue is to not show ratings until there are a minimum number of votes (e.g. 10), because such situations become vanishingly unlikely as the amount of votes rises. Alternatively, show only ratings if the standard deviation is below a certain threshold, because high SD indicates lack of consensus. Of course, if we want more votes then the obvious thing to do is create a single-click voting bar right on the movie page. It's kind of silly that the facebook LIKE button is right on the movie page, and for voting I have to first navigate to a subpage, then read two separate tabs explaining what I'm supposed to vote for, then use four separate pulldown menus, and finally hit an easily-missable 'send' button that is not anywhere near the vote box (and isn't labeled 'vote'). It's basic interface design, people :D
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
Radiant wrote:
Warp wrote:
If you see this rating, how would you interpret it?
I would interpret this as mediocre to awful.
I think you missed my point. My intention was not to ask "how high quality would you interpret something having this many stars to be?"
Would you, however, believe that that rating is the result of two 5-star ratings and four 1-star ratings? (Which, in other words, means that twice as many people gave it a minimum rating than a maximum rating.)
Of course, the answer to your issue is to not show ratings until there are a minimum number of votes (e.g. 10), because such situations become vanishingly unlikely as the amount of votes rises.
I think you are still missing the point. My point has nothing to do with the reliability of the results or how much variance there may be, or anything like that. It doesn't matter how many votes there are. There could be 400 one-star votes and 200 five-star votes and you would still get the same picture as above. My point was that a 5-star rating system, visualized like that, using a range of 1 to 5 stars, is visually misleading. The above graphic would give the impression that the tas has exactly average ratings, even though it has well below-average ratings. The reason for that is that the first star is extraneous, always lit, and makes the graphic misleading. This has nothing to do with the "reliability" of the votes, or their amount (or how you would subjectively interpret the result as a measurement of "quality").
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
Alyosha wrote:
Maybe even just replace the 'did you find this entertaining?' question with the ratings box.
Something like this was tried many years ago. It was relatively soon changed back to the poll question.
Player (26)
Joined: 8/29/2011
Posts: 1206
Location: Amsterdam
Warp wrote:
My point has nothing to do with the reliability of the results or how much variance there may be, or anything like that. It doesn't matter how many votes there are. There could be 400 one-star votes and 200 five-star votes and you would still get the same picture as above.
Yes, so that has everything to do with how much variance (or standard deviation) there is. What you're missing is that movies with four one-star and two five-star votes are uncommon at best, whereas movies with 400 one-star and 200 five-star are vanishingly unlikely to the point of nonexistence. So it's totally fine to have a system that takes the former into account but not the latter.
The above graphic would give the impression that the tas has exactly average ratings,
No, for the reason I mentioned above. The average rating on a five-point scale is expected to be above four (e.g. see this article for an explanation. )
The reason for that is that the first star is extraneous, always lit, and makes the graphic misleading.
The reason is that you think of a five-point scale as 0 through 4, whereas most people think of a five-point scale as 1 through 5. That's why the latter is the standard and the former is not.
Noxxa
They/Them
Moderator, Expert player (4123)
Joined: 8/14/2009
Posts: 4089
Location: The Netherlands
Nach and I have recently been working on a new movie module which would include an easier rating system, but progress on it has stalled recently. I agree that a rating star system like IMDB's would be much better, and it is something we're aiming to work towards, but this will take some time.
http://www.youtube.com/Noxxa <dwangoAC> This is a TAS (...). Not suitable for all audiences. May cause undesirable side-effects. May contain emulator abuse. Emulator may be abusive. This product contains glitches known to the state of California to cause egg defects. <Masterjun> I'm just a guy arranging bits in a sequence which could potentially amuse other people looking at these bits <adelikat> In Oregon Trail, I sacrificed my own family to save time. In Star trek, I killed helpless comrades in escape pods to save time. Here, I kill my allies to save time. I think I need help.
Skilled player (1731)
Joined: 9/17/2009
Posts: 4980
Location: ̶C̶a̶n̶a̶d̶a̶ "Kanatah"
Well, apparently, the reason why the rating system isn't updated is because coding it seems to be the problem. I wonder would it be possible to open source it, and try getting help from social media? Not sure if there would be (or how much of a) security risk open sourcing the sections needed though.
Alyosha
He/Him
Editor, Emulator Coder, Expert player (3821)
Joined: 11/30/2014
Posts: 2829
Location: US
jlun2 wrote:
Well, apparently, the reason why the rating system isn't updated is because coding it seems to be the problem. I wonder would it be possible to open source it, and try getting help from social media? Not sure if there would be (or how much of a) security risk open sourcing the sections needed though.
Woah, the post in that quote is almost 2 years old already, time sure does fly! XD Just as a point of reference, I looked at the official encode of the TAS BrunoVisnadi linked in his post above. That video has 214 ratings (thumbs up or down on youtube) out of 13414 views at the time of my loading it. Roughly 1.5% Even if only half of those views are unique people watching more then 1 or 2 minutes of the TAS, we'd still only have a 3% participation rate in the youtube rating. I'm also guessing that most of those people are not even aware of the publication page for that TAS on TASVideos.org, so they aren't the same audience that would use the rating system either. Obviously this is just one movie, but it does seem that people just aren't interested in rating TASes, not TASers and not the general public, so it might not matter what the rating system is changed to, if the concept itself does not appeal to many people.
Player (26)
Joined: 8/29/2011
Posts: 1206
Location: Amsterdam
Alyosha wrote:
Just as a point of reference, I looked at the official encode of the TAS BrunoVisnadi linked in his post above. That video has 214 ratings (thumbs up or down on youtube)
214 ratings is way more votes than almost any video gets on the TASvideos website. If we can improve our voting process to get even half of that, that'd be a definite improvement.
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
Radiant wrote:
The above graphic would give the impression that the tas has exactly average ratings,
No, for the reason I mentioned above. The average rating on a five-point scale is expected to be above four
You are still missing the point completely. I am not talking about the subjective meaning of "average", as used in colloquial language when judging a work of art. Nor am I talking about people's behavior when they estimate and rate such things. I am not talking about how the results should be subjectively interpreted from a psychological perspective in terms of the quality of the work. I am not talking about "four stars means that the work is meh". You seem to still be clinging to your complete misinterpretation of my original question "how would you interpret this graph?" I was not asking for a psychoanalysis of the mentality of the people who have rated the work, or an essay on typical human behavior. I was talking about how the graphic visually misleading, giving the impression that the ratings are exactly half-way through the scale, when in reality they are significantly lower than half-way. Forget "average", since you seem to have so many problems in understanding what that word means. Think of the more mundane "half-way between lowest and highest" concept instead.
The reason for that is that the first star is extraneous, always lit, and makes the graphic misleading.
The reason is that you think of a five-point scale as 0 through 4, whereas most people think of a five-point scale as 1 through 5. That's why the latter is the standard and the former is not.
But the problem is that when you display the results using five stars, it leads to a misleading result. People don't think that "oh, the range is 1 to 5? That means I should ignore the first star in the image and just look at the four remaining ones." They will look at the entire image and see that the highlighted portion is covering exactly half of it, and thus come to the intuitive but wrong conclusion that the ratings are also likewise completely evenly split, when that's very far from the truth.
Editor, Skilled player (1344)
Joined: 12/28/2013
Posts: 396
Location: Rio de Janeiro, Brasil
That's a very easy problem to fix. As suggested before, there could be the option of voting 0 stars. The interface I imagine is 6 clickable stars, with an 'X' in the leftmost denoting the 0 stars vote.
My YouTube channel: https://www.youtube.com/channel/UCVoUfT49xN9TU-gDMHv57sw Projects: SMW 96 exit. SDW any%, with Amaraticando. SMA2 SMW small only Kaizo Mario World 3
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
BrunoVisnadi wrote:
That's a very easy problem to fix. As suggested before, there could be the option of voting 0 stars. The interface I imagine is 6 clickable stars, with an 'X' in the leftmost denoting the 0 stars vote.
Hmm... I don't think that's very intuitive. Its meaning isn't very apparent.