| | TDJ
Registered: Dec 2001 Posts: 1879 |
csdb gem top 40
For those curious about how the csdb charts would look if negative votes were cast aside, I've written a special and completely useless article.
Read it at http://www.lyrical.demon.nl/csdb/gem40.html .. Don't mind the lack of pretty pictures or the ugly layout, it's the text that counts :)
And ofcourse I'm more than interested in your reactions, so bring it on! |
|
... 10 posts hidden. Click here to view all posts.... |
| | TDJ
Registered: Dec 2001 Posts: 1879 |
Quote: <irony>
What Deus Ex Machina only on 17, closely followed by Batmania? Muahahaha! ;-)
</irony>
And why not? There's no doubt that Deus Ex Machina is a much better demo (no wonder, since it was made 11 years later by somebody who was already a top coder when Batmania was released) but bear in mind that a lot of people prefer the old style demos as well - hence they vote for demos such as Batmania, and maybe not for D.E.M.
Also, what good would it be if D.E.M. would be #1 on this chart as well? Remember: this is an alternative to the normal chart, giving a different view but using the same votes :)
I'm not sure where I will take this though - it's more of an experiment so far, so chances are that there will be no follow-up ;Z |
| | SIDWAVE Account closed
Registered: Apr 2002 Posts: 2238 |
I think we need the old system used in the 80's charts.
Once a month it's voting time, and once a year it's demo of the year.
It gives a better reflection of what people think, at any given time. The chart becomes more 'correct' IMO.
Online 'floating' votes, well - you can always vote on an older demo, so the current events are probably not up to date. And maybe you don't see all the new demos.
What about monthly vote rollcall, and then you cant vote for that demo anymore, that way.
At end of year you can vote again, for best of the year.
And then you have an online vote, like system is now, that is just "one of the best demos I ever saw vote", and that can be given at any time, and then no more.
?
|
| | Hein
Registered: Apr 2004 Posts: 942 |
UP voting is the same as DOWN voting, only different direction. Who sais that someone who votes a 10 isn't as biased as someone voting a 1?
I'm curious though what makes it possible that 64allstars had 5x10,5x1 and 1x9 gives the average of 9.2. Later someone votes another 1, making 6x1, 5x10 and 1x9 all of a sudden the average is 2.3. What's the magic formula used on CSDb? |
| | TDJ
Registered: Dec 2001 Posts: 1879 |
Quote: UP voting is the same as DOWN voting, only different direction. Who sais that someone who votes a 10 isn't as biased as someone voting a 1?
I'm curious though what makes it possible that 64allstars had 5x10,5x1 and 1x9 gives the average of 9.2. Later someone votes another 1, making 6x1, 5x10 and 1x9 all of a sudden the average is 2.3. What's the magic formula used on CSDb?
As for the up/downvoting thing: completely true, that's why I have the "at least 10 gem-votes" rule.
And the 64 allstars thing is by far the most clear example of why the csdb voting system doesn't work, i.m.h.o. :) |
| | TDJ
Registered: Dec 2001 Posts: 1879 |
Quote: I think we need the old system used in the 80's charts.
Once a month it's voting time, and once a year it's demo of the year.
It gives a better reflection of what people think, at any given time. The chart becomes more 'correct' IMO.
Online 'floating' votes, well - you can always vote on an older demo, so the current events are probably not up to date. And maybe you don't see all the new demos.
What about monthly vote rollcall, and then you cant vote for that demo anymore, that way.
At end of year you can vote again, for best of the year.
And then you have an online vote, like system is now, that is just "one of the best demos I ever saw vote", and that can be given at any time, and then no more.
?
Okay, so if I understand you correctly you want 2 different voting systems: 1 for current demos, one for all-time demos. For the latter one you want to keep using the system already in place, and for the first one you want to use the 'once a month+once a year' system?
That might work (it would filter the relatively big amount of 'new' demos from the all-time chart, demos that are mostly there because people just saw them), but on the other hand: with so few demos being released in 2004 (less than 50 so far), why bother? Why not just select all eligible demos at the end of the year based on average csdb score (let's say all demos with at least a score of 6), make a zipfile, ask people to download that, take a look at the demos in it (wouldn't take more than a few hours max) and then give their top 3 or something? For demo of the year that would be sufficient, and it would surely be better than the current system of just taking the party winners .. |
| | Ben Account closed
Registered: Feb 2003 Posts: 163 |
I personally have always perceived a unidimensional system inappropriate. I have once tried to rank musicians, but rather by introducing a 10 dimensional system and giving weights to each and every one of them. This system even included a dimension like 'employment/development' (how did a musician develop over time), and the obvious categories sounds, melody, use of player, impact/popularity, et cetera, and -yep, even- overal impression.
Actually using a multidimension system in the appropriate way is indeed quite laborious, but it would reflect more precisely what the qualities of particular demos are. I can imagine e.g. Krestology would end up high in the graphics and design category, but One-Der would score considerably higher in the code-technical difficulty category.
By introducing an overal impact or innovativeness category, the 'radicalness' of inventions can be easily captured irrespective of the time at which the inventions are done.
Just some thoughts..
<edit>
Please note, voters are also expected to provide the weights. Some people are mostly interested in technical code, others prefer the atmosphere over code, et cetera.. |
| | TDJ
Registered: Dec 2001 Posts: 1879 |
For my old system I also gave scores for different disciplines (code, gfx, music, even for how much of the demo was made by the group itself) and then calculate an overall value. But, apart from it being way too much work, it's also nonsense, I feel, as it is the total that counts. And have you ever seen a music review where they give different scores for music, lyrics, and the way the band dresses? :)
For my own personal ratings-database I use the 5 star system: 1 star = poor, 2 stars = fair, 3 stars = good, 4 stars = excellent and 5 stars = magnificent.
I use it for 2 different purposes: first of all to see what demos I will store on a real floppy so I can view them on the real c64 (all gems, demos that score 3 stars or more), and also for a new project that will start early next year, about the 'best' c64 demos ever, and the groups that made them.
Roughly translated: 1 star = 1-5 here on csdb, 2 stars = 6/7 here, 3 stars = 8, 4 stars = 9 and 5 stars = a 10.
Ofcourse it's not really about how good a demo is, but how much I like it. And that's what counts. Why would I rate a demo that's technically perfect, has good gfx + music but no soul, and is a bore to watch, a 10?
And thus we get back to my article, where 'favourite' votes are all that matter :) |
| | jailbird
Registered: Dec 2001 Posts: 1578 |
TDJ, more than a year has passed since your article was published and it seemed to me as hell of a good idea! Since it's not updated dinamically, I'd be really interested to see the current situation on the GEM chart... Any plans to continue the project and to write another article?
|
| | TDJ
Registered: Dec 2001 Posts: 1879 |
Actually I was just thinking about it the other day .. no real plans though but if more people might be interested I may give it a go. |
| | Ed
Registered: May 2004 Posts: 173 |
I am looking forward to more of this. Go ahead!
|
Previous - 1 | 2 - Next | |