Forum Home Forum Home > Site News, Newbies, Help and Improvements > Internal news
  New Posts New Posts RSS Feed - TOP 100 Progressive Music Albums
  FAQ FAQ  Forum Search   Events   Register Register  Login Login

Topic ClosedTOP 100 Progressive Music Albums

 Post Reply Post Reply Page  123 12>
Author
Message Reverse Sort Order
Larkstongue41 View Drop Down
Forum Senior Member
Forum Senior Member
Avatar

Joined: July 07 2015
Location: Eastern Canada
Status: Offline
Points: 1360
Direct Link To This Post Topic: TOP 100 Progressive Music Albums
    Posted: October 26 2017 at 09:37
I just want to point out an issue that might be related. I might just be missing something here but whenever I narrowly filter the "Top Prog Albums" page, tons of album which should appear based on the filters I put on aren't showing up. For instance, if I'm looking for Post/Math Rock albums from 1996 and that I don't put any filter other than those two, only one album is showing up while there should be at least a few. Checking it just now I just realized that if I put a number in the "Minimum Average Rating Value" filter, more albums (2) are popping up but still I know of at least one that is missing.

EDIT- Replacing the 0 by 1 in the "Minimum Number of Ratings" seems to have taken care of the issue but it still seems strange to me.


Edited by Larkstongue41 - October 26 2017 at 09:40
"Larks' tongues. Wrens' livers. Chaffinch brains. Jaguars' earlobes. Wolf nipple chips. Get 'em while they're hot. They're lovely. Dromedary pretzels, only half a denar."
Back to Top
ProgRobUK View Drop Down
Forum Groupie
Forum Groupie
Avatar

Joined: February 12 2007
Location: United Kingdom
Status: Offline
Points: 76
Direct Link To This Post Posted: October 26 2017 at 09:05
And forgot to mention in my reply above... try removing the restriction to just 2010 or later. You then spot that third placed The Mountain by Haken moves above Parallax II. This is because the average number of votes (N) has changed dramatically with the inclusion of many albums with huge numbers of votes against them.

Just goes to show that the current system just isn't ideal.
Back to Top
ProgRobUK View Drop Down
Forum Groupie
Forum Groupie
Avatar

Joined: February 12 2007
Location: United Kingdom
Status: Offline
Points: 76
Direct Link To This Post Posted: October 26 2017 at 08:44
I can explain why it's happening and I agree that it is wrong. Essentially, I don't believe changing the years or the genres included should change the relative order of any two albums.

As explained by someone else in an earlier post the formula used is:

QWR = (NR + nr) / (N + n)

where QWR is the weighted rating, R is the average rating of all albums, N is the average number of votes per album for all albums, r is the album average and n is the number of ratings.

What isn't explained in the earlier post is that R and N depend on the filter (i.e. the selected genres and years). So if you add/remove a year or genre to/from your filter then R and N will be recalculated based only on those albums in the filtered set - as a result that changes the QWR values. In other words, removing Heavy Prog changes the values of N and R used in the calculation of QWR and hence used to order the albums. As you have demonstrated beautifully, this can have a dramatic effect as albums quite high up the chart can switch places.

To show more detail we would need to know the values of N and R that were used in your examples but working out what are the values of N and R is difficult for the reasons:
1. I don't have access to the database; 
2. New reviews are being added all the time; and,
3. Individual reviewers' ratings are themselves weighted depending on whether there is a written review, whether they are a member or a collaborator.

As a result, when I did my analysis in the posts earlier I had to make a few assumptions. However, my conclusions were that in general the value of R looks pretty stable, but the value of N can change quite a lot.

The reason why your specific queries caused a swap is because Pale Communion has a lower r value (average rating) than Parallax II but a much higher n value (number of ratings). This is exactly the situation where if N changes then the QWR values can change sufficiently for the albums to switch places!

As an example, I set R at 4.0 and tried N at 100 then N at 87. With that simple change I was able to get close to the QWR values you had and therefore was able to cause them to swap places.

As for a fix, I believe there is only one effective solution: calculate N and R based on every album in progarchives irrespective of year and genre (and, for that matter, country). Then simply use the filters to decide which albums to show in any chart that the user generates. That way albums stay in a consistent order irrespective of the genres and years that are included.

That's my view anyway. You can probably tell  have spent far too long looking at this issue Smile Just glad someone else finds it annoying when this is such a valuable and interesting feature of progarchives.



Edited by ProgRobUK - October 26 2017 at 08:53
Back to Top
nuunuu View Drop Down
Forum Newbie
Forum Newbie
Avatar

Joined: October 14 2017
Location: United States
Status: Offline
Points: 2
Direct Link To This Post Posted: October 15 2017 at 00:01
Well if the qwr's are changing given different criteria, it would stand to reason that the qwr is either effected by the criteria or effected by the particular set of items returned in the query results, both of which seem wrong. My assumption is that the qwr is baked into the page and generated by data brought back from the query for each entry as opposed to the qwr being returned with the rating itself, which would be ideal imo considering it is static data as far as the listing is concerned. As far as what could be wrong with the implementation, it could be anything from the algorithm missing some parenthesis to a table join implicitly removing or duplicating user reviews (i.e. bring back the set of users that voted for an album one time for each genre selected, artificially boosting albums with high numbers of reviews due to the weighted system)


EDIT: just noticed someone above said they were observing similar behavior when they selected different sets of years. Could be that there was a fix applied to the years filter, but not to the genre filter...

Edited by nuunuu - October 15 2017 at 00:09
Back to Top
siLLy puPPy View Drop Down
Special Collaborator
Special Collaborator
Avatar
PSIKE, JRF/Canterbury, P Metal, Eclectic

Joined: October 05 2013
Location: SFcaUsA
Status: Offline
Points: 15243
Direct Link To This Post Posted: October 14 2017 at 14:16
^ i've noticed that as well. It makes absolutely no sense and i can't think of any logical reason for it. If anyone has an answer, i'm all floppy ears

https://rateyourmusic.com/~siLLy_puPPy
Back to Top
nuunuu View Drop Down
Forum Newbie
Forum Newbie
Avatar

Joined: October 14 2017
Location: United States
Status: Offline
Points: 2
Direct Link To This Post Posted: October 14 2017 at 13:01
Hi, I'm a long time user, but just registered to post about some strange behavior i saw on the filter. 

It is somewhat related to what was already being discussed on this last page. I went to the top prog albums page with the criteria listed and selected the following filters:

Genre:
Experimental/Post Metal
Heavy Prog
Progressive Metal
Tech/Extreme Prog Metal

Recording Type: 
Studio

Year:
2017
2016
2015
2014
2013
2012
2011
2010

Country:
All

Min Ratings - 0
Max Ratings - 0
Min Average Rating - 0
Max Results - 100

When applying this criteria, i see that the top album is Pale Communion - Opeth with 4.16 rating/967 ratings/4.1450 qwr

the second album listed is The Parallax II Future Sequence - BtBaM with 4.20 rating/245 ratings/4.1443 qwr

now the interesting behavior i observed was that when i removed "Heavy Prog" from the genre filters, The positions of those two albums swapped. The rating and reviews remained the same, but the qwr changed based on the query results.

Pale Communion changed to 4.1460 and Parallax II changed to 4.1478.

I don't know if this behavior is intentional, but the way the qwr algorithm is described at the top of the page seems like it should be based on factors not influenced by a filtered subset of data. I work as a software developer and would be happy to help look at the query if you need some extra manpower, otherwise good luck and thanks for this awesome system. It's my sole source of finding music nowadays 
Back to Top
ProgRobUK View Drop Down
Forum Groupie
Forum Groupie
Avatar

Joined: February 12 2007
Location: United Kingdom
Status: Offline
Points: 76
Direct Link To This Post Posted: September 14 2017 at 05:23
M@X - has the algorithm been tweaked? The issues that I noted above seem to have gone away and albums now seem to be presented in a consistent order Smile
Back to Top
sl75 View Drop Down
Forum Groupie
Forum Groupie
Avatar

Joined: April 11 2012
Location: Australia
Status: Offline
Points: 64
Direct Link To This Post Posted: April 15 2017 at 07:01
And suddenly it works again.

As you were.
Back to Top
M@X View Drop Down
Forum & Site Admin Group
Forum & Site Admin Group
Avatar
Co-founder, Admin & Webmaster

Joined: January 29 2004
Location: Canada
Status: Offline
Points: 4028
Direct Link To This Post Posted: April 15 2017 at 07:00
will check this out, thanks
Prog On !
Back to Top
sl75 View Drop Down
Forum Groupie
Forum Groupie
Avatar

Joined: April 11 2012
Location: Australia
Status: Offline
Points: 64
Direct Link To This Post Posted: April 15 2017 at 06:58
I tried to post this as a new topic in the Report Bugs section, but apparently I don't yet have permission to start new topics in that area, so forgive me for posting on here instead


I just tried to alter the parameters on the Top Prog Albums page (to select all album types, multiple genres, a minimum number of ratings, a minimum rating value, etc)
It's always worked before

This time I got:


Microsoft SQL Server Native Client 11.0 error '80040e14'

Incorrect syntax near ','.

/top-prog-albums.asp, line 187



and then no list

Back to Top
ProgRobUK View Drop Down
Forum Groupie
Forum Groupie
Avatar

Joined: February 12 2007
Location: United Kingdom
Status: Offline
Points: 76
Direct Link To This Post Posted: February 14 2017 at 07:33
I'm continuing to look into this and have spotted a couple more difficult to justify positionings.

If, at least today, you look at the Top 100 and then compare it against the Top 50 Classic albums we get at number 3 (Top 100) Thick As A Brick and at Number 4 Wish You Were Here. But look at the Top 50 Classics and their positions swap!

The same happens a little further down the charts with Foxtrot ahead of Dark Side of the Moon on the Top 100 but their positions swapped on the Top 50 Classics.

I don't have access to the raw data but using the information in the charts I have estimated the values for R (the average rating) and N (the average number of ratings) for each of the two charts. The interesting thing is that R seems to be pretty consistent at 3.41 for the Top 100 and 3.42 for the Top 50 Classic. This is really encouraging as it means that we are collectively pretty consistent in our distribution of our stars.

However, it is interesting that the values for N are very different: about 36 ratings per album for the Top 100, but about 100 ratings per album for the Top 50 Classics.

This confirms my suspicion that it is the changing average number of ratings that causes albums to switch places. I believe that there is a solution to this - by using the same values for R and N whatever "chart" we are generating - Top 50 Classics, Top 100 Ever or US progressive Metal in 2015. That way, if album A is deemed higher ranked that album B on one chart then they will remain in that order on all other charts generated at the same time.

Now to look at what values to use for R and N...
Back to Top
ProgRobUK View Drop Down
Forum Groupie
Forum Groupie
Avatar

Joined: February 12 2007
Location: United Kingdom
Status: Offline
Points: 76
Direct Link To This Post Posted: January 31 2017 at 06:50
I find the top 100 feature on progarchives really useful.

Also I understand the algorithm used to order it and it makes sense to me.

However, the algorithm has an unfortunate side effect which is, in my view, illogical - mathematically correct, but still illogical!

Today (31st Jan), if I look at the top albums in 2017 I see Mike Oldfield's Return to Ommadawn at the top - great, I love the original looks like I might have to check this out.

So, I thought it would be interesting to see how that compares with other artists I like - say, Steven Wilson. So I show the top albums from 2015-2017 and it turns out that number 1 is Mike Oldfield and number 2 is Steven Wilson's Hand.Cannot.Erase. Now that is really interesting as that tells me that Return To Ommadawn is better than one of my favourite albums from the last couple of years. So I definitely need to check it out.

Rolling back time still further to, say 2013 and it turns out that Steven Wilson's The Raven That Refused To Sing is now top, Mike has slipped to second and HCE has moved down to third. I personally prefer HCE to Raven, but we all have our opinions.

Rolling back to 2011... and something strange happens... Raven is still top, but HCE and Return to Ommadawn swap places! This is odd.  I find it hard to argue that Mike Oldfield's album is better than Steven Wilson's when considering albums from the period 2013-2017, but that the Steven Wilson album is better than the Mike Oldfield album if you also include the years 2011 and 2012 - especially considering that both those albums were published several years out side of 2011/2012.


If you understand the algorithm you can understand why this happens. Mike Oldfield's latest (at least today) has only a relatively small number of votes (72) compared to HCE (1000+). The algorithm "pulls" an album towards the average more the less the number of votes it has. As I add in new years this gravitational pull gets greater and so Return falls below HCE. Every time you add in a year the QWR rating of the albums that are above the average falls - this is the effect of this gravitational pull.

So I can explain what is happening mathematically, but as I have already indicated it does go against common sense.


I thought I would share this observation. I am going to have a think about whether the algorithm could be tweaked to address this issue whilst retaining the general thrust of the algorithm. Ultimately it may be we just have to live with an artefact like this.


Thanks for reading!
Rob
Back to Top
esha9751 View Drop Down
Forum Groupie
Forum Groupie


Joined: April 20 2006
Location: Denmark
Status: Offline
Points: 52
Direct Link To This Post Posted: September 25 2016 at 15:04
Getting all kind of errors with this (potentially) wonderful tool. Try a few genres and a few countries combined and it will - more often that not - fail...
Back to Top
Dean View Drop Down
Special Collaborator
Special Collaborator
Avatar
Retired Admin and Amateur Layabout

Joined: May 13 2007
Location: Europe
Status: Offline
Points: 37575
Direct Link To This Post Posted: May 28 2015 at 18:04
Originally posted by misterprog misterprog wrote:

let me check if I undestood well...
QWR for the top 100 is different from QWR of a specific year, correct?
Correct. 

The average rating  R and the average number of votes per album N for all albums in the given sample. So for the all time top 100 they are calculated using every album in the PA database whereas for a specific year they are calculated using only albums from that year. Similarly, if you filtered on subgenre then only albums from that subgenre would be used.
What?
Back to Top
Nogbad_The_Bad View Drop Down
Forum & Site Admin Group
Forum & Site Admin Group
Avatar
RIO/Avant/Zeuhl & Eclectic Team

Joined: March 16 2007
Location: Boston
Status: Offline
Points: 20847
Direct Link To This Post Posted: May 28 2015 at 15:10
Originally posted by gooner666 gooner666 wrote:

Im curious how Hand Cannot Erase got in the top 100 so fast. Im pretty new here is that normal? Awesome site!!!Smile

Well you've got two effects going on, (1) new albums for fan favorite bands (SW, IQ, etc) tend to get a lot of very positive reviews from fanboys when it initially comes out with the rest of the listeners typically posting later on with lower ratings so you get an early jump up the charts (2) these bands get a lot of ratings overall so tend to be higher up the chart. Looking at Magma MDK that is two spots above Hand.Cannot.Erase, it's been out since 1973 so has been getting ratings from probably fairly early in the sites life, whenever Magma were added, and has a total 679 ratings. H.C.E. has 544 ratings in the short time it has been on.  
Ian

Host of the Post-Avant Jazzcore Happy Hour on Progrock.com

https://podcasts.progrock.com/post-avant-jazzcore-happy-hour/
Back to Top
gooner666 View Drop Down
Forum Newbie
Forum Newbie
Avatar

Joined: April 28 2015
Location: Prairie Village
Status: Offline
Points: 3
Direct Link To This Post Posted: May 28 2015 at 14:57
Im curious how Hand Cannot Erase got in the top 100 so fast. Im pretty new here is that normal? Awesome site!!!Smile

Edited by gooner666 - May 28 2015 at 15:00
When you are sorrowful look again in your heart,and you shall see that in truth you are weeping for that which has been your delight.
Back to Top
Dean View Drop Down
Special Collaborator
Special Collaborator
Avatar
Retired Admin and Amateur Layabout

Joined: May 13 2007
Location: Europe
Status: Offline
Points: 37575
Direct Link To This Post Posted: May 28 2015 at 03:38
Originally posted by misterprog misterprog wrote:

aany reason why in the top 100 some albums do not appear even if they have a hiher rating than the number 100?
Only one reason. QWR - Query Weighted Rating - this is a weighted arithmetic mean that takes into account the number of ratings that each album has compared to the mean for the whole population (i.e., rated average and number of ratings average).

For example if there are 100 albums that only have one rating each and those are all rated at 5-stars then the Top 100 would consist of nothing but those albums since their unweighted average would be "5.00"

Originally posted by Dean, in another thread, </span><span style=font-size: 11px; line-height: 13.9999990463257px; : rgb248, 248, 252;>19 May 2014 at 01:42</span><span style=line-height: 16.5454540252686px;> Dean, in another thread, 19 May 2014 at 01:42 wrote:

 
The QWR formula is:

QWR = (NR + nr) / (N + n)

where QWR is the weighted rating, R is the average rating of all albums, N is the average number of votes per album for all albums, r is the album average and n is the number of ratings.

The effect of calculating the QWR is two-fold: 
    1. As the number of ratings surpasses N, the weighted rating QWR approaches the actual average r. ie QWR  (nr) / (n)
    2. The closer "n" is to zero, the closer QWR  gets to R. ie  QWR  (NR) / (N)
So, in simpler terms, albums with very few ratings/votes will have a rating weighted towards the average across all albums, while albums with many ratings/votes will have a rating weighted towards their own average rating. 

Therefore once the number of ratings for an album gets really big (ie n is much larger than N) then the actual value of the average rating r will have more effect on the formula than the chart average rating R and the QWR value will approach the actual average rating value r.




What?
Back to Top
Kati View Drop Down
Forum Senior Member
Forum Senior Member
Avatar

Joined: September 10 2010
Location: Earth
Status: Offline
Points: 6253
Direct Link To This Post Posted: April 15 2015 at 20:44
Originally posted by Dean Dean wrote:

Originally posted by Kati Kati wrote:

Dean, hello you fluffy cute grumpy HugBig smile
Hello Sonia, flattery will get you nowhere... 

Originally posted by Kati Kati wrote:

Yes true what you said but atleast they will have a little less effect on the ratings scores. Also why, unlike other sites does P.A. allow people to rate albums without signing in properly?
I can't comment on that because I'm not M@X.
Originally posted by Kati Kati wrote:

Big hug to you Hug
Thank you, hugs are welcome in moderation. Hug

there is no so such thing as moderation unless you have a stalker (that's called creepy) WinkBig smileHug
M@X looks like Harvey Specter from SUITS aka hot, insightful and smart Approve I am sure if we arise this problem and come up with a better alternative he will be open to it http://www.proadviser.com.au/blog/wp-content/uploads/2014/11/h06.jpg


Edited by Kati - April 15 2015 at 20:50
Back to Top
Dean View Drop Down
Special Collaborator
Special Collaborator
Avatar
Retired Admin and Amateur Layabout

Joined: May 13 2007
Location: Europe
Status: Offline
Points: 37575
Direct Link To This Post Posted: April 15 2015 at 20:22
Originally posted by Kati Kati wrote:

Dean, hello you fluffy cute grumpy HugBig smile
Hello Sonia, flattery will get you nowhere... 

Originally posted by Kati Kati wrote:

Yes true what you said but atleast they will have a little less effect on the ratings scores. Also why, unlike other sites does P.A. allow people to rate albums without signing in properly?
I can't comment on that because I'm not M@X.
Originally posted by Kati Kati wrote:

Big hug to you Hug
Thank you, hugs are welcome in moderation. Hug

What?
Back to Top
Kati View Drop Down
Forum Senior Member
Forum Senior Member
Avatar

Joined: September 10 2010
Location: Earth
Status: Offline
Points: 6253
Direct Link To This Post Posted: April 15 2015 at 20:14
Originally posted by Dean Dean wrote:

Originally posted by Kati Kati wrote:

Originally posted by Gerinski Gerinski wrote:

I think he was meaning to forbid 1 and 5 star rating-only's (without review).
Gerinski! Heart That is a brilliant option really! And would stop a lot of the very silly ratings, considering that most albums/bands added on PA certainly deserve more than 1 star because it's not easy to be accepted on this site and 5 star ratings are given easily too Smile big hug to you! Hug
Sorry Sonia, but if you think about the consequences of that all it does is skew the averages towards the centre because those people will now be distributing their rating-only between 2-star and 4-star ratings, or if they are really smart - give 2-star ratings to bands they don't like and nothing at all to those they do. The silly ratings would continue unabated. 

Also, we do not accept bands here based on whether they are good or not. The only criteria is they have to be Prog.
Dean, hello you fluffy cute grumpy HugBig smile
Yes true what you said but atleast they will have a little less effect on the ratings scores. Also why, unlike other sites does P.A. allow people to rate albums without signing in properly?
Big hug to you Hug
P.S. In that acceptance criteria, any true prog band is surely more deserving than a 1 star rating. Unhappy


Edited by Kati - April 15 2015 at 20:16
Back to Top
 Post Reply Post Reply Page  123 12>

Forum Jump Forum Permissions View Drop Down



This page was generated in 0.320 seconds.
Donate monthly and keep PA fast-loading and ad-free forever.