Forum Home Forum Home > Progressive Music Lounges > Prog Music Lounge
  New Posts New Posts RSS Feed - Jethro Tull’s "Thick As a Brick" #1 Prog?
  FAQ FAQ  Forum Search   Events   Register Register  Login Login

Topic ClosedJethro Tull’s "Thick As a Brick" #1 Prog?

 Post Reply Post Reply Page  123 6>
Author
Message Reverse Sort Order
Kotro View Drop Down
Prog Reviewer
Prog Reviewer
Avatar

Joined: August 16 2004
Location: Portugal
Status: Offline
Points: 2815
Direct Link To This Post Topic: Jethro Tull’s "Thick As a Brick" #1 Prog?
    Posted: February 14 2006 at 17:38
I think JT's "Thick..." fully deserves that position. It is a masterpiece of music beyond the progressive spectrum (as is Dark Side of the Moon).
Bigger on the inside.
Back to Top
Erik View Drop Down
Forum Senior Member
Forum Senior Member


Joined: October 23 2005
Location: Netherlands
Status: Offline
Points: 101
Direct Link To This Post Posted: February 14 2006 at 17:32
I was actually quite pleased to see TAAB up there. I'm very fond of Jethro Tull, I think their musical compositions and lyrics are brilliant. TAAB is one of my favourite pieces of music.

And I am sure a lot of you who know a lot more about music then I do can make very good arguments why other albums deserve the number one spot more but this is about the likes of the majority of users so this will always be a matter of personal preference. And it seems the masses prefer TAAB above other, also fantastic, albums.

If I where to make a top whatever it will probably look a whole lot different from the next guy's. (Hell, I'd put Moving Pictures and Scenes from a Memory in the top five )
Back to Top
ChadFromCanada View Drop Down
Forum Senior Member
Forum Senior Member
Avatar

Joined: November 12 2005
Location: Canada
Status: Offline
Points: 293
Direct Link To This Post Posted: February 14 2006 at 17:31
I haven't heard Thick as a Brick yet, but I've heard (and own all but foxtrot) all of the rest of the Top 5, so it must be excellent to be so high on the list.
Back to Top
Andrea Cortese View Drop Down
Special Collaborator
Special Collaborator
Avatar
Honorary Collaborator

Joined: September 05 2005
Status: Offline
Points: 4411
Direct Link To This Post Posted: February 14 2006 at 17:23

Jethro Tull are the greatest band in my personal cd collection!! What an impressive, prolific and veried discography!! 

Just listen to some too often 1980s' neglected albums: "A" in particular...a real gem! Black Sunday is one of the best songs ever! I reccommend you to buy the new remastered edition with the bonus dvd containing the complete Slipstream videos

Thick as a Brick? Yes, it deserves to be settled in the top 3 of all times!



Edited by Andrea Cortese
Back to Top
aapatsos View Drop Down
Special Collaborator
Special Collaborator
Avatar
Honorary Collaborator

Joined: November 11 2005
Location: Manchester, UK
Status: Offline
Points: 9226
Direct Link To This Post Posted: February 14 2006 at 16:21
TAAB is a magnificent album and deserves its place...
Back to Top
Gentle Tull View Drop Down
Forum Senior Member
Forum Senior Member
Avatar

Joined: November 13 2005
Location: United States
Status: Offline
Points: 518
Direct Link To This Post Posted: February 14 2006 at 15:41
I dont think Jethro Tull are the greatest prog band, but I think "Thick as a Brick" may be the greatest prog album. It's a contender, at least.
Back to Top
Bob Greece View Drop Down
Prog Reviewer
Prog Reviewer
Avatar

Joined: July 04 2005
Location: Greece
Status: Offline
Points: 1823
Direct Link To This Post Posted: February 14 2006 at 08:55
Thick as a Brick is now at number 2. Is it challenging for the top again?
Back to Top
MikeEnRegalia View Drop Down
Special Collaborator
Special Collaborator
Avatar
Honorary Collaborator

Joined: April 22 2005
Location: Sweden
Status: Offline
Points: 21156
Direct Link To This Post Posted: August 06 2005 at 21:14
Originally posted by The Wizard The Wizard wrote:

Originally posted by bluetailfly bluetailfly wrote:

Granted, we all like Ian Anderson's humororous, outlandish concept and Martin Barres commanding guitar licks, etc., but, I'm sorry, the band is not even in the same league with many other more quintessential prog bands, for example, Yes, Genesis, King Crimson, and dare I say it, even ELP (at the height of their powers).

Jethro Tull in definetly the same league as Yes and King Crimsonand Thick as a Brick is may not be the greatest prog album ever, it is certainly in the top 15 if not top 10.

Then write a review of the album! (all who think that it's overrated and haven't written a review yet).

Back to Top
The Wizard View Drop Down
Prog Reviewer
Prog Reviewer
Avatar

Joined: July 18 2005
Location: United States
Status: Offline
Points: 7341
Direct Link To This Post Posted: August 06 2005 at 20:57
Originally posted by bluetailfly bluetailfly wrote:

Granted, we all like Ian Anderson's humororous, outlandish concept and Martin Barres commanding guitar licks, etc., but, I'm sorry, the band is not even in the same league with many other more quintessential prog bands, for example, Yes, Genesis, King Crimson, and dare I say it, even ELP (at the height of their powers).

Jethro Tull in definetly the same league as Yes and King Crimsonand Thick as a Brick is may not be the greatest prog album ever, it is certainly in the top 15 if not top 10.



Edited by The Wizard
Back to Top
Easy Livin View Drop Down
Special Collaborator
Special Collaborator
Avatar
Honorary Collaborator / Retired Admin

Joined: February 21 2004
Location: Scotland
Status: Offline
Points: 15585
Direct Link To This Post Posted: March 15 2005 at 14:44

Fitz,

A very interesting and detailed post. You have given this far more thought than I have, and I bow to your considered recommendations.Clap The proof of the pudding as they say is in the chart which appears on the home page, and I'm happy enough that overall it contains the right albums.

I think you and I will aways differ on how important popular vs best is. I still feel any chart on the home page should as far as possible list the cream of prog. It is always said that the singles charts don't list the best music, but the most popular, and to some extent that is what our current chart does.

However, I think for the time being, we have taken things as far as is necessary, and should let the current chart bed in.

Till the next time...Wink

Back to Top
Fitzcarraldo View Drop Down
Special Collaborator
Special Collaborator
Avatar
Honorary Collaborator

Joined: April 30 2004
Location: United Kingdom
Status: Offline
Points: 1835
Direct Link To This Post Posted: March 14 2005 at 18:41

Reed, what do you mean, one of the most impressive?!

I'm in Madrid and the tapas were exceptionally good this evening. I can recommend the setas and gambas, in particular, washed down with a caña or five. Out on the town again tomorrow night (even though it's not a pleasure trip). But I do like something to exercise the little grey cells now and again.

 

Back to Top
tuxon View Drop Down
Forum Senior Member
Forum Senior Member
Avatar

Joined: September 21 2004
Location: plugged-in
Status: Offline
Points: 5502
Direct Link To This Post Posted: March 14 2005 at 18:34

First. I think the list should be and stay at the home page. I think a lot of people are interested in these kinds of lists. i think especially to new-comers to this site it's a fast way to know which are the major bands and albums in progressive rock.

second I've been coming around to your vision, and I agree that numbers of ratings are a very good measuring tool for populatity.

i've been working on some other possible algorithms, one of them I think could be interesting.

algorithm number 6

almost the same as the current algorithm, but punishing more severly for low ratings.

album score = (average rating - 2,5) * (number of ratings - 0,5*numbers of 3 star ratings - number of 2 star ratings, - number of 1 star ratings - number of 0 star ratings)

I like the results from this calculation It prevents albums with mostly 3 star ratings from climbing to a top position (with enough ratings it is possible, but then your argument that it has to be a well-known, popular album provides reason for their inclusion in the list.)

and it tackles the problem of my suggested 100 ratings limit as maximum.

I'm always almost unlucky _ _ _ _ _ _ _ _ Id5ZcnjXSZaSMFMC Id5LM2q2jfqz3YxT
Back to Top
Reed Lover View Drop Down
Forum Senior Member
Forum Senior Member


Joined: July 16 2004
Location: Sao Tome and Pr
Status: Offline
Points: 5187
Direct Link To This Post Posted: March 14 2005 at 18:25

Fitz that's one of the most impressive pieces of work I've seen on here!Clap

But man,you really gotta get out more.....LOL




Back to Top
richardh View Drop Down
Prog Reviewer
Prog Reviewer
Avatar

Joined: February 18 2004
Location: United Kingdom
Status: Offline
Points: 28028
Direct Link To This Post Posted: March 14 2005 at 17:22
WOW! I've no intention of reading all that but  all the same time.
Back to Top
Fitzcarraldo View Drop Down
Special Collaborator
Special Collaborator
Avatar
Honorary Collaborator

Joined: April 30 2004
Location: United Kingdom
Status: Offline
Points: 1835
Direct Link To This Post Posted: March 14 2005 at 15:12

Hello tuxon (and Easy Livin too),

 

Well done for having a go at an algorithm, but stating that your algorithm is better does not necessarily make it a fact. All you are doing in essence is returning to Algorithm No. 4 for albums with more than 100 ratings, so we would be back to the same situation as before at the top of the list: e.g. Album A with 400 5-star ratings and 10 4-star ratings would be ranked below Album B with 101 5-star ratings (to mention one of the several paradoxes of that approach). The fact that you are multiplying the arithmetic mean by a constant (100) does not alter that. And it would just be putting off the inevitable to ‘raise the bar’ to 150. The value of the constant (100) is arbitrary and therefore difficult to justify. My previous posts gave various examples of how arithmetic means can be meaningless for ranking or lead to incorrect conclusions.

 

You state that there are only 10 albums with over 100 ratings at the moment. You did not mention that there are another 6 albums close to that barrier and which might break through it in the near future, and more could follow during this second year of the Archives’ existence. In a couple of years time there could be a greater number of albums over the 100 mark, and they would be ranked based on the arithmetic mean alone (multiplied by 100). Over the years, more and more albums will move above the 100 ratings mark and your algorithm would effectively turn into Algorithm No. 4, with the inherent problems I’ve pointed out in several previous posts in this thread and earlier threads.

 

Just to recap, the ordinal scale means that the arithmetic that you and I are doing is meaningless because the star-levels are not measurements and have absolutely no numerical value, they are simply used as a way of indicating the rank (order) of the data. The ordinal scale could just as easily have been A, B, C, D, E and F instead of ‘5 stars’, ‘4 stars’, ‘3 stars’, ‘2 stars’, ‘1 star’ and ‘0 stars’. An album rated ‘Essential: masterpiece of progressive music’ is not 5 times better than an album rated ‘Bad. Do not buy’. The difference between ‘Essential: masterpiece of progressive music’ and ‘Excellent addition to any progressive music collection’ is not the same as the difference between ‘Excellent addition to any progressive music collection’ and ‘Good but not essential’. Nevertheless, we are assuming just that when we turn the ordinal scale into an interval scale with an arbitrary numerical value (5, 4, 3, 2, 1, 0). The values 5 stars, 4 stars, 3 stars, 2 stars, 1 star and 0 stars actually indicate an order of ranking only. You are turning the rank into the value of the rank. That is a big assumption in itself, but to then calculate an arithmetic mean and declare that it represents the typical rating of an album is difficult to justify (even if many people do declare that when using this type of scale).

 

Let’s consider again TOT and DSOTM. The chart below shows the number of ratings per star-level for the two albums on a day I looked last week.

 

 

Chart 1

 

Now let’s multiply the number of ratings at each level by the corresponding rank. That, by the way, is tantamount to weighting the ratings according to the degree of approval/disapproval of the person rating the album, i.e. the rating level is effectively a weighting factor if used as a value instead of a rank. The resulting number of stars at each level for each album is shown in the chart below.

 

Chart 2

 

You can see the relative difference between the columns in the two above charts, which clearly shows the impact of the ‘weighting factors’ (5, 4, 3, 2, 1, 0). To reiterate, to use the rank as the value of the rank is to use it as a weighting factor.

 

The arithmetic mean rating level per album is the sum of all the album’s stars from each star-level divided by the total number of ratings for the album.

 

Now, you’ll recall from my previous post that I too make an assumption: it is that, due to the way the rating levels are worded, they are degrees of approval/disapproval and could be assumed to be symmetrical, viz. ratings of ‘Good but not essential’ and above are votes of approval with increasing intensity of approval, and ratings of ‘Collectors/fans only’ and below are votes of disapproval with increasing intensity of disapproval. Disapproving ratings could be considered to be negative ratings and approving ratings could be considered to be positive ratings. Let’s say I continue to use your assumption that a rank of 5 stars is literally worth 5 stars, and in addition let’s consider disapproving ratings to be negative. This is shown in the chart below.

 

Chart 3

 

Note 1: I’ve used the values from my spreadsheet for Algorithm No. 5, so the height/depth of all the columns is different because the ranks are actually 2.5, 1.5, 0.5, -0.5, -1.5 and -2.5 instead of 5, 4, 3, 2, 1 and 0, but the principle is the same. I've just kept the scale as 5, 4, 3, 2, 1, 0 on the charts for the sake of simplicity.

 

Now let’s look at each album one at a time in order to make the charts easier to understand. The chart below (see Note 1) shows the stars for TOT taken from the chart above, rearranged so that the disapproving ratings are below their corresponding approving ratings (0-star ratings underneath 5-star ratings, 1-star ratings underneath 4-star ratings, and so on).

 

Chart 4

 

Now let’s go ahead and subtract the stars of the disapproving ratings from the stars of the corresponding approving ratings. The result is shown in the chart below (see Note 1).

 

Chart 5

 

The remaining stars at each scale point (star-level) are: 250 stars at the 5-star level and 4.5 stars at the 3-star level (see Note 1).

 

Now let’s do the same thing for DSOTM. The corresponding chart showing the ‘stars of disapproval’ below the ‘stars of approval’ is shown below (see Note 1). As there are only a few disapproving ratings for DSOTM, the number of negative stars is small as can be seen (just about) in the chart.

 

Chart 6

 

Now let’s go ahead and subtract the stars of the disapproving ratings from the stars of the corresponding approving ratings. The result is shown in the chart below (see Note 1).

 

Chart 7

 

The remaining stars at each scale point (star-level) are: 230 stars at the 5-star level, 12 stars at the 4-star level and 3 stars at the 3-star level (see Note 1).

 

So we now have the net number of ‘approving stars’ for both albums, and these are shown together in the chart below (see Note 1).

 

Chart 8

 

 

To recap the figures given above, the number of stars for the two albums are (see Note 1):

‘Essential: masterpiece of progressive music’: TOT = 250, DSOTM = 230.

‘Excellent addition to any progressive music collection’: TOT = 0, DSOTM = 12.

‘Good but not essential’: TOT = 4.5, DSOTM = 3.

 

Even if we were to bump up all the DSOTM stars at the lower rating levels to make them ‘Masterpiece stars’, and throw away (ignore) all the TOT ‘Good stars’, DSOTM would still have less ‘Masterpiece stars’ than TOT by a small margin (more than the 3 ratings I mistakenly mentioned in an earlier post – I should have said 6 because Algorithm No. 5 halves the number of stars).

 

Thus it can be argued that TOT is (slightly) more popular than DSOTM and, on average, also of a higher quality (whatever that means) than DSOTM based on the ratings to date. This is a perfectly valid conclusion, and no less valid than the conclusion reached using the arithmetic mean rating-level to rank albums. I feel it is more valid than blindly using the arithmetic mean, because it takes into account the meaning of the rating levels.

 

If the size of all samples (i.e. the number of ratings per album) were the same, and if all the sample distributions were of similar shape, and if all the populations were the same, then I would be much more comfortable with the approach of using arithmetic means to rank albums. However, none of these are the case in this situation and I am therefore not comfortable using arithmetic means. The other thing that I need to stress is that statistical methods are used to estimate the attributes of a population from a sample (or samples). I am not interested in doing that, because the populations for each album can be very different. I am interested in ranking albums based solely on the ratings submitted to date, and based on the volume of ratings, not just the arithmetic mean rank (rating).

 

Now, having said the above, your problem accepting Algorithm No. 5 appears to have been articulated by Easy Livin, and I quote him:

 

“The overriding concern I have about the present one remains though, and I think Tuxon's post implies a similar worry. That is, a mediocre album can ride high in the chart, simply because a lot of peope have reviewed it, and said they "quite like it." Anything "Good" or above results in additonal "points" being gained. This means that albums which are universially acclaimed as masterpieces can appear lower in the chart than quite good albums, just because more people have rated the quite good ones. I don't like that, that to me does not fit well with the word popular.”

 

I do understand this point of view, but my understanding of the word ‘popular’ does not appear to be the same as Easy Livin’s (and yours?). Let’s just look at one well-known dictionary’s definition of the word ‘popular’.

 

popular

adj

1. appealing to the general public: appealing to or appreciated by a wide range of people

  • the most popular name for babies this year

2. well-liked: liked by a particular person or group of people

  • popular with young audiences

3. of the general public: relating to the general public

  • popular appeal

4. aimed at non-specialists: designed to appeal to or be comprehensible to the non-specialist

  • a popular gardening magazine

5. believed by people in general: believed, embraced, or perpetuated by ordinary people

  • popular myths

6. inexpensive: designed to be affordable to people on average incomes

  • a new popular car

[15th century. Via Anglo-Norman populer from Latin popularis ‘of the people’, from populus ‘people’ (source of English people and public), of uncertain origin: probably from Etruscan .]

Microsoft® Encarta® Premium Suite 2003. © 1993-2002 Microsoft Corporation. All rights reserved.

 

To me, the above would not imply that an album “acclaimed as a masterpiece” by a relatively small group of people is more popular than an album acclaimed as excellent or good by a larger group of people.

 

Easy Livin uses the words “universally acclaimed” but his usage is misleading in my opinion. Take TULL’s “Thick As A Brick”, for example. A total of 69 people have rated it to date, of which 58 people rated it as ‘Essential: a masterpiece of progressive music’ and 11 people rated it as ‘Excellent addition to any progressive music collection’. This is hardly universally acclaimed: a total of only 69 people have rated the album to date. If the album were indeed very popular and universally acclaimed to be a masterpiece then I would expect to see a far higher number of ratings (cf. “Close To The Edge”). My inference from this is that TAAB is not as popular amongst the Progressive Rock community (at least those fans who visit ProgArchives) as several other albums, despite having the highest arithmetic mean rating. If it were universally acclaimed, then there would be a lot more ratings than there are.

 

As to a “mediocre album riding high in the chart”, I don’t see any mediocre albums riding high in the chart. Or is this referring to “Train Of Thought” again (or to the other DREAM THEATER albums, for that matter)? If so, firstly please refer back to the charts and accompanying explanations earlier in this post. Secondly, my previous posts have, I hope, explained that mediocrity is in the eye of the beholder. Would, for example, the majority of fans of DREAM THEATER regard “Thick As A Brick” as a masterpiece? Would the majority of fans of JETHRO TULL regard “Train Of Thought” as a masterpiece? I very strongly doubt it. I don’t think the data indicates “mediocrity” for those albums – again, refer to the charts and discussion above.

 

To address one of the points you made in an earlier post:

Album A has only 3 5-star ratings and 3 4-star ratings.

Album B has only 70 3-star ratings and 10 2-star ratings.

 

To me, Album B is the more popular album. The net number of stars for Album A is 27 stars (these stars have two different ‘colours’, if you follow my meaning). The net number of stars for Album B is 180 stars (of a different ‘colour’ to Album A’s stars, if you follow my meaning). The handful of people that reviewed Album A thinks it’s the tops. A much larger number of people think Album B is good. That’s what popular means. If more people rate Album A in future it might overtake Album B (assuming the new people liked it as much as the handful who had rated it already). If Album B is indeed excellent in the eyes of many then it would only take another 13 people rating at 4-stars (11 if rating at 5-stars) in order for Album A to overtake Album B in the list. Thus higher ratings have a built-in weighting factor: fewer ratings are needed for an album’s rank to rise in the Top 50. If the Progressive Rock community really does deem an album excellent it would rise faster in the list than an album deemed good, and an album deemed poor (or deemed bad) would fall in the list, with an album deemed awful falling faster than an album deemed poor. Let’s call it the ‘percolation effect’. If, as I surmised in a previous post (as did BebieM), people are less likely to rate albums they don’t like then that would increase the percolation effect. Thus, as the number of ratings increases, I surmise that the validity of the derived ranking increases. I don’t think that can necessarily be said if using arithmetic means to compare albums.

 

The Top 50 is a guideline. People can click on an album’s hyperlink in the list, read the reviews of the album and study the album’s ratings chart (a.k.a. frequency distribution) and see clearly the ratio of people rating the album who adored it, liked it, disliked it and hated it. If they browse the TAAB page they will see that 58 of the 69 people who rated it have rated it as a masterpiece of the genre. That may tell them something. If they browse to the TOT page they will see that 26 per cent of the 231 people who have rated the album to date did not like it but 106 rated it as a masterpiece. Can we tell which is the “better” album from these? No. Define “better”. We’re talking about subjective art. But we can see that TOT is very popular. Heck, it has more masterpiece ratings than TAAB has ratings.

 

If everyone really wants to change the Top 50 to something other than my original request to M@X, then be my guest. But please do not call it a popularity list, because it won’t be. If you want to have a list that is a recommendation of what different groups of fans deem to constitute quality, then you should call it something like: “Top 50 based on the arithmetic mean rating”. Even then, if you use the arithmetic mean as your means of deriving the rank in the list, you are ranking badly and will produce silly results like the ones originally highlighted by richardh for Algorithm No. 4 (also see, for example, my earlier post with a table and chart showing four hypothetical albums with very different ratings but all with the same arithmetic mean). If you still believe that the arithmetic mean is the best measure then see the URLs in my very long post of 10th March in this thread if you don’t feel my words alone are sufficient proof (here’s just one of them again: http://www.quickmba.com/stats/centralten/). If professionals do not regard the arithmetic mean as a good means of determining rank with data as skewed and otherwise irregular as in our case, why insist on using it?

 

You might also be interested to read the following paper:

http://gsep.pepperdine.edu/~cmccall/CERAFinal.pdf#search='an %20empirical%20examination%20of%20the%20likert%20scale '

that criticises the use of arithmetic means and summing the ratings per rating-level to evaluate Likert scales (which is basically what the ProgArchives 6-star system is). An approach based on percentages is proposed in that paper. See if you can understand it and develop something from it that is a) better than your flawed algorithm and b) that would also make people more comfortable than Algorithm No. 5 which, just to emphasise it yet again, is interested in popularity, as that is my interest. (Actually, are the majority of visitors to the Archives uncomfortable with the list as it stands? richardh seemed to think it is OK.) However, do note that the examples used in the above-mentioned paper assume that the sample size is the same in every case, i.e. the number of ratings per album is the same for every album. That is certainly not the case here.

 

As it seems we have very differing views, if you do not wish to investigate further – and unfortunately I do not have the time at present to do much more on this – and you feel strongly that the list on the Home Page should change (I am happy with it, as it satisfies my original request in May last year to have such a list on the Home Page) then you might like to try and convince the site owners of your viewpoint. If they are convinced then it is their prerogative to change the algorithm. Alternatively a possible compromise, so that people like me who want to know the popularity of albums (quantity and like/dislike, because the star-rating does affect the rank in Algorithm No. 5, as I’ve explained above), would be to remove the Top 50 list from the Home Page and make “Top 50” on the Home Page a hyperlink to another page which contains two lists: one using Algorithm No. 5 and one using Algorithm No. 4. That way, both camps could be satisfied and the visitor can decide which type of list s/he is more interested in. That page could also quote the progressive music charts produced by organisations such as the RIAA, BPI, magazines etc. It might be interesting to have these all on one page so that the visitor can make some comparisons. Another advantage would be that the SQL query would not be run every time someone visits the Home Page. You might like to suggest that to the site owners.

 

I have just seen DSOTM leapfrog TOT, so the charts earlier in this post would need to be adjusted. Nevertheless my explanation obviously remains the same. As to the issue of spamming, there have been 8 ratings added for TOT since 6 March and 6 ratings for DSOTM. These do not seem excessive to me, although, looking at the dates of the last 4 DSOTM reviews on that album’s page in the Archives, their submittal looks like it might have been prompted in an effort to move DSOTM up the list. I’m only speculating and, if it were indeed the case, would not be a crime. I’ve done the same myself once or twice in the past: e.g. a review appears on the Home Page that I disagree with so I’m prompted to write a review to counter it; e.g. an album I like drops in the Top 50 and I’m is prompted to finally get around to review it after months of inaction. However if anyone is actively trying to move TOT down the list then that would be a great shame and rather childish (not to mention naïve given the reasoning behind the list). I do hope that isn’t the case.



Edited by Fitzcarraldo
Back to Top
tuxon View Drop Down
Forum Senior Member
Forum Senior Member
Avatar

Joined: September 21 2004
Location: plugged-in
Status: Offline
Points: 5502
Direct Link To This Post Posted: March 11 2005 at 15:49

Fitz, and all interested parties,

Well I've interpret and calculated the data from the albums in the top 50.

I tried to copy-paste the outcome, apparently the post couldn't handle it, so I'll give a small summary of my findings.

 

Number of ratings
Average number of ratings of the albums currently in the top 50 = 78
Number of ratings for Train Of Thought = 229 (3*average)
number two on the scale of ratings is Yes CTTE, with 145 (86 less than TOT)

Effect on the number of ratings on the Top 50
All top 10 albums have 100 or more ratings
The lowest 100 review album is Yes TFTO ranking 17

Averages
The lowest average is from Dream Theaters Train of Thought, well below Pink Floyds the Division Bell. These are also the only two averaging below 4 stars.

My recomandation for a failsave mechanism to be incorporated with the algorithm stands. this means an IF-formula in the calculation.
Formula : (IF(number of ratings>100;100;number of ratings)*(average rating)

It isn't really nescesary to substract 2,5 from the average, the results remain the same

This way the number of ratings stay very important (only 12 albums currently have 100 ratings or more, and only 25 albums have more than 70 ratings) I doubt there are albums with more than 80 reviews outside the current top 50, and if so their averages are below 3,00

I tried BebieM's proposal aswel, with different ways, the results appeared normal, but there is a fault in it (album with 100 reviews, 50 * 4 stars and 50 3 stars would end up with 4 stars as average), The idea 'behind it was not bad in itself BTW.

 

Concluding:

The number of reviews are a good indication of popularity and ranking, but they are not a good indication of quality.
The algorithm I provided is better. The number of ratings setting at 100 is however arbitrary (the former algorithm had a similar formula, using 30 (approximation of that algorithm, not being specific), I believe that was something like this. If(number of ratings<30;Average;0). it didn't had the problem of hugh quantities of reviews, but it did have the problem that relatively unknown bands were easaly incorperated in the top 50 (volatility of the mean, meaning 1 * 5 star rating weighs relatively heavy on the mean).

with the proposed algorithm, high quantities of reviews remain of significant influence, but an overdose on reviews does not poison the outcome. Like I said the choice for 100 is arbitrary, 50 or 150 are also possible. The higher you make it, the more impact the amount of reviews have.

Any Questions ?

 

I'm always almost unlucky _ _ _ _ _ _ _ _ Id5ZcnjXSZaSMFMC Id5LM2q2jfqz3YxT
Back to Top
BebieM View Drop Down
Forum Senior Member
Forum Senior Member
Avatar

Joined: November 01 2004
Location: Germany
Status: Offline
Points: 854
Direct Link To This Post Posted: March 10 2005 at 22:58

I think the biggest problem is that people who don't like particular albums usually don't write a review. Also, we're talking about the most popular albums of prog, so though the definition may classify the 3-star rating as a positive one, I think it doesn't really add that much to the popularity. Actually, it depends.. Some reviewers say "Really interesting album, but i don't like the lyrics that much" ---> 3 stars, that should be taken as a positive rating. Others say "That is way worse than the band's best work, i give it 3 stars because i can still find a few good parts" ---> 3 stars, but not quite positive.

Since we don't rank ALL the prog albums, but only make a top50 list, i think 3 star-ratings should have the value 0 or maybe even a negative one. I also do see the problem that the positive-negative relation gets uneven, but I don't think a "good, but non-essential" rating really adds to the popularity of an album that's considered a masterpiece by a lot of other people.

Back to Top
Fitzcarraldo View Drop Down
Special Collaborator
Special Collaborator
Avatar
Honorary Collaborator

Joined: April 30 2004
Location: United Kingdom
Status: Offline
Points: 1835
Direct Link To This Post Posted: March 10 2005 at 21:41

Easy Livin wrote: "Thanks for all the background info Fitz, you've put a lot of tiem and thought (or is that train of thought) into it."

Easy Livin wrote: "The overriding concern I have about the present one remains though, and I think Tuxon's post implies a similar worry. That is, a mediocre album can ride high in the chart, simply because a lot of peope have reviewed it, and said they "quite like it." Anything "Good" or above results in additonal "points" being gained. This means that albums which are universially acclaimed as masterpieces can appear lower in the chart than quite good albums, just because more people have rated the quite good ones. I don't like that, that to me does not fit well with the word popular."

I'll tackle this one at the same time as I reply to tuxon's post, Easy, if that's OK. I'm quite busy at work at the moment, so have less time to play. Actually, I wrote most of my looong post at 10,000 metres.

Easy Livin wrote: "There do seem to be a disproportionate number of reviews and ratings already for TOT on the site when compared to amount of discussion there is about Dream Theater in the forum. Has there perhaps been some "spamming" already? Perhaps a limit on the maximum number of reviews, say the first 100 would help?"

If you look at the reviews page for TOT, the rate of posting looks quite normal. It could simply be because the album is popular. I can't tell about the ratings posted without reviews, of course, but maybe Max would be able to see a date_time stamp in the database for ratings only, as well as reviews? The band just seems to be very popular - I see their CDs in many shops. Perhaps DT fans don't find ProgArchives a very welcoming place to discuss Metal? Do our forums tend to focus on the 'classic bands' more? Perhaps the DT fans prefer to frequesnt other sites. I have browsed a few DT sites in the past to find out more about the albums, and they seemed to be thriving. I got the impression that the band is very popular.

Easy Livin wrote: "Another thought. In the olympics, the medals table takes account of all the gold medals gained first. One gold medal is worth more than any number of silver. Worth condsidering?"

Do you mean apply a weighting factor to 5-star ratings, Easy? I'd prefe to keep weighting factors in the back pocket for now. What is niggling me about the data at the moment is that the distributions are very varied, heavily skewed in many cases, the populations different, and the ordinal scale (the words, i.e. the meaning of the scale points) not necessarily of equal interval, which screws up any parametric statistical analysis even more. One thing that tuxon, bless him, could do - 'cause I don't have the time - is to look at the medians for all the distributions and see if that can be used as a more meaningful ranking measure. There are ways to deal with skew: geometric means and all sorts of heavy non-paramentic stats methods but they involve some heavy number crunching so that rules them out.

And so to bed, as Samuel said.

Back to Top
Fitzcarraldo View Drop Down
Special Collaborator
Special Collaborator
Avatar
Honorary Collaborator

Joined: April 30 2004
Location: United Kingdom
Status: Offline
Points: 1835
Direct Link To This Post Posted: March 10 2005 at 20:54

tuxon,

If possible I will try to get some time in the next few days to analyse and reply to your initial post that followed my very lengthy post. I am very busy just at the moment, so it may be a day or it may be a few days, but I will comment, as you have raised some interesting points.

In the meantime I recommend that you prepare an Excel spreadsheet with the data for the albums currently in the Top 50 (you can obtain from each album's page in the Archives: a) the percentage values for each scale point (star-level), and b) the total number of ratings). From these you can derive the number of ratings per scale point, and that is the basic data for analysis of ranking algorithms.

Then you can easily implement Algorithm 5 in the Excel worksheet, plus any other algorithms that you care to experiment with or compare it to. Rather than just spending your time trying to pick holes in Algorithm No. 5, you could more usefully employ your time trying to develop an algorithm. If you do, you will need to be able to justify it using sound statistical and logical principles - no arbitrary chopping data off here and excluding ratings there unless there is a sound statistical reason for doing so. And do bear in mind what I have previously mentioned about the arithmetic mean.

 



Edited by Fitzcarraldo
Back to Top
Fitzcarraldo View Drop Down
Special Collaborator
Special Collaborator
Avatar
Honorary Collaborator

Joined: April 30 2004
Location: United Kingdom
Status: Offline
Points: 1835
Direct Link To This Post Posted: March 10 2005 at 20:33
Originally posted by tuxon tuxon wrote:

 

In your calculation Train Of Thought acquires 262 points

 

107 5 267,5
34 4 51
29 3 14,5
20 2 -10
34 1 -51
4 0 -10
262

An album with 100 solely 5 star ratings would have less points than Train Of Thought.

Please explain the logic in this 

tuxon,

 

You can see better how popular the two albums are from the chart below, which shows the number of people who have rated TOT and TUXON (your hypothetical album with only 100 5-star ratings) at each of the 6 scale points (star levels).

The number of people who approved of the albums is shown above the x-axis and the number of people who disapproved of the album is shown below the x-axis. I have represented the chart bars this way to indicate the opposing nature of the ordinal data (approval/disapproval) and also because that's how Algorithm No. 5 proceeds.

Firstly, remember that the scale on the x-axis is ordinal and represents attitudes which, very roughly speaking could be considered as ranked 'sublime', 'excellent', 'good', 'not good', 'awful', 'use it for clay pigeon shooting'. They are, very roughly speaking, symmetrical about the line between 'good' and 'not good' on the x-axis.

For TOT, the ‘Poor. Only for completists’ vote cancels out the entire ‘Excellent addition’ vote, and the ‘Collectors/fans only’ vote cancels out about two thirds of the ‘Good, but not essential’ vote. Tot it all up and the popularity of the two albums in absolute terms (i.e. in terms of the net number of people who rate the album highly) is close, with TOT just slightly more popular. Hopefully it is easier to understand when presented in pictorial form. You have to forget about arithmetic means (well, not forget them completely, but understand that the arithmetic mean is not always the appropriate tool for making a ranking decision - see also my previous post).

Back to Top
 Post Reply Post Reply Page  123 6>

Forum Jump Forum Permissions View Drop Down



This page was generated in 0.227 seconds.
Donate monthly and keep PA fast-loading and ad-free forever.