Okay, this is really, really bugging me.
Our MySweetHoneybush Chocolate Malt Honeybush (http://steepster.com/teas/mysweethoneybush-com/13493-chocolate-malt ) has six ratings. Five of them are visible: 100, 100, 99, 95, and 80. Based on those five scores and one score that we can’t see who posted it, the average rating is 81.
Whoever posted the sixth rating, scored the tea VERY VERY low. And if that is actually someone who tried it and really didn’t like it, I’m fine with that. (Well, maybe not fine, but I’m not going to argue with them.) But how do I know that one of my competitors hasn’t figured out the fact that rating a tea without leaving a tasting note leaves no trace of who rated it?
I’ve hesitated to say anything in the past, because I don’t want to give anyone any ideas, but I did mention it to Jason, and I hope he will consider flipping a switch and revealing who has posted all of these anonymous scores. I think it might be quite embarrassing to some tea companies, if my conspiracy theory is correct.
I agree with you.
I think that the anonymous scores should be eliminated, or at the very least we should be able to see all ratings so that we know where the low scores are coming from, that way we can have a little more insight behind that low score.
Frank while I agree w/ you, also remember that the “average” score for a tea is not the average at all. It’s not total sum of scores divided by the number of scores. It’s some weird algorhythm… you’ll especially notice it if you are the 1st person rating a tea. While both are concerns, but I think the latter should be fixed even more as it affects every single tea on steepster.
I’ve seen teas that have a single person rating them (sometimes me), and they rate them super-high (90+), but the “Steepster score” is, like, 70 something. Seemed really weird.
I actually like this aspect of the algorithm; for one thing, I think if it didn’t work that way, the “Best Teas” category would be constantly swamped with new teas that one person loved. Or if one person hates a tea (and not many have tried it), it suddenly has a Steepster rating of 10 and no one will want to touch it.
teabird… I don’t see the last comment really being an issue, but maybe that’s cuz I just think weird lol. I’m gonna investigate low ratings just because I’m oddly drawn to them so I’d realize if it was based on 1 score. I think it’s giving a disadvantage to a highly rated tea, especially one that’s new to the data base. I’ve given a couple 100s and the rating it gave the tea was substantially lower.
I guess I think it should be a true average, and have the Best Teas page reflect only the teas with a minimum of X ratings. Say 5 or 10?
Actually I like the algorithm. I’m curious how it works, but I don’t think that it should be possible for a tea to get 90+% based on one or two scores. I want the best teas to be the ones that a consistently loved by most people who try them. This is more useful to me because it’s likely that these teas will be easier to get hold of, and because it stops the ratings from being so easily skewed.
If I’m interested in what an individual thinks and trust that individual then I’ll look at their particular ratings, but I don’t want the global ratings to be skewed by some dude who decides that everything they drink deserves to be 100%, for example
That is actually pretty terrible if that is in fact taking place. It especially saddens me on this particular tea, as it is amazing. I think I will brew a cup to console myself now. ;)
BUMMER…how low is low…I cannot see the low rating when I view the link or tisane…Frank, you know I LOVE this one…so I, too, am curious about this!
I didn’t think that was possible. What I see is 7 tasting notes and only 6 ratings since 52 teas did not rate their own tea. 6+1=7 I think the algorithm is the culprit. But please remember that TeaEqualsBliss had two of the tasting notes and she can only rate the tea once.
I’m not sure this is sabotage. I think it’s very possible that the Steepster ratings are not a simple average of the ratings people have given it, but are weighted toward the center (a score of 50). For example, this tea http://steepster.com/teas/red-blossom-tea-company/3107-alishan-spring-2009 has three ratings: 98, 95, and 97. It’s Steepster score is 85.
This suggests that, perhaps, every tea is given a single invisible rating of 50 just for being added to the database. Each rating by a Steepsterite pulls the tea’s rating away from 50 and toward user opinion.
This wouldn’t fully explain Frank’s case, but it would mean that the one anonymous rating could be a 45 rather than a 0.
For the record, I would also like it if ratings couldn’t be anonymous; I don’t care if someone doesn’t feel like writing a full tasting note, but I don’t want people with a financial stake in a rating messing with it dishonestly.
No sabotage. Woaps, wrong thread. This is the right one.
Yeah I disagree on the “better”, but thanks for putting more marbles in the “not sabotage” bucket. :)
Ah, thanks for the detail Ricky! The blog has been pretty quiet since I joined Steepster, so I wouldn’t have thought to check those archives for rating info. Reading up on Bayesian Averages now…
I don’t think this necessarily precludes sabotage.
I understand the idea behind the Bayesian averages, but that doesn’t mean that the scores used to calculate those averages are all above board.
This little teapot agrees with 52Teas that ratings should always be non-anonymous so that no one can diss a competitor’s tea. Or if some human is just mad at a tea company for whatever reason, they can’t keep posting low ratings on those teas. I know that ideally all you humans should play nice together (like us teawares do), but it doesn’t always happen.
Interesting. I don’t really know what’s going on with the rating system.. or how it necessarily works, but I do think that ratings should be attached to an account/a name. I don’t see anything wrong with giving low ratings/reviews if it is a person’s genuine opinion.. after all, different kinds of tea don’t always sit well with EVERYONE.