Difficulty Metrics

This forum is for discussing tournament formats, question styles, strategy, and such.
Post Reply
User avatar
Red Panda Cub
Wakka
Posts: 194
Joined: Thu Dec 22, 2011 9:59 pm

Difficulty Metrics

Post by Red Panda Cub » Mon Aug 26, 2019 11:04 am

Assessing difficulty of questions is notoriously difficult, especially when the material covered is novel or under-asked. Even if the thing being asked about has come up before, without combing over tournament discussion or making use of only recently available advanced stats, it can still be hard to determine how well-known it is. As such, I think it would be useful to compile some metrics to help writers predict or otherwise assess the difficulty of their questions. I have two such metrics that I make use of and which I hope others might find helpful.

Wikipedia pageview statistics

Many writers use Wikipedia to find content to put in questions. Many players use Wikipedia to research and study. Therefore, it seems reasonable to conclude that the number of hits a given page gets will be useful metric for predicting the difficulty of a question. Let's say, then, that we're writing a question on Hasidic Judaism, and we've found out about both the Satmar and Boyan dynasties, but we're not sure what order to clue them in our tossup. If click on the "View history" tab at the top right of the page in question, and then select "Pageviews" we are presented with detailed statistics about how often the page is accessed, and this data goes back quite a way. We'll then see that the Satmar page gets 7,195 views a month across the last year, while the Boyan page only has 814 a month. On this basis we can be fairly confident the Satmar should come up later that the Boyan. Furthermore, since 814 views/month is a pretty respectable number, we can be confident that the Boyan dynasty is still worth clueing.

Goodreads ratings numbers

Let's say I want to write a question about Junichiro Tanizaki, and I'm not sure what order to mention the titles of Naomi and Some Prefer Nettles. Wikipedia pageviews may still be useful, but I (intuitively) think that those statistics are a little less useful for books, since people learn about them in a variety of ways, and Wikipedia views don't necessarily map terribly well to how much something is actually read. However, if we go to the Goodreads page for Junichiro Tanizaki, we can see precisely how many ratings each of his books get. Naomi has 4,279 and Some Prefer Nettles has 2,923, both quite a lot in the scheme of Tanizaki novels, so we can be happy with them coming up towards the end of the TU, and now we have a good empirical basis to put one before the other.

Do people have any other metrics they use to help them assess difficulty when writing?
Joey Goldman
Oxford '17
City, University of London '19

User avatar
Mike Bentley
Auron
Posts: 5862
Joined: Fri Mar 31, 2006 11:03 pm
Location: Bellevue, WA
Contact:

Re: Difficulty Metrics

Post by Mike Bentley » Mon Aug 26, 2019 12:11 pm

For books and authors, a rough metric can be "is this author/book in the cultural zeitgeist?" For instance, have there been new editions, translations, re-evaluations published recently? Have those works been widely reviewed? Sometimes something as arbitrary as an author's 100th birthday can spark a re-evaluation of their work and a flurry of new editions, letters, etc. This obviously shouldn't be the only metric you use when writing questions (if you do, your question set can get very pegged to the cultural news). But it can be a tool to get some sense of "importance" in the current environment.
Mike Bentley
VP of Editing, Partnership for Academic Competition Excellence
Adviser, Quizbowl Team at University of Washington
University of Maryland, Class of 2008

User avatar
UlyssesInvictus
Yuna
Posts: 821
Joined: Thu Feb 10, 2011 7:38 pm

Re: Difficulty Metrics

Post by UlyssesInvictus » Mon Aug 26, 2019 2:49 pm

This one goes without saying, but for completeness's sake (and because everyone basically does it): just searching up the answerline on QuizDB / asseemDB and seeing at what difficulties it's been mentioned before.

A less technical equivalent is also asking people well-versed in the field / subject how well-known/important they think it is (especially fellow QB players).

When I was writing the FILM set, I often used the site TheyShootPictures.com to get a quick assessment of the general critical consensus of some of the less popular films.
Raynor Kuang
quizdb.org
Harvard 2017, TJHSST 2013
I wrote GRAPHIC and FILM

User avatar
Deviant Insider
Auron
Posts: 4663
Joined: Sun Jun 13, 2004 6:08 am
Location: Chicagoland
Contact:

Re: Difficulty Metrics

Post by Deviant Insider » Mon Aug 26, 2019 4:21 pm

Search quiz bowl packet archives is sometimes better for this sort of thing because it has more sets to draw from. The drawbacks are that it takes a lot of effort to see the prior uses and to sort things out when two things have the same name.
David Reinstein
PACE VP of Outreach, Head Writer and Editor for Scobol Solo and Masonics (Illinois), TD for New Trier Scobol Solo and New Trier Varsity, Writer for NAQT (2011-2017), IHSSBCA Board Member, IHSSBCA Chair (2004-2014), PACE Member, PACE President (2016-2018), New Trier Coach (1994-2011)

touchpack
Rikku
Posts: 350
Joined: Tue Sep 20, 2011 12:25 am

Re: Difficulty Metrics

Post by touchpack » Mon Sep 02, 2019 11:52 pm

My go-to when writing jazz questions for assessing the difficulty of albums is looking at the number of reviews on Allmusic. The logic is similar to your Goodreads example--it gets you an approximate comparative measurement of how many people are actually listening to X/Y/Z album.
Billy Busse
Illinois '14
President, ACF
Writer/Subject Editor/Set Editor, NAQT

Post Reply