Our Job Is To Help Them Make Something Of It

Now, after more than a year out of the classroom, Wataru, 16, has returned to school, though not a normal one. He and around two dozen teenagers like him are part of the inaugural class of Japan’s first e-sports high school, a private institution in Tokyo that opened last year.
The academy, which mixes traditional class work with hours of intensive video game training, was founded with the intention of feeding the growing global demand for professional gamers. But educators believe they have stumbled onto something more valuable: a model for getting students like Wataru back in school.


I came across this article in the New York Times, and found it to be fascinating. Wataru, the sixteen year old mentioned in the article, had dropped out of school after the pandemic, because “he was getting nothing from school”. He preferred to stay at home and play video games the whole day.

This school though, the one featured in the article, is a school in which you’re taught competition strategies for games such as Fortnite and Valorant. Or you might be given – and this was my favorite sentence in the article – “a scientific lecture about the relative merits of Street Fighter characters”. And it’s not just theory, of course – post this lecture, the students then formed groups to put the lesson into action.

This is what a classroom looks like:


If you’re curious, and are able to speak and understand the language, here’s what the infrastructure of the school looks like – it has forty Galleria XA7C-R37 gaming PC’s. The curriculum includes the following genres of video games: FPS, third-person shooter, RTS and MOBA. I don’t know what these genres are, for I don’t play video games all that much.

But I applaud the initiative, and hope it scales, both within Japan and in other parts of the world.

You may ask why I applaud a school that teaches students how to play video games. And my answer is that I’m actually quite agnostic about how an educational institute is weird. All I ask is that it be sufficiently weird in at least one way. This particular school is weird about video games, but what about schools that are weird in other ways? What about a school that teaches you about dancing, for example?

Lynne’s gift for dancing was discovered by a doctor. She had been underperforming at school, so her mother took her to the doctor and explained about her fidgeting and lack of focus. After hearing everything her mother said, the doctor told Lynne that he needed to talk to her mother privately for a moment. He turned on the radio and walked out. He then encouraged her mother to look at Lynne, who was dancing to the radio. The doctor noted that she was a dancer, and encouraged Lynne’s mother to take her to dance school


And if you’ve been tempted to sneer while reading about these newfangled ideas about alternate education – “video games and dancing in schools! Hmph, whatever next?!” – note that the first story is from December 2022, while the other story is from sometime in the 1930’s. Everything with Sir Ken Robinson in it is always worth watching, but this video is a particularly fascinating one. Gillian Lynne’s story comes on at around the 15 minute mark, if you’d rather not watch the whole thing, but I hope you do.

But whether it is video games today or dancing a century ago – or whatever else might be around a hundred years from now, for that matter – the point isn’t about how young people learn best. Well, it is, but the first point that all of us would do well to internalize is that everybody learns differently.

And the idea that everybody learns best by sitting in a classroom and listening to a person drone on for hours on end is one that has been rejected by students year after year after year. But because it is cheap, scalable and easy to endlessly replicate, it is now a part of our culture. To the extent that we will think of students who are unable to be a part of this dreary ritual as being not normal.

Of course they’re not normal, none of them are. They’re special, in their own way, as all of us are. That was the message in the brilliant talk given by Sir Ken Robinson. That everybody is talented in their own way.

And his call to action at the end of the talk is the title of today’s blogpost.

Our job isn’t to browbeat our students into downcast and sullen obedience and compliance. Our job is to figure out what motivates them to learn, by figuring out their special talent.

And then to help them make something of it.

Learning in the Age of AI

How should one think about learning in the age of AI?

That is, if you are a student in a class today, how can you use AI to make your experience of being a student better?

  1. Use AI to create work, but learn how to work with it to make it better: In my experience of having spoken with people about AI, it has been a bit of a bi-modal distribution. There are folks (and I’m very much one of them) who think of ChatGPT as a fantastic tool whose potential to be useful is only going to grow over time. And there are folks who triumphantly assert that AI simply isn’t good enough, citing examples of hallucinations, not-good-enough answers or sub-standard essays. All of these arguments are good arguments against AI, but the last one in particular can be easily overcome by providing better prompts, and by suggesting improvements. “Write a seven paragraph essay on India’s economic reforms of 1991” is a barely acceptable prompt to give it, for example. Mention specific people, events and dates that you might want it to mention in the essay, ask it to revise certain paragraphs in the essay, ask it to write “like” a certain person, mention the conclusion you would like it to reach – spend time with it to make it better.
    All of my suggestions – and this is important! – require the student to know enough about the topic to be able to make these suggestions. You need to think about the prompt, you need to critically evaluate the first-pass answer, and you need to know enough to suggest suitable improvements. AI can take away the drudgery associated with polishing an essay, but it will still (so far) require you to know what you’re talking about. A student’s life is much more interesting today, rather than easier.
  2. Ask it to teach you stuff you didn’t understand: Small class sizes aren’t really a feature of most Indian colleges, in my experience. The idea that you will have five to ten students in class, and will therefore be able to have meaningful, extensive discussions about your doubts in class is a far fetched one in most Indian colleges. So treat AI as a very helpful research assistant who will be able to explain to you your doubts about a particular topic. This can very quickly become too addictive a practice, because the AI will be able to carry out a much more detailed conversation about literally any topic you can think of than most (all?) of your peers. Converse with humans about your conversations with AI, and figure out a ratio that works for you. But corner solutions (of both kinds) are almost certainly sub-optimal.
  3. Check it’s “facts”: You will run into trouble if you accept it’s output as the gospel truth. It asserts facts that simply don’t exist, it will cite papers that it has made up on the spot and it will confidently tell you about books that were never written by people who’ve never existed. It is not about to replace search engines – in fact, search engines have become more useful since the launch of ChatGPT, not less.
  4. Use specialized AI tools: Of which there are hundreds, if not thousands. You can use AI to cite papers (Scite.ai), to design presentations (beautiful.ai), create simple animations (look it up) and so much more besides. Don’t restrict yourself to any one tool, and learn how to get better at improving all aspects of your workflow.
  5. Document your work with AI, and make it public: Create a very public repository of work that you have created with AI, and share how you’ve become better at working with AI. Your career depends on your ability to do this, and on your ability to teach other people to do this – so the more the evidence regarding this is in your favor, the stronger your argument for your own career. Begin early, and don’t be shy about showing the world what you’ve done, and how good a worker you are with AI by your side.

The Game Theory of Bazball

The term has its own Wikipedia article now!

Bazball is an informal cricketing term coined during the 2022 English cricket season. Bazball commonly refers to the style of play of the England national cricket team after the appointments of Brendon McCullum as Test cricket head coach, and Ben Stokes as England Test cricket captain, by English cricket managing director Rob Key, in May 2022. The Bazball style and mindset is said to have an emphasis on taking positive decisions in attack and defence, whether batting or in the field.


The article is worth reading in full, especially if you are a fan of cricket. But how does one think about Bazball if one is both a fan of cricket and of game theory?

  1. First, you can have fun defining what Bazball is, but what it has brought to the table where England is concerned is not up for debate. 10 wins out of 11 since the era has started, a victory in Pakistan that is still hard to believe, and the second fastest declaration in history – and there’s a lot many more records to look up apart from these. Whatever it is, it is working – so far.
  2. One way to think about Bazball is to argue that it is the same style of play that has worked so well for England in the case of limited overs cricket. So why not bring the same fearless approach into test cricket too? And on the basis of the evidence thus far, why not indeed?
  3. You could argue that Brendon McCullum is in effect hastening what would have been an inevitable process in the medium/long term. Is it safe to say that Cheteshwar Pujara is the last of his breed when it comes to Indian batsmen? Will all test playing nations have batsmen who are more naturally aggressive in five to ten years time? If yes, England just got there sooner under Stokes and McCullum.
  4. So the other teams must play catch-up, correct? They must respond by utilizing the same no-fear-no-holds-barred approach. Bazball, in other words, but the amped-up version. Beat ’em by getting better than ’em at their own game. That’d be one option, sure…
  5. But there were two ways to out-Pep Pep at the start of the previous decade. I’m talking football now, but you could either try and get even better at possession based football than the OG’s, or you could go the Mourinho route. Think about the Barcelona Inter Milan semis, for example. Similarly, you could try and out-Baz Bazball, or you could go in the opposite direction and play ultra defensively.
  6. If you want to go the out-Baz Bazball route, it’ll be great for the spectators, and one will get to see high-octane series with a lot of risks being taken by both teams. But there will be teams that will lose a game too many by adopting the extremely risky route, and such teams might adapt by toning down their level of risk tolerance. You’ll see risk-taking approaches go through cycles before hitting upon some sort of an equilibrium.
  7. If you want to go the conservative route instead, you might push teams that go down the Bazball route taken even more risks in response. This may work, in which case these teams will be even more incentivized to go further down the high-risk path. Or it may not, in which case these teams may tone down down their gung-ho approach a bit. But again, you’ll see risk-taking approaches go through cycles.
  8. Football has gone through many such cycles in its past, and this is a great book to read in this regard.
  9. As a fan of cricket, and as a student of game theory, it will be fascinating to see how this plays out in cricket, especially in the context of shortening attention-spans, the increasing popularity of T20 leagues, and the preferences of players to play ‘T20 style’.
  10. Get game theory out of the classroom, and into whatever fields you like to think about. Sports is just one example. But a subject like game theory comes alive when it helps you understand real-life situations better. And as a cricket fan, I can think of very few examples better than analyzing Bazball and its game theoretic implications!

Matt Parker on the Greatest Maths Mistakes

Pranay Kotasthane on the defence budget

… which, of course, makes it self-recommending:

Some Days Are Diamonds

… and as the poet tells us, some days are stones.

Today, in the case of yours truly, is one of the latter ones. The daughter has been sniffling, coughing and battling a fever for the last three days, and while she is now much better (thank god), she has now passed the fever on to me.

But that’s not the reason today is a stone. The reason today is a stone is because I didn’t schedule a post for 10 am today. I’ve been on a bit of a good run – best as I can tell, the last time I missed posting was on the 30th of October last year, and while that isn’t great if the aim is to post daily, it certainly is better relative to the recent past.

And naturally, this is not a streak I would like to give up on. The sensible thing to do is to have some buffer posts ready, that can be deployed on days such as these. If I’m not up to sitting in front of a computer, filtering stuff I’ve read and deciding what to write about – and I’m really not up to it today – then I should be able to dip into my pitaara and schedule something that I’ve written in the past.

The good news is that I have 12 drafts waiting that will turn into good posts whenever I get around to finishing them. The bad news is that not one of them is complete. I teach economics for a living, but my real calling is procrastination.

Today’s post was going to be my notes from having read an article that I both enjoyed reading closely, and discussing with my students in class at the Gokhale Institute. I’m teaching behavioral economics this semester, and the essay in question has a lot of great points to think about in the context of biases and irrationality. I may come back to it in a later blog post, but for now, I’ll link to it, and leave as a snippet this lovely excerpt:

I’ve been tweeting about irrationality since 2017, and in that time I’ve noticed a disturbing pattern. Whenever I post of a cognitive bias or logical fallacy, my replies are soon invaded by leftists claiming it explains rightist beliefs, and by rightists claiming it explains leftist beliefs. In no cases will someone claim it explains their own beliefs. I’m likely guilty of this too; it feels effortless to diagnose others with biases and fallacies, but excruciatingly hard to diagnose oneself. As the famed decision theorist Daniel Kahneman quipped, “I’ve studied cognitive biases my whole life and I’m no better at avoiding them.”


And may I just say that the universe is rather good at trolling? I followed the author of this essay that I’m talking about on Twitter, and here’s a tweet that he recently retweeted:

Yes, yes, ok, fine.

On the Art of Doing and Communicating Research

Akshay Alladi had a lovely response to my post from yesterday:

There is a follow-up tweet to the tweet I’ve linked to above, please read it as well.

Today’s post is a placeholder post, in a manner of speaking. I want to expand upon each of the points I am going to make below. But for the moment, a simple list of five things I was reminded of when I read Akshay’s tweet.

  1. “Sure it works in practice, but does it work in theory?” is a joke, but as with all good jokes, it is funny because it is true.
  2. Research work is hard, and the greater the body of work that precedes it, the more incremental it tends to be. Or should I be saying ‘needs to be’?
  3. To what extent is the second point an inevitability given the depth of research in a particular field, and to what extent is it a rational response to your incentives and your choice set?
  4. Related to the last point, explained well towards the end of the same post: how do we incentivize folks to be bolder when it comes to taking up research questions, and what are the opportunity costs of doing so?
  5. Related to the fourth point now, and something that I would like to focus on myself: how do we get better at disseminating research, both within a particular community and to the world at large?

On Bryan Caplan’s Question

Bryan Caplan asked an interesting question on Twitter the other day:

I’m not sure if you can see what I voted for, but I am fairly sure you could have guessed!

But if you think about it a little bit, it’s actually not all that easy to answer this question. As with all good questions, there are nuances and layers. First, I asked my excellent research assistant to tell me more about what the phrase “epistemic value” might mean:

The phrase “epistemic value” refers to the degree to which a belief or piece of information contributes to our knowledge or understanding of the world. Epistemology is the branch of philosophy that concerns itself with the nature and scope of knowledge, and epistemic value is a term that is often used in discussions within this field.
Beliefs or pieces of information that have high epistemic value are those that are reliable, accurate, and well-supported by evidence. Conversely, beliefs or pieces of information that have low epistemic value are those that are unreliable, inaccurate, or poorly supported by evidence.
In general, beliefs or pieces of information that have high epistemic value are more valuable to us than those that have low epistemic value, because they are more likely to help us understand the world and make well-informed decisions.


I’ve always thought of the word epistemic as ‘ about knowledge’, or ‘about knowing’. And the phrase ‘epistemic value’ seems to be – as expected – very similar. But that phrase in the first sentence of my RA’s answer is worth thinking about a little bit: ‘the degree to which a belief or piece of information contributes to our knowledge or understanding of the world’.

Now that makes the answer to Bryan’s question more difficult! To the extent that the top empirical papers of the last 5-10 years have been written using economic principles (and I see no reason to assume that this is not the case), they end up telling us more about the world than the principles that they have used. This has literally got to be true by definition.

They might tell us conditions under which, for a given geography and a given time period, these principles are upheld (or not). They might quantify the relationship between two variables, again for a given geography and a given time period. They might explain apparent violations of these principles, and explain why such phenomena occur.

To give you a simple, but completely hypothetical example – imagine a paper that examines how people change their consumption patters in the face of high inflation in America in the year 2021. That wouldn’t make it a ‘top empirical econ paper’ of course – but leave that be for the moment. Standard econ theory will predict that consumers will respond to changing prices by changing their patterns of consumption (more fillers in your burger patties rather than only meat, for example). And this hypothetical paper of ours not only confirms this, but also quantifies it for a geography (USA) for a particular time period (2021).

But that would mean that the epistemic value of this paper is actually more than the wisdom of standard intro econ textbooks. Because your standard econ text will tell you that this will happen, but the textbook will be unable to tell you the extent to which this will be true for a given country for a given time period. That ‘piece of information’ re: the extent is an addition to the knowledge that we get from a standard econ text, and so by definition the paper has more epistemic value.

Whoops. I chose the first option in Bryan’s tweet, and it would seem that I am wrong.

Or am I?

For the year 2021, for the nation of the United States of America, the epistemic value of that paper is higher than the wisdom of a standard intro econ textbook. But the wisdom of the standard econ intro text will apply to all other geographies in the world in the year 2021, and will apply for all geographies for all years to come.

The paper wins in the specific, but becomes by definition inapplicable in the case of the general.

Time to roll out one of my favorite guns from my arsenal:

What are you optimizing for?

If you want more epistemic value for a given space and a given time, option 3 from Bryan’s tweet. If you want more epistemic value in general, option 1 from Bryan’s tweet.

And for having written this post, my own answer still remains option 1, but this post helps me understand that I should ask one of my favorite questions more often.

What say you?

Older Adults Should Let Younger Adults Be Adults

The title of today’s post is a slightly longer version of a tweet written in response to Nitin Pai’s excellent article on just this topic:

Why do I think Nitin’s article is an excellent one? I’ll happily admit to my bias – I think it to be excellent because I happen to wholeheartedly agree with it. As always, do read the whole thing, which is about a whole lot more than what I’m going to talk about in today’s post.

What am I going to talk about? These two paragraphs:

One of our recent interns told me that she had to get her parent’s permission every time she wanted to step out of her campus. The college was more than 2,000km away from where her parents lived. But this was not a problem at all. The students had a gate pass app on their smartphones that would send a request to their parents’ smartphones, whose approval would be relayed to the security guards’ smartphones, and the gate would open (or remain closed, depending on what kind of parent you had). It did not matter that she was a smart, adult law student—without Mom’s permission, she couldn’t leave the campus.
As a parent, I am of course concerned about the safety of my children. But I am unable to fathom how an adult who can legally sign a contract, take a loan, have sex, get married, drive a truck, fly a plane, fight a war and vote in elections cannot leave the college campus without parental permission.


Student’s marksheets being shared with parents, parental approval being required before students can leave the campus, attendance records of students being shared with parents – as Nitin says, it is time we stop infantilizing our young adults. My specific point in today’s blogpost – this is especially true and relevant on college campuses.

Me, I personally happen to be of the opinion that attendance should not be mandatory in classrooms. It is my job as a teacher to make the class interesting enough for students to want to attend. It is not the student’s ‘duty’ to attend 75% (or any other number) of the classes. Fun question for you to ponder today: is a minimum attendance requirement a minimum support price regime for us professors? What does microeconomics teach us about price floors and price ceilings?

But regardless of whether or not you agree with my point re: attendance, the consequences of not attending classes should be the sole responsibility of the adult in question. And the adult in question is the person in college, not their parents. You could argue that it is the parents – usually, in an Indian context – who stump up the fees, so they have a ‘right’ to know. But that is a conversation between the student and their parents, and I do not think the college need intervene.

The many other points that Nitin makes in his post regarding other nuances of this topic are also worth reading. But the point that resonated with me the most was the one I wanted to emphasize in this post: if you’re 18 and in college, this country thinks you’re old enough to elect its leaders. Surely then this country also ought to treat them as adults in all other respects. For if you’re deemed far too immature to decide for yourself if you should bunk classes or not, surely you are not mature enough to vote in an election. No?

And therefore I say: old adults should let younger adults be adults.

On Specificity and Sensitivity

Before the pandemic came along, it was relatively more difficult to get students to be truly interested in the topic of specificity and sensitivity. And in a sense, understandably so. By that I do not mean the topic is not important – it absolutely is – but rather that I can understand why eyes may glaze over just a little bit:

Sensitivity and specificity mathematically describe the accuracy of a test which reports the presence or absence of a condition. If individuals who have the condition are considered “positive” and those who don’t are considered “negative”, then sensitivity is a measure of how well a test can identify true positives and specificity is a measure of how well a test can identify true negatives


But when we’ve all got skin in the game, it’s a whole other story.

“We’re going to learn all about specificity and sensitivity today” is one way to begin a class on the topic.

“Let’s say you self-administered a Rapid Antigen Test in 2020, and the test came back positive. Do you have Covid or not?” is another.

Incentives matter!

I’ve linked to this thread before, but it is worth sharing once again, for it remains the best way to quickly grok both what specificity and sensitivity are, but also to get a sense of how to untangle the two in your own head:

Why do I bring this up today? Because now that we’re past the pandemic, how do we now motivate students to learn about specificity and sensitivity?

By asking, as it turns out, if we’d prefer detection systems to pick up on more objects in the sky (sensitivity), or get better at picking up only the relevant objects in the sky (specificity)!

After the transit of the spy balloon this month, the North American Aerospace Defense Command, or NORAD, adjusted its radar system to make it more sensitive. As a result, the number of objects it detected increased sharply. In other words, NORAD is picking up more incursions because it is looking for them, spurred on by the heightened awareness caused by the furor over the spy balloon, which floated over the continental United States for a week before an F-22 shot it down on Feb. 4.


To a statistician, it doesn’t matter if it’s objects in the sky or objects in your body. The principle remains the same, and it is the principle that you should internalize as a student. But also, it is equally important that you ask yourself a very important, and a very underrated question once you’ve learned the principle in question:

Where else is this applicable?

I cannot begin to tell you how much more interesting things become when you ask and answer this question. UFO’s and viruses in your body – what a class in statistics this would be!