Learn at Twitter Speed, Get Tested at AOL Speed

The title of today’s post is directly lifted from an MR Post from yesterday, which you should read in its entirety.

Instead of hearing a rumor at the coffee shop and running down to the bank branch to wait on line to withdraw your money, now you can hear a rumor on Twitter or the group chat and use an app to withdraw money instantly. A tech-friendly bank with a highly digitally connected set of depositors can lose 25% of its deposits in hours, which did not seem conceivable in previous eras of bank runs.
But the other part of the problem is that, while depositors can panic faster and banks can give them their money faster, the lender-of-last-resort system on which all of this relies is still stuck in a slower, more leisurely era. “When the user interface improves faster than the core system, it means customers can act faster than the bank can react,” wrote Byrne Hobart. You can panic in an instant and withdraw your money with an app, but the bank can’t get more money without a series of phone calls and test trades that can only happen during regular business hours.


Try this variant on for size:

Instead of hearing about a concept in a classroom, and running to the library to get access to the book that explains it in greater detail, now you can hear about a concept on Twitter, or the group chat, and use ChatGPT to learn all about it instantly. A tech-friendly classroom with a highly digitally connected group of learners can learn much more about a topic in a couple of hours, which did not seem conceivable in previous learning environments.
But the other part of the problem is that, while learners can learn faster and LLM’s can give them additional nuance and context much better, the exam system on which all of this ultimately relies for certifications is still stuck in a slower, more traditional era. “When the learning environment improves faster than the testing environment, it means learners can learn better than colleges can meaningfully test them,” wrote a grumpy old blogger. You can learn much more about a topic in a semester than you ever could before, but the college will still insist on making you memorize stuff so that you can choose five questions out of six to answer in a closed-book-pen-and-paper examination.

It’s not an exact analogy, of course. But there are two points to this blogpost:

  1. Where colleges and universities are concerned, this is a useful framework to deploy. And sure I had fun tweaking that excerpt in order to maximize my snarkiness – but I’m not joking about the point being made. When students are able to learn far better, far more effectively and far faster, but the testing environment doesn’t keep up with either the learning or its applications, it is a problem. Simply put, if teaching and learning with LLM’s is best, but the college thinks that testing without access to LLM’s is best, there’s a disconnect.
  2. The broader point, of course, is that you should be applying this framework to everything. Banks and colleges, sure. What about government (at all levels)? What about software companies? What about delivery apps? What about <insert the place you work at here>? Which parts of your organization are already using LLM’s in their workflows, or will sooner rather than later? Which parts will be the most reluctant, and therefore the last to adopt to this brave new world? What imbalances might result? How should we incentivize the rate of adoption such that we optimize appropriately?

Note that this doesn’t necessarily mean incentivizing those reluctant to adopt! You might want to incentivize a slower adoption of ChatGPT, if that’s what you think is best (and yes, that goes for colleges too). But if that’s the route you’re going to go down, think first about the competition. And note that in the age of LLM’s, defining who your competition is isn’t as easy as it used to be.

Leave a Reply