Complements, Substitutes, AI and the Way Forward

One of the most popular blogposts on this blog is one that I wrote over five years ago: a simple explainer post about complements and substitutes.

It’s part of the arsenal of an economist, an understanding of the concept of substitutes and complements, and it is useful in many surprising and unexpected ways. But never has its use been as important as it is in understanding the importance, the threat and the advantages of AI. A video that I have often linked to in the past, and will probably link to many times again helps make this point clear:

When Steve Jobs says computers are like bicycles for the mind, he is saying that our mind becomes more powerful when we work with computers, rather than instead of them (substitutes) or infinitely worse, without them (almost all examinations conducted in higher education in India today).

And if you want to think about your career in this brave new world of ours, you really should be thinking about working with computers. Not against, or without. As it turns out, this is surprisingly hard to do for most of us. I invite you to walk into a higher education institute of your choice and listen to professors talk about how many students are copying during examinations. Nobody seems to ask why it is right and appropriate to check how good students are at doing work without computers. Why is this a skill that we’re building for folks who will be working in the 21st century?

And if you are learning how to work “effectively” without a computer – and again, that is what we train you for when we make you write three hour pen-and-paper examinations in higher education – you are destroying your ability to earn more in the future.

I’m being quite serious.

The key questions will be: Are you good at working with intelligent machines or not? Are your skills a complement to the skills of the computer, or is the computer doing better without you? Worst of all, are you competing against the computer?

Cowen, Tyler. Average is over: Powering America beyond the age of the great stagnation. Penguin, 2013.

A lot of people are scared about job losses as a consequence of the rapid development of AI, and with good reason. AI can today do quite a few jobs better than humans can, and more than its current capabilities, what keeps a lot of us up at night is the rate of improvement. Not only is AI very good already, but it is noticeably better than it was last year. And for the pessimists among us, the scarier part is that not only will AI be even better next year, but the rate of improvement will also improve. That is, the improvement in AI’s abilities will not only be more in 2023 compared to 2022, but the difference between 2023 and 2022 will be higher than was the difference in 2022 compared to 2021. And that will be true(er) for 2025, and for 2026 and, well, there’s no telling where we’re headed.

But this is exactly why studying economics helps! Because both Steve Jobs and Tyler Cowen are, in effect, saying the same thing: so long as you plan your career by using computers/AI as a complement, you’re going to be just fine. If you think of your job as being substitutable – or if your job is, or will be, substitutable by a computer – well then, yes, you do have problems.

An underappreciated point is the inherent dynamism of this problem. While your job may not yet be a substitute for AI, that is no reason to assume that it will not be substitutable forever:


For example: is Coursera for Campus a complement to my teaching or a substitute for it? There are many factors that will decide the answer to this question, including quality, price and convenience among others, and complementarity today may well end up being substitutability tomorrow. If this isn’t clear, think about it this way: cars and drivers were complementary goods for decades, but today, is a self-driving car a complement or a substitute where a driver is concerned?

https://atomic-temporary-112243906.wpcomstaging.com/2022/04/18/supply-and-demand-complements-and-substitutes-and-dalle-e-2/

But even so, I find myself being more optimistic about AI, and how it can make us more productive. I haven’t come across a better explainer than the one that Ethan Mollick wrote about in a lovely post called Four Paths to the Revelation:

I think the world is divided into two types of people: those obsessed with what creative AI means for their work & future and those who haven’t really tried creative AI yet. To be clear, a lot of people in the second category have technically tried AI systems and thought they were amusing, but not useful. It is easy to be decieved, because we naturally tend try out AI in a way that highlights their weaknesses, not their strengths.
My goal in this post is to give you four experiments you can do, in less than 10 minutes each, with the free ChatGPT, in order to understand why you should care about it.

https://oneusefulthing.substack.com/p/four-paths-to-the-revelation

All four examples in this post are fantastic, but the third one is particularly relevant here. Ethan Mollick walks us through how AI can:

  1. Give you ideas about what kind of business you might be able to set up given your skills
  2. Refines a particular idea that you would like to explore in greater detail
  3. Gives you next steps in terms of actualyl taking that idea forward
  4. And even writes out a letter that you might want to send out to potential business collaboarators

His earlier posts on his blog also help you understand how he himself is using ChatGPT3 in his daily workflow. He is a professor, and he helps you understand what a “mechanical” professor might be able to do

To demonstrate why I think this is the case, I wanted to see how much of my work an AI could do right now. And I think the results will surprise you. While not nearly as good as a human professor at any task (please note, school administrators), and with some clear weaknesses, it can do a shocking amount right now. But, rather than be scared of AI, we should think about how these systems provide us an opportunity to help extend our own capabilities

https://oneusefulthing.substack.com/p/the-mechanical-professor (emphasis added)

Note the same idea being used here – it really is all about compementarity and substitutability.

AI can already create a syllabus and refine it; it can create an assignment and refine it; it can create a rubric for this assignment; it can create lecture notes; and it can write a rap song about a business management concept to make the content more interesting for students. I loathe the time spent in creating documentation around education (every single teacher does) and it would take me a long time to come up with even a halfway possible rap song about substitutes and complements.

That last statement is no longer true: it took me twenty seconds.

Here are examples from outside the field of academia:

The question to ask isn’t “how long before I’m replaced?. The question to ask is “what can I do with the time that AI has saved me?”. And the answer to that question should show that you are thinking deeply about how you can use (and continue to use!) AI as a useful complement.

If you don’t think about this, then yes, I do think that you and your job are in trouble. Get thinking!

Complements, Substitutes and Examinations

Writing all of what I wrote in February 2020 was a lot of fun, and gave rise to a series of interesting, and interlinked ideas.

In today’s essay, I want to explore one of these interlinked ideas: I want to riff on the concept made famous by Steve Jobs: the computer as a bicycle for the mind. But with an Econ 101 twist to the topic!

I’ve already linked to the video where Steve Jobs speaks about this, but just in case you haven’t seen it, here’s the video:

As I mentioned in the post “Apple Through Five Articles“, Steve Jobs was essentially saying that the computer is a complementary good for the mind: that the mind becomes far more powerful, far more useful as a tool when used in conjunction with a computer.

A complement refers to a complementary good or service used in conjunction with another good or service. Usually, the complementary good has little to no value when consumed alone, but when combined with another good or service, it adds to the overall value of the offering. A product can be considered a compliment (sic) when it shares a beneficial relationship with another product offering, for example, an iPhone complements an app.

One way to understand Apple is to understand that Jobs effectively ensured that Apple built better and better computers. Apple has continued to do that even after Jobs has passed on, but they’ve been building computers all along. You can call them Macs and iPhones and iPads and Apple Watches, but they’re really computers.

But that’s not the focus of this piece. The focus of this piece is to think about this as an economist. If the mind is made more useful when it is able to complement the processing power of the computer, then the world is obviously more productive now that many more minds are being complemented with many more computers. I writing this piece on my laptop, and you reading it on your device is the most appropriate example – or so we shall assume.

But viewed this way, I would argue that we get the design of most of our examinations wrong. Rote memorization, or “mugging up” is still the default method for evaluating whether a student has learnt a particular subject. Mugging up is just another way of saying that we need to substitute for the computer, not complement it!

When we reject open book examinations, when we reject the ability to write a paper using laptops/tablets that are connected to the internet, when we force students to substitute for computers, rather than use them to write better, richer, more informed answers, we’re actively rejecting the analogy of the bicycle for the mind.

To say nothing, of course, of the irrelevance of forcing people to write examinations for three hours using pen and paper. But that’s a topic for another day.

Right now, suffice it to say that when it comes to examinations in India, Steve Jobs would almost certainly have not approved.

Bottom line: If computers are a complement, our examinations are incorrectly designed, and we end up testing skills that are no longer relevant.

And the meta-skill you might take away from this essay is the fact that a lot of ideas in economics are applicable in entirely surprising and unexpected areas!

I hope some of you disagree, and we can argue a bit about this. I look forward to it! 🙂

How do you interact with your computer?

“Alexa, play Hush, by Deep Purple.”

That’s my daughter, all of six years old. Leave aside for the moment the pride that I feel as a father and a fan of classic rock.

My daughter is coding.


My dad was in Telco for many years, which was what Tata Motors used to call itself  back in the day. I do not remember the exact year, but he often regales us with stories about how Tata Motors procured its first computer. Programming it was not child’s play – in fact, interacting with it required the use of punch cards.

I do not know if it was the same type of computer, but watching this video gives us a clue about how computers of this sort worked.


The guy in the video, the computer programmer in Telco and my daughter are all doing the same thing: programming.

What is programming?

Here’s Wikiversity:

Programming is the art and science of translating a set of ideas into a program – a list of instructions a computer can follow. The person writing a program is known as a programmer (also a coder).

Go back to the very first sentence in this essay, and think about what it means. My daughter is instructing a computer called Alexa to play a specific song, by a specific artist. To me, that is a list of instructions a computer can follow.

From using punch cards to using our voice and not even realizing that we’re programming: we’ve come a long, long way.


It’s one thing to be awed at how far we’ve come, it is quite another to think about the path we’ve taken to get there. When we learnt about mainframes, about Apple, about Microsoft and about laptops, we learnt about the evolution of computers, and some of the firms that helped us get there. I have not yet written about Google (we’ll get to it), but there’s another way to think about the evolution of computers: we think about how we interact with them.

Here’s an extensive excerpt from Wikipedia:

In the 1960s, Douglas Engelbart’s Augmentation of Human Intellect project at the Augmentation Research Center at SRI International in Menlo Park, California developed the oN-Line System (NLS). This computer incorporated a mouse-driven cursor and multiple windows used to work on hypertext. Engelbart had been inspired, in part, by the memex desk-based information machine suggested by Vannevar Bush in 1945.

Much of the early research was based on how young children learn. So, the design was based on the childlike primitives of eye-hand coordination, rather than use of command languages, user-defined macro procedures, or automated transformations of data as later used by adult professionals.

Engelbart’s work directly led to the advances at Xerox PARC. Several people went from SRI to Xerox PARC in the early 1970s. In 1973, Xerox PARC developed the Alto personal computer. It had a bitmapped screen, and was the first computer to demonstrate the desktop metaphor and graphical user interface (GUI). It was not a commercial product, but several thousand units were built and were heavily used at PARC, as well as other XEROX offices, and at several universities for many years. The Alto greatly influenced the design of personal computers during the late 1970s and early 1980s, notably the Three Rivers PERQ, the Apple Lisa and Macintosh, and the first Sun workstations.

The GUI was first developed at Xerox PARC by Alan Kay, Larry Tesler, Dan Ingalls, David Smith, Clarence Ellis and a number of other researchers. It used windows, icons, and menus (including the first fixed drop-down menu) to support commands such as opening files, deleting files, moving files, etc. In 1974, work began at PARC on Gypsy, the first bitmap What-You-See-Is-What-You-Get (WYSIWYG) cut & paste editor. In 1975, Xerox engineers demonstrated a Graphical User Interface “including icons and the first use of pop-up menus”.[3]

In 1981 Xerox introduced a pioneering product, Star, a workstation incorporating many of PARC’s innovations. Although not commercially successful, Star greatly influenced future developments, for example at Apple, Microsoft and Sun Microsystems.

If you feel like diving down this topic and learning more about it, Daring Fireball has a lot of material about Alan Kay, briefly mentioned above.

So, as the Wikipedia article mentions, we moved away from punch cards, to using hand-eye coordination to enter the WIMP era.

It took a genius to move humanity into the next phase of machine-human interaction.


https://twitter.com/stevesi/status/1221853762534264832

The main tweet shown above is Steven Sinofsky rhapsodizing about how Steve Jobs and his firm was able to move away from the WIMP mode of thinking to using our fingers.

And from there, it didn’t take long to moving to using just our voice as a means of interacting with the computers we now have all around us.

Voice operated computing systems:

That leaves the business model, and this is perhaps Amazon’s biggest advantage of all: Google doesn’t really have one for voice, and Apple is for now paying an iPhone and Apple Watch strategy tax; should it build a Siri-device in the future it will likely include a healthy significant profit margin.

Amazon, meanwhile, doesn’t need to make a dime on Alexa, at least not directly: the vast majority of purchases are initiated at home; today that may mean creating a shopping list, but in the future it will mean ordering things for delivery, and for Prime customers the future is already here. Alexa just makes it that much easier, furthering Amazon’s goal of being the logistics provider — and tax collector — for basically everyone and everything.


Punch cards to WIMP, WIMP to fingers, and fingers to voice. As that last article makes clear, one needs to think not just of the evolution, but also about how business models have changed over time, and have caused input methods to change – but also how input methods have changed, and caused business models to change.

In other words, understanding technology is as much about understanding economics, and strategy, as it is about understanding technology itself.

In the next Tuesday essay, we’ll take a look Google in greater detail, and then about emergent business models in the tech space.

 

Apple through five articles

I’ve said it before and I’ll say it again. If you are really and truly into the Apple ecosystem, you could do  a lot worse than just follow the blog Daring Fireball. I mean that in multiple ways. His blog is one of the most popular (if not the most popular) blog on all things Apple. Popularity isn’t a great metric for deciding if reading something is worth your time, but in this case, the popularity is spot on.

But it’s more than that: another reason for following John Gruber’s blog is you learn about a trait that is common to this blog and to Apple: painstaking attention to detail. Read this article, but read especially footnote 2, to get a sense of what I mean. There are many, many examples of Apple’s painstaking attention to detail, of course, but this story is one of my favorites.

There is no end to these kind of stories, by the way. Here’s another:

Prior to the patent filing, Apple carried out research into breathing rates during sleep and found that the average respiratory rate for adults is 12–20 breaths per minute. They used a rate of 12 cycles per minute (the low end of the scale) to derive a model for how the light should behave to create a feeling of calm and make the product seem more human.

But finding the right rate wasn’t enough, they needed the light to not just blink, but “breathe.” Most previous sleep LEDs were just driven directly from the system chipset and could only switch on or off and not have the gradual glow that Apple integrated into their devices. This meant going to the expense of creating a new controller chip which could drive the LED light and change its brightness when the main CPU was shut down, all without harming battery life.

Anybody who is an Android/PC user instead (such as I), can’t help but be envious of this trait in Apple products.

There are many, many books/videos/podcasts you can refer to to understand Apple’s growth in the early years, and any one reference doesn’t necessarily mean the others are worse, but let’s begin with this simple one. Here’s Wikipedia on the same topic, much more detail, of course.

Before it became one of the wealthiest companies in the world, Apple Inc. was a tiny start-up in Los Altos, California. Co-founders Steve Jobs and Steve Wozniak, both college dropouts, wanted to develop the world’s first user-friendly personal computer. Their work ended up revolutionizing the computer industry and changing the face of consumer technology. Along with tech giants like Microsoft and IBM, Apple helped make computers part of everyday life, ushering in the Digital Revolution and the Information Age.

The production function and complementary goods are two topics that every student of economics is taught again and again. Here’s how Steve Jobs explains it:

 

You can’t really learn the history of Apple without learning about John Sculley firing Steve Jobs.

According to Sculley’s wishes, Steve Jobs was to represent the company externally as a new Apple chairman without influencing the core business. As Jobs got wind of these plans to deprive him of his power, he tried to arrange a coup against Sculley on the Apple board. Sculley told the board: “I’m asking Steve to step down and you can back me on it and then I take responsibility for running the company, or we can do nothing and you’re going to find yourselves a new CEO.” The majority of the board backed the ex-Pepsi man and turned away from Steve Jobs.

And the one guy who helped Steve Jobs achieve his vision for Apple once Jobs came back was, of course, Jony Ive. This is a very, very long article but a fun read, not just about the relationship between Ive and Jobs, but also about Ive and Apple. Jony Ive no longer works at Apple of course (well, kinda sorta), but you can’t understand Apple without knowing more about Ive.

Jobs’s taste for merciless criticism was notorious; Ive recalled that, years ago, after seeing colleagues crushed, he protested. Jobs replied, “Why would you be vague?,” arguing that ambiguity was a form of selfishness: “You don’t care about how they feel! You’re being vain, you want them to like you.” Ive was furious, but came to agree. “It’s really demeaning to think that, in this deep desire to be liked, you’ve compromised giving clear, unambiguous feedback,” he said. He lamented that there were “so many anecdotes” about Jobs’s acerbity: “His intention, and motivation, wasn’t to be hurtful.”

Apple has bee, for most of its history, defined by its hardware. That still remains true, for the most part. But where does Apple News, Apple Music, Apple TV fit in?

Apple, in the near future will be as much about services as it is about hardware, and maybe more so. That’s, according to Ben Thompson, the most likely (and correct, in his view) trajectory for Apple.

CEO Tim Cook and CFO Luca Maestri have been pushing the narrative that Apple is a services company for two years now, starting with the 1Q 2016 earnings call in January, 2016. At that time iPhone growth had barely budged year-over-year (it would fall the following three quarters), and it came across a bit as a diversion; after all, it’s not like the company was changing its business model