Apple through five articles

I’ve said it before and I’ll say it again. If you are really and truly into the Apple ecosystem, you could do  a lot worse than just follow the blog Daring Fireball. I mean that in multiple ways. His blog is one of the most popular (if not the most popular) blog on all things Apple. Popularity isn’t a great metric for deciding if reading something is worth your time, but in this case, the popularity is spot on.

But it’s more than that: another reason for following John Gruber’s blog is you learn about a trait that is common to this blog and to Apple: painstaking attention to detail. Read this article, but read especially footnote 2, to get a sense of what I mean. There are many, many examples of Apple’s painstaking attention to detail, of course, but this story is one of my favorites.

There is no end to these kind of stories, by the way. Here’s another:

Prior to the patent filing, Apple carried out research into breathing rates during sleep and found that the average respiratory rate for adults is 12–20 breaths per minute. They used a rate of 12 cycles per minute (the low end of the scale) to derive a model for how the light should behave to create a feeling of calm and make the product seem more human.

But finding the right rate wasn’t enough, they needed the light to not just blink, but “breathe.” Most previous sleep LEDs were just driven directly from the system chipset and could only switch on or off and not have the gradual glow that Apple integrated into their devices. This meant going to the expense of creating a new controller chip which could drive the LED light and change its brightness when the main CPU was shut down, all without harming battery life.

Anybody who is an Android/PC user instead (such as I), can’t help but be envious of this trait in Apple products.

There are many, many books/videos/podcasts you can refer to to understand Apple’s growth in the early years, and any one reference doesn’t necessarily mean the others are worse, but let’s begin with this simple one. Here’s Wikipedia on the same topic, much more detail, of course.

Before it became one of the wealthiest companies in the world, Apple Inc. was a tiny start-up in Los Altos, California. Co-founders Steve Jobs and Steve Wozniak, both college dropouts, wanted to develop the world’s first user-friendly personal computer. Their work ended up revolutionizing the computer industry and changing the face of consumer technology. Along with tech giants like Microsoft and IBM, Apple helped make computers part of everyday life, ushering in the Digital Revolution and the Information Age.

The production function and complementary goods are two topics that every student of economics is taught again and again. Here’s how Steve Jobs explains it:

 

You can’t really learn the history of Apple without learning about John Sculley firing Steve Jobs.

According to Sculley’s wishes, Steve Jobs was to represent the company externally as a new Apple chairman without influencing the core business. As Jobs got wind of these plans to deprive him of his power, he tried to arrange a coup against Sculley on the Apple board. Sculley told the board: “I’m asking Steve to step down and you can back me on it and then I take responsibility for running the company, or we can do nothing and you’re going to find yourselves a new CEO.” The majority of the board backed the ex-Pepsi man and turned away from Steve Jobs.

And the one guy who helped Steve Jobs achieve his vision for Apple once Jobs came back was, of course, Jony Ive. This is a very, very long article but a fun read, not just about the relationship between Ive and Jobs, but also about Ive and Apple. Jony Ive no longer works at Apple of course (well, kinda sorta), but you can’t understand Apple without knowing more about Ive.

Jobs’s taste for merciless criticism was notorious; Ive recalled that, years ago, after seeing colleagues crushed, he protested. Jobs replied, “Why would you be vague?,” arguing that ambiguity was a form of selfishness: “You don’t care about how they feel! You’re being vain, you want them to like you.” Ive was furious, but came to agree. “It’s really demeaning to think that, in this deep desire to be liked, you’ve compromised giving clear, unambiguous feedback,” he said. He lamented that there were “so many anecdotes” about Jobs’s acerbity: “His intention, and motivation, wasn’t to be hurtful.”

Apple has bee, for most of its history, defined by its hardware. That still remains true, for the most part. But where does Apple News, Apple Music, Apple TV fit in?

Apple, in the near future will be as much about services as it is about hardware, and maybe more so. That’s, according to Ben Thompson, the most likely (and correct, in his view) trajectory for Apple.

CEO Tim Cook and CFO Luca Maestri have been pushing the narrative that Apple is a services company for two years now, starting with the 1Q 2016 earnings call in January, 2016. At that time iPhone growth had barely budged year-over-year (it would fall the following three quarters), and it came across a bit as a diversion; after all, it’s not like the company was changing its business model

From Mainframes to Personal Computing: The Journey

From mainframes to desktops, from desktops to laptops, from laptops to phones, and from phones to watches. So far. As I said in the previous edition of Tech Tuesdays, my daughter doesn’t think of Alexa as a computer, but what is Alexa if not one?

We are an empowered species today, for most of us – not all, to be sure, but most of us – carry around with us more computing power than was used to send people to the moon. Not only do we take it for granted, but the programming itself is done at such a high level that we aren’t even aware that we’re programming a machine.

For example: when my daughter says to Alexa, “Set a timer for five minutes”, she’s really programming a computer to emit a series of beeps in three hundred seconds, starting now. But this wasn’t always the case. There was a time when people were excited about the fact that they could get a machine home into which they would have to laboriously (by our current standards) input a series of instructions for it to do certain things.

What kind of machines were these? Who made them, for what reason? What changed in terms of ease of use, design, and available accessories – and with what results for us, as society?

In today’s set of five links, we take a look at the answers to some of these questions.

  1. “At the time that IBM had decided to enter the personal computer market in response to Apple’s early success, IBM was the giant of the computer industry and was expected to crush Apple’s market share. But because of these shortcuts that IBM took to enter the market quickly, they ended up releasing a product that was easily copied by other manufacturers using off the shelf, non-proprietary parts. So in the long run, IBM’s biggest role in the evolution of the personal computer was to establish the de facto standard for hardware architecture amongst a wide range of manufacturers. IBM’s pricing was undercut to the point where IBM was no longer the significant force in development, leaving only the PC standard they had established. Emerging as the dominant force from this battle amongst hardware manufacturers who were vying for market share was the software company Microsoft that provided the operating system and utilities to all PCs across the board, whether authentic IBM machines or the PC clones.”
    ..
    ..
    The excerpt above comes a long way into the Wikipedia article, and the correct way to read this article, if you ask me, is to scan through it, rather than read every single word. But the excerpt, for an economist, is the most interesting part, for it explains how Microsoft became Microsoft – because of an ill-thought out strategy by IBM!
    ..
    ..
  2. “Although the company knew that it could not avoid competition from third-party software on proprietary hardware—Digital Research released CP/M-86 for the IBM Displaywriter, for example—it considered using the IBM 801 RISC processor and its operating system, developed at the Thomas J. Watson Research Center in Yorktown Heights, New York. The 801 processor was more than an order of magnitude more powerful than the Intel 8088, and the operating system more advanced than the PC DOS 1.0 operating system from Microsoft. Ruling out an in-house solution made the team’s job much easier and may have avoided a delay in the schedule, but the ultimate consequences of this decision for IBM were far-reaching.”
    ..
    ..
    As economists, we’re interested in understanding the fact that we have the power to be vastly more productive now that we all have our own personal computers, sure, but we’re also interested in finding out why firms who were the giants of their time (lookin’ at you, IBM) didn’t make the transition over to being the giants of the personal computing era. We’re interested in this in and of itself, of course, but also so that we can apply these lessons to the giants of our time.
    ..
    ..
  3. “The 90-minute presentation essentially demonstrated almost all the fundamental elements of modern personal computing: windows, hypertext, graphics, efficient navigation and command input, video conferencing, the computer mouse, word processing, dynamic file linking, revision control, and a collaborative real-time editor (collaborative work). Engelbart’s presentation was the first to publicly demonstrate all of these elements in a single system. The demonstration was highly influential and spawned similar projects at Xerox PARC in the early 1970s. The underlying technologies influenced both the Apple Macintosh and Microsoft Windows graphical user interface operating systems in the 1980s and 1990s.”
    ..
    ..
    I learnt about this only while researching links for this series: the mother of all demos that inspired, essentially, what we know as personal computing today. Fascinating stuff!
    ..
    ..
  4. “Now that computers were up and running with Microsoft’s operating system, the next step was to build tools to streamline the user experience. Today, it’s hard to imagine a world where computers didn’t run programs such as Microsoft Word, PowerPoint and Excel.These “productivity applications,” as Microsoft calls them, were revolutionary tools for getting work done. They automated many of the aspects of word processing, accounting, creating presentations and more. Plus, Microsoft’s deal with Apple allowed it to develop versions of these programs for Macintosh computers.Over the years, Microsoft has provided updates to the Office Suite, from additional programs (such as Outlook and Access) to additional features.”
    ..
    ..
    Yes, this is a listicle, but a useful, mostly informative one. The excerpt above comes midway through the article, and the rest of the article speaks about Microsoft’s attempted move towards becoming a hardware focused firm, and the subsequent move towards being, well, a software focused firm under Nadella. We’ll be focusing on Microsoft next Tuesday, so consider this an appetizer.
    ..
    ..
  5. “The iPhone’s potential was obviously deep, but it was so deep as to be unfathomable at the time. The original iPhone didn’t even shoot video; today the iPhone and iPhone-like Android phones have largely killed the point-and-shoot camera industry. It has obviated portable music players, audio recorders, paper maps, GPS devices, flashlights, walkie-talkies, music radio (with streaming music), talk radio (with podcasts), and more. Ride-sharing services like Uber and Lyft wouldn’t even make sense pre-iPhone. Social media is mobile-first, and in some cases mobile-only. More people read Daring Fireball on iPhones than on desktop computers.”
    ..
    ..
    John Gruber rhapsodizing about  a whole variety of things, but mostly about how the iPhone was the culmination of the long journey that began with the move away from mainframes. It is exhilarating to realize how far we’ve come! Two weeks from now, we’ll also take a look at Apple’s long journey.

Tech: Understanding Mainframes Better

My daughter, all of six years old, doesn’t really know what a computer is.

Here’s what I mean by that: a friend of hers has a desktop in her bedroom, and to my daughter, that is a computer. My laptop is, well, a laptop – to her, not a computer. And she honestly thinks that the little black disk that sits on a coffee table in our living room is a person/thing called Alexa.

How to reconcile – both for her and for ourselves – the idea of what a computer is? The etymology of the word is very interesting – it actually referred to a person! While it is tempting to write a short essay on how Alexa has made it possible to complete the loop in this case, today’s links are actually about understanding mainframes better.

Over the next four or five weeks, we’ll trace out the evolution of computers from mainframes down to, well, Alexa!

  1. “Several manufacturers and their successors produced mainframe computers from the late 1950s until the early 21st Century, with gradually decreasing numbers and a gradual transition to simulation on Intel chips rather than proprietary hardware. The US group of manufacturers was first known as “IBM and the Seven Dwarfs”: usually Burroughs, UNIVAC, NCR, Control Data, Honeywell, General Electric and RCA, although some lists varied. Later, with the departure of General Electric and RCA, it was referred to as IBM and the BUNCH. IBM’s dominance grew out of their 700/7000 series and, later, the development of the 360 series mainframes.”
    ..
    ..
    Wikipedia’s article on mainframes contains a short history of the machines.
    ..
    ..
  2. “Mainframe is an industry term for a large computer. The name comes from the way the machine is build up: all units (processing, communication etc.) were hung into a frame. Thus the maincomputer is build into a frame, therefore: MainframeAnd because of the sheer development costs, mainframes are typically manufactured by large companies such as IBM, Amdahl, Hitachi.”
    ..
    ..
    This article was written a very long time ago, but is worth looking at for a simple explanation of what mainframes are. Their chronology is also well laid out  – and the photographs alone are worth it!
    ..
    ..
  3. “Although only recognized as such many years later, the ABC (Atanasoff-Berry Computer) was really the first electronic computer. You might think “electronic computer” is redundant, but as we just saw with the Harvard Mark I, there really were computers that had no electronic components, and instead used mechanical switches, variable toothed gears, relays, and hand cranks. The ABC, by contrast, did all of its computing using electronics, and thus represents a very important milestone for computing.”
    ..
    ..
    This is your periodic reminder to please read Cixin Liu. But also, this article goes more into the details of what mainframe computers were than the preceding one. Please be sure to read through all three pages – and again, the photographs alone are worth the price of admission.
    ..
    ..
  4. A short, Philadelphia focussed article that is only somewhat related to mainframes, but still – in my opinion – worth reading, because it gives you a what-if idea of the evolution of the business. Is that really how the name came about?! (see the quote about bugs below)
    ..
    ..
    “So Philly should really be known as “Vacuum Tube Valley,” Scherrer adds: “We want to trademark that.” He acknowledged the tubes were prone to moths — “the original computer bugs.”
    ..
    ..
  5. I’m a sucker for pictures of old technology (see especially the “Death to the Mainframe” picture)

Tech: Links for 5th November, 2019

  1. “Wearable technology, wearables, fashion technology, tech togs, or fashion electronics are smart electronic devices (electronic device with micro-controllers) that can be incorporated into clothing or worn on the body as implants or accessories.”
    ..
    ..
    Wikipedia on wearables.
    ..
    ..
  2. Wearbles are bigger than you thought.
    ..
    ..
    “Wearables are now bigger than iPad and will soon be bigger than the Mac. And the glasses are supposedly coming next year, and the $250 AirPods Pro just shipped.”
    ..
    ..
  3. You’ve heard of Google Glass, presumably. But uh, one ring to rule ’em all…?
    ..
    ..
    “Amazon is experimenting with putting Alexa everywhere, and its latest experiment might be the wildest yet: a new smart ring called the Echo Loop that puts Alexa on your finger.”
    ..
    ..
  4. “And it goes without saying that the technology still matters: chips need to get faster (a massive Apple advantage), batteries need to improve (also an Apple specialty), and everything needs to get smaller. This, though, is the exact path taken by every piece of hardware since the advent of the industry. They are hard problems, but they are known problems, which is why smart engineers solve them. ”
    ..
    ..
    The ever excellent Ben Thompson, writing about wearables in 2016. He was bullish then, and I suspect will be even more bullish now.
    ..
    ..
  5. All of which, I hope, will help contextualize Google’s latest acquisition.

RoW: Links for 16th October, 2019

Five links about the NBA, China and the United States of America

  1. “Apple removed an app late Wednesday that enabled protesters in Hong Kong to track the police, a day after facing intense criticism from Chinese state media for it, plunging the technology giant deeper into the complicated politics of a country that is fundamental to its business.”
    ..
    ..
    The NYT gives us useful background about the topic…
    ..
    ..
  2. “But Apple is in a particularly difficult position, due to the company’s success in China: Unlike several other big consumer tech companies, which either do little business in China or none at all, Apple has thrived in China. The country is Apple’s third-biggest market, which generates some $44 billion a year in sales. And Apple’s supply chain, which lets it produce the hundreds of millions of iPhones it sells around the world each year, is deeply embedded in China.”
    ..
    ..
    Recode explains the perils of integrating too successfully with China in terms of both backward linkages as well as final sales (that’s a loaded statement, worthy of a deeper analysis!)
    ..
    ..
  3. “This morning brings new and exciting news from the land of Apple. It appears that, at least on iOS 13, Apple is sharing some portion of your web browsing history with the Chinese conglomerate Tencent. This is being done as part of Apple’s “Fraudulent Website Warning”, which uses the Google-developed Safe Browsing technology as the back end. This feature appears to be “on” by default in iOS Safari, meaning that millions of users could potentially be affected.”
    ..
    ..
    Via John Gruber, over at Daring Fireball (please follow that blog!), a somewhat unsurprising, yet depressing revelation.
    ..
    ..
  4. “I am not particularly excited to write this article. My instinct is towards free trade, my affinity for Asia generally and Greater China specifically, my welfare enhanced by staying off China’s radar. And yet, for all that the idea of being a global citizen is an alluring concept and largely my lived experience, I find in situations like this that I am undoubtedly a child of the West. I do believe in the individual, in free speech, and in democracy, no matter how poorly practiced in the United States or elsewhere. And, in situations like this weekend, when values meet money, I worry just how many companies are capable of choosing the former?”
    ..
    ..
    Ben Thompson provides useful background and an even more useful overview of the larger picture.
    ..
    ..
  5. “Daryl Morey wrote a pro-Hong Kong tweet and had to retract it, and then both the Rockets and the NBA had to eat crow. ESPN — part of the Disney empire I might add — has given only tiny, tiny coverage to the whole episode, even though it is a huge story on non-basketball sites. I’ve been checking the espn/nba site regularly over the last 24 hours, and there is one small link in the upper corner, no featured story at all.”
    ..
    ..
    And finally, Tyler Cowen explains how incentives always and everywhere matter.

Tech: Links for 2nd July, 2019

Five articles from tech, but about something that took place about twelve years ago.

  1. “One of the most important trends in personal technology over the past few years has been the evolution of the humble cellphone into a true handheld computer, a device able to replicate many of the key functions of a laptop. But most of these “smart phones” have had lousy software, confusing user interfaces and clumsy music, video and photo playback. And their designers have struggled to balance screen size, keyboard usability and battery life.”
    ..
    ..
    Thus began Walt Mossberg’s review of the first ever iPhone. That review is fun to read in order to understand how far smartphones have come since then, and we we took for granted then, and do now.
    ..
    ..
  2. “With the iPhone XS and Apple Neural Engine, the input isn’t an image, it’s the data right off the sensors. It’s really kind of nuts how fast the iPhone XS camera is doing things in the midst of capturing a single image or frame of video. One method is to create an image and then apply machine learning to it. The other is to apply machine learning to create the image. One way Apple is doing this with video is by capturing additional frames between frames while shooting 30 FPS video, even shooting 4K. The whole I/O path between the sensor and the Neural Engine is so fast the iPhone XS camera system can manipulate 4K video frames like Neo dodging bullets in The Matrix.”
    ..
    ..
    That was then, this is now – well, this is also last year. John Gruber on how far we’ve come – he reviews the iPhone XS, and reading both reviews one after the other points to how far we’ve come.
    ..
    ..
  3. “…I’m not convinced that anyone at Google fully thought through the implication of favoring Android with their services. Rather, the Android team was fully committed to competing with iOS — as they should have been! — and human nature ensured that the rest of Google came along for the ride. Remember, given Google’s business model, winning marketshare was perfectly correlated with reaping outsized profits; it is easy to see how the thinking and culture that developed around Google’s core business failed to adjust to the zero-sum world of physical devices. And so, as that Gundotra speech exemplified, Android winning became synonymous with Google winning, when in fact Android was as much ouroboros as asset.”
    ..
    ..
    It’s not just technology that changed then – entire ecosystems and business models had to be changed, updated, pilfered. Microsoft, obviously, but most significantly, Google.
    ..
    ..
  4. “There’s that word I opened with: “future”. As awesome as our smartphones are, it seems unlikely that this is the end of computing. Keep in mind that one of the reasons all those pre-iPhone smartphone initiatives failed, particularly Microsoft’s, is that their creators could not imagine that there might be a device more central to our lives than the PC. Yet here we are in a world where PCs are best understood as optional smartphone accessories.I suspect we will one day view our phones the same way: incredibly useful devices that can do many tasks better than anything else, but not ones that are central for the simple reason that they will not need to be with us all of the time. After all, we will have our wearables.”
    ..
    ..
    One risk that all of us run is to think of the future in terms of what exists now – which is one reason why 2007 was such big news for tech and then for all of us. What might a similar moment be in the near future? Earlier, you had to have a computer, and it was nice to have a smartphone. Now, you have to have a smartphone, and it is nice to have a computer. When might it be nice to have a smartphone, while you have to have a ‘wearable’?
    ..
    ..
  5. “The Defense Advanced Research Projects Agency (DARPA) is developing a new “Molecular Informatics” program that uses molecules as computers. “Chemistry offers a rich set of properties that we may be able to harness for rapid, scalable information storage and processing,” Anne Fischer, program manager in DARPA’s Defense Sciences Office, said in a statement. “Millions of molecules exist, and each molecule has a unique three-dimensional atomic structure as well as variables such as shape, size, or even color. This richness provides a vast design space for exploring novel and multi-value ways to encode and process data beyond the 0s and 1s of current logic-based, digital architectures.” ”
    ..
    ..
    Not just the rather unimaginable (to me, at any rate) thought of molecules as computers (did I get that right?!), but also a useful timeline of how calendars have evolved. Also note how the rate of “getting better” has gotten faster over time!