“[But] men may construe things after their fashion/Clean from the purpose of the things themselves” ~ Cicero in Shakespeare’s Julius Caesar
One of the biggest prevailing cultural myths in tech is that world-changing products are created by visionaries endowed with near-mystical powers to see and shape the future. But buying into this myth blinds you to the practical realities of your customers' needs. No one can predict the future, but with a little insight and a healthy dose of practicality, it's entirely possible to discern the needs of the present.
Case in point: the story of what many people consider to be the world's first computer program. But it didn't happen they way you probably imagine. It was written over 100 years ago, by the daughter of a famous Romantic poet.
To tell the story of the world's first computer program, we need to go back to early nineteenth-century England. It was a turbulent nation, one in which the naturalism and fraught emotionalism of the Romantic movement existed, often uncomfortably, alongside the burgeoning Industrial Revolution. After all, these two movements were in many ways philosophically opposed: Romanticism was about poetry and art and individualism; while the rise of industry and mechanization meant the (gleeful) proliferation of steam engines, moving parts, and mass production .
But these worlds collided amicably in a London salon in 1833, when Ada Byron, aka Lady Ada Lovelace -- daughter of the Romantic movement’s greatest poet anti-hero, Lord Byron -- attended a salon given by mathematical whizz Charles Babbage.
Here she saw a demonstration of a prototyped portion of his Difference Engine, a machine that could perform simple polynomial calculations. An amateur mathematician herself, Ada, the story goes, was able to see the mathematical poetry in the machine and its (to a modern eye) striking steampunk construction. Babbage later went on to design the plans for a machine that could perform an even higher order of calculations, the Analytical Engine.
Ada was entranced -- some might say obsessed -- with the possibilities of this Analytical Engine, so much so that she, with Babbage’s encouragement, translated an Italian article about it with copious supplemental original notes. Her notes contained an algorithm which she intended the computing machine to carry out; many computing historians point to this algorithm as the first computer program.
Babbage, however, was not a skilled marketer, and he was unable to convince anyone to fund the building of his theoretical machine. He died in 1871, penniless; while Ada died at age 36 of uterine cancer.
In The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Age, Walter Isaacson says that Babbage, who is commonly credited as the father of the modern computer, was “100 years ahead of his time.” Of Ada, he rhapsodizes that she “envisioned the modern computer” and “was able to glimpse a future in which machines would become partners of the human imagination.”
One can hardly blame Isaacson for succumbing to such a romantic and idealized notion. This sort of story is, after all, the stuff of great novels and epic heroic tales. It’s compelling to imagine that Babbage and Lovelace were two great visionaries, able to accurately peer over 100 years into the future, yet tragically impeded by the blockheads who couldn’t share in their vision.
But the truth is, no one can see into the future -- not even a genius. The past (and our present) is littered with failed products that a so-called “visionary” was convinced would be THE Product of the Future. Humans are simply not wired to be able to take into account the thousands of factors at play to be able to accurately forecast such things. As Harvard professor and psychologist Dan Gilbert points out, we aren’t even any good at predicting our own future happiness, and “no matter how much time we spend thinking about the future, we don't get any better at predicting it.” How then, can we expect to predict the desires of anyone else, especially on a large scale?
But there is something we can do: We can understand the needs of the present.
That’s actually what Charles Babbage and Ada Lovelace were doing, after all: People needed a way to crunch numbers and make computations more easily and more quickly, and they had an idea as to how that job could be done with their beloved mechanical engines. The problem with accomplishing this was twofold: Babbage wasn’t able to make anyone understand why they would want his solution to their needs, and even if he could have it’s unlikely that the technology of the time would have been able to cut the metal moving parts of his Analytical Engine with enough precision and accuracy.
By definition, Babbage and Lovelace were not “ahead of their time.” Ada did not envision our present world of computers and microprocessors. She and her cohort were imagining a solution to an unmet need; they merely lacked the funds and the technology to make it a reality. Were they creative? Yes. Geniuses? Very likely. Oracles? Certainly not.
An Apple a Day
For a more contemporary example of the myth of the so-called visionary vs. the reality of human beings successfully solving problems and creating products for unmet needs, one need look no further than Apple.
In 1977, the Apple hit the market. Built by Steve Wozniak, who envisioned it as a gaming platform, the other Steve (Jobs) perceived the need for a personal computer that required no assembly (unlike the Apple I, which was a kit computer). Perhaps more than any other modern-day innovator, Jobs has been lionized, mythologized, and, by some, even worshiped as a near-deity. In The Innovators, Isaacson states that Jobs wasn’t worried about the competition when IBM entered the PC market because “he had seen the future and was busy inventing it.”
Jobs - like Babbage and Ada before him -- wasn’t a prophet, but he was particularly skilled at discerning the unmet needs of his present-day. Indeed, the Apple II did not ultimately become the widespread phenomenon that it did because of some mysterious visionary power that Jobs possessed, but because another factor allowed it to fulfill more unmet needs for more people on a single platform.
That factor was Visicalc.
Invented by Bob Frankston and Dan Bricklin, Visicalc was the first commercially available, consumer spreadsheet program. The idea for the software came about when the inventors observed a professor filling in calculations on a chalkboard chart. Every time a single calculation was changed, the professor had to go and erase each corresponding figure to make the entire chart correct. What if, the young programmers thought, there was an easier way to do this job?
From the beginning of their endeavor, Bricklin wrote that he and his partner envisioned Visicalc as a product, not a program. They even implemented design principles and constraints: “Decisions were made with the product in mind and, to the extent possible, the programming was towards this end,” Bricklin explains. “The goal was to give the user a conceptual model which was unsurprising -- it was called the principle of least surprise. We were illusionists synthesizing an experience. Our model was the spreadsheet -- a simple paper grid that would be laid out on a table. The paper grid provided an organizing metaphor for a working with series of numbers.”
In its early version, due to memory requirements, Visicalc only worked with the Apple II, and people reportedly plunked down over $1K (equivalent to $5K in 2015) in order to be able to use the (just under) $100 Visicalc software.
Were Jobs, Frankston, and Bricklin sages gifted with the ability to look into the future? No. These two (admittedly ingenious) products, when put together, successfully filled more unmet needs for consumers than either product could have done alone, and did it better than any product had ever done before. And that practicality -- not prescience -- is why the product became so wildly successful in the marketplace.
Need more proof that the myth of the visionary innovator is just that? Let’s talk about robots.
In 1968, Douglas Engelbart presented his NLS online demo, in what has become known as “The Mother of All Demos” to 1,000 computer professionals. Among other things, this marked the debut of a computer mouse, shared-screen collaboration, hypertext, and dynamic file linking. Engelbart began the demo by asking his audience: “If in your office, you as an intellectual worker were supplied with a computer display backed up by a computer that was alive for you all day and was instantly responsive, how much value could you derive from that?”
How much value, indeed.
Meanwhile, down the hall from Engelbart’s demo, another scientist was presenting a film about a robot that could pretend to hear and see things, as reported by John Markhof in What the Dormouse Said. While robots may be the stuff of many a “visionary” predictive future fantasy, the lion’s share of the attention went to Engelbart’s computer demo, a demo that so richly revealed the many unmet needs that could be filled with a computer that is “responsive to all your needs.”
And, no matter how many fantastical versions of the robot the “visionaries” of today and tomorrow create, until and unless they make one that can fill heretofore unmet needs on a large scale, don’t expect to exchange your laptop for a real-life version of R2D2 anytime soon.