Click here to watch Everything is a Remix Part 3. The act of creation is surrounded by a fog of myths. Myths that creativity comes via inspiration. That original creations break the mold, that they’re the products of geniuses, and appear as quickly as electricity can heat a filament. But creativity isn’t magic: it happens by applying ordinary tools of thought to existing materials.
And the soil from which we grow our creations is something we scorn and misunderstand, even though it gives us so much… and that's copying. Put simply, copying is how we learn. We can’t introduce anything new until we’re fluent in the language of our domain, and we do that through emulation.
For instance, all artists spend their formative years producing derivative work.
Bob Dylan’s first album contained eleven cover songs.
Richard Pryor began his stand-up career doing a not-very-good imitation of Bill Cosby.
And Hunter S. Thompson re-typed The Great Gatsby just to get the feel of writing a great novel.
Nobody starts out original. We need copying to build a foundation of knowledge and understanding. And after that... things can get interesting.
After we’ve grounded ourselves in the fundamentals through copying, it’s then possible to create something new through transformation. Taking an idea and creating variations. This is time-consuming tinkering but it can eventually produce a breakthrough.
James Watt created a major improvement to the steam engine because he was assigned to repair a Thomas Newcomen steam engine. He then spent twelve years developing his version.
Christopher Latham Sholes’ modeled his typewriter keyboard on a piano. This design slowly evolved over five years into the QWERTY layout we still use today.
And Thomas Edison didn’t invent the light bulb — his first patent was “Improvement in Electric Lamps“ — but he did produce the first commercially viable bulb... after trying 6,000 different materials for the filament.
These are all major advances, but they’re not original ideas so much as tipping points in a continuous line of invention by many different people.
But the most dramatic results can happen when ideas are combined. By connecting ideas together creative leaps can be made, producing some of history's biggest breakthroughs.
Johannes Gutenberg’s printing press was invented around 1440, but almost all its components had been around for centuries.
Henry Ford and The Ford Motor Company didn’t invent the assembly line, interchangeable parts or even the automobile itself. But they combined all these elements in 1908 to produce the the first mass market car, the Model T.
And the Internet slowly grew over several decades as networks and protocols merged. It finally hit critical mass in 1991 when Tim Berners-Lee added the World Wide Web.
These are the basic elements of creativity: copy, transform, and combine. And the perfect illustration of all these at work is the story of the devices we’re using right now. So let’s travel back to the dawn of the personal computer revolution and look at the company that started it all… Xerox.
Xerox invented the modern personal computer in the early seventies. The Alto was a mouse-driven system with a graphical user interface. Bear in mind that a popular personal computer of this era was operated with switches, and if you flipped them in the right order, you got to see blinking lights. The Alto was way ahead of its time. Eventually Apple got a load of the Alto, and later released not one but two computers with graphical interfaces, the Lisa and its more successful follow-up, The Macintosh.
The Alto was never a commercial product, but Xerox did release a system based on it in 1981, the Star 8010, two years before The Lisa, three years before the Mac. It was the Star and the Alto that served as the foundation for the Macintosh.
The Xerox Star used a desktop metaphor with icons for documents and folders. It had a pointer, scroll bars, and pop-up menus. These were huge innovations and the Mac copied every one of them. But it was the first combination it incorporated that set the Mac on a path towards long-term success.
Apple aimed to merge the computer with the household appliance. The Mac was to be a simple device like a TV or a stereo. This was unlike the Star, which was intended for professional use, and vastly different from the cumbersome command-based systems that dominated the era. The Mac was for the home and this produced a cascade of transformations.
Firstly, Apple removed one of the buttons on the mouse to make its novel pointing device less confusing. Then they added the double-click for opening files. The Star used a separate key to open files. The Mac also let you drag icons around and move and resize windows. The Star didn’t have drag-and-drop — you moved and copied files by selecting an icon, pressing a key, then clicking a location. And you resized windows with a menu. The Star and the Alto both featured pop-up menus, but because the location of these would move around the screen, the user had to continually re-orient. The Mac introduced the menu bar, which stayed in the same place no matter what you were doing. And the Mac added the trash can to make deleting files more intuitive and less nerve-wracking.
And lastly, through compromise and clever engineering Apple managed to pare down the Mac’s price to $2,500. Still pretty expensive but much cheaper than the $10,000 Lisa or the $17,000 Star.
But what started it all was the graphical interface merged with the idea of the computer as household appliance. The Mac is a demonstration of the explosive potential of combinations. The Star and the Alto, on the other hand, are the products of years of elite research and development. They’re a testament to the slow power of transformation. But of course they too contain the work of others. The Alto and the Star are evolutionary branches that lead back to the NLS System, which introduced windows and the mouse, to Sketchpad, the first interactive drawing application, and even back to the Memex, a concept resembling the modern PC decades before it was possible.
The interdependence of our creativity has been obscured by powerful cultural ideas, but technology is now exposing this connectedness. We’re struggling legally, ethically and artistically to deal with these implications — and that’s our final episode, Part 4.
What if Xerox never decided to pursue the graphical interface? Or Thomas Edison found a different trade? What if Tim Berners-Lee never got the funding to develop the World Wide Web? Would our world be different? Would we be further behind?
History seems to tell us things wouldn’t be so different. Whenever there’s a major breakthrough, there’s usually others on the same path. Maybe a bit behind, maybe not behind at all.
Isaac Newton and Gottfried Leibniz both invented calculus around 1684.
The theory of evolution was proposed by Darwin, of course, but Alfred Russel Wallace had pretty much the same idea at pretty much the same time. And Alexander Graham Bell and Elisha Gray filed patents for the telephone on the same day.
We call this multiple discovery — the same innovation emerging from different places. Science and invention is riddled with it, but it can also happen in the arts.
In film, for instance, we had three Coco Chanel movies released within nine months of each other.
Around 1999 we had a quartet of sci-fi movies about artificial reality.
Even Charlie Kaufman’s unusually original, Synecdoche, New York, bears an uncanny resemblance to Tom McCarthy’s novel, Remainder. They’re both the stories of men who suddenly become wealthy and start recreating moments of their lives, even going so far as to recreate the recreations.
And actually, this — the video you’re watching — was written just before the New Yorker published a Malcolm Gladwell story about Apple, Xerox and the nature of innovation.
We’re all building with the same materials. And sometimes by coincidence we get similar results, but sometimes innovations just seem inevitable.