This year, more than 15 million earthlings strummed, drummed and blared along with rock classics on plastic toy instruments. And Tod Machover is the guy who let them do it.
Back in the early ’90s, in his lab at the Massachusetts Institute of Technology, the professor of musicology helped Harmonix founders Alex Rigopulos and Eran Egozy build the software technology that would later become the core of Guitar Hero and Rock Band. By adapting Machover’s basic research in music technology into a videogame application, they taught the world a new way to appreciate rock music.
“Thanks to Guitar Hero and Rock Band, certain musical values and qualities can more easily be explored by laymen,” says Machover. “You don’t need to be a trained musician anymore to appreciate music on a deeper level. It’s such a great invention. I wish I could say I came up with it.”
But Machover’s own merits in the process aren’t to be underestimated. Guitar Hero evolved out of more than two decades’ worth of Machover’s research at the junction of musicology and computer technology. As a young Juilliard-schooled master of musicology, Machover moved to Paris to become a composer-in-residence – and later the Music Research Director – of the Institut de Recherche et Coordination Acoustique/Musique (IRCAM), before returning to the States and occupying the Professor of Music and Media chair at MIT in 1985. During these first years at MIT, he developed a piece of technology that would later become the foundation for all of the principles behind Guitar Hero and Rock Band: the Hyperinstrument.
Machover built several musical instruments, such as a cello, a violin and a piano, with embedded touch sensors on their surface, and connected them to a PC. The software on this computer, written at Machover’s lab, would then pick up on the slightest variation in intonation and attack that a musician exerted over his instrument. This data allowed a computer-generated orchestra to accompany the musician’s work with an appropriate score. If, for instance, the computer noticed the musician was firing up his performance, the virtual orchestra would accentuate this further by layering on some percussion work; when the player toned it down, the orchestra would ease accordingly.
At first, the aim of Machover’s Hyperinstrument research was to offer virtuoso musicians an extra aid to improve their performances. But by the time Rigopulos and Egozy entered his lab as master’s students, the focus of the research had already broadened to a new and nobler goal: making music creation and appreciation more accessible for unskilled players.
The two students quickly got the message, says Machover. In 1994, as the subject for his master’s thesis, Rigopulos developed the Seed Music System, a piece of computer software that converted simple actions like button presses into complex musical sequences. That way playing an instrument could be made so accessible that an unskilled musician would be able to play along with any song he likes.
“There is a large population of these people,” Machover notes. “They think music is important, but they don’t have the skills to play it. And you know why? It’s because, technologically speaking, the interface on which skilled musicians play – a musical instrument – is too primitive.”
With an unsophisticated musical interface, one action, like plucking a guitar string or pressing a piano key, results in only one musical note. Whoever wants to play a sequence of notes must know how to produce the exact sequence of corresponding actions. Together with friend, colleague and computer whiz Egozy, Rigopulos found the solution to that problem: an interface on which one action resulted in a string of notes instead of just one. All the player had to do was relay his musical intentions to the Seed Music System – from there, the intelligent software played sequences of notes according to pre-programmed rules of harmony, rhythm, melody and tempo.
At first, Rigopulos and Egozy tested the system with two joysticks. The player initiated a series of notes according to the direction in which he moved the left joystick, all the while making slight changes to the rhythm and harmony by using the right joystick. After that, the same system was tested again on a modified PC keyboard. “There was no talk of a plastic toy guitar just yet,” Machover says.
That idea came into play about 10 years later, when the two founded Harmonix in order to create music applications based on their MIT research. First, they built their inventions into some rather conventional music games for PC and PlayStation 2 – a few Karaoke games and a music game based on Sony’s EyeToy peripheral – which garnered mediocre sales. Then Rigopulos expanded on an idea that had been playing in his head for years: the guitar game controller.
On the surface, Guitar Hero is as simple as any of the other devices the duo had made under Machover’s tutelage: The player presses one or more colored buttons situated on the neck of the ersatz guitar while simultaneously strumming a simulated string on the body of the instrument. Then the game’s software translates that gesture into a recognizable rock guitar lick.
More fundamentally, however, Guitar Hero turned the principle of Rigopulos’ original Seed Music System upside down. The game requires the player to produce the right intentional, semi-musical actions at the right time in order to stay in line with the music, not the other way around. “The core software of Guitar Hero, based on the Seed Music System, breaks an existing piece of music down into several music sequences,” Machover explains. “It’s still a matter of simple actions resulting in elaborate music sequences, but the player has to press the right buttons.”
After Rigopulos and Egozy released Guitar Hero at the end of 2005, things went kablooey. The game was a sleeper hit, and by the time Guitar Hero II came to market in 2006, they had a multimillion-dollar franchise on their hands. In 2007, they caught the attention of music giant Viacom, who bought Harmonix for $175 million, making both of Machover’s former pupils wealthy men. That same year, they released Rock Band, a competitor with their own Guitar Hero franchise (the rights to which they had conceded to former publisher Activision) that added drums and vocals to the mix. To date, they have shipped more than 13 million units of Rock Band, gradually catching up with the 35 million pieces their first invention sold. The battle is still raging: This season, we’re seeing the likes of Guitar Hero 5, The Beatles: Rock Band, DJ Hero and Lego Rock Band hit store shelves.
While his former pupils are counting the cash, Machover hasn’t been sitting idle, either. With a new batch of students attending his lab, he’s forging new technological concepts that take the premise of his Hyperinstrument research a few miles further: After music appreciation, his focus is now on helping untrained musicians compose their own music and facilitating novice musicians along their learning curve.
There may be a new collaboration with his former master students underway as well. Harmonix is still based in Cambridge, Massachusetts, which is also the home of MIT, and the company shares ownership of two patents with Machover’s lab.
“We’ve kept contact over the years,” says Machover. “Not as much as we’d like to, because of time constraints, but we still share ideas with each other. In fact, we’ve all decided to collaborate on new ideas in the near future. Right now, they’re too busy with their existing products; the videogame industry is moving very fast, and they’re required to keep the pace with their products. But the consumer will want something new soon, and we’ve got some great ideas and concepts lying around. Some of them are stuff they’ve been working on themselves 15 years ago.”
Rigopulos and Egozy have stated that they’re looking at Microsoft’s Project Natal motion camera technology for Rock Band 3 as a way to give players even more ways to convey musical intentions in a videogame. But Machover recalls that Egozy invented something way better himself nearly 15 years ago. By sheer coincidence, he stumbled upon a technology that could detect simple gestures from the amount of electricity that a human body absorbs. In fact, during his years at MIT, he even programmed an application that could read these gestures.
That wasn’t the objective of his research at the time – all he wanted to do was construct a bow violin that could read the musician’s gestures by the use of electrically charged sensors on the upper and lower side of the instrument. But the results were aberrant at best, and he quickly discovered why: The human body jams the readings by absorbing some of the electrical energy. The main research was eventually corrected, and the Las Vegas magician dueo Penn and Teller even used one application of it in their act. But Machover and Egozy saw a second opportunity.
“Based on his initial mistake, we started a new line of research: How much electricity does your body absorb by conducting several kinds of actions?” Machover notes. “Eran used a new set of sensors to measure that and even built a simple musical application for it: a program that can transfer simple movements of the head into musical data.”
If they were to expand upon that line of research, they could build a music application that ditches the plastic instruments altogether. Could there be a Headbang Hero or Air Guitar Hero in the company’s future? “That would be kind of great, wouldn’t it?” says Machover.
Ronald Meeus rocks the nation of Belgium. He is waiting for your mail at email@example.com.