
Grammy Award-winning composer Winifred Phillips is one of the most successful and accomplished composers for video games, known for her compositional skill and stylistic diversity. Her dramatic works convey an “epic cinematic sound” (ScreenSounds), while her lighthearted scores are “captivating” (Washington Post). She has composed iconic music for eight major gaming franchises: Assassin’s Creed, God of War, Jurassic World, Lineage, LittleBigPlanet, The Sims, Total War, and Wizardry.
Phillips won a GRAMMY Award for Wizardry: Proving Grounds of the Mad Overlord and received a BAFTA nomination for Sackboy: A Big Adventure. Her scores have earned the D.I.C.E. Award for Outstanding Music Composition, six Game Audio Network Guild Awards, and four Hollywood Music in Media Awards. Beyond games, her music has been featured in trailers for Avengers: Endgame and performed worldwide as part of the Assassin’s Creed Symphony World Tour. She is the author of the bestselling book A Composer’s Guide to Game Music (MIT Press).
You’ve composed for more than 50 video games — from God of War to LittleBigPlanet and Wizardry. What first sparked your interest in music composition, and how did that path lead you into the world of video games?
I’ve been passionate about composing music for as long as I can remember! My first gig as a media composer was at National Public Radio, where I worked as the composer and sound designer for a series of dramas called Radio Tales. I had an awesome time working on that series. It focused on adaptations of literary classics like The Tell-Tale Heart, The War of the Worlds, 20,000 Leagues Under the Sea, The Odyssey, The Yellow Wallpaper, Gulliver’s Travels, and so many more. Each episode featured wall-to-wall music and sound design, so it kept me very busy. By the end, the series had over 100 episodes, and we’d won multiple awards.
Eventually, I had to think about what would come next. Having spent so many years creating music for larger-than-life stories, I’d grown to love the epic scope and grandeur of that kind of storytelling. I’d been a gamer since childhood, but it hadn’t occurred to me that music for games could become a career until after Radio Tales wrapped. Once the idea came to me, it made perfect sense, especially considering the kind of music composition work I’d been doing up to that point. I started reaching out to video game publishers and developers, introducing myself and hoping to make connections. I happened to connect with a Sony Interactive Entertainment music producer at just the right time —they were assembling a composer team for the very first God of War game. Since I already had experience composing music for large-scale epic narratives, I could demonstrate my skills, and they brought me on as a member of the composer team for that project.
When you begin working on a new game, what’s your creative process like? Where do you start shaping the emotional sound of that world?
Composing for games is unique. The composer must think about players constantly during the creative process. Their behavior and choices are central to the experience, and every element of the game —including the music —exists to support and accentuate players’ personal agency and freedom.
My creative process starts with the music design, which is closely linked to the gameplay design. Will the music adapt and change based on player choices? If so, how extensive will those changes be? Depending on the ambition of the interactivity design, a composer often has to think of music as modular —able to be broken down, reassembled, and recombined in multiple ways. I start by understanding the plan for musical interactivity, then channel my creative energies through this structural framework to create an emotional journey for players. Will the music build in intensity through its modular components? Can thematic development and leitmotifs be integrated?
Once I answer these questions, I dive into research, examining game design documents to understand the core experience the developers are creating. I then expand this research to cover any additional knowledge or tools I may need. For some projects, like God of War, Assassin’s Creed Liberation, and Wizardry, the narrative includes a ton of historical or mythological elements, so my research focuses on musical techniques, instruments, and styles that bring those periods to life. For more contemporary or futuristic projects, I explore genres suited to the game’s setting, characters, or gameplay mechanics. I also dive into background research that can help me envision the game world —this kind of exploration fuels my creativity and keeps me energized while I work.
Finally, I begin composing and recording tracks. This is where the fun begins. At this stage, my focus is on staying true to the music design and creative vision while channeling all my creativity into crafting a musical world that supports and enhances the game.
Video games offer a unique challenge for composers — the music must evolve and respond to player choices. What do you love most about composing for interactive worlds compared to film or television?
Composing interactive music is endlessly fascinating. Since the nature of musical interactivity in games evolves constantly, my creative process must adjust to the changes. It’s a cycle of reinvention that keeps my work fresh as I move from project to project.
Storytelling in traditional linear media is well-established, with long-standing methods and accepted practices. Music for video games, however, constantly reinvents itself to keep pace with the rapidly changing structure of the games. I’m challenged repeatedly to find new ways to solve musical problems tied to gameplay implementation. In many ways, music composition for games is much more difficult than for film and television — but that challenge forces me to innovate. There’s never a dull moment, and I value the chance to redefine myself and my work with each project.
Your music can range from the sweeping orchestral intensity of God of War to the joyful energy of Sackboy: A Big Adventure. Where do you draw inspiration and how do you approach switching between such dramatically different tones and styles?
I’ve been fortunate to compose for projects with wide stylistic range. I recognize not all composers are afforded this opportunity, and in thinking, it may be due to how my game composition career began. Around the time I joined the God of War team, I was also hired by High Voltage Software to compose the music for the Charlie and the Chocolate Factory game, a tie-in with the Tim Burton film. The music needed to capture that iconic gothic whimsy associated with Tim Burton projects.
God of War and Charlie and the Chocolate Factory released within four months of each other, and as the first two games on my resume, they couldn’t have been more different. They both helped define my style in opposite directions and opened doors to similar future projects. I’ve enjoyed exploring the darker textures in franchises like God of War, Jurassic World, Lineage, Assassin’s Creed, and Total War, while also moving into lighter projects like LittleBigPlanet, The Sims, Spore, and tie-ins like Shrek the Third and Speed Racer. Since my days composing for Radio Tales, I’ve moved between comedy and tragedy –scoring lively comedies like Gulliver’s Travels and darker pieces like The Fall of the House of Usher. Switching between bright and dark projects feels very natural to me.
Wizardry: Proving Grounds of the Mad Overlord earned you multiple awards, including a GRAMMY. How did you approach reimagining a legendary RPG soundtrack for the modern players?tensity of God of War to the joyful energy of Sackboy: A Big Adventure. Where do you draw inspiration and how do you approach switching between such dramatically different tones and styles?
Wizardry was thrilling because I wasn’t reimagining anything; I was hired to compose a brand-new soundtrack. The developers at Digital Eclipse faced a complex and ambitious challenge: how could the original 1981 Wizardry game be adapted using Unreal Engine to appeal to modern players? Digital Eclipse approaches adaptations with immense respect for source material, aiming to bring classic games to new audiences while staying faithful to the original experience.
Complicating things, the original game had no musical score — just a little beep when players bumped into a wall on the Apple II computer. Later adaptations added gameplay, environments, and music, but those changes came years later. Digital Eclipse was bringing the 1981 game to modern players, so they focused on staying true to the original content while adding new elements to enhance the experience for a contemporary audience.
My job was to create music that evoked RPG nostalgia while delivering visceral drama, honoring the tabletop roleplaying roots of Wizardry while intensifying the medieval adventure with orchestral depth. I’m immensely proud that my score for Digital Eclipse’s version of Wizardry won a GRAMMY this year — it was the culmination of a dream project.
Technology has changed how music is made, from adaptive scores to spatial audio. How do you see innovation shaping the next generation of game music?
It’s hard to imagine the next big leap in game music technology. I wouldn’t have foreseen the innovations that came along in the last decade. Virtual reality (VR) hasn’t revolutionized gameplay, but it certainly accelerated audio technology development and sped up the process of innovation.
In spatial audio, the technology behind binaural recording dates back to the 1800s, while ambisonics has been around since the 1970s. Neither approach initially gained traction in game development, which relied heavily on surround sound for many years. Surround sound is helpful for simple positional audio, but it’s limited to a horizontal ring enfolding the listener. The introduction of the Oculus Rift highlighted this limitation, as the lack of vertical positioning became a real problem. Binaural and ambisonic technologies offered a solution, allowing audio to be positioned in a full sphere around the listener. Even though VR hasn’t become as widespread as expected, the more sophisticated audio techniques developed for VR can be applied to traditional games — and these technological steps fascinate me. I’m excited to see what’s next in game music and sound.
I’m also watching MIDI 2.0 and MIDI Polyphonic Expression technology, which promise more expressive and flexible tools for composers. Game audio implementation continues to evolve through new versions of game audio engines like Wwise, FMOD, and Elias, so I’m interested to see how these software packages shape the future of game music.
You’ve achieved so much across multiple mediums, including video games, film, TV, and live performance. When players hear your music years from now, what do you hope they feel or remember most?
I think of my work primarily as a game composer. By nature, the video game experience is often ephemeral. Games are dependent on their platforms. Many games designed for early consoles and computers have become nearly impossible to access now, as their associated consoles and computers fall into obsolescence. That may be the reason why remakes like Wizardry have become so popular, since they bring back games that had receded into obscurity. Nevertheless, many games don’t have the privilege to be remade and rediscovered.
As a game composer, I know that while the games I’m working on today might disappear, the music has the potential to live on. Music rekindles memories better than almost anything else, and I hope that when players hear my music, it helps them relive the memories of their own video game adventures.