06-17-2018, 07:34 PM
Blog: Composing video game music for virtual reality – Part 3
<div style="margin: 5px 5% 10px 5%;"><img src="http://www.sickgaming.net/blog/wp-content/uploads/2018/06/blog-composing-video-game-music-for-virtual-reality-part-3.jpg" width="800" height="659" title="" alt="" /></div><div><p><strong><i><small> The following blog post, unless otherwise noted, was written by a member of Gamasutra’s community.<br />The thoughts and opinions expressed are those of the writer and not Gamasutra or its parent company. </small></i></strong> </p>
<hr />
<p><img alt="In this article for and about the craft of video game composers, Winifred Phillips is pictured in this photo working in her music production studio." src="http://www.sickgaming.net/blog/wp-content/uploads/2018/06/blog-composing-video-game-music-for-virtual-reality-part-3.jpg" /></p>
<p><strong>By <a href="http://www.winifredphillips.com/" target="_blank">Winifred Phillips</a></strong> | <a href="http://winifredphillips.com/contact.php" target="_blank"><em>Contact</em></a> | <a href="http://www.twitter.com/winphillips" target="_blank"><em>Follow</em></a></p>
<p>So happy you’ve joined us! I’m videogame composer Winifred Phillips. Welcome back to our four part discussion of the role that music plays in Virtual Reality video games! These articles are based on the presentation I gave at this year’s gathering of the famous Game Developer’s Conference in San Francisco. My talk was entitled <a href="http://schedule.gdconf.com/session/music-in-virtual-reality/851396" target="_blank">Music in Virtual Reality</a> (I’ve included the official description of my talk at this end of this article). If you haven’t read the previous two articles, you’ll find them here:</p>
<p>During my GDC presentation, I focused on three important questions for VR video game composers:</p>
<ul>
<li>Do we compose our music in 3D or 2D?</li>
<p> </p>
<li>Do we structure our music to be Diegetic or Non-Diegetic?</li>
<p> </p>
<li>Do we focus our music on enhancing player Comfort or Performance?</li>
<p>
</ul>
<p>While attempting to answer these questions during my GDC talk, I discussed my work on four of my own VR game projects – the <a href="http://bebylon.world/" target="_blank">Bebylon: Battle Royale</a> arena combat game from Kite & Lightning, the <a href="http://www.dragonfront.com/" target="_blank">Dragon Front</a> strategy game from High Voltage Software, the <a href="https://www.oculus.com/experiences/gear-vr/1519842891421902/" target="_blank">Fail Factory</a> comedy game from Armature Studio, and the <a href="http://www.scrapernetwork.com/" target="_blank">Scraper: First Strike</a> shooter/RPG from Labrodex Inc.</p>
<p>In these articles, I’ve been sharing the discussions and conclusions that formed the basis of my GDC talk, including numerous examples from these four VR game projects. So now let’s look at the second of our three questions:</p>
<h2>Do we structure our music to be Diegetic or Non-Diegetic?</h2>
<p><img alt="In this article discussing popular VR issues for video game composers, Winifred Phillips explores an example from one of her game music composition projects - the Dragon Front VR strategy game." src="http://www.sickgaming.net/blog/wp-content/uploads/2018/06/blog-composing-video-game-music-for-virtual-reality-part-3-1.jpg" />Before we launch into this discussion, let’s revisit one of the examples from the previous article. You’ll remember that we took a look at the Main Theme music I composed for the popular Dragon Front VR strategy game, in order to examine how music can best transition from a traditionally 2D stereo delivery to a 3D positional implementation. So in this case, the big victorious anthem that I composed for Dragon Front makes its first appearance as a bombastic stereo mix directly piped into the player’s headphones, and then transitions smoothly to a spatially positioned environmental sound issuing from a small in-game radio. Just as a reminder, let’s take another look at that:</p>
<p>[embedded content]</p>
<p>In this example, we see how the Dragon Front theme music starts as traditional underscore (that is, a non-diegetic score), but then moves into the VR space and becomes a diegetic score – one that is understood to be present in the game world. And that brings us to the second of the three core debates at the heart of music in VR: should music in VR be diegetic or non-diegetic?</p>
<p>It’s a thorny issue. As we know, musical underscore is absolutely vital in gaming – it creates momentum, motivates players and adds emotional texture to the story and the characters. However, in VR, the idea of presence becomes paramount. We want players to feel like they are inside the fiction of an awesome VR world. So, when the non-diegetic music starts playing, we worry that players might stop and wonder, ‘where’s this music coming from? Why am I hearing it?’</p>
<p>The obvious solution is to make all of the music in the game diegetic – somehow, in this VR world, all music comes from in-game sources that players can see in the environment around them. Here’s an example from one of my VR projects – Bebylon: Battle Royale, from developers Kite & Lightning.</p>
<p><img alt="In this article exploring the craft of VR music for video game composers, Winifred Phillips discusses an example from one of her own VR projects - the Bebylon: Battle Royale game for the famous Oculus Rift VR platform." src="http://www.sickgaming.net/blog/wp-content/uploads/2018/06/blog-composing-video-game-music-for-virtual-reality-part-3-2.jpg" />Bebylon is a great example of a completely diegetic score in VR. The whole premise hinges on immortal babies battling it out in over-the-top arena fights in a futuristic setting. Music during gameplay is represented by a group of in-game baby musicians, so the music originates from that source, and we’re able to see this happening in the VR world. So, let’s take a look at that:</p>
<p>[embedded content]</p>
<p>Bebylon: Battle Royale proves that its possible to get away with a completely diegetic score, but we’d need really specific circumstances to justify it. Most games won’t be able to make this approach work. So, what then? I’ve found that there are three strategies to ease non-diegetic music into VR:</p>
<ul>
<li>Keep it subtle and gradual,</li>
<p> </p>
<li>Keep it dry and warm, and</li>
<p> </p>
<li>Keep it both inside and outside the VR world.</li>
<p>
</ul>
<p>So let’s start with the first strategy – subtle and gradual.</p>
<p><img alt="In this article about music for the popular VR platforms (by a video game composer for video game composers) Winifred Phillips describes her work on the Scraper VR shooter/RPG." src="http://www.sickgaming.net/blog/wp-content/uploads/2018/06/blog-composing-video-game-music-for-virtual-reality-part-3-3.jpg" />We’ve already discussed this technique in the first article in this series, when we took a look at the ambient music for Scraper, a first-person VR shooter set inside colossal skyscrapers in a futuristic city. Exploring the massive buildings in the Scraper fictional universe requires a musical soundtrack to set the tone, but introducing it so that it feels natural in VR is a challenge.</p>
<p>In order to address this problem, I composed the ambient music in Scraper so that it would come and go in subtle, gradual ways. As a technique for music implementation in VR, this can be an effective approach. Let’s take another look at what that was like in Scraper:</p>
<p>[embedded content]</p>
<p>While this technique works well for the ambient music, it wasn’t an option for combat. Battles in Scraper are pretty intense – the music begins with a bang and keeps on whaling away until the room is cleared of enemies. At the beginning of the project, we’d decided on a stereo music mix rather than spatialization – considering how important audio cues are to expert first-person-shooter players, we didn’t want a spatialized score to introduce any confusion. My job at that point was to figure out a way to delineate the stereo music mix from the VR world so that the player wouldn’t wonder where the music was coming from.</p>
<p><img alt="An illustration for the famous 'proximity effect' in sound recording - in this article for video game composers, Winifred Phillips explores the role of music in VR." src="http://www.sickgaming.net/blog/wp-content/uploads/2018/06/blog-composing-video-game-music-for-virtual-reality-part-3-4.jpg" />From here, I started thinking about proximity effect – it’s a term relating to microphone recording. You’ll notice proximity effect when someone speaks into a mike while leaning very close to it. The voice starts sounding really bassy and warm in tone, and the mike picks up a lot of the dry source signal, with less of the room acoustics coming through. When you listen with headphones to a recording with lots of proximity effect, it tends to feel like it’s inside your head. I thought – great! If the music is in our heads, we’re not going to be looking around, wondering where it’s coming from.</p>
<p>I recorded the music for Scraper with fairly dry acoustics, and when I mixed the music, I focused on keeping the tone warm and bassy, with a solid low end and some rich mids in the EQ spectrum. Here’s an example of how that worked in combat sequences of the Scraper VR game:</p>
<p>[embedded content]</p>
<p><img alt="The logo of the Fail Factory game for the popular VR platform -- in this article for video game composers, Winifred Phillips explores an example from one of her own VR music composition projects." src="http://www.sickgaming.net/blog/wp-content/uploads/2018/06/blog-composing-video-game-music-for-virtual-reality-part-3-5.jpg" />I also recorded the music of Fail Factory with dry acoustics and a warm, bassy mix – this effect is especially prevalent during the Fail Factory tutorial.</p>
<p>In the Fail Factory Tutorial, the instructor zips around on a hover craft while offering tips and guidelines. In those circumstances, having the music in a dry, warm mix allows it to feel closer to the player, and more separated from the spatialized sounds from the instructor. Let’s check that out:</p>
<p>[embedded content]</p>
<p>So now let’s look at another approach, which I’ve called ‘Inside and Outside.’ If music is 3D – if it’s spatialized – we’re more likely to think it actually exists inside the fictional world. If music is 2D – if it’s a direct stereo mix – we’ll be more likely to accept it as non-diegetic, as outside the experience.</p>
<p><img alt="A depiction of the official logo of the Dragon Front VR game -- in an article written for video game composers, Winifred Phillips (video game composer) explores the role of music in projects for VR projects." src="http://www.sickgaming.net/blog/wp-content/uploads/2018/06/blog-composing-video-game-music-for-virtual-reality-part-3-6.jpg" />Remember the example I showed earlier from Dragon Front – when the main theme music of the game transitioned into a spatialized music source coming from inside the VR space? This is an example of music making the jump from non-diegetic to diegetic, and that can help the player accept the presence of music as a part of the VR game. Watch how players can look around in the Dragon Front hub area, locate the source of the music, and actually turn it off if they want to:</p>
<p>[embedded content]</p>
<p>So we’ve now discussed the second of the three important questions for video game composers creating music for VR games:</p>
<ul>
<li>Do we compose our music in 3D or 2D?</li>
<p> </p>
<li><u><em>Do we structure our music to be Diegetic or Non-Diegetic?</em></u></li>
<p> </p>
<li>Do we focus our music on enhancing player Comfort or Performance?</li>
<p>
</ul>
<p>We’ve contemplated what role our music should play in the VR experience – whether it should be considered a part of the fictional world or an outside commentary that shapes the player’s emotional experience. Both roles are valid, but the choice between them is especially meaningful within the context of VR. The next article will focus on the third of the three questions: whether music in VR should enhance player comfort or player performance. Thanks for reading, and please feel free to leave your comments in the space below!</p>
<hr />
<table border="0" cellpadding="2" cellspacing="2" summary="A synopsis of the GDC 2018 lecture by Winifred Phillips for video game composers.">
<tbody>
<tr>
<td> </p>
<h2><u><strong>Music in Virtual Reality (GDC 2018 Session)</strong></u></h2>
<p> <br />
</p>
<p><img alt="Illustration of the VR projects to be discussed in a GDC talk presented by Winifred Phillips for video game composers." src="http://www.sickgaming.net/blog/wp-content/uploads/2018/06/blog-composing-video-game-music-for-virtual-reality-part-3-7.jpg" /><em>This lecture presented ideas for creating a musical score that complements an immersive VR experience. Composer Winifred Phillips shared tips from several of her VR projects. Beginning with a historical overview of positional audio technologies, Phillips addressed several important problems facing composers in VR.</em></p>
<p> <br />
</p>
<p><em>Topics included 3D versus 2D music implementation, and the role of spatialized audio in a musical score for VR. The use of diegetic and non-diegetic music were explored, including methods that blur the distinction between the two categories.</em></p>
<p> <br />
</p>
<p><em>The discussion also included an examination of the VIMS phenomenon (Visually Induced Motion Sickness), and the role of music in alleviating its symptoms. Phillips’ talk offered techniques for composers and audio directors looking to utilize music in the most advantageous way within a VR project.</em></p>
<p> <br />
</p>
<p><em><u><strong>Takeaway</strong></u></em></p>
<p> <br />
</p>
<p><em>Through examples from several VR games, Phillips provided an analysis of music composition strategies that help music integrate successfully in a VR environment. The talk included concrete examples and practical advice that audience members can apply to their own games.</em></p>
<p> <br />
</p>
<p><em><u><strong>Intended Audience</strong></u></em></p>
<p> <br />
</p>
<p><em>This session provided composers and audio directors with strategies for designing music for VR. It included an overview of the history of positional sound and the VIMS problem (useful knowledge for designers.)</em></p>
<p> <br />
</p>
<p><em>The talk was intended to be approachable for all levels (advanced composers may better appreciate the specific composition techniques discussed).</em></p>
<p> </td>
<p> </tr>
</tbody>
</table>
<h2> </h2>
<hr />
<p><img alt="Photo of Winifred Phillips in her video game composers music production studio." src="http://www.sickgaming.net/blog/wp-content/uploads/2018/06/blog-composing-video-game-music-for-virtual-reality-part-3.png" />Winifred Phillips is an award-winning video game music composer whose most recent projects are the triple-A first person shooter <em>Homefront: The Revolution</em> and the <em>Dragon Front</em> VR game for Oculus Rift. Her credits include games in five of the most famous and popular franchises in gaming: <em>Assassin’s Creed, LittleBigPlanet, Total War, God of War, </em>and<em> The Sims</em>. She is the author of the award-winning bestseller <a href="http://amzn.com/0262026643" target="_blank">A COMPOSER’S GUIDE TO GAME MUSIC</a>, published by the MIT Press. As a VR game music expert, she writes frequently on the future of music in virtual reality games.</p>
<p>Follow her on Twitter <a href="http://www.twitter.com/winphillips" target="_blank">@winphillips</a>.</p>
</div>
<div style="margin: 5px 5% 10px 5%;"><img src="http://www.sickgaming.net/blog/wp-content/uploads/2018/06/blog-composing-video-game-music-for-virtual-reality-part-3.jpg" width="800" height="659" title="" alt="" /></div><div><p><strong><i><small> The following blog post, unless otherwise noted, was written by a member of Gamasutra’s community.<br />The thoughts and opinions expressed are those of the writer and not Gamasutra or its parent company. </small></i></strong> </p>
<hr />
<p><img alt="In this article for and about the craft of video game composers, Winifred Phillips is pictured in this photo working in her music production studio." src="http://www.sickgaming.net/blog/wp-content/uploads/2018/06/blog-composing-video-game-music-for-virtual-reality-part-3.jpg" /></p>
<p><strong>By <a href="http://www.winifredphillips.com/" target="_blank">Winifred Phillips</a></strong> | <a href="http://winifredphillips.com/contact.php" target="_blank"><em>Contact</em></a> | <a href="http://www.twitter.com/winphillips" target="_blank"><em>Follow</em></a></p>
<p>So happy you’ve joined us! I’m videogame composer Winifred Phillips. Welcome back to our four part discussion of the role that music plays in Virtual Reality video games! These articles are based on the presentation I gave at this year’s gathering of the famous Game Developer’s Conference in San Francisco. My talk was entitled <a href="http://schedule.gdconf.com/session/music-in-virtual-reality/851396" target="_blank">Music in Virtual Reality</a> (I’ve included the official description of my talk at this end of this article). If you haven’t read the previous two articles, you’ll find them here:</p>
<p>During my GDC presentation, I focused on three important questions for VR video game composers:</p>
<ul>
<li>Do we compose our music in 3D or 2D?</li>
<p> </p>
<li>Do we structure our music to be Diegetic or Non-Diegetic?</li>
<p> </p>
<li>Do we focus our music on enhancing player Comfort or Performance?</li>
<p>
</ul>
<p>While attempting to answer these questions during my GDC talk, I discussed my work on four of my own VR game projects – the <a href="http://bebylon.world/" target="_blank">Bebylon: Battle Royale</a> arena combat game from Kite & Lightning, the <a href="http://www.dragonfront.com/" target="_blank">Dragon Front</a> strategy game from High Voltage Software, the <a href="https://www.oculus.com/experiences/gear-vr/1519842891421902/" target="_blank">Fail Factory</a> comedy game from Armature Studio, and the <a href="http://www.scrapernetwork.com/" target="_blank">Scraper: First Strike</a> shooter/RPG from Labrodex Inc.</p>
<p>In these articles, I’ve been sharing the discussions and conclusions that formed the basis of my GDC talk, including numerous examples from these four VR game projects. So now let’s look at the second of our three questions:</p>
<h2>Do we structure our music to be Diegetic or Non-Diegetic?</h2>
<p><img alt="In this article discussing popular VR issues for video game composers, Winifred Phillips explores an example from one of her game music composition projects - the Dragon Front VR strategy game." src="http://www.sickgaming.net/blog/wp-content/uploads/2018/06/blog-composing-video-game-music-for-virtual-reality-part-3-1.jpg" />Before we launch into this discussion, let’s revisit one of the examples from the previous article. You’ll remember that we took a look at the Main Theme music I composed for the popular Dragon Front VR strategy game, in order to examine how music can best transition from a traditionally 2D stereo delivery to a 3D positional implementation. So in this case, the big victorious anthem that I composed for Dragon Front makes its first appearance as a bombastic stereo mix directly piped into the player’s headphones, and then transitions smoothly to a spatially positioned environmental sound issuing from a small in-game radio. Just as a reminder, let’s take another look at that:</p>
<p>[embedded content]</p>
<p>In this example, we see how the Dragon Front theme music starts as traditional underscore (that is, a non-diegetic score), but then moves into the VR space and becomes a diegetic score – one that is understood to be present in the game world. And that brings us to the second of the three core debates at the heart of music in VR: should music in VR be diegetic or non-diegetic?</p>
<p>It’s a thorny issue. As we know, musical underscore is absolutely vital in gaming – it creates momentum, motivates players and adds emotional texture to the story and the characters. However, in VR, the idea of presence becomes paramount. We want players to feel like they are inside the fiction of an awesome VR world. So, when the non-diegetic music starts playing, we worry that players might stop and wonder, ‘where’s this music coming from? Why am I hearing it?’</p>
<p>The obvious solution is to make all of the music in the game diegetic – somehow, in this VR world, all music comes from in-game sources that players can see in the environment around them. Here’s an example from one of my VR projects – Bebylon: Battle Royale, from developers Kite & Lightning.</p>
<p><img alt="In this article exploring the craft of VR music for video game composers, Winifred Phillips discusses an example from one of her own VR projects - the Bebylon: Battle Royale game for the famous Oculus Rift VR platform." src="http://www.sickgaming.net/blog/wp-content/uploads/2018/06/blog-composing-video-game-music-for-virtual-reality-part-3-2.jpg" />Bebylon is a great example of a completely diegetic score in VR. The whole premise hinges on immortal babies battling it out in over-the-top arena fights in a futuristic setting. Music during gameplay is represented by a group of in-game baby musicians, so the music originates from that source, and we’re able to see this happening in the VR world. So, let’s take a look at that:</p>
<p>[embedded content]</p>
<p>Bebylon: Battle Royale proves that its possible to get away with a completely diegetic score, but we’d need really specific circumstances to justify it. Most games won’t be able to make this approach work. So, what then? I’ve found that there are three strategies to ease non-diegetic music into VR:</p>
<ul>
<li>Keep it subtle and gradual,</li>
<p> </p>
<li>Keep it dry and warm, and</li>
<p> </p>
<li>Keep it both inside and outside the VR world.</li>
<p>
</ul>
<p>So let’s start with the first strategy – subtle and gradual.</p>
<p><img alt="In this article about music for the popular VR platforms (by a video game composer for video game composers) Winifred Phillips describes her work on the Scraper VR shooter/RPG." src="http://www.sickgaming.net/blog/wp-content/uploads/2018/06/blog-composing-video-game-music-for-virtual-reality-part-3-3.jpg" />We’ve already discussed this technique in the first article in this series, when we took a look at the ambient music for Scraper, a first-person VR shooter set inside colossal skyscrapers in a futuristic city. Exploring the massive buildings in the Scraper fictional universe requires a musical soundtrack to set the tone, but introducing it so that it feels natural in VR is a challenge.</p>
<p>In order to address this problem, I composed the ambient music in Scraper so that it would come and go in subtle, gradual ways. As a technique for music implementation in VR, this can be an effective approach. Let’s take another look at what that was like in Scraper:</p>
<p>[embedded content]</p>
<p>While this technique works well for the ambient music, it wasn’t an option for combat. Battles in Scraper are pretty intense – the music begins with a bang and keeps on whaling away until the room is cleared of enemies. At the beginning of the project, we’d decided on a stereo music mix rather than spatialization – considering how important audio cues are to expert first-person-shooter players, we didn’t want a spatialized score to introduce any confusion. My job at that point was to figure out a way to delineate the stereo music mix from the VR world so that the player wouldn’t wonder where the music was coming from.</p>
<p><img alt="An illustration for the famous 'proximity effect' in sound recording - in this article for video game composers, Winifred Phillips explores the role of music in VR." src="http://www.sickgaming.net/blog/wp-content/uploads/2018/06/blog-composing-video-game-music-for-virtual-reality-part-3-4.jpg" />From here, I started thinking about proximity effect – it’s a term relating to microphone recording. You’ll notice proximity effect when someone speaks into a mike while leaning very close to it. The voice starts sounding really bassy and warm in tone, and the mike picks up a lot of the dry source signal, with less of the room acoustics coming through. When you listen with headphones to a recording with lots of proximity effect, it tends to feel like it’s inside your head. I thought – great! If the music is in our heads, we’re not going to be looking around, wondering where it’s coming from.</p>
<p>I recorded the music for Scraper with fairly dry acoustics, and when I mixed the music, I focused on keeping the tone warm and bassy, with a solid low end and some rich mids in the EQ spectrum. Here’s an example of how that worked in combat sequences of the Scraper VR game:</p>
<p>[embedded content]</p>
<p><img alt="The logo of the Fail Factory game for the popular VR platform -- in this article for video game composers, Winifred Phillips explores an example from one of her own VR music composition projects." src="http://www.sickgaming.net/blog/wp-content/uploads/2018/06/blog-composing-video-game-music-for-virtual-reality-part-3-5.jpg" />I also recorded the music of Fail Factory with dry acoustics and a warm, bassy mix – this effect is especially prevalent during the Fail Factory tutorial.</p>
<p>In the Fail Factory Tutorial, the instructor zips around on a hover craft while offering tips and guidelines. In those circumstances, having the music in a dry, warm mix allows it to feel closer to the player, and more separated from the spatialized sounds from the instructor. Let’s check that out:</p>
<p>[embedded content]</p>
<p>So now let’s look at another approach, which I’ve called ‘Inside and Outside.’ If music is 3D – if it’s spatialized – we’re more likely to think it actually exists inside the fictional world. If music is 2D – if it’s a direct stereo mix – we’ll be more likely to accept it as non-diegetic, as outside the experience.</p>
<p><img alt="A depiction of the official logo of the Dragon Front VR game -- in an article written for video game composers, Winifred Phillips (video game composer) explores the role of music in projects for VR projects." src="http://www.sickgaming.net/blog/wp-content/uploads/2018/06/blog-composing-video-game-music-for-virtual-reality-part-3-6.jpg" />Remember the example I showed earlier from Dragon Front – when the main theme music of the game transitioned into a spatialized music source coming from inside the VR space? This is an example of music making the jump from non-diegetic to diegetic, and that can help the player accept the presence of music as a part of the VR game. Watch how players can look around in the Dragon Front hub area, locate the source of the music, and actually turn it off if they want to:</p>
<p>[embedded content]</p>
<p>So we’ve now discussed the second of the three important questions for video game composers creating music for VR games:</p>
<ul>
<li>Do we compose our music in 3D or 2D?</li>
<p> </p>
<li><u><em>Do we structure our music to be Diegetic or Non-Diegetic?</em></u></li>
<p> </p>
<li>Do we focus our music on enhancing player Comfort or Performance?</li>
<p>
</ul>
<p>We’ve contemplated what role our music should play in the VR experience – whether it should be considered a part of the fictional world or an outside commentary that shapes the player’s emotional experience. Both roles are valid, but the choice between them is especially meaningful within the context of VR. The next article will focus on the third of the three questions: whether music in VR should enhance player comfort or player performance. Thanks for reading, and please feel free to leave your comments in the space below!</p>
<hr />
<table border="0" cellpadding="2" cellspacing="2" summary="A synopsis of the GDC 2018 lecture by Winifred Phillips for video game composers.">
<tbody>
<tr>
<td> </p>
<h2><u><strong>Music in Virtual Reality (GDC 2018 Session)</strong></u></h2>
<p> <br />
</p>
<p><img alt="Illustration of the VR projects to be discussed in a GDC talk presented by Winifred Phillips for video game composers." src="http://www.sickgaming.net/blog/wp-content/uploads/2018/06/blog-composing-video-game-music-for-virtual-reality-part-3-7.jpg" /><em>This lecture presented ideas for creating a musical score that complements an immersive VR experience. Composer Winifred Phillips shared tips from several of her VR projects. Beginning with a historical overview of positional audio technologies, Phillips addressed several important problems facing composers in VR.</em></p>
<p> <br />
</p>
<p><em>Topics included 3D versus 2D music implementation, and the role of spatialized audio in a musical score for VR. The use of diegetic and non-diegetic music were explored, including methods that blur the distinction between the two categories.</em></p>
<p> <br />
</p>
<p><em>The discussion also included an examination of the VIMS phenomenon (Visually Induced Motion Sickness), and the role of music in alleviating its symptoms. Phillips’ talk offered techniques for composers and audio directors looking to utilize music in the most advantageous way within a VR project.</em></p>
<p> <br />
</p>
<p><em><u><strong>Takeaway</strong></u></em></p>
<p> <br />
</p>
<p><em>Through examples from several VR games, Phillips provided an analysis of music composition strategies that help music integrate successfully in a VR environment. The talk included concrete examples and practical advice that audience members can apply to their own games.</em></p>
<p> <br />
</p>
<p><em><u><strong>Intended Audience</strong></u></em></p>
<p> <br />
</p>
<p><em>This session provided composers and audio directors with strategies for designing music for VR. It included an overview of the history of positional sound and the VIMS problem (useful knowledge for designers.)</em></p>
<p> <br />
</p>
<p><em>The talk was intended to be approachable for all levels (advanced composers may better appreciate the specific composition techniques discussed).</em></p>
<p> </td>
<p> </tr>
</tbody>
</table>
<h2> </h2>
<hr />
<p><img alt="Photo of Winifred Phillips in her video game composers music production studio." src="http://www.sickgaming.net/blog/wp-content/uploads/2018/06/blog-composing-video-game-music-for-virtual-reality-part-3.png" />Winifred Phillips is an award-winning video game music composer whose most recent projects are the triple-A first person shooter <em>Homefront: The Revolution</em> and the <em>Dragon Front</em> VR game for Oculus Rift. Her credits include games in five of the most famous and popular franchises in gaming: <em>Assassin’s Creed, LittleBigPlanet, Total War, God of War, </em>and<em> The Sims</em>. She is the author of the award-winning bestseller <a href="http://amzn.com/0262026643" target="_blank">A COMPOSER’S GUIDE TO GAME MUSIC</a>, published by the MIT Press. As a VR game music expert, she writes frequently on the future of music in virtual reality games.</p>
<p>Follow her on Twitter <a href="http://www.twitter.com/winphillips" target="_blank">@winphillips</a>.</p>
</div>