Posted on Leave a comment

Blog: VR music tips for the game composer

The following blog post, unless otherwise noted, was written by a member of Gamasutra’s community.
The thoughts and opinions expressed are those of the writer and not Gamasutra or its parent company.


Video game music composer Winifred Phillips, at work in her music production studio - from the article about music for virtual reality / VR.

By video game composer Winifred Phillips | Contact | Follow

The Game Developers Conference is always an awesome opportunity for game audio experts to learn and share experiences. I’ve given presentations at GDC for a few years now, and I’m always excited to hear about what’s new and notable in game audio. This year, the hot topic was virtual reality. In fact, the subject received its own dedicated sub-conference that took place concurrently with the main GDC show. The VRDC (Virtual Reality Developers Conference) didn’t focus particularly on the audio and music side of VR, but there were a couple of notable talks on that subject. In this article, let’s take a look at some of the more intriguing VR game music takeaways from those two talks. Along the way, I’ll also share some of my related experience as the composer of the music of the Dragon Front VR game for the Oculus Rift (pictured above).

Inside and outside

The talks we’ll be discussing in this article are entitled “Audio Adventures in VR Worlds” and “The Sound Design of Star Wars: Battlefront VR.” Here’s a common issue that popped up in both talks:

An illustration of music in the popular VR platform, from the article by Winifred Phillips (video game composer).Where should video game music be in a VR game? Should it feel like it exists inside the VR world, weaving itself into the immersive 3D atmosphere surrounding the player? Or should it feel like it’s somehow outside of the VR environment and is instead coasting on top of the experience, being conveyed directly to the player? The former approach suggests a spacious and expansive musical soundscape, and the latter would feel much closer and more personal. Is one of these approaches more effective in VR than the other? Which choice is best?

These two concepts share a lot in common with the traditional categories of diegetic and non-diegetic music in entertainment media. Diegetic music exists inside the fictional world, perceived by the characters within it, whereas non-diegetic music is inaudible to the characters and only exists for the benefit of the audience. VR presents an interesting twist to this usually straightforward dichotomy. When the entertainment experience is doing everything in its power to make us forget that we’re an audience, to the point where we achieve a sense of complete presence within the fictional world… what role does non-diegetic music play then?  If we can now consider ourselves as characters in the story, how do we hear music that story characters aren’t supposed to hear?

An illustration of the game PlayStation VR Worlds, from the article by popular video game music composer Winifred Phillips“VR goes beyond picture sync. It’s about sync of the world,” says music producer Joe Thwaites of Sony Interactive Entertainment Europe. In his talk about the music and sound of the game PlayStation VR Worlds, Thwaites explores the relationship between music and the VR environment. “The congruency between audio and visuals is key in maintaining that idea of believability,” Thwaites asserts, “which in turn makes immersiveness, and in turn makes presence.” In virtual reality development, the term ‘presence’ denotes the sensation of actually existing inside the virtual environment. According to Thwaites, a strong believable relationship between the aural and visual worlds can contribute to a more satisfying VR experience.

Music inside the world

An illustration of the Ocean Descent portion of famous PlayStation VR Worlds game, from the article by Winifred Phillips, video game composer.As an example, Thwaites describes an interactive music implementation that he integrated into the ‘Ocean Descent’ section of PlayStation VR Worlds.  In this portion of the game, Thwaites pulled the otherwise non-diegetic musical score more fully into the immersive world by creating an illusion that the in-game objects were reacting to the musical notes. “There’s a part called The Jellyfish Cave, where you descend into this sea of jellyfish,” Thwaites describes. “You get this 2D music,” he adds, “which bypasses the 3D audio plugin, so it goes straight to your ears.” In other words, the music is recorded in a traditionally stereo mix and the output is fed directly to the player’s headphones without bothering with any spatial positioning in the virtual world. “Then, as you look around, these jellyfish light up as you look directly at them,” Thwaites goes on, “and they emit a tone in 3D in space so the music tone stays where it is in the world.” So, these tones have been attached to specific jellyfish in the virtual world, spatially positioned to emanate from those locations, as if special portions of the non-diegetic score have suddenly leapt into the VR world and taken up residence there. “And that has this really nice effect of creating this really immersive and magical moment which is really unique to VR,” Thwaite remarks.

So this method served to help non-diegetic music feel more natural within the VR environment. But what happens when pure non-diegetic music is an absolutely necessity?

Music outside the world

An illustration of the famous Star Wars Battlefront VR game, from the article by video game music composer Winifred Phillips.In the game Star Wars Battlefront Rogue One X-Wing VR Mission, the audio team at Criterion Games were tasked with creating an authentic audio experience in a virtual reality environment dedicated to the eternally famous and popular Star Wars franchise. In this case, according to audio lead Jay Steen, pure non-diegetic music was a must. “Non-diegetic means not from a source in the scene. This is how most movies and flatscreen games handle the music. So the music plays through the direct out straight to the player’s ears and we were worried from what we’d heard about non-diegetic music that it would distract from immersion,” Steen confesses. “But we actually found the opposite. Maybe that’s because you can’t have a Star Wars story without the music. You don’t feel like you’re in Star Wars until the music kicks in.” According to Steen, the non-diegetic music worked in this circumstance because the audio team was careful to avoid repetition in the musical score. “We didn’t reuse or loop cues that much, and due to the linear structure of the mission we could kind of get away with this,” Steen points out. “We think that helps to not break immersion.”

My perspective on using non-diegetic music in VR:

Famous video game composer Winifred Phillips works in her music production studio.Sometimes non-diegetic music can be introduced into a VR game, and then quickly transformed into diegetic music within the immersive environment in order to enhance player presence. In my musical score for the Dragon Front game for Oculus Rift, I composed a dramatic choral track for the opening main theme of the game. During the game’s initial logo sequence, the music is channeled directly to the player’s ears without any spatial positioning. However, this changes as soon as the player fully enters the initial environment (wherein the player navigates menus and prepares to enter matches).Logo art from the popular Dragon Front game, featured in the article by video game music composer Winifred Phillips  Once the logo sequence has completed, the music makes a quick transition, from a full-bodied direct stereo mix to the player’s headphones, to a spatially localized narrow mix located to the player’s lower right. Upon turning, players see that the music is now coming from a battered radio, which the player is free to turn on and off. The music is now fully diegetic, existing inside the game’s fictional world. Here’s a video showing this sequence in action:

[embedded content]

Music inside and outside

The logo for the VR Luge portion of the popular PlayStation VR Worlds game, from the article by Winifred Phillips (video game music composer).While non-diegetic music can be tricky in VR, sometimes its an important part of the overall aesthetic. Plus, there can be ways to integrate non-diegetic music into the spatial environment. Joe Thwaites of Sony Europe describes an interesting combination of diegetic and non-diegetic music that was integrated into the ‘VR Luge’ section of the PlayStation VR Worlds game. In this gameplay sequence, players ride feet-first on a luge that’s racing downhill amidst heavy vehicle traffic. The experience was designed to be a heart-stopping thrill ride. “So one of the experiments we did around the synchronization of the world was using a combination of diegetic and non-diegetic music to build tension as you zoomed down the hill,” Thwaites describes. “We used 3D car radios to introduce elements of percussion into the 2D soundtrack that was playing.” In the musical score for this sequence, the non-diegetic music presented a purely percussive rhythm, but as the player passed by other cars, the music would change. “So as you passed a car with a radio playing, an element of that 3D music would transition from the car into the 2D soundtrack.” In this way, the in-game radio music would briefly become a part of the game’s non-diegetic score, while still conveying spatial positioning inside the 3D world.

So in these examples from PlayStation VR Worlds and Star Wars Battlefront Rogue One X-Wing VR Mission, we see that audio teams grapple constantly with the contrasting natures of diegetic and non-diegetic music. While it seems as though non-diegetic music has been relegated to a very traditional, non-spatially localized delivery, this may not always be the case. Jay Steen of Criterion Games spent some time considering the possibility of delivering the non-diegetic music of his Star Wars game with a more enveloping spatial texture. “We did do a quick experiment on it, and we found that it’s like having an orchestra sitting around you,” Steen says. “We didn’t want to evoke you sitting in the middle of an orchestral recording. We just wanted it to sound like the movie.” That being said, Steen doesn’t rule out the possibility of a more spatially-interesting mix for music in the future, including the use of ambisonic recordings for non-diegetic musical scores. “Ambisonic recordings of orchestras for example,” Steen speculates, “I think there’s something fun there. We haven’t experimented with it anymore than that, but yeah, definitely, we’d want to try.”

Conclusion

So this concludes our look at two presentations from GDC 2017 that focused on issues that complicate music creation and implementation in virtual reality. I hope you’ve found this interesting, and please feel free to leave a comment in the space below!


Photo of video game composer Winifred Phillips in her music production studio.Winifred Phillips is an award-winning video game music composer whose most recent projects are the triple-A first person shooter Homefront: The Revolution and the Dragon Front VR game for Oculus Rift. Her credits include games in five of the most famous and popular franchises in gaming: Assassin’s Creed, LittleBigPlanet, Total War, God of War, and The Sims. She is the author of the award-winning bestseller A COMPOSER’S GUIDE TO GAME MUSIC, published by the MIT Press. As a VR game music expert, she writes frequently on the future of music in virtual reality games.

Follow her on Twitter @winphillips.

Posted on Leave a comment

Now Available on Steam – Skyworld, 25% off!

Skyworld is Now Available on Steam and is 25% off!*

Classic strategy gameplay reimagined for VR. Skyworld combines accessible turn-based strategy and fast-paced real-time battles, all set in intricately animated miniature worlds. Build and command your forces right on the battlefield, conquer all Skyworlds and compete in online multiplayer. Plus, DRAGONS!

*Offer ends October 24 at 10AM Pacific Time

Posted on Leave a comment

Last Chance on Steam – Cabela’s® Big Game Hunter® Pro Hunts, 80% off!

Cabela’s® Big Game Hunter® Pro Hunts and the rest of the Cabela’s® lineup is 80% off!*

This is your last chance to buy the Cabela’s® hunting games on Steam- the games will be retired from the store on October 20th.

The most authentic hunting ballistics ever in a Cabela’s® game!
Traverse maps 4x the size of any previous Cabela’s® Big Game Hunter® game!
Track, scout and target your trophy animal in all new ways!

*Offer ends October 19 at 10AM Pacific Time

Posted on Leave a comment

Free Weekend – Warhammer: End Times – Vermintide

Play Warhammer: End Times – Vermintide for FREE starting now through Thursday October 26th at 10AM Pacific Time. You can also pickup Warhammer: End Times – Vermintide at 75% off the regular price!*

If you already have Steam installed, click here to install or play Warhammer: End Times – Vermintide. If you don’t have Steam, you can download it here.

*Offer ends Thursday October 26th at 10AM Pacific Time

Posted on Leave a comment

Report: UK dev Sumo Digital is preparing to go public

LittleBigPlanet 3, Snake Pass, and Crackdown 3 developer Sumo Digital is preparing to go public in an IPO (initial public offering) worth nearly $200 million.

As reported by The Times, the developer-publisher is getting ready for a ‘£150 million ($198 million) float,’ with sources claiming Zeus Capital has been hired to advise on a listing in London. 

If the IPO goes ahead, co-founders Carl Cavers and Paul Porter will apparently retain sizeable minority stakes in the studio. 

Sumo was founded in 2003 and is currently headquartered in Sheffield. The company runs three development studios: Sumo Digital Nottingham and Sumo Digital Sheffield in the UK, and Sumo Digital Pune in India.

The firm creates and publishes games for most platforms, and has worked on major franchises including Forza, Dead Space, Hitman, and Disney Infinity.

Posted on Leave a comment

Blog: Bringing Galaxy on Fire to Vulkan – Part 4

The following blog post, unless otherwise noted, was written by a member of Gamasutra’s community.
The thoughts and opinions expressed are those of the writer and not Gamasutra or its parent company.


Written by Max Röhrbein-Kling and Johannes Kuhlmann

Having frequently posted here on Gamasutra over the past couple of weeks, we have now reached part four of our series of blogs about our experience with bringing Galaxy on Fire 3 – Manticore to Vulkan.

Our posts follow this structure:

  1. Introduction and Fundamentals
  2. Handling Resources and Assets
  3. What We have Learned
  4. Vulkan on Android (this post)
  5. Stats & Summary

In case you have not read our previous posts yet, here is our disclaimer once more: When we started working on the Android version of our game we decided to use Vulkan for rendering (there is also an OpenGL ES version, but that is not of interest here). This series is about our own experiences with implementing a Vulkan renderer and getting it to work on different devices, in particular on Android devices. So, we are mainly going to talk about the interesting aspects of our implementation and then dive into what we learned along the way.

First and foremost, the focus of our Vulkan renderer was to ship a game. That means it is more pragmatic than perfect and we have mainly done what has worked for us. We are not using any fancy stuff like custom allocators, parallel command generation, reusing command buffers, etc., etc. We do believe, though, that our implementation is still reasonably versatile and well done.

This fourth post covers the problems we encountered that are specific to Vulkan on Android.

Vulkan on Android 6

While proper Vulkan support was only added in Android 7 (or Android N, or Nougat, or API level 24), there are a few devices out there that already had Vulkan support on Android 6 (or Android M, or Marshmallow, or API level 23). Some of these devices are, for example, the Samsung Galaxy S7 (Edge) and Nvidia Shield Tablet.

It is possible to support such devices with a bit of extra effort. The problem is that you cannot depend on the Vulkan header and library being part of the Android SDK/NDK. Instead, you have to provide your own header file and load the library dynamically at runtime. Google’s Vulkan samples have a convenient wrapper for this that spares you all the typing.

Note, however, that just because a given device can support Vulkan on Android 6, that does not mean all devices of that type support Vulkan on Android 6. The Vulkan support can vary with minor updates and even the extend of the support may vary. For example, we have found one implementation reporting its API version as being 0.0.1 and not supporting the swapchain extension or validation layers at all. Another device told us it did not want to work with us by reporting the VK_ERROR_INCOMPATIBLE_DRIVER error.

So, make sure to check that the Vulkan implementation that a device provides is actually one you can work with and that it supports all the features you need. Otherwise, fail gracefully.

Lifecycle Concerns

A challenge that is rather unique to Android is that you have to handle the case when your application is sent to the background. The player can pause and resume the application at pretty much any time by pressing the home button, for example. You have to handle this on the CPU side in order to not eat up all CPU cycles in the background. This would annoy the user by slowing down the phone and draining the battery.

Unfortunately, we could not find any documentation on what you have to do for Vulkan when this happens. Therefore, we had to figure this out ourselves by experimenting. We already knew that with OpenGL ES you have to be careful to not destroy your complete context (which means you will have to recreate all your textures and buffers, and so on). If you are careful, you can get away with only having your surface destroyed and recreated.

It is actually the same case with Vulkan. When the application is paused, your surface is destroyed. When it is resumed, you get a new surface which you will have to render into from then on. This is all a bit tricky as you have to be careful with synchronization and timing. Do not destroy the surface while still rendering into it, for example.

Destroying and recreating the surface also means that you will have to recreate your swapchain and its framebuffers. When all of that is done, you should have a smooth and quick pause and resume cycle.

Note, however, that you cannot pass in the old swapchain into vkCreateSwapchainKHR(). You have to destroy it independently and create a completely new one. We assume this is due to the old swapchain already being invalid because the surface was destroyed.

Debugging Tools

We started with implementing the Vulkan renderer on Windows. There, RenderDoc had our back when our rendered frames looked wrong and the validation layers did not provide enough insights.

For Android development, there are various tools aiming to satisfy your Vulkan debugging needs. RenderDoc also supports capturing from an Android app. But it is harder to set up. The major GPU vendors also provide their own tools:

Google is currently working on extracting the graphics debugger from Android Studio into a standalone tool.

Sadly, when we really wanted a frame capture, it almost always was on a device where the validation layers were not working. And all of the tools we tried require you to load a special validation layer for the capture. As a result, none of these tools provided a great deal of help. We therefore either tried to reproduce the problem on Windows or took a more manual approach by selectively disabling certain kinds of draw calls to track down the problem.

Conclusion

Implementing a Vulkan renderer is already a complex undertaking in itself. But from our point view, there are even more pitfalls on Android. This is mainly caused by two factors: First, the absence of simple-to-use tools. And second, the presence of additional difficulties such as the application lifecycle and different versions of Android.

Interestingly, different GPUs from the same vendor often have the same manifestations of bugs. So, if you want to reproduce a problem, make sure to use a device with the exact same GPU or at least with one from the same vendor. This can be difficult in some cases as, for example, Samsung likes to ship different GPUs in different regions of the world.

In the next (and final) post, we will talk about select statistics and numbers that we collected from our Vulkan implementation.

Posted on Leave a comment

Chinese firm buys 20% stake in Halo 4 co-developer Certain Affinity

Chinese firm Leyou Technologies has acquired a 20 percent stake in Austin-based developer Certain Affinity for $10 million. 

Established by a group of former Bungie employees back in 2006, Certain Affinity has co-developed a number of popular titles including Halo 4, Call of Duty: Black Ops, and Doom. 

Leyou was at one time best known as a poultry supplier, but recently made inroads into the games industry after acquiring Dirty Bomb creator Splash Damage and Warframe developer Digital Extremes

The chicken peddler turned games mogul now hopes to work with Certain Affinity to create an “ambitious and exciting” new title. 

As part of the agreement, Leyou also has the option to snap up Certain Affinity’s remaining shares in 2021 for a valuation based on an agreed formula. If that deal goes ahead, it could cost Leyou up to $150 million. 

“With its proven track record producing high-quality video games, Certain Affinity possesses the technical capability and talent to create highly successful titles, which in turn will assist Leyou in further diversifying its video game portfolio and enhancing its revenue streams,” said Leyou CEO, Alex Xu.

“This strategic investment into Certain Affinity is consistent with the growth strategy of our company as we continue to look for opportunities to invest and increase our market share in the video gaming industry.”