Posted on Leave a comment

Bungie drops hidden Destiny 2 XP scaling following player outcry

The scaling system itself was unveiled by a Reddit user over the long weekend after other posters had noticed that that the in-game experience bar seemed to be taking more time to fill after repeated activities, despite displaying consistent numbers.

The controversy, however, stems from the fact that, in the post-game, in-game experience points are used to earn Destiny 2‘s cosmetic ‘Bright Engram’ loot boxes, which can otherwise only be purchased with Destiny‘s microtransaction-dependent ‘Silver’ currency. The hidden system reduces the amount of experience players receive when grinding or completing activities in quick succession, slowing their progress toward those free boxes.

Additionally, the discovery also sparked concerns that players were getting less experience than advertised after spending silver in-game experience boosters or buying Red Bull or Pop-Tarts to redeem a promotional experience boost. 

Using the third-party app Destiny Item Manager (DIM) and a spreadsheet, Reddit user EnergiserX calculated how much experience the game said they were earning versus how much progress was actually being made on the in-game progress bar.

According to EnergiserX’s tests, roughly 130,000 experience points were lost over the course of three hours. At its worst, the hidden scaling system eventually awards only 4 percent of what is being reported.

Following that data, Bungie confirmed that it did have a system in place to scale experience gains behind the scenes, but that the system was “not performing in the way we’d like it to.” According to Bungie’s post, the now-removed system scaled up experience points for players participating in lengthier activities like PvP matches or the raid and scaled experience down for shorter, grindable activities like public events.

“We are not happy with the results, and we’ve heard the same from the community. Effective immediately, we are deactivating this system,” read the post. “As a result, players will see XP earn rates change for all activities across the board, but with all values being displayed consistently in the user interface. Over the course of the next week, we will be watching and reviewing XP game data to ensure that these changes meet our expectations, as well as yours. Any additional updates to this system will be communicated to you via our official channels.”

As of Sunday, the system has been removed from Destiny 2 though players were once again up in arms after noticing that the experience needed to earn a bright engram was quietly doubled from 80,000 to 160,000 in the same patch. Bungie later confirmed this quiet increase over Twitter, noting that displaying the now correct 160,000 value will require a coming API update

Posted on Leave a comment

Xenoblade Chronicles 2: Behind Yasunori Mitsuda’s music

Xenoblade Chronicles 2: Behind Yasunori Mitsuda’s music

Hello, I’m Yasunori Mitsuda. Xenoblade Chronicles 2 was the biggest and most challenging project I have ever worked on. It consisted of a number of processes, including composing (needless to say), acting as a coordinator (for recording sessions), a producer (managing budgets) and working as a sound director, managing schedules, proofreading every composer’s scores, finaliziing/brushing up the scores and printing them all out, looking at the sound controls for Nintendo Switch, etc. I was careful especially in choosing the musicians, and in being efficient in recording sessions as much as possible. Another point to highlight was that musicians from all over the world were involved, such as a chorus from Slovakia, an orchestra from Japan, vocal songs sung by Ms. Jen Bird, who came all the way from England, and finally the Irish chorus group ANÚNA. One of ANÚNA’s performances is featured in this music video.

All kinds of genres of music were made, so I am sure that the game will not bore you no matter how long you play it.

I was invited by the director [Tetsuya] Takahashi (Taka-san) to be involved in the project of Xenoblade Chronicles 2 back on December 9th 2014, which I was very excited about. Three months later, we held a meeting to discuss the direction of the music, and the sound that Taka-san required. Then after a few more months – when the direction and the amount of music were all set – we held another meeting including ACE and [Kenji] Hiramatsu-san to decide who would write for what. Although generally we divided the music equally, I think the decision was made rather smoothly, considering that we wanted the fans to be satisfied, and we did not want to ruin the image that was already set by the first Xenoblade Chronicles. Each composer was in touch directly with Taka-san to communicate about the music they were working on. For some of the demos, I had phone calls from Taka-san, asking for my opinion on the music in question. Then we’d usually have the same thought and agree on something like, “Yeah, maybe that’s not right.” When Taka-san turns down a demo, he tells the composer exactly what is required in a clear way. He does so by putting himself into the composer’s position, and choosing the words he uses wisely, which makes it easy to make any necessary amendments. Another thing is, when I work with Taka-san I always want to bring in some new musicians or music on board, and for Xenoblade Chronicles 2 I wanted to work with the Irish chorus group ANÚNA. I thought that with ANÚNA, having their distinctive sound, it would be possible to express the mystical and majestic sound which fits well with Xenoblade Chronicles 2 . Surprisingly, ANÚNA were planning on coming to Japan for a different project, so I soon asked for their schedule for a recording session. This was an absolute miracle! The vocal pieces, which ANÚNA sings, are all important town songs, where the lyrics are written by Taka-san himself. Perhaps you may understand the game fully once you understand the meaning of the words…

The first time that I came across the Irish chorus group ANÚNA was from one of their albums released in 1996, called “Deep Dead Blue”. Back then I was totally into the music of Northern Europe, including Finland, Ireland, Scotland and the Mediterranean region. Usually world music consists of a land’s distinctive instruments, whereas ANÚNA found their way of expression through the human voice, which can be considered the original musical instrument. The album “Deep Dead Blue” really gave me a huge impression, as the chorus wasn’t like a classical one, nor Gregorian, or Bulgarian… I felt that ANÚNA was a new type of chorus that I’d never heard before. My attention was drawn to the lead singer, Michael McGlynn, and I dreamt about making music with ANÚNA one day. After 20 years, my dream came true through the making of the game Xenoblade Chronicles 2.

We have recorded four pieces in total, one of which will be released with a beautiful music video featuring ANÚNA. The piece is called “Shadow of the Lowlands”, and it played in the Kingdom of Tantal in the game. I am sure that it will give you a strange, mysterious sensation whilst walking around the Kingdom of Tantal. Please enjoy playing the game, as I am sure it will give you a totally new feeling that you have never felt before.

For more information about Xenoblade Chronicles 2, visit the official site.

Game Rated:

Language
Suggestive Themes
Use of Alcohol and Tobacco
Violence

Posted on Leave a comment

Video Game Deep Cuts: HAL In The Clouds, Monster-Free

The following blog post, unless otherwise noted, was written by a member of Gamasutra’s community.
The thoughts and opinions expressed are those of the writer and not Gamasutra or its parent company.


[Video Game Deep Cuts is a weekly newsletter from curator/video game industry veteran Simon Carless, rounding up the best longread & standout articles & videos about games, every weekend. This installment includes a look at HAL Laboratory’s last self-published title, a poke at a famous game artwork’s use of Mario’s clouds, and how SOMA’s patch allows you to play sans monsters. 

Just packing up for a long trip to Europe here (some work, some holiday, some gadding about), so apologies for the shorter-than-normal preamble. But just remembered that I didn’t mention two notable new bits of GDC 2018 (that event I help organize!)

That’s both a new Vision Track of ‘mini-keynotes’ I’m helping to program, kicked off by Pixar on how they use real-time graphics to make their movies, and the new GDC Film Festival – get your buddies to enter their neat video game documentaries, won’t you?

Until next time!
– Simon, curator.]

——————

Frictional on designing SOMA’s new monster-free Safe Mode (Andy Kelly / PC Gamer, ARTICLE)
“A year ago I wrote about Wuss Mode, a popular fan-made mod for SOMA that makes its monsters harmless. It’s currently the game’s most subscribed Steam Workshop mod, which suggests a lot of people want to experience its dark, twisted story without the frustration of having to play hide-and-seek with biomechanical monstrosities.”

The Modern Design of 3D Platformers (ft Snoman) (HeavyEyed / YouTube, VIDEO)
“This year has been absolutely stacked with games from all kinds of genres, but the biggest and most pleasant surprise for me was just how much love 3D platformer collectathons got on a mainstream level. So let’s look at the design of Yooka Laylee, Snake Pass, A Hat in Time and Mario Odyssey to see if this is a return or just a quick flash in the pan hype revival.”

Why Does HAL Laboratory Only Make Nintendo Games? (Brian Crimmins / Waypoint, ARTICLE)
“Though today it is mostly known as a major partner with Nintendo, it’s easy to forget that HAL Laboratory began life as a developer and publisher in the late 80s and very early 90s. It was a small studio in these early days, but even here the team found a decent amount of success.”

An Interview With Ken Wong, Lead Designer of Monument Valley (Jamie Gilman / Resource, ARTICLE)
“Ryan Cash & Eli Cymet from Snowman have kindly allowed us to share highlights from episode one of their podcast Art & Craft in which they interview Ken Wong, lead designer of Monument Valley. They discuss Ken’s influences, how he entered the industry and the alternative ending to Monument Valley that didn’t make the cut.”

Competing in America’s Biggest Fighting Game Tournament: Evo 2017 (Waypoint / YouTube, VIDEO)
“Waypoint Presents: Evolution profiles the Evolution Championship Series (EVO) through the eyes of two of the best in fighting games – SonicFox of Echo Fox and Punk of Panda Global. [SIMON’S NOTE: this is a series of Waypoint documentaries that originally appeared on Disney XD, of all places, and I’ll try to post the other ones, because they seem raather good.]”

PUBG’s Second Big Tournament Shows It Still Needs Work As An Esport (Nathan Grayson / Kotaku Compete / ARTICLE)
“Over the weekend, 20 top PlayerUnknown’s Battlegrounds teams faced off at Intel Extreme Masters in Oakland, California… When the smoke cleared, a team that didn’t even get invited took top honors, besting big-name teams, without even winning the final match. This shows that PUBG has potential to be a very different kind of esport, but it’s not there yet.”

Atari and the dawn of video game culture (Charles Russo / The Six Fifty, ARTICLE)
“Mindful of both Atari’s lasting legacy and its current anniversary, we caught up with Tim Lapetino, author of Art of Atari, a gorgeous book that not only explains the glorious 8-bit history of the company, but properly celebrates the often-forgotten — yet entirely dynamic — design concepts which made the system a trailblazing phenomenon.”

Brazil’s Video Game Gray Markets (Drew Scanlon / Cloth Map / YouTube, VIDEO)
“Brazil’s complicated history with electronics has created an alternate universe of video games. [SIMON’S NOTE: some excellent console hardware weirdness here, starring my Video Game History Foundation Discord buddy Gus Lanzetta.]”

Portraying migrants’ struggles via cellphones in Bury Me, My Love (Joel Couture / Gamasutra, ARTICLE)
“Bury Me, My Love has players following the journey of their wife, Nour, as she works her way from Syria to Europe. However, players act as the husband, Majd, only able to know whatever parts of the journey Nour chooses to text back to him.”

Nier’s Yoko Taro On Success, Drinking, And Death (Kimberley Wallace / Game Informer, ARTICLE)
“Best known for directing the Nier and Drakengard series, Yoko Taro reached a new level of success after teaming up with Platinum Games for Nier: Automata, which sold over two million copies. We chatted with Taro about his newfound success and what’s next. Just like his esoteric games, our talk was anything but ordinary.”

Stephen’s Sausage Roll – The Best Puzzle Game I’ve Played (Joseph Andersen / YouTube, VIDEO)
“[SIMON’S NOTE: slightly late to the party on this analysis-heavy video, but it’s been interesting to see the level of praise for this game, which is staggeringly difficult and more expensive than you might think – looks like it’s 65% off in Steam sale right now, though!]

The complete history of Civilization (Fraser Brown / PC Gamer, ARTICLE)
“Welcome to the history of Civilization, a series that has been keeping us up until silly o’clock in the morning  since the release of Sid Meier’s original game in September 1991. Civ turns all of human history into a playground that you can exploit, turn by turn, to bring your chosen nation to glory.”

[Post Mortem]: I thought I could ship at least 700 units to stay in business (Constantin Bacioiu / Gamasutra Blogs, ARTICLE)
“I’m not having the best of time writing this post but I feel like I have to. I have been warned against going full time indie by everyone on the internet and by my friends and family. I believed I could make it, all I had to do was ship just 700 units of my game on steam. I’m not even close.”

Everything but the Clouds (Patrick LeMieux / Vimeo, VIDEO)
“In didactic texts, artist talks, personal websites, and private interviews Cory Arcangel describes Super Mario Clouds as “an old Mario Brothers cartridge which I modified to erase everything but the clouds.”… However, attempting to reverse engineer Super Mario Clouds according to the artist’s original source code… reveals that Arcangel’s ROM hack does not actually contain Nintendo’s ROM.”

Apple Time Warp: Episode 3 – Nasir Gebelli (Part 1 of 3) (John Romero & Craig Johnston / Apple Time Warp, PODCAST)
“John Romero and Craig Johnston talk about the early days of games on the Apple ][… on this episode we have part 1 of an interview that John did with Nasir Gebelli who is very well known for great Apple ][ software and games including the hits Space Eggs and Gorgon, which were clones of Moon Cresta and Defender. [SIMON’S NOTE: this is an impossibly rare interview with Gebelli, who is also famed for programming the first 3 Final Fantasy games!]”
 

Animal Crossing: Pocket Camp impressions: Nintendo should be ashamed (Sam Machkovech / Ars Technica, ARTICLE)
“The series’ mix of simple, bright graphics, cute animal friends, house decorations, and quick-hit daily tasks seems like perfect tap-and-go gaming fodder… But before addressing any of that, we have to look closely at how Nintendo converted this game from a fixed-price, retail offering to a free-to-play microtransaction disaster—and how that has rotted Animal Crossing’s most rewarding elements from the inside-out.”

Reliving the Horror: Taking Resident Evil 7 Forward by Looking Back (GDC / YouTube, VIDEO)
“In this 2017 GDC talk, Capcom’s Koshi Nakanishi and Peter Fabiano explain how Capcom took the Resident Evil franchise somewhere new and different, while keeping true to the series’ original concepts.”

What are devs saying about the design of Super Mario Odyssey? (Joel Couture / Gamasutra, ARTICLE)
“With so many people and developers buzzing about the game and its constant array of new mechanics, Gamasutra reached out to several developers to see just what struck them about Mario’s newest outing. Many are playing the game and revelling in every aspect of its design.”

The case for and against loot boxes, according to developers (Wes Fenlon / PC Gamer, ARTICLE)
“I asked developers who have worked on triple-A and indie games about the process behind how loot boxes are designed and implemented, plus what the future holds for microtransactions given the current player backlash against them. Here’s what they had to say.”

Who is PLAYERUNKNOWN? – Noclip Profiles (Noclip / YouTube, VIDEO)
“PLAYERUNKNOWN’S BATTLEGROUNDS has taken the world of online PC shooters by storm in 2017. But who is the man behind the moniker? We sit down with Brendan Greene to talk about his love of military shooters, his journey into mod development and the success of his first game.”

——————

[REMINDER: you can sign up to receive this newsletter every weekend at tinyletter.com/vgdeepcuts – we crosspost to Gamasutra later on Sunday, but get it first via newsletter! Story tips and comments can be emailed to vgdeepcuts@simoncarless.com. MINI-DISCLOSURE: Simon is one of the organizers of GDC and Gamasutra & an advisor to indie publisher No More Robots, so you may sometimes see links from those entities in his picks. Or not!]

Posted on Leave a comment

Steam Autumn Sale Continues!

The Steam Autumn Sale 2017 continues for Cyber Monday, with great deals across the Steam catalog. Take advantage of the Autumn Sale for the next two days!*

In addition to discounts on thousands of great games, join the nomination process for the Steam Awards. Nominate your favorite games across a variety of categories, and earn profile XP and badges for participating! Your nominations will help determine the finalists for each category. In December, you can vote on the winners for each category during the Steam Winter Sale. Learn more about the Steam Awards here.

*Offers end Tuesday November 28th at 10am Pacific.

Posted on Leave a comment

Get mobile game dev tips from the maker of Good Pizza, Great Pizza at GDC 2018

It can be tricky to balance the demands of making a great mobile game and making a mobile game that earns enough to let you keep making mobile games.

At GDC 2018, TapBlaze president Anthony Lai will be offering fellow game makers advice on how they can do both in his talk “Good Pizza, Great Pizza‘: Game Design, Iteration, and Business Lessons Learned.”

Drawing on Tapblaze’s experience taking its free-to-play mobile game Good PizzaGreat Pizza from 3,000 to 60,000 daily active users in the space of a year (without paid user acquisition), Lai will walk you through the process of making a good, sustainable game under significant resource constraints.

It promises to be a great talk, and anyone who makes time to attend it will walk away with actionable design, development, monetization and production tips for increasing the probability of success for their game. 

Plus we have lots more GDC 2018 announcements to make in the coming months. For more information about GDC 2018 visit the show’s official website, and subscribe to regular updates via Facebook, Twitter, or RSS.

Gamasutra and GDC are sibling organizations under parent UBM Americas

Posted on Leave a comment

PC free-to-play revenue has doubled since 2012

Despite the negative perception surrounding microtransations, free-to-play spending on PC has actually doubled since 2012. 

A new report from market analyst SuperData shows that revenue generated from freemium PC titles has risen to $22 billion in 2017 from $11 billion in 2012. 

Meanwhile, revenue from PC and console full game sales increased by 60 percent to $8 billion from $5 billion during that same period.

“Add-on content sales are increasingly out-earning the traditional one-time purchase model, and the trend shows no signs of slowing,” explains SuperData. 

“PC and console game publishers, who are aware that each segment has a finite audience, are looking for ways to further monetize both the existing audience and find new ways to attract new consumers by lowering the entry barriers.”

The report suggests future game monetization may even mean triple-A publishers completely doing away with $60 products in favor of “product ecosystems.”

However, as the recent Battlefront II loot crate debacle shows, it also warned that publishers still have some way to go when it comes to understanding how to best deploy free-to-play techniques. 

“EA have by no means been the first to get burned by what appears to consumers as money-grubbing techniques,” continues the report. 

“It remains to be seen what effect EA’s course correction on microtransactions will have on Battlefront II, but it’s fair to say the vocal fan community isn’t enthused. Despite this, it’s clear that gamers are continuing to spend on well-executed additional content, and the market presents a massive opportunity for publishers.”

Posted on Leave a comment

Building the new PvE features in WoW: Battle for Azeroth

It’s been over ten years since World of Warcraft took the world by storm, and in that time, a whole slew of online games in other genres have been learning lessons from its success. 

That means now, in 2017, the still-going-strong World of Warcraft does have to indirectly compete with games like Destiny and League of Legends for player attention, as both games have taken core elements of the Warcraft formula and evolved them into new genres. Destiny’s built a first-person shooter with raids, League and other MOBAs yanked the most popular mod from Warcraft 3 to build their own empires on.

With that in mind, we reached out to Blizzard with a question for the current stewards of Warcraft: what new features in the Battle for Azeroth expansion help the long-running MMORPG stand out from its new competitors? 

Thankfully, creative director Alexander Afrasiabi and production director Jon Haight had some interesting insight for us about how island explorations are helping World of Warcraft create new co-operative experiences that help the game continue to evolve. 

“How do you make randomization become a thing that isn’t really frustrating when you get the wrong seed?”

First, a quick breakdown of why the announced “island expeditions” are a notable feature for World of Warcraft. In Battle for Azeroth, islands are instances meant to be experienced with multiple players in a party, just like dungeons. The key difference is, when the instance loads, not only will the loot tables be different every time, the enemies and encounters on each island will be different too. 

Since this expansion focuses on a renewed conflict between The Alliance and The Horde, players will either be pitted against an AI team representing the opposite faction, or if they choose, an actual group of opposing players, as they all race for the same randomly-generated goal. Many of these goals involve gathering Azerite, the new resource players can use to participate in end-game progression, but they may also involve killing specific enemies before the other team does. 

Afrasiabi jokes that this randomly-driven encounter design is a kind of “holy grail of gaming,” since it hopefully ensures that each push into the Islands is fresher than the usual scripted content seen in Warcraft raids. “We had years of experience building this content, how do we make something more compelling? How do we make something that doesn’t become stale after X uses,” asked Afrasiabi. “That led us down the path of randomization. How do you make randomization become a thing that isn’t really frustrating when you get the wrong seed?”

“This kind of challenge opens the door for unusual party configurations, meaning a 3-healer party could succeed in the same way a traditional tank/DPS/healer party could.”

Now that Afrasiabi and his team have had time to explore that idea, he says the biggest difference between these two forms of design is that dungeons, like his favorite raid The Halls of Valor, do a very good job delivering large encounters that reward practice but diminish “newness” over time. 

But on the islands, Afrasiabi says that they’ve been able to explore more emergent gameplay, with “little mini stories and micro-stories that are happening across these island explorations. Hopefully, the tenth run, the twentieth run, it’s different enough because of the randomization and the AI that you don’t feel like ‘man I’ve seen this before.’”

Part of the reason there’s more emergent play in islands, is performing well in them involves less prescribed choreography and a more give-and-take interaction with the game’s artificial intelligence or players from the opposite faction that are racing for the same goal as you. Haight adds that this kind of challenge opens the door for unusual party configurations, meaning a 3-healer party could succeed in the same way a traditional tank/DPS/healer party could.

“The way islands are designed, players won’t be focusing on mastery and memorization so much as they will improvisation.”

Island encounters are also a lot shorter than raids, lasting 15-20 minutes instead of…well, hours, so there’s a big incentive for the Warcraft team to avoid that repetitive raid feel. “We want to make sure that all those prolonged play sessions you’re not [thinking] ‘this dungeon is awesome, but it can’t be awesome forever,’” says Afrasiabi.

Haight notes that the way islands are designed, players won’t be focusing on mastery and memorization so much as they will improvisation. “So it’s not a scripted behavior in the same way that a dungeon boss has. They’re going to react to you in the way that you act towards them,” he explains. “So it’s stimuli and response. It’s going to feel very different.”

There are a few other new features in Battle for Azeroth that are intended to shake up the traditional endgame formula. Warfronts, for instance, is a Horde-versus-Alliance-themed PvE feature that re-introduces strategy game mechanics to the Warcraft universe, as 20 players group up to manage an army of units while still running around the battlefields themselves.  

What’s interesting about this feature is that Afrasiabi says the original intent wasn’t to re-introduce classic Warcraft mechanics, but while building this mode they found they could call back to them while doing something different. “There’s always that breakpoint when you’re making a new game system where it goes from being not fun to fun,” he explains. “And that was our breakpoint, when we’re like ‘actually, we’ve already got things to reference and it feels like it really fulfills that fantasy of a world at war and it’s something we can kind of harken back to fondly.’”

As more online games use events and their own raiding systems to drive player retention, it’s interesting to see Blizzard go the extra mile in adding new instance-driven features with this expansion. With randomization and strategy mechanics back on the table, it’ll be interesting to watch these features grow once Battle for Azeroth launches. 

Posted on Leave a comment

Blog: Creating an awesome game design pattern in C++

The following blog post, unless otherwise noted, was written by a member of Gamasutra’s community.
The thoughts and opinions expressed are those of the writer and not Gamasutra or its parent company.


Hi Folks.

This is actually my first post on gamasutra 🙂 I am here pretty much every day and checkout cool posts, today is gonna be the day I add one by myself 🙂 You will find the origial post here.

In this article I want to talk about the Entity-Component-System (ECS). You can find a lot of information about the matter in the internet so I am not going to deep into explanation here, but talking more about my own implementation.

First things first. You will find the full source code of my ECS in my github repository.

An Entity-Component-System – mostly encountered in video games – is a design pattern which allows you great flexibility in designing your overall software architecture[1]. Big companies like Unity, Epic or Crytek in-cooperate this pattern into their frameworks to provide a very rich tool for developers to build their software with. You can checkout these posts to follow a broad discussion about the matter[2,3,4,5].

If you have read the articles I mentioned above you will notice they all share the same goal: distributing different concerns and tasks between Entities, Components and Systems. These are the three big players in this pattern and are fairly loose coupled. Entities are mainly used to provide a unique identifier, make the environment aware of the existence of a single individual and function as a sort of root object that bundles a set of components. Components are nothing more than container objects that do not possess any complex logic. Ideally they are simple plain old data objects (POD’s). Each type of a component can be attached to an entity to provide some sort of a property. Let’s say for example a “Health-Component” can be attached to an entity to make it mortal by giving it health, which is not more than an integer or floating point value in memory.

Up to this point most of the articles I came across agree about the purpose and use of entity and component objects, but for systems opinions differ. Some people suggest that systems are only aware of components. Furthermore some say for each type of component there should be a system, e.g. for “Collision-Components” there is a “Collision-System”, for “Health-Components” there is a “Health-System” etc. This approach is kind of rigid and does not consider the interplay of different components. A less restrictive approach is to let different systems deal with all components they should be concerned with. For instance a “Physics-Systems” should be aware of “Collision-Components” and “Rigidbody-Components”, as both probably contain necessary information regarding physics simulation. In my humble opinion systems are “closed environments”. That is, they do not take ownership of entities nor components. They do access them through independent manager objects, which in turn will take care of the entities and components life-cycle.

This raises an interesting question: how do entities, components and systems communicate with each other, if they are more or less independent of each other? Depending on the implementation the answer differs. As for the implementation I am going to show you, the answer is event sourcing[6]. Events are distributed through an “Event-Manager” and everyone who is interested in events can listen to what the manager has to say. If an entity or system or even a component has an important state change to communicate, e.g. “position changed” or “player died”, it can tell the “Event-Manager”. He will broadcast the event and all subscriber for this event will get notified. This way everything can be interconnected.

Well I guess the introduction above got longer than I was actually planning to, but here we are 🙂 Before we are going to dive deeper into the code, which is C++11 by the way, I will outline the main features of my architecture:

  • memory efficiency – to allow a quick creation and removal of entity, component and system objects as well as events I could not rely on standard new/delete managed heap-memory. The solution for this was of course a custom memory allocator.
  • logging – to see what is going on I used log4cplus[7] for logging.
  • scalable – it is easy to implement new types of entities, components, systems and events without any preset upper limit except your system’s memory
  • flexible – no dependencies exist between entities, components and systems (entities and components sure do have a sort of dependency, but do not contain any pointer logic of each other)
  • simple object lookup/access – easy retrieval of entity objects and there components through an EntityId or a component-iterator to iterate over all components of a certain type
  • flow control – systems have priorities and can depend on each other, therefore a topological order for their execution can be established
  • easy to use – the library can be easily in cooperate into other software; only one include.

The following figure depicts the overall architecture of my Entity-Component-System:

ECS_Overview Figure-01: ECS Architecture Overview (ECS.dll).

As you can see there are four different colored areas in this picture. Each area defines a modular piece of the architecture. At the very bottom – actually in the picture above at the very top; it should be upside down – we got the memory management and the logging stuff (yellow area). This first-tier modules are dealing with very low-level tasks. They are used by the second-tier modules in the Entity-Component-System (blue area) and the event sourcing (red area). These guys mainly deal with object management tasks. Sitting on top is the third-tier module, the ECS_Engine (green area). This high-level global engine object orchestrates all second-tier modules and takes care of the initialization and destruction. All right, this was a short and very abstract overview now let’s get more into the details.

Memory Manager

Let’s start with the Memory-Manager. It’s implementation is based on an article[8] I have found on gamedev.net. The idea is to keep heap-memory allocations and releases to an absolute minimum. Therefore only at application start a big chuck of system-memory is allocated with malloc. This memory now will be managed by one or more custom allocator. There are many types of allocators[9] ( linear, stack, free list…) and each one of them has it’s pro’s and con’s (which I am not going to discuss here). But even if they internally work in a different way they all share a common public interface:

class Allocator
{ public: virtual void* allocate(size_t size) = 0; virtual void free(void* p) = 0;
};

The code snippet above is not complete, but outlines the two major public methods each concrete allocator must provide:

  1. allocate – which allocates a certain amount of bytes and returns the memory-address to this chunk and
  2. free – to de-allocates a previously allocated chuck of memory given it’s address.

Now with that said, we can do cool stuff like chaining-up multiple allocators like that:

CustomMemoryMgr Figure-02: Custom allocator managed memory.

As you can see, one allocator can get it’s chunk of memory – that it is going to manage – from another (parent) allocator, which in turn could get it’s memory from another allocator and so on. That way you can establish different memory management strategies. For the implementation of my ECS I provide a root stack-allocator that get’s an initial allocated chuck of 1GB system-memory. Second-tier modules will allocate as much memory as they need from this root allocator and only will free it when the application get’s terminated.

MemoryMgr Figure-03: Possible distribution of global memory.

Figure-03 shows how the global memory could be distributed among the second-tier modules: “Global-Memory-User A” could be the Entity-Manager, “Global-Memory-User B” the Component-Manager and “Global-Memory-User C” the System-Manager.

Logging

I am not going to talk too much about logging as I simply used log4cplus[7] doing this job for me. All I did was defining a Logger base class hosting a log4cplus::Logger object and a few wrapper methods forwarding simple log calls like “LogInfo()”, “LogWarning()”, etc.

Entity-Manager, IEntity, Entity and Co.

Okay now let’s talk about the real meat of my architecture; the blue area in Figure-01. You may have noticed the similar setup between all manager objects and their concerning classes. Have a look at the EntityManager, IEntity and Entity classes for example. The EntityManger class is supposed to manage all entity objects during application run-time. This includes tasks like creating, deleting and accessing existing entity objects. IEntity is an interface class and provides the very basic traits of an entity object, such as an object-identifier and (static-)type-identifier. It’s static because it won’t change after program initialization. This type-identifier is also consistent over multiple application runs and may only change, if source code was modified.

class IEntity
{ // code not complete!
EntityId m_Id; public: IEntity(); virtual ~IEntity(); virtual const EntityTypeId GetStaticEntityTypeID() const = 0; inline const EntityId GetEntityID() const { return this->m_Id; }
};

The type-identifier is an integer value and varies for each concrete entity class. This allows us to check the type of an IEntity object at run-time. Last but not least comes the Entity template class.

template<class T>
class Entity : public IEntity
{ // code not complete! void operator delete(void*) = delete; void operator delete[](void*) = delete; public: static const EntityTypeId STATIC_ENTITY_TYPE_ID; Entity() {} virtual ~Entity() {} virtual const EntityTypeId GetStaticEntityTypeID() const override { return STATIC_ENTITY_TYPE_ID; }
}; // constant initialization of entity type identifier
template<class T>
const EntityTypeId Entity<T>::STATIC_ENTITY_TYPE_ID = util::Internal::FamilyTypeID::Get();

This class’s soul purpose is the initialization of the unique type-identifier of a concrete entity class. I made use of two facts here: first constant initialization[10] of static variables and second the nature of how template classes work. Each Version of the template class Entity will have its own static variable STATIC_ENTITY_TYPE_ID. Which in turn will be guaranteed to be initialized before any dynamic initialization happens. The term “util::Internal::FamilyTypeID::Get()” is used to implement a sort of type counter mechanism. It internally increments a counter every time it gets called with a different T, but always returns the same value when called with the same T again. I am not sure if that patter has a special name, but it is pretty cool 🙂 At this point I also got ride of the delete and delete[] operator. This way I made sure nobody would accidentally call these guys. This also – as long as your compiler is smart enough – would give you a warning when trying to use the new or new[] operator of entity objects as their counterparts are gone. These operators are not intended to be used since the EntityManager class will take care of all this. Alright, let’s summarize what we just learned. The manager class provides basic functionality such as creating, deleting and accessing objects. The interface class functions as the very root base class and provides an unique object-identifier and type-identifier. The template class ensures the correct initialization of the type-identifier and removes the delete/delete[] operator. This very same pattern of a manager, interface and template class is used for components, systems and events as well. The only, but important, thing these groups differ, is the way manger classes store and access their objects.

Let’s have a look at the EntityManager class first. Figure-04 shows the overall structure of how things are stored. 

EntityMgr

Figure-04: Abstract view of EntityManager class and it’s object storage.

When creating a new entity object one would use the EntityManager::CreateEntity<T>(args…) method. This public method first takes a template parameter which is the type of the concrete entity to be created. Secondly this method takes in an optional amount of parameters (can be empty) which are forwarded to the constructor of T. Forwarding  these parameters happens through a variadic template[11]. During creation the following things happen internally …

  1. The ObjectPool[12] for entity objects of type T will be acquired, if this pool does not exists a new one will be created
  2. New memory will be allocated from this pool; just enough to store the T object
  3. Before actually calling the constructor of T, a new EntityId is acquired from the manager. This id will be stored along with the before allocated memory into a look-up table, this way we can look-up the entity instance later with that id
  4. Next the C++ in-placement new operator[13] is called with the forwarded args… as input to create a new instance of T
  5. finally the method returns the entity’s identifier.

After a new instance of an entity object got created you can get access to it via it’s unique object identifier (EntityId) and EntityManager::GetEntity(EntityId id). To destroy an instance of an entity object one must call the EntityManager::DestroyEntity(EntityId id) method.

The ComponentManager class works in the same way plus one extension. Besides the object pools for storing all sorts of components it must provide an additional mechanism for linking components to their owning entity objects. This constraint results in a second look-up step: first we check if there is an entry for a given EntityId, if there is one we will check if this entity has a certain type of component attached by looking it up in a component-list.

CompMgr Figure-05: Component-Manager object storage overview.

Using the ComponentManager::CreateComponent(EntityId id, args…) method allows us to add a certain component to an entity. With ComponentManager::GetComponent(EntityId id) we can access the entity’s components, where T specifies what type of component we want to access. If the component is not present nullptr is returned. To remove a component from an entity one would use the ComponentManager::RemoveComponent(EntityId id) method. But wait there is more. Another way of accessing components is using the ComponentIterator. This way you can iterate over all existing components of a certain type T. This might be handy if a system like the “Physics-System” wants to apply gravity to all “Rigidbody-Components”.

The SystemManager class does not have any fancy extras for storing and accessing systems. A simple map is used to store a system along with it’s type-identifier as the key.

The EventManager class uses a linear-allocator that manages a chunk of memory. This memory is used as an event buffer. Events are stored into that buffer and dispatched later. Dispatching the event will clear the buffer so new events can be stored. This happens at least once every frame.

ECS_Access Figure-06: Recap ECS architecture overview

I hope at this point you got a somewhat idea how things work in my ECS. If not, no worries, have a look at Figure-06 and let’s recap. You can see the EntityId is quite important as you will use it to access a concrete entity object instance and all it’s components. All components know their owner, that is, having a component object at hand you can easily get the entity by asking the EntityManager class with the given owner-id of that component. To pass an entity around you would never use it’s pointer directly, but you can use events in combination with the EntityId. You could create a concrete event, let’s say “EntityDied” for example, and this event (which must be a plain old data object) has a member of type EntityId. Now to notify all event listeners (IEventListener) – which could be Entities, Components or Systems – we use EventManager::SendEvent(entityId). The event receiver on the other side now can use the provided EntityId and ask the EntityManager class to get the entity object or the ComponentManager class to get a certain component of that entity. The reason for that detour is simple, at any point while running the application an entity or one of it’s components could be deleted by some logic. Because you won’t clutter your code by extra clean-up stuff you rely on this EntityId. If the manager returns nullptr for that EntityId, you will know that an entity or component does no longer exists. The red square btw. is corresponding to the one in Figure-01 and marks the boundaries of the ECS.

 

The Engine object

To make things a little bit more comfortable I created an engine object. The engine object ensures an easy integration and usage in client software. On client side one only has to include the “ECS/ECS.h” header and call the ECS::Initialize() method. Now a static global engine object will be initialized (ECS::ECS_Engine) and can be used at client side to get access to all the manager classes. Furthermore it provides  a SendEvent method for broadcasting events and an Update method, which will automatically dispatch all events and update all systems. The ECS::Terminate() should be called before exiting the main program. This will ensure that all acquired resources will be freed. The code snippet bellow demonstrates the very basic usage of the ECS’s global engine object.

#include <ECS/ECS.h> int main(int argc,char* argv[])
{ // initialize global 'ECS_Engine' object ECS::Initialize(); const float DELTA_TIME_STEP = 1.0f / 60.0f; // 60hz bool bQuit = false; // run main loop until quit while(bQuit == false) { // Update all Systems, dispatch all buffered events, // remove destroyed components and entities ... ECS::ECS_Engine->(DELTA_TIME_STEP); /* ECS::ECS_Engine->GetEntityManager()->...; ECS::ECS_Engine->GetComponentManager()->...; ECS::ECS_Engine->GetSystemManager()->...; ECS::ECS_Engine->SendEvent<T>(...); */ // more logic ... } // destroy global 'ECS_Engine' object ECS::Terminate(); return 0;
}

Conclusion

The Entity-Component-System described in this article is fully functional and ready to use. But as usual there are certainly a few thinks to improve. The following list outlines just a few ideas that I came up with:

  • Make it thread-safe,
  • Run each system or a group of systems in threats w.r.t. to their topological order,
  • Refactor event-sourcing and memory management and include them as modules,
  • serialization,
  • profiling

I hope this article was helpful and you enjoyed reading it as much as I did writing it 🙂 If you want to see my ECS in action check out this demo:

[embedded content]

The BountyHunter demo makes heavily use of the ECS and demonstrates the strength of this pattern. If you want to know how?, have a look at this post.

So far …

Cheer’s, Tob’s.


References

[1]https://en.wikipedia.org/wiki/Entity-component-system [2]http://gameprogrammingpatterns.com/component.html [3]https://www.gamedev.net/articles/programming/general-and-gameplay-programming/understanding-component-entity-systems-r3013/
[4]https://github.com/junkdog/artemis-odb/wiki/Introduction-to-Entity-Systems [5]http://scottbilas.com/files/2002/gdc_san_jose/game_objects_slides.pdf [6]https://docs.microsoft.com/en-us/azure/architecture/patterns/event-sourcing [7]https://sourceforge.net/p/log4cplus/wiki/Home/ [8]https://www.gamedev.net/articles/programming/general-and-gameplay-programming/c-custom-memory-allocation-r3010/
[9]https://github.com/mtrebi/memory-allocatorshttps://www.gamedev.net/articles/programming/general-and-gameplay-programming/c-custom-memory-allocation-r3010/ [10]http://en.cppreference.com/w/cpp/language/constant_initialization [11]https://en.wikipedia.org/wiki/Variadic_template
[12]http://gameprogrammingpatterns.com/object-pool.html [13]http://en.cppreference.com/w/cpp/language/new

Posted on Leave a comment

Rovio shares fall by 20 percent thanks to rising UA costs

Rovio has seen its share price drop by 20 percent two months after its IPO, with declining profits and rising user acquisition (UA) costs the likely culprits. 

Shares began trading for around $13.6 when the Finnish outfit first went public, but are now going for around for $11 per share.

The company’s financials for the three months ended September 30 show that revenues actually increased 41.2 percent year-over-year to $83.8 million. 

However, user acquisition costs rose by 308.7 percent to $26.3 million, while profits fell by 70 percent to $1.9 million. 

Meanwhile, revenue in Rovio’s games division rose by 40 percent, with the company explaining it received a monetization boost from its “top games.”

The company believes its games line-up will start to reap the benefits of its recent UA investments in around 8 to 10 months. 

“We significantly increased our investments in user acquisition, and at the same time in future revenues, for our top-performing games: investments increased to $26 million in the third quarter, which, as expected, reduced the profitability of the games business unit for the third quarter,” explained Rovio CEO Kati Levoranta.

“We expect the payback time for these investments to be 8 to 10 months. In August, Rovio launched a new game, Angry Birds Match, which has promising performance indicators and the potential to become one of Rovio’s best performing games.”

Posted on Leave a comment

Blog: Creating a machine learning algorithm to illustrate Magic cards

The following blog post, unless otherwise noted, was written by a member of Gamasutra’s community.
The thoughts and opinions expressed are those of the writer and not Gamasutra or its parent company.


[This post originally appeared on my blog]

image

Urza’s Dream Engine is a neural network project I’ve been working on for a few months. It’s a bot that creates art in the style of Magic: The Gathering cards. It began as an effort to create art to go with the cards created by the amazing Twitter bot, RoboRosewater, and grew into its own beast. It was my first foray into machine learning as well as my first project focused on producing still images.

You can see some of the output on the site for the project here: http://andymakes.com/urzasdreamengine/

Although Urza’s Dream Engine grew a life of its own, it culminated in a booster draft, held at Babycastles in NYC on Nov 18th 2017, using the RoboRosewater cards paired with my bot’s art. This was a game played by humans, but designed and illustrated by machines. In this post I’ll go through my process step by step starting from the initial idea to the Babycastles event.

The Urza’s Dream Engine site has downloads for the images, cards, and print-and-play booster packs used at the Babycastles event.

Conception

This whole project started because I was enamored with a twitter bot called @RoboRosewater. I certainly wasn’t the only one; as of this writing, the account has 23,500 followers. RoboRosewater is a neural network bot created by Chaz and Reed that has been running since 2015. Each day it posts a new Magic: The Gathering card generated by a machine learning algorithm. Basically, it is a program that has been trained on all of the existing MTG cards (roughly 30,000 over the course of the game’s two decade run) and which now attempts to create new ones in the same vein. The bot itself only generates the text for these cards.

Most of the cards it makes are borderline nonsense (but still very fun), but a surprising amount of the output is actually playable. Maybe broken in the sense that the card is too strong or too weak, but actually legal within the (fairly complex) rules of the game. Even these legal cards are still alien, like a glimpse into a bizarre alternate reality of the game. Some are amusingly pointless (like a card that states that players must pay the cost of the spells they cast). Others staple three seemingly unrelated abilities together. And some have genuinely interesting effects that as far as I know have never been used in the game’s long run. If you like Magic or bots I highly recommend following this account.

image

Given my interest in exploring unusual parts of the game in my Weird: MTG series, holding a RoboRosewater booster draft seemed like a natural fit. I had something of a convenient problem though: the cards posted on the RoboRosewater account use stock images of computers as their art. They basically all looked the same, and trying to play with them would be hugely difficult as card art is an important identifying feature, especially when playing with a new set. Imagine having to read 20 identical cards on a table every time you are thinking about your next move.

I say a “convenient” problem because I am usually keen to try a new project, and the challenge of creating illustrations for these cards sounded exciting. The text was generated via automation, so I knew the images should be as well. Early on I thought about trying to create an openFrameworks sketch that would cut up and recombine the existing art from the game, but it didn’t feel quite right. The cards were created by a neural network so the art should be as well. I’d never messed around with neural networks myself, but I was as blown away by Google’s Deep Dream algorithmas anybody else so now seemed like the time to learn!

Getting Started

I’m a total script kiddie when it comes to machine learning. People get their PhDs creating and working with these techniques. I’m a game designer who likes to tinker. As a result, I tried setting up a lot of different neural nets on my laptop (non-GPU enhanced, but I wasn’t going to let that stop me). With most, I ran into some roadblock after 5 or 6 hours that couldn’t be solved by importing a new Python library, and I’d move on to the next one. Finally I had success with a Tensorflow implementation of Deep Convolutional Generative Adversarial Networks by Taehoon Kim. I followed the instructions on the GitHub page and  trained the bot on some sample images of faces. After tweaking some of the source code to work on my machine, I produced a set of my own slightly-off faces and I was off to the races.

image

Getting the Data

Part of how neural networks function is that before they can attempt to replicate a style, they must first train on an existing dataset to learn what the thing is. This data set is no trivial thing. In order to work well, the data must be ordered (being of similar type and presented in the same way) and there must be a lot of it. The sample set that came with Kim’s Tensorflow implantation contained 200,000 images of celebrities–all from the neck up, facing the camera and cropped to be the same size.

Fortunately, the sheer volume of data associated with Magic: The Gathering is huge, probably rivaled only by major sports like baseball. It is the first & longest running collectible card game. It first launched in 1993, and Wizards of the Coast, the company that produces it, has released multiple new sets every year since then. This means that there are currently over 30,000 unique Magic cards. This pool, while small in terms of what would be best for neural net generated art, is still larger than any other game I could hope for. Of course, the art in the game is also not nearly as uniform as the celebrity data set I described above, but I didn’t need perfect output. I was ok with things getting weird and I wanted to see what the bot could do.

image

First things first, I needed to get that data. Luckily, I am not the first person who wanted to scrape info from all existing MTG cards. There is a piece of software called Gatherer Extractor that will download text and images within certain parameters from Wizard’s official MTG online card database, Gatherer. This software, release for free by MTG Salvation user chaudakh, prepares an Excel spreadsheet file with all of the requested information and can also download the card-scans (the image showing the full card). To my delight it also had a check-box to crop the card image to just the art, saving me a Python or openFrameworks script to do the same thing.

image

I tested Gatherer Extractor with a single expansion set, and after seeing that it did what I wanted, I set it to work pulling the card text and image of every card in the game. The resulting spreadsheet makes my computer chug a little bit, but it had everything, cross-referenced with the image associated with each card.

Sorting

Card art in Magic: The Gathering is dependent on a lot of things, but there are some game factors that are frequent indicators of aspects of the illustration. Card color is a big one. Red cards tend to lean toward warm colors in their palette, blue cards toward cool colors, etc. Creatures also tend to look different than artifacts, which look different that sorcery spells (or so I thought–more on that in a bit).

Initially I had hoped to have the bot simply train on all of the card text and art so that it could learn the difference between, say, a red creature and a white enchantment itself, but I soon realized that this was more than I could do with my setup (or at least with my experience level). It became clear that the best course of action was to sort the art beforehand by whatever metric I wanted to use, and then have the bot train only on those images. To do this, I built a simple Python script that would let me set some parameters (for instance “blue creatures” or “enchantments”) and would then scan the spreadsheet to find all cards that met the criteria. For each card that fulfilled the search query, the image associated with that card was copied to a new folder to use as the training set.

Early Batches

The first training set I tried was all creatures with the “goblin” creature type. This resulted in a little under 400 images for the bot to train on. After a few hours the results were still rubbish. After more experimentation I would learn that I had made two errors: 400 images was a minuscule dataset, and a few hours was not nearly enough time to train, even with a good data set.

image

I switched gears and tried white creature cards, figuring that color was the major indicator of the palette used in the art and that creatures would at least all have a main subject in the art (as opposed to sorceries or enchantments). I let it run for four days this time. As it worked, it would spit out a sample image every hour or so. I was able to watch a formless mess turn into something more discernible. (The sample images are produced in an 8×8 grid, so each sheet contains 64 separate images produced by the bot).

image

image

This bird-fish giant was the first image produced by Urza’s Dream Engine (then unnamed) that I really loved. It’s still one of my favorites.

At this point, I knew things were working. I messed around with the settings for other groups, but as far as the code I was using, very little changed. The biggest modification was that I shifted from breaking up the images by card type and started using color as the only criteria for sorting. Contrary to my initial assumption, I realized that nearly every card in Magic has a central figure in the art, not just the creature cards. In order to illustrate the magical effect of a spell or enchantment the art shows a central figure either reaping the benefits of or being victimized by the spell in question. Since there was little artistic difference between them, there was no reason to make smaller training sets.

image

These three cards all represent spells in the game, but the art could just as easily go with a creature, since each one focuses on a character.

Creating and Posting

The next two months or so were spent with the bot quietly running in the background of my laptop (and absolutely destroying my battery life when unplugged). I worked on other projects, most notably finishing up my sequencer toy Bleep Space with musician Dan Friel. During this time I occasionally posted the images the bot created on this Tumblr and on Twitter. The images resonated with a lot of folks and I always enjoyed hearing what people saw in them. The images are abstract enough that while conveying the general vibe of MTG art, they function a bit like a Rorschach test, allowing for many interpretations.

In order to make the images a bit more human-readable, I used a few Photoshop actions that I could run as a batch. The main one I used took the 8×8 grid of images that the machine learning bot spat out and cut the image into 64 individual images. These images were then blown up slightly to be a bit easier to see and so that they would eventually fit nicely onto the RoboRosewater card frames.image

Around this time, the images were featured in an article on Waypoint (The games culture wing of Vice) by Cameron Kunzelman. As I noted in the article, one of the things I was enjoying most about the bot was that it was generating images that seemed to have some of the flowing style of Rebecca Guay, my favorite illustrator for MTG.

imageOn top are Guay’s illustrations for Bitterblossom and Regenerate. On the bottom are two of the pieces created by Urza’s Dream Engine.

Creating The RoboDraft

I was collecting and displaying the images, but I still wanted to do something with them. My background is in interaction, and that hasn’t changed. My original plan was to pair my images with Roborosewater cards and hold a booster draft with them, and that was still what I wanted to do. My previous Weird MTG events had been much simpler (a booster draft using cards from Legends and Arabian Nights, a tournament only allowing cards from 1994, and a cube draft made of “rare” cards that were worth 25 cents or less). Most of these had involved printing proxies (printed cards used in place of real cards that are too valuable to actually play with), so I had a tool chain in place for that, but the scale of this one was much bigger, and it involved creating the card images before printing them.

A booster draft is a specific type of Magic tournament that involves creating a deck on the fly. The typical tournament type, called “constructed,” involves players building a deck at home with cards they own and bringing it to the tournament. Booster drafts are of a type of tournament, known as “sealed”, where players come empty handed and are given unopened MTG products to build their decks with. The “draft” part of “booster draft” comes from how players build their decks at these tournaments. The players sit in a circle, and each player opens a booster pack (containing 15 cards), selects one card to add to their pool, and passes the remaining cards to the player next to them. This process continues until each player has 15 cards, at which point the players open their next pack. Each player starts with three booster packs, so at the end each player has 45 cards to use in building their deck (not all of the cards need to be used), but because they drafted cards from the packs going around, the card pool for the tournament is actually much larger and players can exercise significant strategy in building their pool of 45 cards.

Being more interested in playing than actual deck-building (and being pretty awful at deck-building), booster draft has always been my favorite format. It also lends itself well to an event that uses cards that nobody owns or could possibly own. It did, however, mean that I had to construct booster packs for players.

Selecting the Cards

The first thing I needed to do was create the pool of cards to draw from. To do this I went through every image that @RoboRosewater has every posted (roughly 830 at the time). To get started, I sorted the images into four categories. Here are my criteria and a ballpark value on how many cards posted by the bot fell into that category

Legal: Cards that could be played as written. I’d say roughly 40% of the cards posted by RoboRosewater were legal.

image

Legal but Functionless: Cards that could be played but effectively did nothing. While these are fun to see printed, they essentially represented dead cards in a booster pack. This category was small, making up only about 5% of the cards. (Yes, there are niche cases where these cards would do something, but the use case is so narrow as to effectively be 0 in a sealed format)

image

Nearly Legal: Cards that were not technically legal within the rules of the game, but which had only one obvious interpretation. These had to be things that could be fixed with a slight edit that would not be dependent on me making design decisions for the card. I was pretty strict about not editorializing, so anything that could potentially be interpreted multiple ways was pushed to the next category. The result was that this also made up about 5% to the total cards.

image

Illegal: Cards that simply did not work within the rules of the game. These cards tend to be collections of Magic terms that do not work in any comprehensible way in the game (they also tend to come from shorter training period as indicated by the art on the card). The longer the text on the card, the more likely it was to veer into this territory. This was the biggest category, with about 50% of the posted cards landing here. While these cards can be great fun to see–the close-but-no-cigar nature of their wording exists in the same vein as videos of robots falling down–they wouldn’t work for this event.

image

Power level did not factor into the decision making at all (besides avoiding cards that basically did nothing). My goal was not to create a balanced environment, and in fact, I think that trying to “correct” for the weird power levels created by the bot would have undermined the authenticity of the event. All of the cards in the “legal” or “nearly legal” categories were initially included, even cards like Teferi’s Curse that threatened to stall out games.

A quick note: it may seem at first glance that having less than half of the cards be legal suggests some kind of failing on the part of RoboRosewater and its creators, but this simply isn’t true. Magic: The Gathering is a deeply complex game with many accumulated rules and abilities on cards over the years. The fact that RoboRosewater can create any cards that are legal is astounding. The fact that it can do it on a regular basis is all the more impressive. I would be over the moon if I made a similar project for any game that was able to succeed 40% of the time. I certainly did not include even 10% of the total output of Urza’s Dream Engine on my website for it. Furthermore, the purpose of the bot is to create interesting cards. They were never meant to be played, so my metrics for what would work for this event are not the same ones the bot’s creators were using.

After sorting the cards into these groups, I broke them down by color. Booster drafts work best when each color is represented somewhat equally. I found that black was the weakest with only 40 viable cards, while white and green were overrepresented with around 60 each. Although I was reluctant to apply my own design decisions to the game, I decided that removing some white and green cards in order to keep some color balance was a good idea. I bumped a few of the cards that were in the “nearly legal” category from those colors and pulled a few more that were borderline functionless. This was the extent of my own design decisions for the draft.

The result was a 308 card set. Definitely a large set, but not so big that players in a draft won’t see some of the same cards go around more than once. Because the cards generated have no rarity level (typically common, uncommon, rare & mythic), all cards were equally likely to show up in a given booster pack.

Putting the Cards Together

Now that I had all of the RoboRosewater cards sorted by color (or type in the case of artifacts and land), it was time to combine them with the images that Urza’s Dream Engine had created. Once again, I wanted to maintain automation’s control over the output. My purpose in this process was to curate and facilitate that automation. I wanted a program to randomly pair cards with art of the same type.

To do this, I built an openFrameworks application that would accept an input card image folder and an input art image folder. Once given these folders (for example, red RoboRosewater cards & red card images produced by Urza’s Dream Engine), it would randomly select from the two pools, combine the images and save the output as a new image. As it did this, it would remove both the card and image from the pool to guarantee that there would be no repeats.

image

As I mentioned earlier, the abstract images produced by Urza’s Dream Engine lend themselves to multiple interpretations. This wound up being a serious boon when paired with the cards. Between RoboRosewater’s liberal use of abilities and fascinating names, and my own bot’s hazy images, nearly every pairing felt like it made sense, even though it was random beyond the color of the card. Although I knew the power of the human mind to seek connections and narrative is an amazing thing I was pleasantly surprised by just how right everything felt.

image

Printing the Boosters

Luckily, this was not my first Weird MTG event that involved making proxy boosters. I’ve created a tool for myself that does exactly this which is available for free. This tool draws cards from input folders and creates PDFs of print-and-play ready booster packs. The printable cards include a small number in the bottom corner to identify what pack they belong to. The purpose of this is to preserve the randomness generated by the program, including the rule that no pack contain duplicates of the same card.

image

I printed a total of 72 packs. Enough for 24 players. The resulting stack had some real heft to it.

image

Cutting and Sorting

I took the whole thing to Kinkos and got to work at the paper cutter. It took a little over an hour to cut everything out.

image

Once I got home, my partner Jane and I watched shows while sleeving cards. Because these proxy cards are printed on paper, they do not have the weight to be played with by themselves. To get around this, they must be put in a protective card sleeve using a real card as a backing to give them the necessary weight. These sleeves are very common in the game, and are typically used to allow players to use valuable cards without worrying about scuffing them or having something spilled on them. I buy them in bulk.

image

Once they were sleeved (at the expense of nearly every common magic card I own), they were ready to be sorted into their packs. This is where the little ID number at the bottom of the card comes in handy. Because I am usually cutting multiple sheets at a time, the order gets a little shuffled as cards wind up not next to their neighbors, but actually the card that was in the same location on the sheet above or below it. I started by grouping everything by their tens place (0-9, 10-19 etc).

image

Once they were in more manageable piles, I got to work placing them into their actual packs. At the end I had 72 stacks of 15 cards for 1080 total cards, enough for 24 players to draft. After counting each pile to make sure it had the right number, I put them in baggies that would act as surrogate wrappers. I packed up a bunch of basic lands along with them (for players to use when building their decks) and the setup for the event was complete.

image

Contact with RoboRosewater

In the week leading up to the event, I posted about it a lot. I was having great fun creating sample packs that not only showed the Urza’s Dream Engine art, but also asked players to think about the RoboRosewater cards in a game context rather than just as abstract individual cards. I also made sure to tag RoboRosewater in these posts. I had previously attempted to reach out the the developers on the both on Twitter and on MTG Salvation, the forum where it had originally been posted. There was no info on either about who actually made the bot and I wanted to be sure that I could credit them and that I had their blessing.

Luckily, a few days before the event at Babycastles, I received a Twitter DM from the RoboRosewater account asking if there would be any pictures of the event. I was thrilled to hear from them, and honestly a bit starstruck. I was happy to know that they approved of the event and was glad for the opportunity to ask about how they wanted to be credited. I want to give a huge thanks to Chaz and Reed for making the bot that got the ball rolling on this project.

Playmats

As a prize to give away at the Babycastles event, I ordered two custom playmats from Inked Gaming with a collage of some of the images generated by Urza’s Dream Engine. It’s a small thing, but it was fun to see them, and I hope they’ll get some use from the folks who won them.

image

The Draft

The event itself was fantastic. I knew the cards would be interesting to play, but I wasn’t sure they would be fun. It turns out that as weird as the draft environment was, it was also very playable. The set offered multiple viable strategies both for deck building and playing. There are a few leftover packs and I’m looking forward to doing it again. A photographer, Lippe, took some lovely pictures of the event.

image

image

image

image

 

Closing Remarks

Magic is a game that I love, and this was a fascinating way to interact with it. It was also my first glimpse into machine learning and image generation as a whole. I’m very please with the result both of Urza’s Dream Engine and the resulting booster draft. Finally I want to extend another thanks to Chaz and Reed for creating RoboRosewater and giving me an tool to build around.