Sunday, September 26, 2010

Thinking Inside the Box

The word "innovation" gets thrown around a lot in the gaming community. "We want innovation," people will cry, "We want new experiences and fresh gameplay concepts!" And, contrasted with this, people will lament the procession of samey first-person shooters and motion-controlled minigame extravaganzas which flood today's marketplace. The assertion that innovation in dead in big-budget game development is, if hyperbolic, increasingly common even by industry analysts.

So what's the problem? Are the cries for innovation really falling on deaf ears? Is there really no way to get high-profile game developers out of this rut they're in? I say "No!" on both counts. The real problem here is that, for all the ranting and raving done on the subject, a lot of people don't seem to put much thought into what innovation actually is or, more importantly, how it comes about. If you ask me, the sameness of modern video games comes from the fact that modern game designers have too much freedom to do what they want as opposed to too little. Innovation comes about when you have restrictions to work around and sacrifices to make.

Imagine for a second that you had a magical pantry. (Just...just go with me, here.) In this pantry, you could get any kind of ingredient you wanted and thus make any kind of food imaginable. On the surface, you might believe that this would allow you to create new and innovative dishes every night, but think about it for a second. If you had access to every ingredient on Earth, why would you ever choose low-quality goods? Why would you ever bother with ingredients you don't like? If you hate eggplant, why would you use eggplant when you have so many other options you do like?

Now imagine a second magical pantry. This pantry also contains any ingredient imaginable, but only a few of them at a time. Every night the contents are randomized, so you might have a bunch of ingredients you like or you might not. And that, my friends, is when innovation really kicks in. One night you might have eggplant as your only available vegetable, so you'd have to really think about how to make it work. Maybe you could try cooking it a different way than normal or maybe you try covering it in a really rich sauce, but the fact that you're restricted in what you can use is forcing you to be creative.

Modern, big budget game designers are a lot like the first scenario. Between the massiveness of their budgets and the beefiness of modern hardware, they have all the resources they could possibly want when designing a game. And, as such, they just continue to make the same "comfort food" they like the best almost every time. Oh sure, some days they may feel a little adventurous and cook up something new, but they've no real reason to.

However, things weren't always like this. Going back to the NES, Super Nintendo and even Playstation eras, game design was more like the second scenario. Game designers had to fight against the upper limits of the hardware every time they created a game, and this forced them to think of innovative ways to do more with less. If you think back on all the old "classic" games, there wasn't a whole lot to them. In Super Mario Bros. you run, you jump, you duck and you occasionally find one of three power-ups. It's a simple but very elegant core of gameplay which the designers could iterate outward into a million different challenges, all pulling from the same relatively small set of tiles and enemies. The developers surmounted the limitations of the hardware by focusing on making core mechanics that were fun and versatile enough to stand the test of time.

Another example of innovation through limitations would be the first Silent Hill game. In many games of the Playstation and Nintendo 64 era, the problem of "draw distance" was tricky. The hardware being what it was, the system just couldn't render that big an area at once, so the player always had to deal with a sort of "fog" blocking their view of far-away objects. When faced with this, Silent Hill's developers made a brilliant move and turned a hardware limitation into a core of the game's atmosphere. In the haunted town of Silent Hill, the ever-present fog became a big part of gameplay, and the concern that a mutilated beast could come charging out of it at any moment kept the player on edge.

This process of "innovation through limitation" is also the reason you see so much creativity in small, independently-produced games. In the case of these indies, the limiting factor isn't the hardware, but rather the money. Indie start-ups only have so much money and time they can pour into a game, so they need to put a lot of thought into how to get the most bang for their meager buck. Just like in the old days, this means building an engaging (and usually unique) core to your gameplay which you can iterate out any number of ways. This is why you see indies taking advantage of things like procedurally-generated content and abstract or retro-stylization in their games. These elements which may be considered too strange or bland for big-budget studios are, like the eggplant, the ingredients with which indies are forced to work to get results.

"So that's a nice theory and all," you might be saying, "but I thought you said innovation isn't dead in big-budget studios. If just yelling at them to innovate won't work, what should we yell at them to do?" I'll admit it does sound pessimistic to say that the very freedoms game developers have is stifling their creativity, but I think that innovation through limitation can still drive the industry forward. It's all about following one simple rule...

DEMAND QUALITY!

Just think for a second how much the gaming community lets developers get away with. We have games that are glitched, to the point of unplayability, at launch and are only fixed by patches down the line. We have games where human character models get loaded onto birds, ragdoll physics bug out for no reason and...I don't even know....yet we don't seem to care. We don't complain about getting obviously-unfinished and untested games. We just let developers kinda do...whatever, and this is exactly the problem. As it is, a lot of modern games (particularly "open world" ones) feature a crushing amount of seemingly half-done content which feels like they just took every single idea the brainstorming room threw out and tried to cram it in there. The result tends to be a gigantic, schizophrenic mess that really illustrates the "Less is more" concept. Once upon a time, it was a lot easier to accept a game being buggy or feeling unfinished. The hardware was newer and the developers were smaller, some of them working with budgets no higher than the indies of today. But now, with the sheer amount of money and manpower going into AAA titles, there's no longer an excuse, and developers need to start being held more accountable for what they sell.

If we, as consumers, demanded games with actual polish to them it would once-again force developers to seriously consider what to put in and what to leave out. Developers would need to concentrate on having a core set of fun and engaging features (which are easier to playtest) rather than a zillion conflicting, stupid features which no testing team in the world could do justice.

Now that hardware has reached a point where just about anything is possible, it's up to us the audience to set the limitations which will drive the industry forward. Simply put, it's time for developers to once-again start thinking inside the box.

Tuesday, September 7, 2010

Someone Make This: Minion Madness

No matter how many iterations my blogging may take, I will never get tired of having gimmick segments. To that end, welcome to the newest segment of my newly game-focused blog, "Someone Make This," in which I come up with game ideas for people to steal. Specifically, I come up with game ideas in order to illustrate an idea I'd have trouble getting across otherwise.

Today's game is inspired by this article on The Escapist, about "crossplay." The idea of crossplay is to merge single and multiplayer experiences, and isn't strictly a new idea. People have been playing with the idea of "single player games in which other players control the enemies" for years, and to an extent it's been done. Left 4 Dead, for example, allows people to play as the zombies and harass the protagonists. There have even been variations on Pac-Man which allow other players to be the ghosts. Still, this got me to thinking about other ways you could pit players against each other in ostensibly "single player" situations.

I want you to picture a game called "Minion Madness."

Minion Madness is played online. A bunch of people join a game, and their roles are then randomly selected. Specifically, one person will be chosen as the "Hero," the other as the "Overlord" and everyone else as "Minions."

From the perspective of the "Hero" the game is a sidescrolling platformer, akin to the Mario series. The goal is to get from one end of the level to the other, avoiding or defeating enemies and collecting powerups along the way. Maps are procedurally generated, but the platforming itself is always fairly simple. The main challenge comes from the enemies who will stand in your way.

This is where the "Overlord" comes in. From the Overlord's perspective, the game is more like a tower-defense strategy game. When the game begins, the Overlord is given two minutes to set down the enemies in the level. The Overlord has limited funds with which to do this, and more powerful enemies cost more coins to place. The Overlord's goal is to prevent the Hero from completing the level. Once all his minions are placed, the Overlord must sit back and watch the Hero run his gauntlet.

When the actual platforming starts, the rest of the players, the "Minions" take control of the enemies which have been placed. The gameplay of a minion is very limited, and many enemies have one-button controls. An enemy which can only walk back and forth for example, uses only one button with which to change direction. Other, more powerful enemies will have more advanced controls to allow the minion to hinder the hero more effectively. The goal of a minion is, of course, to kill the Hero as the Hero comes charging through. What's in it for them? Well, this is where the overall structure of the game comes into play.

Again taking a cue from Mario, each "game" of Minion Madness consists of four levels, or "rounds" if you prefer. If the hero runs out of lives during any of these rounds, the Overlord's team wins. If the Hero makes it through all four levels, the Hero wins. In addition, each person's performance in a level can help them out in a later level. The Hero, for instance, can collect coins in order to earn more extra lives. The Overlord, by contrast, receives all the coins the Hero didn't collect in the level to bolster his funds for the next. (The Overlord also gets a hefty bonus of coins whenever the Hero loses a life.) The Minions don't care about coins, and instead are fighting for "ranks." If a Minion kills the hero, he or she will become a "Gold-Rank" Minion. The next time that Minion is placed, they will be more powerful than before (faster, more resilient, etc). As a result, gameplay in each level is very much about powering up for the next level, hopefully raising the stakes each time the Hero makes it through.

The fourth and final level of the game is unique in that it has a Boss Chamber at the end. Once the Overlord has distributed his minions, he himself takes control of the boss creature at the end. Depending on how many coins the Overlord has when all is said and done, his boss form will be more or less powerful. Minions don't have much to do with the fight, though their job is obviously to prevent the Hero from ever reaching it. Whoever wins, the game will ultimately start over, with roles being re-assigned.

What I like about this game is that it's a fusion of many largely single-player experiences into a meta-multiplayer game. From the perspective of the Hero, this plays just like a standard platformer, yet it makes the experience much more personal. Most platformers have a "one man against the world" element to them, but this time you're really being pitted against an army of other players, including a living, breathing Mastermind behind it all.

From the perspective of the Overlord this plays very much like a Tower Defense game, but what I like about it is how much it simulates the "evil overlord" experience. You're the one with all the plans, but in the end you must sit back and watch your clumsy and perhaps foolhardy henchmen carry them out. You get to really see what it's like to be a video game supervillain.

For the Minions, it's a somewhat more unique experience, but one I've always kind of wanted to see captured in a game. What's it like to be the little dude on level 1 whose only means of attack is to waddle back and forth, waiting in rapt anticipation for the two or three seconds which will determine your victory or defeat? Maybe I'm alone in wanting to step into that guy's shoes for a moment, but I think it's something that would be interesting to explore. So many multiplayer games, particularly MMORPGs, try to make every player feel like the hero of the story to the point where no one really is. It would be kinda cool for a game to have the balls to say "You're not special. You're a grunt just like 95% of the people here. If you want to be special, you're going to have to earn it."

This is why I think "crossplay" is so interesting. It allows players to step into the shoes of characters who have largely existed as abstractions in video games past. It turns every moment into a real, person vs. person narrative. I don't know if video gamers as a whole would relish the idea of being a Buzzy Beetle for a day, but as far as my personal tastes go, I want someone to make this.

Friday, August 27, 2010

You Mean You Have to Use Your Body? That's Like a Baby's Toy!

So, there's been a lot of confusion lately about who the Kinect is "for." So much confusion, in fact, that even Microsoft can't get their story straight. However, the general consensus seems to be that the Kinect will be focused on making games for "casual" players, which is generally code for simpler, more accessible games suitable for families. More specifically, it means MINIGAMES MINIGAMES MINIGAMES and very little depth.



The thing I don't understand about it is that there seems to be a sense that the Kinect's full-body motion controls couldn't be put into a more "hardcore" context. It seems to me that a game in which your every motion can be used as a command would be an excellent way to make games much more deep than the one's we're playing today.

Let's think about it this way. In the average game, discounting something like an RTS, you are controlling a person. The buttons and joysticks on your controller almost all serve as commands for this person. As a result, you could think of the controller as a kind of abstracted way of controlling your game protagonist's body. Now consider the Kinect. The Kinect, at least in theory, has been touted as having the ability to locate your "skeleton" and determine exactly how you're moving. It stands to reason, then, that this "skeleton" could be used as the "skeleton" of a game character. Imagine a game in which your motions correspond 1:1 with the motions of your game hero. You punch, they punch. You duck, they duck. No canned animations, no fudging hitboxes, it's all you.

Think of the possibilities of making a fighting game out of this. I'm picturing a game structured very much like your standard 2D fighting game, only the character you play is using your skeleton, so your movements translate directly to theirs. The designers would have to be clever about it, particularly in determining how the game recognizes things like "blocking" or whether there would be ways to implement super moves like fireballs. See? That's something that would be an actual game, and the depth is...kinda infinite in a way, because your options are limited only by your ability to move.

The potential for the Kinect to create things gamers would actually recognize as video games and not just diversions seems obvious to me, just with the skeleton mapping alone. So why do developers continue to limit themselves in exploring the hardware's capabilities. I think it could be one of three things.

The first possibility is that developers don't trust themselves. Taking full advantage of dropping a player's skeleton into a game would no doubt be hard. It's a completely different animal than canned animations and would require inventing not only new engines, but new languages in which to give feedback to the player. It would require, y'know, creativity. And of course, with creativity, comes the risk of screwing up and failing, and we all know how well that goes over at shareholder meetings.

The second possibility is that the developers don't trust the Kinect. It's been a point of concern just how "ready for primetime" the Kinect technology actually is. It's possible this "skeleton-mapping" I've been going on about doesn't work as well as Microsoft's marketing team would like you to believe. It's possible there are too many issues and hiccups with the technology to make anything deeper than light fluff. If that's the case it's particularly sad, because at least to my eyes the skeleton mapping abilities of the Kinect were all that was elevating it above being a glorified EyeToy.

The third possibility is that the developers don't trust gamers. Specifically, they worry that if they make a game that actually requires standing up and moving, they run the risk of alienating the 300-pound shut-ins whom they view as the ones buying "hardcore games." If this is what's going on, I think it's missing a key point. Hardcore gamers aren't the only people who can appreciate deep, "hardcore" game mechanics. A game can be deep while also being accessible, something that's always seemed to be at the core of Nintendo's design philosophy. Motion controls offer a fantastic opportunity to create a game that's deep but accessible, because the hurdles of memorizing buttons and getting used to joystick controls are put aside. Controls don't get much easier to grasp or deeper to master than "punch to punch."

Since apparently Microsoft hopes the Kinect will last for a good five years, at the very least developers are looking at a good long time to unlock the hardware's true potential. I just hope they take the time to investigate it, because I really want to play that skeleton-mapped fighting game I was talking about. I am not kidding, here. That's a day-one purchase.

Wednesday, August 25, 2010

Sunshine Syndrome: Get Your Franchise Checked Today

Hello. I am Dr. Lex, and I'm here to talk to you about a very serious and very misunderstood illness. It's a disease which has claimed many a great game, taking franchises in their prime and leaving them with permanent, sometimes crippling black marks on their history. It's torn fanbases apart, ruined game careers, and for no good reason. I know it's a difficult subject to talk about, but the silence must be broken. Today we're going to talk...about Sunshine Syndrome.

The medical definition of Sunshine Syndrome goes thusly: "A condition in which a quality game is poorly received because it fails to live up to the expectations given by the franchise to which it is tied."

It is named for Super Mario Sunshine, an excellent game which was widely considered disappointing for not being the sequel to Super Mario 64 people wanted. Sunshine Syndrome is a condition which tends to strike more often in older game franchises, and the more beloved the more at risk they are. The higher fans' expectations of a franchise are, the more likely it will be hit with Sunshine Syndrome. To give you a better idea of this condition, let's review a few cases.



Super Mario Sunshine. The game for which the condition is named and one of the more famous cases. Super Mario Sunshine was a wonderful game, beloved by critics but still falling short of fans' high expectations. The game's water-spraying mechanics were fun, but felt out of place in a Mario game. After all, Mario games are about the jumping and platforming, and while this game had both of those, they didn't feel like the center of the game. The release of Super Mario Galaxy marked the last nail in this odd duck's coffin. One wonders how an original IP might have fared with this game's watery antics, but the world will never know.



Ah, Starfox Adventures. Now, while Super Mario Sunshine may be a more famous case, this one is particularly tragic. This game was, at the start of its development, never intended to be a Starfox game. It had the Starfox characters and mythos added very late in development in an effort to help the game sell, despite the fact that the game had nothing to do with flying around and shooting stuff like a normal Starfox title. Instead, the entire game centered around Fox going planetside and hitting dinosaurs with a stick. Understandably, this game left fans and critics alike absolutely mystified as to what Rare and Nintendo were getting at, and it was heavily panned. The fact that this was never intended to be a Starfox game in the first place makes this a somewhat more clear-cut instance of Sunshine Syndrome than most, and should be considered as a case study for academic analysis of the condition.

Before we move on, I warn you, this final example is an extreme sufferer and may be difficult to look at. Those of more squeamish constitutions may wish to look away.



Now this...this is just sad. In many cases, franchises suffering from Sunshine Syndrome are able to make a recovery, making a slow return to producing the type and quality of games their fans expect. Sonic Adventure however, should be viewed as a cautionary tale on how heavily Sunshine Syndrome can cripple a franchise. Sonic the Hedgehog, once a beloved video game icon, was never the same after his move to three-dimensions. The complaints about Adventure, his first 3D outing, were rampant. It "didn't feel like Sonic," it was either "too fast" or "not fast enough" and so on. But, unfortunately, it didn't end there. Sonic Adventure 2 came out, attempting to fix the problems but receiving roughly the same criticism. Then Sonic Heroes, then Shadow the Hedgehog, and for game after game after game after that Sonic Team attempted to recapture the magic of the franchise's glory days to no avail, and nostalgia only made the old games look better and the new games more inferior by comparison as the years wore on. It reached a point where no one even knew what they wanted out of the franchise anymore, gamer or developer alike. In the most severe stages...well...viewer discretion is advised. While the final chapter has yet to be written on the blue hedgehog, his will be remembered as one of the most protracted bouts of Sunshine Syndrome on record.

So now that you've seen the problem up-close, it is time to step back and look at the problem as a whole. Before we discuss prevention, let's first review the ways you can check a game for Sunshine Syndrome at home. Nothing invasive is required, just a few simple questions answered honestly. Here's a questionnaire:

1. If this game were an original IP with original characters, not tied to any sort of franchise, would this game have been better-received?
2. Do the majority of reviews from fans contain the words "It doesn't feel like (Franchise Name)"?
3. Name what you consider the core gameplay mechanic of this franchise. Do you feel this mechanic is poorly-represented in this game?
4. Did this game have a large discrepancy between review scores from professional game reviewers and reviews from fans? Specifically, were review scores from fans lower on the whole?

If you can answer yes to at least two of these questions with regards to a game you know, it may be a sufferer of Sunshine Syndrome. Do not panic. There is absolutely no need to despair. Many games with Sunshine Syndrome go on to lead happy, productive lives as part of peoples' game libraries in spite of their condition. Treat them as you would any other game and try not to let your preconceptions cloud your view. Remember that every game is some developer's pride and joy.

And speaking of developers, it is time to discuss ways to prevent Sunshine Syndrome before it starts. As always, prevention begins in the home, and a video game's home is the game studio. The most common cause of Sunshine Syndrome is a feeling by game developers (and their bosses) that games in a popular franchise will automatically garner more interest than new properties. This is often true. People will get excited for sequels to awesome games because they expect more awesomeness. This is, however, precisely the problem with heavy franchising and the reason Sunshine Syndrome exists.

If you're going to release a game as part of a beloved franchise, you need to seriously consider the expectations that will result from such a move, and whether or not you can deliver on those expectations. If you can't, you're putting not only your game but, as we've seen, your entire franchise at risk. Just one instance of Sunshine Syndrome can drive away fans and you may be hurting yourself for games upon games down the line. If you want to make a more modest or experimental game, do the responsible thing and make it a new property.

I know it's frightening, and it will require your marketing team to do actual work outside of shutting down independent documentaries you don't consider "on message," but in reality the risks of disappointing the fanbase one has built around a franchise cannot be ignored or overstated. People will always, always be more open to new IPs than you think they are.

Sunshine Syndrome can strike any franchise at any time, and it's still a threat. The upcoming X-COM game, for example, looks like an especially at-risk case with its move away from turn-based strategy and towards first-person...and blob-fighting...and also the 1950s for some reason. Spread awareness of this condition however you can, for everyone's sake. Let's work together so that there will be no more grieving investors.

Thank you for your time. I'm sorry if this topic was troublesome, but it had to be addressed. I am Dr. Lex...and...yeah, I really didn't think I was going to keep this doctor gimmick up through the entire post.

Old Games and New Names

Two things to talk about, today! Yeah, I said I was gonna tackle "sequelitis" in this installment, but a couple things have come up which I want to address, so that will have to wait for next time.

First up, it's time for ACTION ON-THE-SPOT VIDEO GAME INDUSTRY NEWS! There's been a lot of ruckus recently regarding the business, ethics, and business ethics of selling pre-owned video games. Basically, game companies don't make money on used game sales, and are now doing everything in their power to dissuade people from buying used. As developers lock players out of more and more used-game content with one-time codes and the like, tensions are rising, and people are starting to wonder what the solution to the problem is. Is there a way to encourage people to buy new without coming off as a massive jackass?

I don't know the solution...because frankly, I don't think there is a solution. Game developers may not like the used game market, but I think they're fighting a battle they cannot win. Developers can not and will not stop used game sales. How do I know this? Because used game retailers have been doing their damndest to put people off of used games and it still hasn't worked!

Seriously, the used game business was incredibly repugnant even before this. Putting aside the issues I already covered with regards to the constant upselling one experiences in these establishments, buying used is always a huge gamble. There's no quality control in these places. Store clerks can't test EVERY GAME that comes in to see if it runs. The best that happens is that someone checks the back of the disc, confirms there are no obvious toothmarks, and slips it into a little paper cover to get jostled around a bazillion times before it actually reaches you. Then you get a disc with no instruction manual and (sometimes) no freaking box,instead getting a generic box with the words "HAYLOW THREE" scribbled in crayon on the front. You put it in your system and, assuming it boots at all, it plays decently or at the very least never fails in a repeatable way so you could get a refund bringing it in.

And despite all this, people...still...do it. Saving ten bucks is so important to some people that they will actually put up with that crap. Do you developers out there really think you can do anything to these people that Gamestop and God haven't already done ten times over? All this "project ten-dollar" nonsense is like putting a 5% tax on crystal meth. You're not going to put people off it by doing that.

If you want people to buy new, all you can do is make something worth buying new. Use the hype machine. Make me want your game on day one. You know when I'm going to consider buying a game used? If it was a game I wasn't excited for so I waited a month to see what the reviews were like before picking it up. If I'm super-hyped about a game, I will run out and buy it at launch (assuming you ship enough for me to do so, of course). Excluding the people for whom used games are an addiction (or economic necessity) I think a lot of gamers operate the same way. So, surprise I guess, the secret to selling games is...making and marketing good games.

I...I hope that helps?

So that's the first item on the agenda, and the second is with regards to my blog itself. As you may have noticed, my recent posts have all started centering around video games, and in light of that I'm considering biting the bullet and converting this into a video games news-and-theory blog. That said, as part of the change, I've decided my blog will need a new name in order to get peoples' attention. There are a few directions I could take this.

I could go the "nonsensical but catchy" route and call my blog something like Flaguzzle or Zimboing. Y'know, one of those titles that betrays absolutely nothing about the content, but is intriguing nonetheless, and is spelled exactly as it sounds so it's easy to enter into a websearch. This tends to be the naming convention used by newsblogs like Kotaku, Destructoid, and all those other places you keep hearing a news story originally broke even though you never actually visit them.

I could also just make the title some random phrase from video game pop culture. I mean it worked for 1up.com, or webcomics like PvP. I could call my blog something like Full Clear or Assisted Aim or...um...Crafting System. Something like that. There's tons of phrases out there, and they all sound about equally like titles. Why, I bet I could just randomly point to a phrase on the back of a video game box and it would work as a blog title. Let's try it...

"A Persistent Internet Connection is Required to Play This Game"

Sounds like a winner to me!

And if I was feeling less creative I could go with one of those lame "informative" titles, like A Gamer's Thoughts or Game Story Discussions. Where's the fun in that, though? The point of a good blog title is to make your content seem more interesting and more important than it actually is. I may be some random nerd on the internet who thinks his outsider opinions on video games are valuable, but ideally my title should leave the impression that I'm a random nerd on the internet who knows his outsider opinions on video games are valuable. Going out there with a blog called "Hey Guys I'm A Gamer, Read This" is not going to get you the obsessive cult of personality which should come with amateur video game journalism. No one's gonna write a creepy Fandom!Secrets post about that mess.

Plus, I have the problem that my blog basically already has the best title ever. It's called "Atomic Chainsaw Apocalypse," a name specifically chosen because it was the most awesome title I could think of. How do you come up with a title cooler than that?

Clearly I will need to think on that. For now...man, I still need to buy Little King's Story. I wonder how much it goes for used.

Tuesday, August 24, 2010

Graphics Schmaphics

Hey, guess what time it is! It's time to spin another Gamasutra article into a vaguely-related tangent about game theory!

Today's article is this one here, in which some EA dude says a bunch of stuff about social gaming I don't care about. However, he does make the point that he thinks video game budgets have reached their peak, and that future games will be given more modest budgets. Now, the way he phrases this, he makes it sound like "more modest budget" means "games for more casual players," and this is the bit I wanted to talk about today.

You don't need a huge budget to make a game that appeals to hardcore video gamers!

This is something I'll never understand. Why do people continue to think that having the best production values ever equates to increased sales, when the opposite has proven true time and time again? I'm not trying to be one of those "APPRECIATE INDIE GAMES YOU PHILISTINES" bloggers, because I don't need to be. People already appreciate low-budget games so long as they have well-crafted gameplay. Case-in-point, Monday Night Combat, a game that came out this month as a low-budget, fifteen-dollar Xbox Live release. People are loving this game. And why not? It's an awesome (albeit very, very derivative) game with tons of personality. Then you compare it with something like Mafia II, which came out this month as well, which clearly poured a ton of money into creating a massive and fully-realized city...and manages to be the blandest thing ever if reviews are to be believed.

A video game only really needs a fun core of gameplay in order to be engaging. I've played games people have coded in their basements for free and had more fun with them than I ever had with Grand Theft Auto 4. As many gamers and critics as there are out there who praise graphical fidelity and count pixels and all that, I feel like game companies really overestimate how much anybody out there really cares about production values.

But the real proof comes from the numbers, and this is something I've never heard brought up before. Has anyone ever noticed that, when two video game consoles go head to head, the one with the better graphics almost always gets outsold by the lower-powered machine? Seriously, almost every time. I mean, let's ask Wikipedia.

Game Boy vs. Game Gear? Game Gear had better graphics, in color even, but was absolutely decimated sales-wise by the two-color Game Boy.

In the 16-bit era, you have the SNES outperforming the Sega Genesis, which seems like a victory for the graphical powerhouse...until you realize that both of these systems severely crushed the "Neo Geo" in sales even though it was a far more powerful system than either. During this console generation, the Jaguar also tried to break into the market, and ended up more or less dead-on-arrival despite being able to process goddamn polygons.

Then you have the Playstation vs. the Nintendo 64. While it's true the N64 wasn't capable of doing full-motion video (at least not very well) in actual in-game terms the N64 consistently had smoother, better-looking 3D graphics...and it was brutally, brutally outsold by the Playstation.

Playstation 2 vs. Xbox vs. Gamecube vs. Dreamcast! Playstation 2, despite generally having the weakest graphics of the four, sells more than the other three consoles combined, and by a significant margin, too!

And that brings us to today, with the Wii outselling all other consoles all around the world and the DS outselling the PSP despite considerably inferior processing power.

...

THAT...IS...A...GODDAMN...TREND!

I really can't think of a better indicator that gamers, all gamers, prefer games with solid and interesting gameplay over high-fidelity graphics. People don't by consoles, and they don't buy games, just because of how good they look. If reduced budgets on future high-profile titles lead to a shift away from polygon-boosting and towards an examination of what really matters in terms of making a quality game, I'm all for it.

I know I'm oversimplifying. I know that. But, at the very least, I feel like this is an important jumping-off-point when considering game design theory. Are developers actually shooting themselves in the foot by pumping so much money into these games? Is there really such a thing as a "safe" investment? The mentality of "sequels are safer," which is part of the core of this problem, is something I intend to tackle next time. For now...I think I'm gonna go buy Little King's Story and enjoy the hell out of it.

Thursday, August 12, 2010

Narrative in Video Games and How Not to be Doing It Wrong...Maybe

Follow-up time! So yeah, last time I said that I would follow up my comprehensive dissection of the failings of video game storytelling (which totally didn't spill off on an unintended tangent about critics) with my thoughts on how a video game story could be told well. I actually went back and forth on whether I would write this, primarily because...what the hell do I know? There are tons of articles on the subject of game narrative out there, written by people who...y'know...actually study this stuff and do it for a living. My own experience writing for video games begins and ends with Spore: Galactic Adventures, so who's to say if my thoughts on the subject are valid at all?

But I guess you're still here, so let's get on with it. Here are my personal thoughts on how game storytellers can improve their narrative.

TIP 1: WRITE A WORLD, NOT JUST A SEQUENCE OF EVENTS

This is probably the biggest piece of advice I can give game developers. And yes, I'm speaking to all of the developers. The most important thing about a video game's story is that it be more than just a story. You have an entire world to work with and shape to be a part of your narrative. Use that canvas!

While there are probably better choices, right now I'm going to point to Bioshock as an example of this concept. If you look at it, there's very little "plot" in Bioshock. You've given a few objectives here and there, but for a large majority of the game you're just kind of a rat, alone in a maze and trying to get from point A to point B. However, this game has story in spades. Every area you encounter is full of hints as to what happened in the ruined city. Everything you see and experience ties back to establishing what this city was and what happened to it. In my last post I described a scene wherein I come upon an abandoned cabin in a video game and intuit what happened there. Bioshock is made of moments like that. I feel like I'm exploring, scavenging, and discovering, while all the while learning more and more about this story being told all around me.

The reason it works and the reason it engages the player is that it makes you feel like the protagonist you're supposed to be. If the player gets the sense that their experience is being too tightly controlled (hello, Final Fantasy XIII), it will feel more like watching the characters than inhabiting them. Since your protagonist is the player's connection to the world, that's a bad thing.

TIP 2: GIVE YOUR STORY A MEMORY

(This is something I'm stunned I don't see more often in video games. It's such a simple way to not only make your story more dynamic, but also boost replay value.)

So, let's say I'm playing some Fantasy RPG. In fact, let's pretend it's that game I made up in the last post. From now on, that game is called ExampleQuest. So I'm playing ExampleQuest and I'm at a crossroads. Down one road is the town of Portside, which is under attack by pirates. Down the other road is the town of Ravencroft, which is under attack by zombies. I decide to go to Portside first and, hero that I am, fend off the pirates and kill the Pirate King in a duel. After that, I head to Ravencroft. Outside, two guards stop me.

Guard 1: "Stay back, sir. It's not safe to go any further."
Guard 2: "Bah, I heard this man killed the Pirate King himself. If anyone can solve our problem, he can."

And I'm let through. Now what does that sound like to you? Sounds like railroading, and if I'd gone to Ravencroft first I'd have been turned away. Nope, if I'd gone to Ravencroft first, I'd get this exchange instead.

Guard 1: "Stay back, sir. It's not safe to go any further."
Guard 2: "The man is armed, at least. I suspect he could make it through town. Just...be careful in there, okay?"

And I'm let through.

That was a really complicated example of a simple thing I'm trying to get across. "Consequence" in video games doesn't have to mean the story changes wildly, or that you get some kind of "good" or "evil" bonus for your actions. Consequence can be as simple as a line of dialog changing, as above, to reflect what you've done. It's so simple but it can be so powerful.

Probably the most awesome example of this in a real video game is the infamous "court scene" in Chrono Trigger. Basically, at the start of Chrono Trigger, you go to this country fair where you can wander around, have fun, and do a variety of things. However, much much later in the game, your character finds himself on trial, and the game starts bringing in a bunch of "character witnesses" to testify for or against you. In this scene, the game brings up a whole bunch of things you, the player, did at the fair at the start of the game. A lot of it is really inconsequential stuff you wouldn't think the game would keep track of (like whether you ran into a certain character or if she ran into you). It's really powerful, and everyone remembers that scene despite the fact that it really has nothing to do with the gameplay. (You find yourself slated for execution no matter how the verdict goes for...reasons. I'm not going to summarize the whole game.)

This is such an awesome storytelling device, and a device which is unique to video games. You can't do that in a movie! It's so simple, too. It's as simple as having two sets of dialog here and there. If your writers are too lazy for that, God help 'em.

TIP 3: MAKE CHARACTER DEVELOPMENT A PART OF GAMEPLAY

The mark of a good story, no matter the medium, is that the characters grow and change over the course of it. In a movie or novel, this tends to follow the pattern of a "character arc" with the character learning how to change themselves for the better right at that eleventh hour when it matters. In a video game or television series, this tends to be a bit more free-form, and character development is usually a slow-build over the course of the character's journey. (I'd argue that this slow-build development is actually a bit more true to real life. I mean, how often do your beliefs take a total 180-degree spin just because you had a bad day?)

Most gamewriters understand the significance of character development from a story standpoint. They'll usually stick some big redemption scene in the game's third act somewhere with dramatic music and the hero vowing not to let the evil wizard hurt anyone else. Still, I feel this is missing an opportunity as video games are supposed to be an interactive medium. Give character development some meaningful impact on gameplay.

A lot of people like to criticize "moral choice" systems in games, like the one in InFamous where you unlock more destructive powers the more of a jackass you are and unlock more heroic powers the more virtuous you are. While I feel like these mechanics are often ham-handedly implemented, I don't think the idea at their core is a bad one. The idea is to take away the "Grand Theft Auto Syndrome" where your character is built up as a noble dude in the story...despite how many pedestrians you run down when actually playing. Having your character's personality change to reflect the way you're playing makes a lot of sense. The main problem, I think, is that the game tends to paint in broad brushstrokes. Your playstyle results in your character being "good" or "evil" without much room for moral ambiguity.

I think the solution here is to get away from the "good" and "evil" dynamic. Make it more a matter of personality. Is your character brave or cowardly? Sarcastic or straightforward? Passionate or cold? Sympathetic or sadistic? People's personalities don't conform to just one slider. Now imagine that each of those sliders is tied to your character's growth, for example...

Brave vs. Cowardly = More Health vs. More Speed
Sarcastic vs. Straightforward = Ice Element vs. Fire Element
Passionate vs. Cold = More Offense vs. More Defense
Sympathetic vs. Sadistic = Healing Abilities vs. Destructive Abilities

And so on. (I...I don't know why Sarcasm is tied to Ice Magic, but just go with me, here.) Seriously, if nothing else think of the replay value in a game where your character's personality could be so radically different from game to game. It wouldn't be that hard to implement in conversation, either. Just have each response check an appropriate slider. If someone asks you to help them, the game determines your character's reply based on their bravery. If someone asks you a silly question, the game determines if you cut them down based on how sarcastic you are. If you're clever about it, you wouldn't need to write 30 different responses for every line, just two appropriate ones. Simple!

I...can't really think of a good example of this from an existing game, but if anyone else can I'd love to play it.

And those are basically my three tips for telling a better video game story. Maybe I don't know what I'm talking about, but I can honestly say that any game to implement all my suggestions is a game I would pick up in a heartbeat. Anyone else feel the same?

Saturday, August 7, 2010

Narrative in Video Games and Why You're Doing it Wrong

Sup, people. Glad you're continuing to read my always-updating-and-not-at-all-frequently-abandoned blog. ACA appreciates your continued viewership, even though you're all probably reading this through facebook and don't know what ACA even stands for.

Anyway, I've been doing a lot of thinking about storytelling in video games lately, primarily because a lot of other people have been thinking about storytelling in video games lately and have been posting opinions on it that tick me off. A good example of this game-story-related off-tick I've been experiencing can be seen in this article on Gamasutra. In it, Mr. Darby McDevitt rattles off a lot of complaints about the current state of video game storytelling. They're complaints I've seen in a lot of places, albeit this article lacks the subtext of "My Critical Studies degree is worthless and I'm taking my frustrations out on the world" which tends to accompany such arguments when they appear on message boards. His basic thesis is that the nature of most video games, with the killing and the maiming and the "Oh God my face," undercuts their ability to convey a good narrative with all the juicy pathos of a Hollywood oscar-bait flick. I feel like this article, as well as a lot of articles and message board rants I see on this subject, is missing or at least sidestepping one key point, and this is a point I'd like to attack first and foremost right now.

VIDEO...GAMES...ARE...NOT...MOVIES

I don't mean this in the standard "making fun of Hideo Kojima" sense, I mean this in the critical sense. While the Gamasutra article avoids this particular fallacy, it's not uncommon for me to see people say "The way to tell if a game has a good story is to ask yourself if that story would work as a movie or novel." This is, in my mind, probably the worst way to think about constructing video game narrative. Asking if a video game's story would work as a movie is like asking if a movie would work as a stage play. You're not accomplishing anything by thinking that way. All you're doing is trying to cram a narrative format you don't understand into a more limited one which you can get your mind around.

Video games provide a much more complex framework for storytelling than your standard three-act narrative media. They offer the possibility of branching paths, world exploration, customization, random or procedural worldbuilding, co-operation and competition between audience members, and so much more. Any game that could be 1:1 converted into a movie or novel has severely underused the capabilities of the medium. It makes me wonder if the true problem with video game narrative isn't so much a failure of the creators as it is a failure of the critics. We have a lot of jargon we can use for describing traditional, linear narratives, "acts," "motifs," "deconstruction," "subtext," and tons of other words you learn in film school, but we don't have anywhere near as strong a vocabulary of words to analyze the narratives of video games.

Let's say I'm walking through a game world and come upon an abandoned house. Inside, there are blood and claw marks everywhere indicating some wild animal got in and killed whoever was living there. I look around and, after making sure the wild animal isn't still around, decide to do a little snooping. I open up a cabinet and see a shiny axe. Deciding whoever lived here no longer has use for said axe, I add it to my inventory and continue on my way. What just happened there? What do you call it? If you're a game narrative scholar looking at that sequence of events, how do you break it down? At best, right now that sequence of events would all be lumped into the buzzword of ~emergent gameplay~, but it's way more complicated than that. I had my perceptions and emotions influenced six ways to Sunday in that five minute span. I was curious, then analytical, then cautious, then curious again, then happy at my discovery of the axe, and then maybe remorseful about stealing from the dead. Just because I didn't have a cutscene shouting exposition at me doesn't mean a story wasn't told, there.

When I say this is a failing of critics, what I mean is that the traditional way of things is to allow the audience to derive meaning from what creators make. Consider the early days of film. A lot of it was just people jackassing around with cameras like it was their favorite new toy, shooting visceral stories about gunfights and trains and...more gunfights...and...more trains. And why not? At the turn of the century, gunfights and trains were two of the most awesome things ever. It wasn't until later that people started to take a critical approach to analyzing films. They looked through all the old gunfight and train-based narratives, built a vocabulary to understand why they were so engaging, and thus inspired filmmakers to build on this knowledge to make a love scene just as engaging as a gunfight and an inner-city apartment look just as awesome as any train. (This may be something of an over-simplification, but I hope my point remains clear.)

We've yet to do this with video games, and we never will as long as we try to analyze them through the lens of existing media. You can't just tell someone to "grow up and make art" and expect them to know what to do. Video games are a new form of storytelling and need to be treated and developed as such.

I may make a follow up post in which I go into ways I personally feel game narrative can be improved, but this is really what I wanted to get off my chest. Maybe I'm totally off base here, and if I am feel free to leave a comment, but this seems like fairly obvious stuff to me.

Monday, May 24, 2010

Angry At Movies: Origins

So I said I'd be blogging more. I haven't. Sorry about that, real life and general laziness have gotten in the way. However, I also said that I had a new idea for something to blog about, and that was true.

Welcome to Angry At Movies, where I talk about all the trends and cliches I hate in modern cinema. These are things I feel have become so ingrained in our cultural storytelling that we don't even notice them anymore, and specifically don't notice how much they suck anymore. What am I talking about? Well, let's get right to the first thing that's been pissing me off, lately. I feel it only fitting that we start from the beginning, so what's making me angry at movies today?

ORIGIN STORIES

We're in an era of summer blockbusters where two things are seriously in vogue. 1) Superhero movies and 2) Reboots of established series. The result is a kind of critical mass of "Origin" movies, retelling the epic beginning of well-known and, more importantly, firmly established characters.

So why is this bad? On paper, it seems only logical to retell the origins of these characters. After all, people might be new to the series. Right? WRONG! AND YOU ARE WRONG FOR THINKING SO!

The primary flaw of origin stories is that they're structured in a way that directly deprives the audience of what they came to see. I'm going to use James Bond as the example here, as the Daniel Craig reboot of the Bond series is one of the worst offenders in this regard. When I go to a James Bond movie, I'm going to see James Bond. I already know who James Bond is. Even if I'd never seen a James Bond movie, I would know who James Bond is. He's a cultural icon. The man is everywhere. When I go to a James Bond movie, I go in with the expectation that I will see a super-suave, super-awesome spy running around exotic locales kicking ass. THAT IS WHAT I BOUGHT A TICKET FOR!

And did I get James Bond when I went to see "Casino Royale?" No. I got "Rookie Bond." I got to see his oh so charming awkward years before he was a hyper-competent MI6 Agent. We got to see exactly how he learned to keep his emotions in check while on assignment, slowly becoming the Bond we all know and love. It's like a delightful coming-of-age story for this thirty-five year old secret agent! Who...Why would anyone think this was a good idea for a movie? A JAMES BOND movie?

To me, I just keep having to ask "why?" Why did we ever need to a James Bond reboot? Was it just too hard for people to catch up with the Brosnan movies? Were people getting so lost in the complex continuity of the previous Bond films that we had to take a step back and start with a fresh slate? And even if it was necessary, even if it was sooooo important that we distance ourselves from "Die Another Day," why an origin? "Dr. No" wasn't an origin. The Bond movies have never told his origin, and why should they? IT'S BORING! Because it isn't Bond! It's the rookie who will become Bond in the future.

Okay, I'll stop harping on Bond, much as it continues to piss me off. You might be asking your monitor "Well, what's the proper way to reboot a series?" The answer, my friends, is Batman. No, not "Batman Begins," I'm talking about Tim Burton's 1989 "Batman" starring Michael Keaton. This was, in many ways, a continuity reboot. It was an effort to bring the darker, edgier Batman of the recent comics into the pop culture, in a time when non-comic fans still associated Batman with the 1960s television serial. So this movie was specifically designed to bring in people new to the franchise. So, is it an origin story?

NO!

Oh the movie tells Batman's origin story...in five minutes. The first few minutes of the movie tell us all we need to know. Rich kid, parents got shot, dresses up in a bat suit and fights crime. BAM! DONE! And within five minutes the Bat himself is on...the...screen! You wanted to see Batman, there's your Batman and he's got the Batmobile and the Batcave and everything all ready to go. There's still an arc to the movie, and plenty of tension. Bruce Wayne meets an attractive reporter and is unsure whether or not he should reveal his secret identity to her, and of course he has to stop the Joker from running around poisoning people. There's a lot going on in the movie, and it's all Batman, with the moments of Not-Batman restricted to the bare minimum at the beginning.

That is how you handle an established character. AUDIENCES AREN'T STUPID! They don't need every detail of a character laid out before them in order to relate to them. They paid to see Batman, they know Batman is coming, so just show them Batman. It's like in a slasher movie where you have to wait half an hour before Freddy Kreuger is even mentioned. There's building tension and then there's just pissing me off! You know why Spider-Man 2 is better than Spider-Man 1? Because it opens with Spider-Man!

I could go on, but this rant would just get more incoherent from here. Bottom line: stop with all the damn origin stories for characters we already know. They're unnecessary, underestimate the audience, and most of all they're boring!

And if you want proof, see if you don't skip the first two paragraphs if you decide to re-read this post.

Monday, May 3, 2010

Game Over

So there's a rumor going around that Game Crazy will soon be shutting its doors. I've gotta say, I believe it. The article in the link paints this as "the little guy" being crushed under the boot of the massive Gamestop corporation, but I see it as part of a larger issue. That issue? DEDICATED GAME STORES SUCK!

Seriously, I don't understand why anyone uses dedicated game stores anymore. Do they just like being hassled for pre-orders and membership cards and extra controllers? That's basically all that separates the game store experience from your average trip to Best Buy. Every person behind the counter is instructed to upsell the hell out of anyone who sets foot in the store, even if they just want to use the restroom. It's not because these places are evil or greedy. If that were the case, Game Crazy would be thriving. The dark secret of game stores is that selling video games alone produces a very, very slim profit margin. I don't know the numbers, but apparently the sale of a new game yield's next to no profit at sale.

At places like Best Buy, video games can be written off as a sort of a loss leader. They've worked out ways to make a profit that don't involve badgering the customer to give them money for nothing. And if you're using Gamestop to buy used games on the cheap, I seriously suggest checking out eBay or Amazon.com. You'll find better deals, a way better selection, and it's really no more of a crap shoot if the disk will work.

I honestly hope we're reaching a point where either dedicated game stores die entirely or they seriously rethink their business model. I'll give a suggestion for free right now. You know what people who buy video games also like? Comic books and tabletop games like Dungeons and Dragons. You know what you can sell at ridiculous mark-ups and yet people will still pay for them? Comic books and tabletop games like Dungeons and Dragons. Why are there no combined video game and comic book shops? It seems like the best idea ever. But what do I know about business?

(P.S. I intend to start blogging more often now. I have a few ideas for entries, so expect to see them soon!)