So why are these films so bad? We’ve identified a few mistakes that Hollywood keeps making. Play through our game and try to avoid these pitfalls on your adventure to making the first truly great video game movie.

SCROLL TO WALK

Obstacle encountered! Answer this question to defeat the enemy on your journey to making a movie. Which aspect of the movie is most important: behind-the-scenes crew or a talented on-screen cast?

Sorry, trick question! Usually it’s the story that’s most important — and video game movies usually get that part wrong. “Prince of Persia: The Sands of Time” (2010) had high production values and noteworthy actors (Jake Gyllenhaal starred), but the Boston Globe’s Ty Burr still called it the “CVS brand” version of “The Lord of the Rings.”

Of course, that doesn’t mean the cast and crew are completely pointless. They can be the one thing standing in the way of a complete critical savaging. Take this year’s “Tomb Raider”: Variety film critic Owen Gleiberman credited the movie’s watchability to star Alicia Vikander. “Her Lara may be the most grounded and believable cinematic video-game protagonist I’ve seen,” he said.

On the other end of the spectrum, “Mortal Kombat” (1995) might have laughable visual effects when seen nowadays, but its production values were a major factor in the few positive reviews it received.

Getting the story right probably seems obvious, but it’s easier said than done. Keep playing to learn about the other obstacles.

KEEP SCROLLING FOR NEXT SECTION

Now that we know we should focus on the script, what kind of game story would you pick?

Seriously? Another trick question? It’s almost like this game is a metaphor for something. One of the hardest parts of adapting video games is choosing the right story balance. Ideally, you want a game with a rich but still compact narrative. Let’s take a look at two examples: “Warcraft” (2016) and “Need for Speed” (2014).

“Need for Speed” failed because studios scrambled to adopt something with strong brand recognition instead of something that actually screams out for a film adaptation. The original game’s story is purposefully empty so that the gameplay — car races — can shine. There simply wasn’t a story to adapt.

On the other hand, mythology-rich games like “Warcraft” suffer when adapted into movies, too. Trying to stuff over 100 hours of gameplay into a two-hour movie is a doomed effort, and it results in what the Washington Post’s Stephanie Merry called a “convoluted mess of an introduction [that] requires more mental effort than any movie this idiotic deserves.”

A better game to adapt would be something like “Oxenfree,” which has an eerie coming-of-age story that can be finished in about five hours. Still, a lot of modern games with strong stories also don’t need adaptations at all. “The Last of Us” has been universally acclaimed as one of the greatest video games of all time, and the actors’ voices and motion-capture performances were particularly praised. Changing this story to a live-action one wouldn’t add much more.

KEEP SCROLLING FOR NEXT SECTION

Let’s say that you’ve found the sweet spot with your source material: Solid but not overly complicated. How do you adapt it?

Listen, I know what you’re thinking. “If I wanted a metaphor about free will and inevitable failure, I’d go play ‘Stanley Parable.’” But, like our game, Hollywood keeps trapping itself in an either-or mentality that leads to horrible movies. Balance is the key.

Take a look at 2016’s “Assassin’s Creed,” which stuck closely to many of the game’s concepts. The storyline was ineffective partly because, as a Forbes film review said, “so much of Assassin’s Creed’s storyline was created to solve very ‘video gamey’ problems.” In the game, much of the action takes place in the character’s mind. Inaccessible memories are used to explain why you can’t explore certain areas. Desynchronized memories occur when you fail and restart a mission. This makes for interesting game mechanics but didn’t translate well to the big screen.

But wait! Catch-22. Stray too far from the original source material, and you get something like the completely bonkers “Super Mario Bros.” (1993) movie. It gets credit for an attempt at originality, but when the new direction leads to a meteorite splitting the universe into two parallel dimensions, one where some dinosaurs survive and evolve into a humanoid race, resulting in Bowser/King Koopa? Game over. Do not continue.

KEEP SCROLLING FOR NEXT SECTION

And now we come to the final and biggest obstacle: Interactivity, where YOU get to be the protagonist in a video game. How does this translate to a movie?

Maybe one day virtual reality will advance to the point that you can act as The Rock running through your own personalized movie, but until that day, video games still have the interactive advantage. Much of the innate appeal of video games is that you get to be the protagonist. You implant yourself into the character, and you get to choose who you want to be.

Watching other gamers play online is also incredibly popular, but that’s still a different experience from watching a movie. Streaming on Twitch can still be interactive; you get to live chat with the players, cheer as they make real-time decisions, so on and so forth. Watching an actor on-screen do those same things without any interaction with the audience seems awfully dull in comparison.

Sure, the difficulty of translating interactivity to the screen isn’t the only reason that video game movies fail. After all, movie critics who’ve never played the original video games still despise these movies just fine. But this issue shows that filmmakers have that much more work when it comes to adapting video game material. It isn’t enough to rely on brand recognition or reuse the exact same story; something original and creative has to be added. The tough part is figuring out just what that original thing is.

It’s dangerous to go alone. Take our stories with you.

Source: Post research. Illustrations by Shelly Tan.