Frostpunk 2 Gets Chilling New Trailer And July Release Date

Frostpunk 2 Gets Chilling New Trailer And July Release Date

Today’s Xbox Partner Preview gave us a new look at Frostpunk 2, including a release date.  First announced in 2021, 11 Bit Studios’ tense city-management sim arrives on July 25, and it’s also coming to PC Game Pass.

A new trailer shows off some of the game’s bleak, choice-driven gameplay. You’re charged with leading a city set within an inhospitable frozen wasteland, making decisions about how it’s governed to keep mouths fed and, hopefully, happy. That includes managing labor, politics, and even the food supply, which can ingratiate you to citizens or, worst case, cause them to revolt.  

[embedded content]

Frostpunk 2 will first launch on PC but will come to consoles at a later date (including Xbox Game Pass). You can read our review of the first Frostpunk here

Researchers enhance peripheral vision in AI models

Researchers enhance peripheral vision in AI models

Peripheral vision enables humans to see shapes that aren’t directly in our line of sight, albeit with less detail. This ability expands our field of vision and can be helpful in many situations, such as detecting a vehicle approaching our car from the side.

Unlike humans, AI does not have peripheral vision. Equipping computer vision models with this ability could help them detect approaching hazards more effectively or predict whether a human driver would notice an oncoming object.

Taking a step in this direction, MIT researchers developed an image dataset that allows them to simulate peripheral vision in machine learning models. They found that training models with this dataset improved the models’ ability to detect objects in the visual periphery, although the models still performed worse than humans.

Their results also revealed that, unlike with humans, neither the size of objects nor the amount of visual clutter in a scene had a strong impact on the AI’s performance.

“There is something fundamental going on here. We tested so many different models, and even when we train them, they get a little bit better but they are not quite like humans. So, the question is: What is missing in these models?” says Vasha DuTell, a postdoc and co-author of a paper detailing this study.

Answering that question may help researchers build machine learning models that can see the world more like humans do. In addition to improving driver safety, such models could be used to develop displays that are easier for people to view.

Plus, a deeper understanding of peripheral vision in AI models could help researchers better predict human behavior, adds lead author Anne Harrington MEng ’23.

“Modeling peripheral vision, if we can really capture the essence of what is represented in the periphery, can help us understand the features in a visual scene that make our eyes move to collect more information,” she explains.

Their co-authors include Mark Hamilton, an electrical engineering and computer science graduate student; Ayush Tewari, a postdoc; Simon Stent, research manager at the Toyota Research Institute; and senior authors William T. Freeman, the Thomas and Gerd Perkins Professor of Electrical Engineering and Computer Science and a member of the Computer Science and Artificial Intelligence Laboratory (CSAIL); and Ruth Rosenholtz, principal research scientist in the Department of Brain and Cognitive Sciences and a member of CSAIL. The research will be presented at the International Conference on Learning Representations.

“Any time you have a human interacting with a machine — a car, a robot, a user interface — it is hugely important to understand what the person can see. Peripheral vision plays a critical role in that understanding,” Rosenholtz says.

Simulating peripheral vision

Extend your arm in front of you and put your thumb up — the small area around your thumbnail is seen by your fovea, the small depression in the middle of your retina that provides the sharpest vision. Everything else you can see is in your visual periphery. Your visual cortex represents a scene with less detail and reliability as it moves farther from that sharp point of focus.

Many existing approaches to model peripheral vision in AI represent this deteriorating detail by blurring the edges of images, but the information loss that occurs in the optic nerve and visual cortex is far more complex.

For a more accurate approach, the MIT researchers started with a technique used to model peripheral vision in humans. Known as the texture tiling model, this method transforms images to represent a human’s visual information loss.  

They modified this model so it could transform images similarly, but in a more flexible way that doesn’t require knowing in advance where the person or AI will point their eyes.

“That let us faithfully model peripheral vision the same way it is being done in human vision research,” says Harrington.

The researchers used this modified technique to generate a huge dataset of transformed images that appear more textural in certain areas, to represent the loss of detail that occurs when a human looks further into the periphery.

Then they used the dataset to train several computer vision models and compared their performance with that of humans on an object detection task.

“We had to be very clever in how we set up the experiment so we could also test it in the machine learning models. We didn’t want to have to retrain the models on a toy task that they weren’t meant to be doing,” she says.

Peculiar performance

Humans and models were shown pairs of transformed images which were identical, except that one image had a target object located in the periphery. Then, each participant was asked to pick the image with the target object.

“One thing that really surprised us was how good people were at detecting objects in their periphery. We went through at least 10 different sets of images that were just too easy. We kept needing to use smaller and smaller objects,” Harrington adds.

The researchers found that training models from scratch with their dataset led to the greatest performance boosts, improving their ability to detect and recognize objects. Fine-tuning a model with their dataset, a process that involves tweaking a pretrained model so it can perform a new task, resulted in smaller performance gains.

But in every case, the machines weren’t as good as humans, and they were especially bad at detecting objects in the far periphery. Their performance also didn’t follow the same patterns as humans.

“That might suggest that the models aren’t using context in the same way as humans are to do these detection tasks. The strategy of the models might be different,” Harrington says.

The researchers plan to continue exploring these differences, with a goal of finding a model that can predict human performance in the visual periphery. This could enable AI systems that alert drivers to hazards they might not see, for instance. They also hope to inspire other researchers to conduct additional computer vision studies with their publicly available dataset.

“This work is important because it contributes to our understanding that human vision in the periphery should not be considered just impoverished vision due to limits in the number of photoreceptors we have, but rather, a representation that is optimized for us to perform tasks of real-world consequence,” says Justin Gardner, an associate professor in the Department of Psychology at Stanford University who was not involved with this work. “Moreover, the work shows that neural network models, despite their advancement in recent years, are unable to match human performance in this regard, which should lead to more AI research to learn from the neuroscience of human vision. This future research will be aided significantly by the database of images provided by the authors to mimic peripheral human vision.”

This work is supported, in part, by the Toyota Research Institute and the MIT CSAIL METEOR Fellowship.

How sensory gamma rhythm stimulation clears amyloid in Alzheimer’s mice

How sensory gamma rhythm stimulation clears amyloid in Alzheimer’s mice

Studies at MIT and elsewhere are producing mounting evidence that light flickering and sound clicking at the gamma brain rhythm frequency of 40 hertz (Hz) can reduce Alzheimer’s disease (AD) progression and treat symptoms in human volunteers as well as lab mice. In a new open-access study in Nature using a mouse model of the disease, MIT researchers reveal a key mechanism that may contribute to these beneficial effects: clearance of amyloid proteins, a hallmark of AD pathology, via the brain’s glymphatic system, a recently discovered “plumbing” network parallel to the brain’s blood vessels.

“Ever since we published our first results in 2016, people have asked me how does it work? Why 40Hz? Why not some other frequency?” says study senior author Li-Huei Tsai, Picower Professor of Neuroscience and director of The Picower Institute for Learning and Memory of MIT and MIT’s Aging Brain Initiative. “These are indeed very important questions we have worked very hard in the lab to address.”

The new paper describes a series of experiments, led by Mitch Murdock PhD ’23 when he was a brain and cognitive sciences doctoral student at MIT, showing that when sensory gamma stimulation increases 40Hz power and synchrony in the brains of mice, that prompts a particular type of neuron to release peptides. The study results further suggest that those short protein signals then drive specific processes that promote increased amyloid clearance via the glymphatic system.

“We do not yet have a linear map of the exact sequence of events that occurs,” says Murdock, who was jointly supervised by Tsai and co-author and collaborator Ed Boyden, Y. Eva Tan Professor of Neurotechnology at MIT, a member of the McGovern Institute for Brain Research and an affiliate member of the Picower Institute. “But the findings in our experiments support this clearance pathway through the major glymphatic routes.”

How sensory gamma rhythm stimulation clears amyloid in Alzheimer’s mice
Video: The Picower Institute

From gamma to glymphatics

Because prior research has shown that the glymphatic system is a key conduit for brain waste clearance and may be regulated by brain rhythms, Tsai and Murdock’s team hypothesized that it might help explain the lab’s prior observations that gamma sensory stimulation reduces amyloid levels in Alzheimer’s model mice.

Working with “5XFAD” mice, which genentically model Alzheimer’s, Murdock and co-authors first replicated the lab’s prior results that 40Hz sensory stimulation increases 40Hz neuronal activity in the brain and reduces amyloid levels. Then they set out to measure whether there was any correlated change in the fluids that flow through the glymphatic system to carry away wastes. Indeed, they measured increases in cerebrospinal fluid in the brain tissue of mice treated with sensory gamma stimulation compared to untreated controls. They also measured an increase in the rate of interstitial fluid leaving the brain. Moreover, in the gamma-treated mice he measured increased diameter of the lymphatic vessels that drain away the fluids and measured increased accumulation of amyloid in cervical lymph nodes, which is the drainage site for that flow.

To investigate how this increased fluid flow might be happening, the team focused on the aquaporin 4 (AQP4) water channel of astrocyte cells, which enables the cells to facilitate glymphatic fluid exchange. When they blocked APQ4 function with a chemical, that prevented sensory gamma stimulation from reducing amyloid levels and prevented it from improving mouse learning and memory. And when, as an added test, they used a genetic technique for disrupting AQP4, that also interfered with gamma-driven amyloid clearance.

In addition to the fluid exchange promoted by APQ4 activity in astrocytes, another mechanism by which gamma waves promote glymphatic flow is by increasing the pulsation of neighboring blood vessels. Several measurements showed stronger arterial pulsatility in mice subjected to sensory gamma stimulation compared to untreated controls.

One of the best new techniques for tracking how a condition, such as sensory gamma stimulation, affects different cell types is to sequence their RNA to track changes in how they express their genes. Using this method, Tsai and Murdock’s team saw that gamma sensory stimulation indeed promoted changes consistent with increased astrocyte AQP4 activity.

Prompted by peptides

The RNA sequencing data also revealed that upon gamma sensory stimulation a subset of neurons, called “interneurons,” experienced a notable uptick in the production of several peptides. This was not surprising in the sense that peptide release is known to be dependent on brain rhythm frequencies, but it was still notable because one peptide in particular, VIP, is associated with Alzheimer’s-fighting benefits and helps to regulate vascular cells, blood flow, and glymphatic clearance.

Seizing on this intriguing result, the team ran tests that revealed increased VIP in the brains of gamma-treated mice. The researchers also used a sensor of peptide release and observed that sensory gamma stimulation resulted in an increase in peptide release from VIP-expressing interneurons.

But did this gamma-stimulated peptide release mediate the glymphatic clearance of amyloid? To find out, the team ran another experiment: They chemically shut down the VIP neurons. When they did so, and then exposed mice to sensory gamma stimulation, they found that there was no longer an increase in arterial pulsatility and there was no more gamma-stimulated amyloid clearance.

“We think that many neuropeptides are involved,” Murdock says. Tsai added that a major new direction for the lab’s research will be determining what other peptides or other molecular factors may be driven by sensory gamma stimulation.

Tsai and Murdock add that while this paper focuses on what is likely an important mechanism — glymphatic clearance of amyloid — by which sensory gamma stimulation helps the brain, it’s probably not the only underlying mechanism that matters. The clearance effects shown in this study occurred rather rapidly, but in lab experiments and clinical studies weeks or months of chronic sensory gamma stimulation have been needed to have sustained effects on cognition.

With each new study, however, scientists learn more about how sensory stimulation of brain rhythms may help treat neurological disorders.

In addition to Tsai, Murdock, and Boyden, the paper’s other authors are Cheng-Yi Yang, Na Sun, Ping-Chieh Pao, Cristina Blanco-Duque, Martin C. Kahn, Nicolas S. Lavoie, Matheus B. Victor, Md Rezaul Islam, Fabiola Galiana, Noelle Leary, Sidney Wang, Adele Bubnys, Emily Ma, Leyla A. Akay, TaeHyun Kim, Madison Sneve, Yong Qian, Cuixin Lai, Michelle M. McCarthy, Nancy Kopell, Manolis Kellis, and Kiryl D. Piatkevich.

Support for the study came from Robert A. and Renee E. Belfer, the Halis Family Foundation, Eduardo Eurnekian, the Dolby family, Barbara J. Weedon, Henry E. Singleton, the Hubolow family, the Ko Hahn family, Carol and Gene Ludwig Family Foundation, Lester A. Gimpelson, Lawrence and Debra Hilibrand, Glenda and Donald Mattes, Kathleen and Miguel Octavio, David B. Emmes, the Marc Haas Foundation, Thomas Stocky and Avni Shah, the JPB Foundation, the Picower Institute, and the National Institutes of Health.

Is this the future of fashion?

Is this the future of fashion?

Until recently, bespoke tailoring — clothing made to a customer’s individual specifications — was the only way to have garments that provided the perfect fit for your physique. For most people, the cost of custom tailoring is prohibitive. But the invention of active fibers and innovative knitting processes is changing the textile industry.

“We all wear clothes and shoes,” says Sasha MicKinlay MArch ’23, a recent graduate of the MIT Department of Architecture. “It’s a human need. But there’s also the human need to express oneself. I like the idea of customizing clothes in a sustainable way. This dress promises to be more sustainable than traditional fashion to both the consumer and the producer.”

McKinlay is a textile designer and researcher at the Self-Assembly Lab who designed the 4D Knit Dress with Ministry of Supply, a fashion company specializing in high-tech apparel. The dress combines several technologies to create personalized fit and style. Heat-activated yarns, computerized knitting, and robotic activation around each garment generates the sculpted fit. A team at Ministry of Supply led the decisions on the stable yarns, color, original size, and overall design.

“Everyone’s body is different,” says Skylar Tibbits, associate professor in the Department of Architecture and founder of the Self-Assembly Lab. “Even if you wear the same size as another person, you’re not actually the same.”

4D Knit Dress: Transforming Style
Video: Self-Assembly Lab

Active textiles

Students in the Self-Assembly Lab have been working with dynamic textiles for several years. The yarns they create can change shape, change property, change insulation, or become breathable. Previous applications to tailor garments include making sweaters and face masks. Tibbits says the 4D Knit Dress is a culmination of everything the students have learned from working with active textiles.

McKinlay helped produce the active yarns, created the concept design, developed the knitting technique, and programmed the lab’s industrial knitting machine. Once the garment design is programmed into the machine, it can quickly produce multiple dresses. Where the active yarns are placed in the design allows for the dress to take on a variety of styles such as pintucks, pleats, an empire waist, or a cinched waist.

“The styling is important,” McKinlay says. “Most people focus on the size, but I think styling is what sets clothes apart. We’re all evolving as people, and I think our style evolves as well. After fit, people focus on personal expression.”

Danny Griffin MArch ’22, a current graduate student in architectural design, doesn’t have a background in garment making or the fashion industry. Tibbits asked Griffin to join the team due to his experience with robotics projects in construction. Griffin translated the heat activation process into a programmable robotic procedure that would precisely control its application.

“When we apply heat, the fibers shorten, causing the textile to bunch up in a specific zone, effectively tightening the shape as if we’re tailoring the garment,” says Griffin. “There was a lot of trial and error to figure out how to orient the robot and the heat gun. The heat needs to be applied in precise locations to activate the fibers on each garment. Another challenge was setting the temperature and the timing for the heat to be applied.”

It took a while to determine how the robot could reach all areas of the dress.

“We couldn’t use a commercial heat gun — which is like a handheld hair dryer — because they’re too large,” says Griffin. “We needed a more compact design. Once we figured it out, it was a lot of fun to write the script for the robot to follow.”

A dress can begin with one design — pintucks across the chest, for example — and be worn for months before having heat re-applied to alter its look. Subsequent applications of heat can tailor the dress further.

Beyond fit and fashion

Efficiently producing garments is a “big challenge” in the fashion industry, according to Gihan Amarasiriwardena ’11, the co-founder and president of Ministry of Supply.

“A lot of times you’ll be guessing what a season’s style is,” he says. “Sometimes the style doesn’t do well, or some sizes don’t sell out. They may get discounted very heavily or eventually they end up going to a landfill.”

“Fast fashion” is a term that describes clothes that are inexpensive, trendy, and easily disposed of by the consumer. They are designed and produced quickly to keep pace with current trends. The 4D Knit Dress, says Tibbits, is the opposite of fast fashion. Unlike the traditional “cut-and-sew” process in the fashion industry, the 4D Knit Dress is made entirely in one piece, which virtually eliminates waste.

“From a global standpoint, you don’t have tons of excess inventory because the dress is customized to your size,” says Tibbits.

McKinlay says she hopes use of this new technology will reduce the amount of waste in inventory that retailers usually have at the end of each season.

“The dress could be tailored in order to adapt to these changes in styles and tastes,” she says. “It may also be able to absorb some of the size variations that retailers need to stock. Instead of extra-small, small, medium, large, and extra-large sizes, retailers may be able to have one dress for the smaller sizes and one for the larger sizes. Of course, these are the same sustainability points that would benefit the consumer.”

The Self-Assembly Lab has collaborated with Ministry of Supply on projects with active textiles for several years. Late last year, the team debuted the 4D Knit Dress at the company’s flagship store in Boston, complete with a robotic arm working its way around a dress as customers watched. For Amarasiriwardena, it was an opportunity to gauge interest and receive feedback from customers interested in trying the dress on.

“If the demand is there, this is something we can create quickly” unlike the usual design and manufacturing process, which can take years, says Amarasiriwardena.

Griffin and McKinlay were on hand for the demonstration and pleased with the results. For Griffin, with the “technical barriers” overcome, he sees many different avenues for the project.

“This experience leaves me wanting to try more,” he says.

McKinlay too would love to work on more styles.

“I hope this research project helps people rethink or reevaluate their relationship with clothes,” says McKinlay. “Right now when people purchase a piece of clothing it has only one ‘look.’ But, how exciting would it be to purchase one garment and reinvent it to change and evolve as you change or as the seasons or styles change? I’m hoping that’s the takeaway that people will have.”

Princess Peach: Showtime Demo Now Available On Switch

Princess Peach: Showtime Demo Now Available On Switch

Princess Peach: Showtime hits Switch exclusively later this month on March 22 and ahead of its launch, Nintendo has released a demo for the game. In it, you can play as two of Peach’s transformations: Swordfighter Peach and Patissiere Peach. The demo is available to download on Switch right now. 

“Swing, strike, dodge, and counterattack as Swordfighter Peach and cut across an action-packed stage,” a press release reads. “Then, turn into Patissiere Peach and get ready to whip up an array of delectable desserts to prevent the Sweet Festival from experiencing a serious sugar crash.” 

As you can see in the gameplay overview trailer above, each of Peach’s transformations grants her distinct and unique abilities that she’ll need to save the plays at Sparkle Theater. Plus, the above trailer demonstrates some of the different customization options players have at their disposal to add flair to Peach’s dress and Stella’s ribbon. 

Princess Peach: Showtime hits Switch on March 22. 

While waiting for its launch, read Game Informer’s Princess Peach: Showtime impressions after going hands-on with the game, and then check out these pink Nintendo Switch Joy-Con launching alongside Princess Peach: Showtime


Are you going to check out the Princess Peach: Showtime demo? Let us know in the comments below!

Penny’s Big Breakaway Review – A Swinging Pendulum – Game Informer

Penny’s Big Breakaway Review – A Swinging Pendulum – Game Informer

Coming off the success of Sonic Mania, the development team behind one of the best games in Sega’s storied series is back with an all-new franchise. Much like how the studio now known as Evening Star’s previous effort was a love letter to a bygone era of platforming, Penny’s Big Breakaway is a fond tribute to the 3D platformers of the late ‘90s. Evening Star clearly knows how to design a fantastic new entry in this well-worn genre, but some important issues drag down an otherwise strong game.

As Penny, a street performer whose yo-yo is transformed by a cosmic entity, you must leap, swing, spin, and dash through more than 11 colorful, themed worlds of stages. Each world is more colorful than the last, complemented by an upbeat soundtrack full of catchy tracks to push the action forward. In moving through these levels, Penny’s Big Breakaway steps into the spotlight in a big way. With the help of her enhanced yo-yo, Penny can pull off satisfying movement-based combos. Once you master the basics, jumping into the air, swinging from her yo-yo, landing in a roll, and smoothly launching into another combo with a twirl feels fantastic. In combat, however, I struggled with accidentally sending Penny flying off a cliff since double-tapping the attack button also initiates a dash.

[embedded content]

Thankfully, combat is only a small piece of the overall pie, and from the moment the movement mechanics clicked with me to the moment I watched the credits roll, I adored building momentum as I sped through the stages. Those terrific moves are accentuated by top-tier level design. Evening Star provides players with a ton of expertly designed courses that play into Penny’s abilities. Penny’s Big Breakaway is at its best when you’re moving quickly through obstacle courses, and the levels give you plenty of opportunities to do so; even the optional side objectives often require you to complete the given task within a time limit.

I relished every twisting path that let me quickly roll through, but I also enjoyed exploring every corner I could to find the collectibles used to purchase extra-challenging bonus stages. Levels typically offer branching pathways, and I loved trying to find the best route through the stages, though the fixed camera sometimes discouraged me from poking around too much. I was also disappointed by how many times I clipped through a stage element and had to restart from a checkpoint.

Sadly, the entire experience is brought down by a problem many early 3D platformers struggled with: depth perception. By the time I beat the story mode, I had lost count of the number of times I missed a seemingly easy jump because I couldn’t tell where Penny was in relation to the platform I was trying to land on. While the obvious answer is to look at her shadow’s position on the platform, my brain constantly needed to perform the calculus of whether Penny was where she looked like she was or where the game said she was. Unfortunately, this permeates the entire experience, poisoning the well of the overall gameplay.

A smaller issue that often rears its ugly head is that of screen-crowding. One of the key elements Penny’s Big Breakaway uses to propel the player forward is a group of penguins that swarm you in a capture attempt. Each time this happens, it immediately raises the level of on-screen chaos, but it sometimes goes too far as the penguins obscure everything happening in the level. Add to that an intrusive U.I. element that pops up when you’re near a side mission, and on multiple occasions, I had to blindly perform a leap of faith and hope for the best.

It’s a shame so many problems weigh on this otherwise enjoyable adventure. Even with the screen-crowding, bugs, and depth-perception troubles, I still look back fondly on the superb level design and movement mechanics. But because of those important detractors, Penny’s Big Breakaway lands as a solid 3D platformer unable to swing to the great heights it felt destined for.

Here Are The Nominees For The 20th Annual BAFTA Games Awards

Here Are The Nominees For The 20th Annual BAFTA Games Awards

The 20th BAFTA Games Awards take place on Thursday, April 11. Streaming live on BAFTA’s’ YouTube and Twitch channels from Queen Elizabeth’s Hall in London starting at 7 p.m. BST/11 a.m. PT/2 p.m. ET, this spin-off of the prestigious film/TV awards show celebrates 40 games released in 2023 boasting “an outstanding level of creative excellence.” 

The full list of categories and nominees has been revealed. Baldur’s Gate 3 leads the pack with 10 nominations, though Marvel’s Spider-Man 2 and Alan Wake 2 aren’t far behind with 9 and 8 nominations, respectively. Winners are decided by BAFTA’s global membership comprised of experienced minds in the game industry. The EE Players’ Choice award is the only category open to public voting, which you can do here. The nominees are: 

Best Game

  • Alan Wake 2
  • Baldur’s Gate 3
  • Dave the Diver
  • The Legend of Zelda: Tears of the Kingdom
  • Marvel’s Spider-Man 2
  • Super Mario Bros. Wonder

Animation 

  • Alan Wake 2
  • Hi-Fi Rush
  • Hogwarts Legacy
  • Marvel’s Spider-Man 2
  • Star Wars Jedi: Survivor
  • Super Mario Bros. Wonder

Artistic Achievement 

  • Alan Wake 2
  • Baldur’s Gate 3
  • Cocoon
  • Diablo IV
  • Final Fantasy XVI
  • Hi-Fi Rush

Audio Achievement

  • Alan Wake 2
  • Call of Duty: Modern Warfare III
  • Hi-Fi Rush
  • The Legend of Zelda: Tears of the Kingdom
  • Marvel’s Spider-Man 2
  • Star Wars Jedi: Survivor

British Game

  • Cassette Beasts
  • Dead Island 2
  • Disney Illusion Island
  • Football Manager 2024
  • Viewfinder
  • Warhammer Age of Sigmar: Realms of Ruin

Debut Game

  • Cocoon
  • Dave the Diver
  • Dredge
  • Stray Gods: The Role-Playing Musical
  • Venba
  • Viewfinder

Evolving Game

  • Cyberpunk 2077
  • Final Fantasy XIV
  • Fortnite
  • Forza Horizon 5
  • Genshin Impact
  • No Man’s Sky

Family

  • Cocoon
  • Dave the Diver
  • Disney Illusion Island
  • Hi-Fi Rush
  • Hogwarts Legacy
  • Super Mario Bros. Wonder

Game Beyond Entertainment

  • Chants of Sennaar
  • Goodbye Volcano High
  • Tchia
  • Terra Nil
  • Thirsty Suitors

Game Design

  • Cocoon
  • Dave the Diver
  • Dredge
  • The Legend of Zelda: Tears of the Kingdom
  • Marvel’s Spider-Man 2
  • Viewfinder

Multiplayer

  • Baldur’s Gate 3
  • Call of Duty: Modern Warfare III
  • Diablo IV
  • Forza Motorsport
  • Party Animals
  • Super Mario Bros. Wonder

Music

  • Alan Wake 2
  • Assassin’s Creed Mirage
  • Baldur’s Gate 3
  • The Legend of Zelda: Tears of the Kingdom
  • Marvel’s Spider-Man 2
  • Star Wars Jedi: Survivor

Narrative

  • Alan Wake 2
  • Baldur’s Gate 3
  • Dredge
  • Final Fantasy XVI
  • The Legend of Zelda: Tears of the Kingdom
  • Star Wars Jedi: Survivor

New Intellectual Property

  • Chants of Sennaar 
  • Dave the Diver
  • Dredge
  • Hi-Fi Rush
  • Jusant
  • Viewfinder

Performer in a Leading Role

  • Amelia Tyler as Narrator – Baldur’s Gate 3
  • Cameron Monaghan as Cal Kestis – Star Wars Jedi: Survivor
  • Nadji Jeter as Miles Morales – Marvel’s Spider-Man 2
  • Neil Newbon as Astarion – Baldur’s Gate 3
  • Samantha Béart as Karlach – Baldur’s Gate 3
  • Yuri Lowenthal as Peter Parker – Marvel’s Spider-Man 2

Performer in a Supporting Role

  • Andrew Wincott as Raphael – Baldur’s Gate 3
  • Debra Wilson as Cere Junda – Star Wars Jedi: Survivor
  • Ralph Ineson as Cidolfus “Cid” Telamon – Final Fantasy XVI
  • Sam Lake as Alex Casey – Alan Wake 2
  • Tony Todd as Venom – Marvel’s Spider-Man 2
  • Tracy Wiles as Jaheira – Baldur’s Gate 3

Technical Achievement 

  • Alan Wake 2
  • Final Fantasy XVI
  • Horizon Call of the Mountain
  • The Legend of Zelda: Tears of the Kingdom
  • Marvel’s Spider-Man 2
  • Starfield

EE Player’s Choice

  • Baldur’s Gate 3
  • Cyberpunk 2077
  • Fortnite
  • The Legend of Zelda: Tears of the Kingdom
  • Lethal Company
  • Marvel’s Spider-Man 2

No Rest For The Wicked Cover Story, WWE 2K24, And The Impending Moomin Invasion | GI Show (Feat. Suriel Vazquez)

No Rest For The Wicked Cover Story, WWE 2K24, And The Impending Moomin Invasion | GI Show (Feat. Suriel Vazquez)

This week on The Game Informer Show podcast, host Marcus Stewart is joined by Kyle Hilliard, Charles Harte, and former Game Informer editor/current narrative designer at Big Blue Sky Games, Suriel Vazquez to talk about a smattering of video games, new and old. First up, we dive deep into our most recent cover story for No Rest for the Wicked, talk a bit about how realistic and nauseating the driving is in Pacific Drive, learn about WWE 2K24, and prepare for the impending Moomin invasion. We also mop up on some games like Like a Dragon: Infinite Wealth and Penny’s Big Breakaway before Suriel shares details about the game he has been working on, Merchants of Rosewall. And we finish out the show by answering some reader questions.

Watch The Video Version:

[embedded content]

Follow us on social media: Marcus Stewart (@MarcusStewart7), Kyle Hilliard (@KyleMHilliard), Charles Harte (@chuckduck365), Suriel Vazquez (@SurielVazquez)

The Game Informer Show is a weekly gaming podcast covering the latest video game news, industry topics, exclusive reveals, and reviews. Join host Alex Van Aken every Thursday to chat about your favorite games – past and present – with Game Informer staff, developers, and special guests from around the industry. Listen on Apple PodcastsSpotify, or your favorite podcast app.

The Game Informer Show – Podcast Timestamps:

00:00:00 – Intro
00:03:20 – Cover Story: No Rest for the Wicked
00:28:38 – Pacific Drive
00:40:03 – Snufkin: Melody of Moominvalley
00:48:20 – WWE 2K24 Review
01:02:06 – Like a Dragon: Infinite Wealth
01:12:17 – Penny’s Big Breakaway
01:16:11 – The Outlast Trials
01:22:51 – Housekeeping and Listener Questions
01:44:52 – The Lunch Break: Like A Dance Break but with Lunch (Working Title)

Vote For The Greatest Game Of All Time In Our Bracket Tournament

Return to top

No matter the industry – sports, television, movies, or games – the debate of “the greatest of all time” inevitably comes up. These debates are often full of superfluous apples-to-oranges comparisons and, as a result, are unlikely to come up with a definitive answer everyone will be happy with. But that’s what makes them so fun. In that spirit, we’re taking the month of March to hold our own bracket-style tournament, voted on by readers, to determine the greatest video game of all time.

Though we here at Game Informer assembled the bracket, even that was heavily influenced by our Reader Vote we held in 2018 to coincide with our 300th issue. Using the results of our 2018 Reader Vote combined with some input from staff, general sentiment in various online communities, and critical reception for more recent games, we assembled a starting bracket of 64 games. From here, we’ll hold two rounds of voting per week until we whittle this field down to one ultimate winner.

Will this result in a definitive selection of the best game ever released? Or will it reveal a coordinated effort from a particular gaming community committed to seeing their favorite game rise through the ranks? Only time will tell. 

While this is meant to be a fun experiment to see how the results shake out within the Game Informer readership and community, we hope you’ll come back to root for your favorites every Monday and Thursday until we are able to crown our champion in the 2024 Game Gauntlet. Be sure to bookmark this page or follow us on social media to be alerted when a new round starts! 

Vote For The Greatest Game Of All Time In Our Bracket Tournament

Click to enlarge

Region 1

Region 1

Region 2

Region 2

Region 3

Region 3

Region 4

Region 4