Get Your First Look At Dragon Age: The Veilguard’s Real-Time Action Combat In First Gameplay Trailer

Developer BioWare has released the first official gameplay trailer for its upcoming RPG, Dragon Age: The Veilguard. Featuring more than 20 minutes of gameplay, this is the first look at the game’s action combat, the magical city of Minrathous, the player-character Rook, some of the companions who are joining Rook’s journey, how dialogue choices work, and more. 

In the gameplay trailer, which begins in Minrathous, we see returning fan favorites Varric Tethras and Lace Harding alongside newcomer companion Neve Gallus, a private detective mage, and Rook. Solas, the once-ally mage turned foe at the end of Dragon Age: Inquisition’s Trespasser DLC, is attempting to destroy the Veil, a barrier between the magical Fade and the continent of Thedas. Our team, seemingly not quite the titular Veilguard just yet, is attempting to stop him. 

Check it out for yourself in the official gameplay trailer for Dragon Age: The Veilguard below:

[embedded content]

At just 20 minutes, this gameplay reveal is a tease of what’s to come in the final game when it hits PlayStation 5, Xbox Series X/S, and PC this fall. However, Dragon Age: The Veilguard is Game Informer’s next cover story, and in the latest issue, you can get an exclusive in-depth look at what happens next in the game alongside new details about the extensive character creator, the game’s main hub location, the action combat, and so much more.

Be sure to subscribe here to ensure our Dragon Age: The Veilguard issue reaches your mailbox. And keep an eye on Game Informer’s Dragon Age: The Veilguard coverage hub for upcoming articles about the game with exclusive details, interviews, video features, and more. 

Get Your First Look At Dragon Age: The Veilguard’s Real-Time Action Combat In First Gameplay Trailer

For more about the game, check out this recent Dragon Age: The Veilguard trailer that introduces the game’s cast, and then read about how the name changed from Dreadwolf to The Veilguard


What do you think of this gameplay reveal for Dragon Age: The Veilguard? Let us know in the comments below!

Gears Of War: E-Day Is A Prequel Set 14 Years Before The First Game Starring Marcus Fenix

Gears Of War: E-Day Is A Prequel Set 14 Years Before The First Game Starring Marcus Fenix

Xbox has revealed Gears of War: E-Day, and it’s a prequel “origin story” set 14 years before the very first Gears of War game. Revealed during today’s Xbox Games Showcase, E-Day stars fan-favorite hero Marcus Fenix and gives players the chance to experience the terror of the infamous Emergence Day. 

In the trailer, which features an orchestral remix of Gary Jules’ Mad World (the very same Mad World from the Gears of War launch trailer), we see Marcus and returning hero Dom Santiago come face to face with the Locust Horde. And though we know the Locust Horde quite well, it appears to be Marcus and Dom’s first bout with these infamous enemies. 

Check it out for yourself in the Gears of War: E-Day reveal trailer below

[embedded content]

“Experience the brutal horror of Emergence Day through the eyes of Marcus Fenix in the origin story of one of gaming’s most acclaimed sagas,” the trailer’s description reads. “Fourteen years before Gears of War, war heroes Marcus Fenix and Dom Santiago return home to face a new nightmare: the Locust Horde. These subterranean monsters, grotesque and relentless, erupt from below, laying siege on humanity itself.”

Xbox says E-Day has been developed from the ground up with Unreal Engine 5 and promises to deliver “unprecedented graphical fidelity.” 

There’s no release date or window for E-Day yet. 


Are you excited about Gears of War: E-Day? Let us know in the comments below!

Fallout 76’s Skyline Valley Launches Next Week, Play As A Ghoul Starting In 2025

Fallout 76’s Skyline Valley Launches Next Week, Play As A Ghoul Starting In 2025

The next expansion for Fallout 76 is called Skyline Valley and it launches next week. More specifically, Skyline Valley expands the game’s map southward to the all-new Shenandoah region on June 12. 

There, players will investigate the cause of an electrical storm circling overhead and unveil the mystery around Vault 63. In this mysterious vault, players will meet its dwellers and discover a new Ghoul type called The Lost, too. 

Check it out for yourself in the Fallout 76: Skyline Valley trailer below

[embedded content]

As you can see from the trailer above, the Shenandoah region is not looking so great in post-apocalyptic America. Plus, the trailer also reveals that Fallout 76 players can play as Ghouls starting sometime next year. 


Are you going to play Fallout 76’s Skyline Valley expansion next week? Let us know in the comments below!

Mixtape Is A Sharp-Looking Coming Of Age Story With Music From Devo, Smashing Pumpkins, And More

Mixtape Is A Sharp-Looking Coming Of Age Story With Music From Devo, Smashing Pumpkins, And More

Beethoven & Dinosaur, the developer behind The Artful Escape, revealed its next game during the 2024 Xbox Showcase, and it’s all about being a teenager, getting into trouble, listening to music, and apparently flying through the air (though it is likely a metaphor). The trailer, which features an impressive art style and character animation that drops frames à la Spider-Man: Into the Spider-Verse, will feature music from bands like Devo, Roxy Music, Lush, and The Smashing Pumpkins, among others.

[embedded content]

The game is being published by Annapurna Interactive and is coming some time in 2025.

Clair Obscur: Expedition 33 Is A Slick-Looking Fantasy RPG Coming Next Year

Clair Obscur: Expedition 33 Is A Slick-Looking Fantasy RPG Coming Next Year

Clair Obscur: Expedition 33 was one of the brand-new reveals during today’s Xbox Game Showcase. Developed by Sandfall Interactive (its debut title), the turn-based RPG unfolds in a fantastical world governed by a powerful figure who wipes out countless innocents every year via a mysterious number. 

Clair Obscur’s world is at the mercy of the Paintress, who awakens once a year to paint a cursed number on a monolith. That number causes every person of that age to instantly vanish in wisps of smoke. The citizens dealing with this recurring threat know that her next number is “33” and a brave collection of them have one day to mount an expedition to her island to stop her once and for all. However, these “Expeditioners” are merely the latest of dozens of failed attempts at killing the Paintress. Thus, this team will retrace the steps of every squad that came before them as they explore a dreamlike world filled with dangerous threats.

[embedded content]

The game is powered by Unreal Engine 5 and sports slick-looking turn-based action. Clair Obscur: Expedition 33 is coming in 2025 to PlayStation 5, Xbox Series X/S (including launching day one on Game Pass), and PC. 

Microsoft Flight Simulator 2024 Will Let You Live Out Your Aviation Career Dreams This November

Microsoft Flight Simulator 2024 Will Let You Live Out Your Aviation Career Dreams This November

Xbox has revealed that Microsoft Flight Simulator 2024 will launch this November. More specifically, it hits Xbox Series X/S and PC on November 19. 

Developer Asobo Studio revealed this today during the Xbox Games Showcase with a new trailer that highlights some of the aviation careers players can take on in the game later this year. That includes aerial ambulances and advertisers, VIP charter captains, and more. 

Check it out for yourself in the latest Microsoft Flight Simulator 2024 trailer below

[embedded content]

“[Microsoft Flight Simulator 2024] features the largest, most diverse and detailed fleet of aircraft, the most complete representation of airports and air traffic, and the most visually stunning rendition of Earth ever created,” an Xbox Wire post reads. “This brand-new simulator is designed to take advantage of the latest technologies in simulation, cloud, machine learning, graphics, and gaming. Microsoft Flight Simulator 2024 goes beyond merely operating the aircraft; it will allow simmers to pursue their dream of an aviation career.” 

Careers include firefighting, search and rescue, commercial airline piloting, remote cargo ops, charter service, air racing, and more. 

Microsoft Flight Simulator 2024 hits Xbox Series X/S and PC on November 19, 2024. It will launch day one on Game Pass for Xbox, PC, and Cloud. 


Are you going to check out Microsoft Flight Simulator 2024 later this year? Let us know in the comments below!

New computer vision method helps speed up screening of electronic materials

New computer vision method helps speed up screening of electronic materials

Boosting the performance of solar cells, transistors, LEDs, and batteries will require better electronic materials, made from novel compositions that have yet to be discovered.

To speed up the search for advanced functional materials, scientists are using AI tools to identify promising materials from hundreds of millions of chemical formulations. In tandem, engineers are building machines that can print hundreds of material samples at a time based on chemical compositions tagged by AI search algorithms.

But to date, there’s been no similarly speedy way to confirm that these printed materials actually perform as expected. This last step of material characterization has been a major bottleneck in the pipeline of advanced materials screening.

Now, a new computer vision technique developed by MIT engineers significantly speeds up the characterization of newly synthesized electronic materials. The technique automatically analyzes images of printed semiconducting samples and quickly estimates two key electronic properties for each sample: band gap (a measure of electron activation energy) and stability (a measure of longevity).

The new technique accurately characterizes electronic materials 85 times faster compared to the standard benchmark approach.

The researchers intend to use the technique to speed up the search for promising solar cell materials. They also plan to incorporate the technique into a fully automated materials screening system.

“Ultimately, we envision fitting this technique into an autonomous lab of the future,” says MIT graduate student Eunice Aissi. “The whole system would allow us to give a computer a materials problem, have it predict potential compounds, and then run 24-7 making and characterizing those predicted materials until it arrives at the desired solution.”

“The application space for these techniques ranges from improving solar energy to transparent electronics and transistors,” adds MIT graduate student Alexander (Aleks) Siemenn. “It really spans the full gamut of where semiconductor materials can benefit society.”

Aissi and Siemenn detail the new technique in a study appearing today in Nature Communications. Their MIT co-authors include graduate student Fang Sheng, postdoc Basita Das, and professor of mechanical engineering Tonio Buonassisi, along with former visiting professor Hamide Kavak of Cukurova University and visiting postdoc Armi Tiihonen of Aalto University.

Power in optics

Once a new electronic material is synthesized, the characterization of its properties is typically handled by a “domain expert” who examines one sample at a time using a benchtop tool called a UV-Vis, which scans through different colors of light to determine where the semiconductor begins to absorb more strongly. This manual process is precise but also time-consuming: A domain expert typically characterizes about 20 material samples per hour — a snail’s pace compared to some printing tools that can lay down 10,000 different material combinations per hour.

“The manual characterization process is very slow,” Buonassisi says. “They give you a high amount of confidence in the measurement, but they’re not matched to the speed at which you can put matter down on a substrate nowadays.”

To speed up the characterization process and clear one of the largest bottlenecks in materials screening, Buonassisi and his colleagues looked to computer vision — a field that applies computer algorithms to quickly and automatically analyze optical features in an image.

“There’s power in optical characterization methods,” Buonassisi notes. “You can obtain information very quickly. There is richness in images, over many pixels and wavelengths, that a human just can’t process but a computer machine-learning program can.”

The team realized that certain electronic properties — namely, band gap and stability — could be estimated based on visual information alone, if that information were captured with enough detail and interpreted correctly.

With that goal in mind, the researchers developed two new computer vision algorithms to automatically interpret images of electronic materials: one to estimate band gap and the other to determine stability.

The first algorithm is designed to process visual data from highly detailed, hyperspectral images.

“Instead of a standard camera image with three channels — red, green, and blue (RBG) — the hyperspectral image has 300 channels,” Siemenn explains. “The algorithm takes that data, transforms it, and computes a band gap. We run that process extremely fast.”

The second algorithm analyzes standard RGB images and assesses a material’s stability based on visual changes in the material’s color over time.

“We found that color change can be a good proxy for degradation rate in the material system we are studying,” Aissi says.

Material compositions

The team applied the two new algorithms to characterize the band gap and stability for about 70 printed semiconducting samples. They used a robotic printer to deposit samples on a single slide, like cookies on a baking sheet. Each deposit was made with a slightly different combination of semiconducting materials. In this case, the team printed different ratios of perovskites — a type of material that is expected to be a promising solar cell candidate though is also known to quickly degrade.

“People are trying to change the composition — add a little bit of this, a little bit of that — to try to make [perovskites] more stable and high-performance,” Buonassisi says.

Once they printed 70 different compositions of perovskite samples on a single slide, the team scanned the slide with a hyperspectral camera. Then they applied an algorithm that visually “segments” the image, automatically isolating the samples from the background. They ran the new band gap algorithm on the isolated samples and automatically computed the band gap for every sample. The entire band gap extraction process process took about six minutes.

“It would normally take a domain expert several days to manually characterize the same number of samples,” Siemenn says.

To test for stability, the team placed the same slide in a chamber in which they varied the environmental conditions, such as humidity, temperature, and light exposure. They used a standard RGB camera to take an image of the samples every 30 seconds over two hours. They then applied the second algorithm to the images of each sample over time to estimate the degree to which each droplet changed color, or degraded under various environmental conditions. In the end, the algorithm produced a “stability index,” or a measure of each sample’s durability. 

As a check, the team compared their results with manual measurements of the same droplets, taken by a domain expert. Compared to the expert’s benchmark estimates, the team’s band gap and stability results were 98.5 percent and 96.9 percent as accurate, respectively, and 85 times faster.

“We were constantly shocked by how these algorithms were able to not just increase the speed of characterization, but also to get accurate results,” Siemenn says.  “We do envision this slotting into the current automated materials pipeline we’re developing in the lab, so we can run it in a fully automated fashion, using machine learning to guide where we want to discover these new materials, printing them, and then actually characterizing them, all with very fast processing.”

This work was supported, in part, by First Solar. 

5 ways generative AI will impact CISOs & cyber security teams – CyberTalk

5 ways generative AI will impact CISOs & cyber security teams – CyberTalk

EXECUTIVE SUMMARY:

Enterprises and individuals have adopted generative AI at an extremely impressive rate. In 2024, generative AI is projected to reach 77.8 million users worldwide — an adoption rate of more than double that of smartphones and tablets across a comparable time frame.

While the integration of generative AI into work environments offers coveted agility and productivity gains, such benefits remain tenuous without the right workforce (and societal) structures in-place to support AI-driven growth.

It nearly goes without saying — Generative AI introduces a new layer of complexity into organizational systems. Effective corresponding workplace transformation — one that enables people to use generative AI for efficiency and productivity gains —  depends on our abilities to secure it, secure our people, and secure our processes.

In the second half of 2024, CISOs and cyber security teams can facilitate the best possible generative AI-based business outcomes by framing discussions and focal points around the following:

5 ways generative AI will impact CISOs and security teams

1. Expanded responsibilities. It should have been written on a neon sign…Generative AI will add new ‘to-dos’ to CISOs’ (already extensive) list of responsibilities. Only 9% of CISOs say that they are currently prepared to manage the risks associated with generative AI.

New generative AI-related responsibilities will involve dealing with data security and privacy, access control, model integrity and security, and user training, among other things.

2. AI governance. As generative AI’s footprint expands within enterprises, cyber security leaders must develop comprehensive governance frameworks to mitigate corresponding risks.

This includes addressing the potential for “shadow generative AI,” referring to the unsanctioned use of generative AI tooling. Shadow generative AI poses challenges that parallel those associated with shadow IT.

To build a strategic AI governance plan for your organization, start with an assessment of your organization’s unique needs and generative AI use-cases.

3. User training. Successful AI governance hinges on effective user awareness and training initiatives. Currently, only 17% of organizations have fully trained their teams on the risks around generative AI.

Prioritize generative AI awareness programs, as to communicate acceptable and unacceptable use-cases. This ultimately minimizes the potential for painful cyber security stumbles.

4. The dual-use dilemma. This concept refers to the notion that generative AI technologies can be applied for both beneficial and malicious gain.

The overwhelming majority of CISOs (70%) believe that generative AI will lead to an imbalance in “firepower,” enabling the cyber criminals to wreak havoc on organizations at an unprecedented rate.

Will AI-generated phishing emails achieve a higher click-through rates and perpetuate a high volume of attacks? No one knows. In the interim, CISOs are advised to proactively update and upgrade cyber security technologies.

5. AI in security tooling. Just over a third of CISOs currently use AI — either extensively, or peripherally — within cyber security functions. However, within the next 12 months, 61% of CISOs intend to explore opportunities for generative AI implementation in security processes and protocols.

If your organization is currently assessing AI-based cyber security threat prevention technologies, see how Check Point’s Infinity AI Copilot can advance your initiatives. Learn more here.

Also, be sure to check out this CISO’s Guide to AI. Lastly, to receive cyber security thought leadership articles, groundbreaking research and emerging threat analyses each week, subscribe to the CyberTalk.org newsletter.

The 10 Best Dedicated Hosting Providers in 2024

Dedicated hosting is the highest tier of web hosting – you get a whole server to yourself and don’t need to share the server’s resources with any other website owner. The perks? Exceptional website performance, unlimited bandwidth, exclusive IP address, infinite flexibility, and unmatched security.  However,…

Digital Humans Are Not Just AI with a Face

Digital humans used to be simple chatbots that often misunderstood questions, which many people found frustrating. Now, they’ve evolved into advanced virtual agents that can communicate as effectively as the best customer service representatives, possess expert-level knowledge, and look strikingly like real humans.  These advanced digital…