Warner Bros. Games Acquires MultiVersus Developer 

Warner Bros. Games Acquires MultiVersus Developer 

Warner Bros. Games has acquired Player First Games, the developer of its cross-over platform fighter MultiVersus. The price of the sale was not disclosed, but according to a press release, Player First Games will continue to operate under its current leadership of co-founders Tony Huynh and Chris White. 

The announcement comes only a couple of months after MultiVersus’ relaunch on May 28. Player First Games served as a work-for-hire studio on the game, and Warner Bros. is apparently pleased enough with its performance to formally bring the studio in-house. The move comes only a few days after Warner Bros. shuttered the entire mobile division of Mortal Kombat developer Netherrealm Studios. 

“We have worked with Player First Games over several years to create and launch MultiVersus, and we are very pleased to welcome this talented team to Warner Bros. Games,” said David Haddad, president of Warner Bros. Games. “The bright and creative team at Player First Games adds to our extensive development capabilities.”

“Our team is excited to join the Warner Bros. Games family, and we feel that this will be great for MultiVersus overall,” Huynh said. “We are working to make the MultiVersus game experience the best it can be and having our development team integrated with the publisher is optimum for the players.”

MultiVersus is a 2v2 spin on the Super Smash Bros. blueprint that pits a variety of characters from different Warner Bros. properties against each other. It was first launched in open beta in 2022 (read our review of that version here), then taken down months later to be rebuilt and expanded before returning this year. MultiVersus recently kicked off its second content season, and it has received new fighters such as The Joker, Agent Smith, Jason Voorhees, and the recently announced Samurai Jack and Beetlejuice.

Roadmap details how to improve exoplanet exploration using the JWST

Roadmap details how to improve exoplanet exploration using the JWST

The launch of NASA’s James Webb Space Telescope (JWST) in 2021 kicked off an exciting new era for exoplanet research, especially for scientists looking at terrestrial planets orbiting stars other than our sun. But three years into the telescope’s mission, some scientists have run into challenges that have slowed down progress.

In a recent paper published in Nature Astronomy, the TRAPPIST-1 JWST Community Initiative lays out a step-by-step roadmap to overcome the challenges they faced while studying the TRAPPIST-1 system by improving the efficiency of data gathering to benefit the astronomy community at large.

“A whole community of experts came together to tackle these complex cross-disciplinary challenges to design the first multiyear observational strategy to give JWST a fighting chance at identifying habitable worlds over its lifetime,” says Julien de Wit, an associate professor in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS) and one of the lead authors of the paper.

Two-for-one deal

Located 41 light years from Earth, the TRAPPIST-1 system with its seven planets presents a unique opportunity to study a large system with multiple planets of different compositions, similar to our own solar system.

“It’s a dream target: You have not one, but maybe three, planets in the habitable zone, so you have a way to actually compare in the same system,” says René Doyon from the Université de Montréal, who co-led the study with de Wit. “There are only a handful of well-characterized temperate rocky planets for which we can hope to detect their atmosphere, and most of them are within the TRAPPIST-1 system.”

Astronomers like de Wit and Doyon study exoplanet atmospheres through a technique called transmission spectroscopy, where they look at the way starlight passes through a planet’s potential atmosphere to see what elements are present. Transmission spectra are collected when the planet passes in front of its host star.

The planets within the TRAPPIST system have short orbital periods. As a result, their transits frequently overlap. Transit observation times are usually allotted in five-hour windows, and when scheduled properly, close to half of these can catch at least two transits. This “two-for-one” saves both time and money while doubling data collection.

Stellar contamination

Stars are not uniform; their surfaces can vary in temperature, creating spots that can be hotter or cooler. Molecules like water vapor can condense in cool spots and interfere with transmission spectra. Stellar information like this can be difficult to disentangle from the planetary signal and give false indications of a planet’s atmospheric composition, creating what’s known as “stellar contamination.” While it has often been ignored, the improved capabilities of the JWST have revealed the challenges stellar contamination introduces when studying planetary atmospheres.

EAPS research scientist Ben Rackham ran into these challenges when they derailed his initial PhD research on small exoplanets using the Magellan Telescopes in Chile. He’s now seeing the same problem he first encountered as a graduate student repeating itself with the new JWST data.

“As we predicted from that earlier work with data from ground-based telescopes, the very first spectral signatures we’re getting with JWST don’t really make any sense in terms of a planetary interpretation,” he says. “The features are not what we would expect to see, and they change from transit to transit.”

Rackham and David Berardo, a postdoc in EAPS, have been working with de Wit on ways to correct for stellar contamination using two different methods: improving models of stellar spectra and using direct observations to derive corrections.

“By observing a star as it rotates, we can use the sensitivity of JWST to get a clearer picture of what its surface looks like, allowing for a more accurate measuring of the atmosphere of planets that transit it,” says Berardo. This, combined with studying back-to-back transits as proposed in the roadmap, collects useful data on the star that can be used to filter out stellar contamination from both future studies and past ones.

Beyond TRAPPIST-1

The current roadmap was born from the efforts of the TRAPPIST JWST Community Initiative to bring together separate programs focused on individual planets, which prevented them from leveraging the optimal transit observation windows.

“We understood early on that this effort would ‘take a village’ to avoid the efficiency traps of small observation programs,” says de Wit. “Our hope now is that a large-scale community effort guided by the roadmap can be initiated to yield deliverables at a timely pace.” De Wit hopes that it could result in identifying habitable, or inhabitable, worlds around TRAPPIST-1 within a decade.

Both de Wit and Doyon believe that the TRAPPIST-1 system is the best place for conducting fundamental research on exoplanet atmospheres that will extend to studies in other systems. Doyon thinks that “the TRAPPIST-1 system will be useful not only for TRAPPIST-1 itself, but also to learn how to do very precise correction of stellar activity which will be beneficial to many other transmission spectroscopy programs also affected by stellar activity.”

“We have within reach fundamental and transforming answers with a clear roadmap to them,” says de Wit. “We just need to follow it diligently.” 

Q&A: “As long as you have a future, you can still change it”

Q&A: “As long as you have a future, you can still change it”

Tristan Brown is the S.C. Fang Chinese Language and Culture Career Development Professor at MIT. He specializes in law, science, environment and religion of late imperial China, a period running from the 16th through early 20th centuries.

In this Q&A, Brown discusses how his areas of historical research can be useful for examining today’s pressing environmental challenges. This is part of an ongoing series exploring how the MIT School of Humanities, Arts, and Social Sciences is addressing the climate crisis.

Q: Why does this era of Chinese history resonate so much for you? How is it relevant to contemporary times and challenges?

A: China has always been interesting to historians because it has a long-recorded history, with data showing how people have coped with environmental and climate changes over the centuries. We have tons of records of various kinds of ecological issues, environmental crises, and the associated outbreaks of calamities, famine, epidemics, and warfare. Historians of China have a lot to offer ongoing conversations about climate.

More specifically, I research conflicts over land and resources that erupted when China was undergoing huge environmental, economic, demographic, and political pressures, and the role that feng shui played as local communities and the state tried to mediate those conflicts. [Feng shui is an ancient Chinese practice combining cosmology, spatial aesthetics, and measurement to divine the right balance between the natural and built environment.] Ultimately, the Qing (1644-1912) state was unable to manage these conflicts, and feng shui–based attempts to make decisions about conserving or exploiting certain areas blew up by the end of the 19th century in the face of pressures to industrialize. This is the subject of my first book, “Laws of the Land: Fengshui and the State in Qing Dynasty China.”

Q: Can you give a sense of how feng shui was used to determine outcomes in environmental cases?

A: We tend to think of feng shui as a popular design mechanism today. While this isn’t completely inaccurate, there was much more to it than that in Chinese history, when it evolved over many centuries. Specifically, there are lots of insights in feng shui that reflect the ways in which people recorded the natural world, explained how components in the environment related to one another, and understood why and how bad things happened. There is an interesting concept in feng shui that your environment affects your health,and specifically your children’s (i.e., descendants and progeny) health. That concept is found across premodern feng shui literature and is one of fundamental principles of the whole knowledge system.

During the period I research, the Qing, the primary fuel energy sources in China came from timber and coal. There were legal cases where communities argued against efforts to mine a local mountain, saying that it could injure the feng shui (i.e., undermine the cosmological balance of natural forces and spatial integrity) of a mountain and hurt the fortunes of an entire region. People were suspicious of coal mining in their communities. They had seen or heard about mines collapsing and flooded mine shafts, they had watched runoff ruin good farmland, causing crops to fail, and even perhaps children to fall ill. Coal mining disturbed the human-earth connection, and thus the relationship between people and nature. People invoked feng shui to express an idea that the extraction of rocks and minerals from the land can have detrimental effects on living communities. Whether out of a sincere community-based concern or out of a more self-interested NIMBYism, feng shui was the primary discourse invoked in these cases.

Not all efforts to conserve areas from mining succeeded, especially as foreign imperialism encroached on China, threatening government and local control over the economy. It became gradually clear to China’s elites that the country had to industrialize to survive, and this involved the difficult and even violent process of taking people from farm work and bringing them to cities, building railways, cutting millions of trees, and mining coal to power it all.

Q: This makes it seem as if the Chinese swept away feng shui whenever it presented a hurdle, putting the country on the path to coal dependence, pollution, and a carbon-emitting future.

A: Feng shui has not disappeared in China, but there’s no doubt about it that development in the form of industrialization took precedence in the 20th century, when it became officially labelled a “superstition” on the national stage. When I first went to China in 2007, city air was so polluted I couldn’t see the horizon. I was 18 years old and the air in some northern cities like Shijiazhuang honestly felt scary. I’ve returned many times since then, of course, and there has been great improvement in air quality, because the government made it a priority.

Feng shui is a future-oriented knowledge, concerned with identifying events that have happened in the past that are related to things happening today, and using that information to influence future events. As Richard Smith of Rice University argues, Chinese have used history to order the past, ritual to order the present, and divination to order the future. Consider, for instance, Xiong’an, a new development area outside of Beijing that is physically marking the era of Xi Jinping’s tenure as paramount leader. As soon as the site was selected, people in China started talking about its feng shui, both out of potential environmental concerns and as a subtle form of political commentary. MIT’s own Sol Andrew Stokols in the Department of Urban Studies and Planning (DUSP) has a fantastic new dissertation examining that new area.

In short, the feng shui masters of old said there will be floods and droughts and bad stuff happening in the future if a course correction isn’t made. But at the same time, in feng shui there’s never a situation that is hopeless; there is no lost cause. So, there is optimism in the knowledge and rhetoric of feng shui that I think might be applicable as time goes on with climate change. As long as you have a future, you can still change it. 

Q: In 2023, you were awarded one of the first grants of MIT’s Climate Nucleus, the faculty committee charged with seeing through the Institute’s climate action plan over the decade. What have you been up to courtesy of this fund?

A: Well, it all started years ago, when I started thinking about great number of mountains in China associated with Buddhism or Daoism that have become national parks in recent decades. Some of these mountains host trees and plant species that are not found in any other part of China. For my grant, I wanted to find out how these mountains have managed to incubate such rare species for the last 2,000 years. And it’s not as simple as just saying, well, Buddhism, right? Because there are plenty of Buddhist mountains that have not fared as well ecologically. The religious landscape is part of the answer, but there’s also all the messiness of material history that surrounds such a mountain.

With this grant, I am bringing together a group of scholars of religion, historians, as well as engineers working in conservation ecology, and we’re trying to figure out what makes some of these places religiously and environmentally distinctive. People come to the project with different approaches. My MIT colleague Serguei Saavedra in the Department of Civil and Environmental Engineering uses new models in system ecology to measure the resilience of environments under various stresses. My colleague in religious studies, Or Porath at Tel Aviv University, is asking when and how Asian religions have centered — or ignored — animals and animal welfare. Another collaboration with MIT’s Siqi Zheng in DUSP and Wen-Chi Liao at the National University of Singapore is looking at how we can use artificial intelligence, machine learning, and classical feng shui manuals to teach computers how to analyze the value of a property’s feng shui in Sinophone communities around the world. There’s a lot going on!

Q: How do you bring China’s unique environmental history and law into your classroom, and make it immediate and relevant to the world students face today?

A: History is always part of the answer. I mean, whether it’s for an economist, a political scientist, or an architect, history matters. Likewise, when you’re confronting climate change and all these struggles regarding the environment and various crises involving ecosystems, it’s always a good idea to look at how human beings in the past dealt with similar crises. It doesn’t give you a prediction on what would happen in the future, but it gives you some range of possibilities, many of which may at first appear counterintuitive or surprising.

That’s exactly what the humanities do. My job is to make MIT undergraduates care about a people who are no longer alive, who walked the earth a thousand years ago, who confronted terrible times of conflict and hunger. Sometimes these people left behind a written record about their world, and sometimes they didn’t. But we try to hear them out regardless. I want students to develop empathy for these strangers and wonder what it would be like to walk in their shoes. Every one of those people is someone’s ancestor, and they very well could have been your ancestor.

In my class 21H.186 (Nature and Environment in China), we look at the historical precedents that might be useful for today’s environmental challenges, ranging from urban pollution or domestic recycling systems. The fact we’re still here to ask historical questions is itself significant. When we feel despair about climate change, we can ask, “How did individuals endure the changed course of the Yellow River or the Little Ice Age?” Even when it is recording tragedies, history can be understood as an enduring form of hope. 

Study across multiple brain regions discerns Alzheimer’s vulnerability and resilience factors

An open-access MIT study published today in Nature provides new evidence for how specific cells and circuits become vulnerable in Alzheimer’s disease, and hones in on other factors that may help some people show resilience to cognitive decline, even amid clear signs of disease pathology. 

To highlight potential targets for interventions to sustain cognition and memory, the authors engaged in a novel comparison of gene expression across multiple brain regions in people with or without Alzheimer’s disease, and conducted lab experiments to test and validate their major findings.

Brain cells all have the same DNA but what makes them differ, both in their identity and their activity, are their patterns of how they express those genes. The new analysis measured gene expression differences in more than 1.3 million cells of more than 70 cell types in six brain regions from 48 tissue donors, 26 of whom died with an Alzheimer’s diagnosis and 22 of whom without. As such, the study provides a uniquely large, far-ranging, and yet detailed accounting of how brain cell activity differs amid Alzheimer’s disease by cell type, by brain region, by disease pathology, and by each person’s cognitive assessment while still alive.

“Specific brain regions are vulnerable in Alzheimer’s and there is an important need to understand how these regions or particular cell types are vulnerable,” says co-senior author Li-Huei Tsai, Picower Professor of Neuroscience and director of The Picower Institute for Learning and Memory and the Aging Brain Initiative at MIT. “And the brain is not just neurons. It’s many other cell types. How these cell types may respond differently, depending on where they are, is something fascinating we are only at the beginning of looking at.”

Co-senior author Manolis Kellis, professor of computer science and head of MIT’s Computational Biology Group, likens the technique used to measure gene expression comparisons, single-cell RNA profiling, to being a much more advanced “microscope” than the ones that first allowed Alois Alzheimer to characterize the disease’s pathology more than a century ago.

“Where Alzheimer saw amyloid protein plaques and phosphorylated tau tangles in his microscope, our single-cell ‘microscope’ tells us, cell by cell and gene by gene, about thousands of subtle yet important biological changes in response to pathology,” says Kellis. “Connecting this information with the cognitive state of patients reveals how cellular responses relate with cognitive loss or resilience, and can help propose new ways to treat cognitive loss. Pathology can precede cognitive symptoms by a decade or two before cognitive decline becomes diagnosed. If there’s not much we can do about the pathology at that stage, we can at least try to safeguard the cellular pathways that maintain cognitive function.”

Hansruedi Mathys, a former MIT postdoc in the Tsai Lab who is now an assistant professor at the University of Pittsburgh; Carles Boix PhD ’22, a former graduate student in Kellis’s lab who is now a postdoc at Harvard Medical School; and Leyla Akay, a graduate student in Tsai’s lab, led the study analyzing the prefrontal cortex, entorhinal cortex, hippocampus, anterior thalamus, angular gyrus, and the midtemporal cortex. The brain samples came from the Religious Order Study and the Rush Memory and Aging Project at Rush University.

Neural vulnerability and Reelin

Some of the earliest signs of amyloid pathology and neuron loss in Alzheimer’s occur in memory-focused regions called the hippocampus and the entorhinal cortex. In those regions, and in other parts of the cerebral cortex, the researchers were able to pinpoint a potential reason why. One type of excitatory neuron in the hippocampus and four in the entorhinal cortex were significantly less abundant in people with Alzheimer’s than in people without. Individuals with depletion of those cells performed significantly worse on cognitive assessments. Moreover, many vulnerable neurons were interconnected in a common neuronal circuit. And just as importantly, several either directly expressed a protein called Reelin, or were directly affected by Reelin signaling. In all, therefore, the findings distinctly highlight especially vulnerable neurons, whose loss is associated with reduced cognition, that share a neuronal circuit and a molecular pathway.

Tsai notes that Reelin has become prominent in Alzheimer’s research because of a recent study of a man in Colombia. He had a rare mutation in the Reelin gene that caused the protein to be more active, and was able to stay cognitively healthy at an advanced age despite having a strong family predisposition to early-onset Alzheimer’s. The new study shows that loss of Reelin-producing neurons is associated with cognitive decline. Taken together, it might mean that the brain benefits from Reelin, but that neurons that produce it may be lost in at least some Alzheimer’s patients.

“We can think of Reelin as having maybe some kind of protective or beneficial effect,” Akay says. “But we don’t yet know what it does or how it could confer resilience.”

In further analysis the researchers also found that specifically vulnerable inhibitory neuron subtypes identified in a previously study from this group in the prefrontal cortex also were involved in Reelin signaling, further reinforcing the significance of the molecule and its signaling pathway.

To further check their results, the team directly examined the human brain tissue samples and the brains of two kinds of Alzheimer’s model mice. Sure enough, those experiments also showed a reduction in Reelin-positive neurons in the human and mouse entorhinal cortex.

Resilience associated with choline metabolism in astrocytes

To find factors that might preserve cognition, even amid pathology, the team examined which genes, in which cells, and in which regions, were most closely associated with cognitive resilience, which they defined as residual cognitive function, above the typical cognitive loss expected given the observed pathology.

Their analysis yielded a surprising and specific answer: across several brain regions, astrocytes that expressed genes associated with antioxidant activity and with choline metabolism and polyamine biosynthesis were significantly associated with sustained cognition, even amid high levels of tau and amyloid. The results reinforced previous research findings led by Tsai and Susan Lundqvist in which they showed that dietary supplement of choline helped astrocytes cope with the dysregulation of lipids caused by the most significant Alzheimer’s risk gene, the APOE4 variant. The antioxidant findings also pointed to a molecule that can be found as a dietary supplement, spermidine, which may have anti-inflammatory properties, although such an association would need further work to be established causally.

As before, the team went beyond the predictions from the single-cell RNA expression analysis to make direct observations in the brain tissue of samples. Those that came from cognitively resilient individuals indeed showed increased expression of several of the astrocyte-expressed genes predicted to be associated with cognitive resilience.

Study across multiple brain regions discerns Alzheimer’s vulnerability and resilience factors

Expression of the gene GPCPD1 in astrocyte cells is associated with cognitive resilience in people with Alzheimer’s pathology. Here, white arrows indicate instances of GPCPD1 expression (blue) in astrocyte cells (denoted by AQP4 staining in magenta). There is much more expression in tissue from the cognitively resilient person (right).

Image: Tsai Lab/The Picower Institute


New analysis method, open dataset

To analyze the mountains of single-cell data, the researchers developed a new robust methodology based on groups of coordinately-expressed genes (known as “gene modules”), thus exploiting the expression correlation patterns between functionally-related genes in the same module.

“In principle, the 1.3 million cells we surveyed could use their 20,000 genes in an astronomical number of different combinations,” explains Kellis. “In practice, however, we observe a much smaller subset of coordinated changes. Recognizing these coordinated patterns allow us to infer much more robust changes, because they are based on multiple genes in the same functionally-connected module.”

He offered this analogy: With many joints in their bodies, people could move in all kinds of crazy ways, but in practice they engage in many fewer coordinated movements like walking, running, or dancing. The new method enables scientists to identify such coordinated gene expression programs as a group.

While Kellis and Tsai’s labs already reported several noteworthy findings from the dataset, the researchers expect that many more possibly significant discoveries still wait to be found in the trove of data. To facilitate such discovery the team posted handy analytical and visualization tools along with the data on Kellis’s website.

“The dataset is so immensely rich. We focused on only a few aspects that are salient that we believe are very, very interesting, but by no means have we exhausted what can be learned with this dataset,” Kellis says. “We expect many more discoveries ahead, and we hope that young researchers (of all ages) will dive right in and surprise us with many more insights.”

Going forward, Kellis says, the researchers are studying the control circuitry associated with the differentially expressed genes, to understand the genetic variants, the regulators, and other driver factors that can be modulated to reverse disease circuitry across brain regions, cell types, and different stages of the disease.

Additional authors of the study include Ziting Xia, Jose Davila Velderrain, Ayesha P. Ng, Xueqiao Jiang, Ghada Abdelhady, Kyriaki Galani, Julio Mantero, Neil Band, Benjamin T. James, Sudhagar Babu, Fabiola Galiana-Melendez, Kate Louderback, Dmitry Prokopenko, Rudolph E. Tanzi, and David A. Bennett.

Support for the research came from the National Institutes of Health, The Picower Institute for Learning and Memory, The JPB Foundation, the Cure Alzheimer’s Fund, The Robert A. and Renee E. Belfer Family Foundation, Eduardo Eurnekian, and Joseph DiSabato.

Mission directors announced for the Climate Project at MIT

Mission directors announced for the Climate Project at MIT

The Climate Project at MIT has appointed leaders for each of its six focal areas, or Climate Missions, President Sally Kornbluth announced in a letter to the MIT community today.

Introduced in February, the Climate Project at MIT is a major new effort to change the trajectory of global climate outcomes for the better over the next decade. The project will focus MIT’s strengths on six broad climate-related areas where progress is urgently needed. The mission directors in these fields, representing diverse areas of expertise, will collaborate with faculty and researchers across MIT, as well as each other, to accelerate solutions that address climate change.

“The mission directors will be absolutely central as the Climate Project seeks to marshal the Institute’s talent and resources to research, develop, deploy and scale up serious solutions to help change the planet’s climate trajectory,” Kornbluth wrote in her letter, adding: “To the faculty members taking on these pivotal roles: We could not be more grateful for your skill and commitment, or more enthusiastic about what you can help us all achieve, together.”

The Climate Project will expand and accelerate MIT’s efforts to both reduce greenhouse gas emissions and respond to climate effects such as extreme heat, rising sea levels, and reduced crop yields. At the urgent pace needed, the project will help the Institute create new external collaborations and deepen existing ones to develop and scale climate solutions.

The Institute has pledged an initial $75 million to the project, including $25 million from the MIT Sloan School of Management to launch a complementary effort, the new MIT Climate Policy Center. MIT has more than 300 faculty and senior researchers already working on climate issues, in collaboration with their students and staff. The Climate Project at MIT builds on their work and the Institute’s 2021 “Fast Forward” climate action plan.

Richard Lester, MIT’s vice provost for international activities and the Japan Steel Industry Professor of Nuclear Science and Engineering, has led the Climate Project’s formation; MIT will shortly hire a vice president for climate to oversee the project. The six Climate Missions and the new mission directors are as follows:

Decarbonizing energy and industry

This mission supports supports advances in the electric power grid as well as the transition across all industry — including transportation, computing, heavy production, and manufacturing — to low-emissions pathways.

The mission director is Elsa Olivetti PhD ’07, who is MIT’s associate dean of engineering, the Jerry McAfee Professor in Engineering, and a professor of materials science and engineering since 2014.

Olivetti analyzes and improves the environmental sustainability of materials throughout the life cycle and across the supply chain, by linking physical and chemical processes to systems impact. She researches materials design and synthesis using natural language processing, builds models of material supply and technology demand, and assesses the potential from recovering value from industrial waste through experimental approaches. Olivetti has experience building partnerships across the Institute and working with industry to implement large-scale climate solutions through her role as co-director of the MIT Climate and Sustainability Consortium (MCSC) and as faculty lead for PAIA, an industry consortium on the carbon footprinting of computing.

Restoring the atmosphere, protecting the land and oceans

This mission is centered on removing or storing greenhouse gases that have already been emitted into the atmosphere, such as carbon dioxide and methane, and on protecting ocean and land ecosystems, including food and water systems.

MIT has chosen two mission directors: Andrew Babbin and Jesse Kroll. The two bring together research expertise from two critical domains of the Earth system, oceans and the atmosphere, as well as backgrounds in both the science and engineering underlying our understanding of Earth’s climate. As co-directors, they jointly link MIT’s School of Science and School of Engineering in this domain.

Babbin is the Cecil and Ida Green Career Development Professor in MIT’s Program in Atmospheres, Oceans, and Climate. He is a marine biogeochemist whose specialty is studying the carbon and nitrogen cycle of the oceans, work that is related to evaluating the ocean’s capacity for carbon storage, an essential element of this mission’s work. He has been at MIT since 2017.

Kroll is a professor in MIT’s Department of of Civil and Environmental Engineering, a professor of chemical engineering, and the director of the Ralph M. Parsons Laboratory. He is a chemist who studies organic compounds and particulate matter in the atmosphere, in order to better understand how perturbations to the atmosphere, both intentional and unintentional, can affect air pollution and climate.

Empowering frontline communities

This mission focuses on the development of new climate solutions in support of the world’s most vulnerable populations, in areas ranging from health effects to food security, emergency planning, and risk forecasting.

The mission director is Miho Mazereeuw, an associate professor of architecture and urbanism in MIT’s Department of Architecture in the School of Architecture and Planning, and director of MIT’s Urban Risk Lab. Mazereeuw researches disaster resilience, climate change, and coastal strategies. Her lab has engaged in design projects ranging from physical objects to software, while exploring methods of engaging communities and governments in preparedness efforts, skills she brings to bear on building strong collaborations with a broad range of stakeholders.

Mazereeuw is also co-lead of one of the five projects selected in MIT’s Climate Grand Challenges competition in 2022, an effort to help communities prepare by understanding the risk of extreme weather events for specific locations.

Building and adapting healthy, resilient cities

A majority of the world’s population lives in cities, so urban design and planning is a crucial part of climate work, involving transportation, infrastructure, finance, government, and more.

Christoph Reinhart, the Alan and Terri Spoon Professor of Architecture and Climate and director of MIT’s Building Technology Program in the School of Architecture and Planning, is the mission director in this area. The Sustainable Design Lab that Reinhart founded when he joined MIT in 2012 has launched several technology startups, including Mapdwell Solar System, now part of Palmetto Clean Technology, as well as Solemma, makers of an environmental building design software used in architectural practice and education worldwide. Reinhart’s online course on Sustainable Building Design has an enrollment of over 55,000 individuals and forms part of MIT’s XSeries Program in Future Energy Systems.

Inventing new policy approaches

Climate change is a unique crisis. With that in mind, this mission aims to develop new institutional structures and incentives — in carbon markets, finance, trade policy, and more — along with decision support tools and systems for scaling up climate efforts.

Christopher Knittel brings extensive knowledge of these topics to the mission director role. The George P. Shultz Professor and Professor of Applied Economics at the MIT Sloan School of Management, Knittel has produced high-impact research in multiple areas; his studies on emissions and the automobile industry have evaluated fuel-efficiency standards, changes in vehicle fuel efficiency, market responses to fuel-price changes, and the health impact of automobiles.

Beyond that, Knittel has also studied the impact of the energy transition on jobs, conducted high-level evaluations of climate policies, and examined energy market structures. He joined the MIT faculty in 2011. He also serves as the director of the MIT Climate Policy Center, which will work closely with all six missions.

Wild cards

This mission consists of what the Climate Project at MIT calls “unconventional solutions outside the scope of the other missions,” and will have a broad portfolio for innovation.

While all the missions will be charged with encouraging unorthodox approaches within their domains, this mission will seek out unconventional solutions outside the scope of the others, and has a broad mandate for promoting them.

The mission director in this case is Benedetto Marelli, the Paul M. Cook Career Development Associate Professor in MIT’s Department of Civil and Environmental Engineering. Marelli’s research group develops biopolymers and bioinspired materials with reduced environmental impact compared to traditional technologies. He engages with research at multiple scales, including nanofabrication, and the research group has conducted extensive work on food security and safety while exploring new techniques to reduce waste through enhanced food preservation and to precisely deliver agrochemicals in plants and in soil.

As Lester and other MIT leaders have noted, the Climate Project at MIT is still being shaped, and will have the flexibility to accommodate a wide range of projects, partnerships, and approaches needed for thoughtful, fast-moving change. By filling out the leadership structure, today’s announcement is a major milestone in making the project operational.

In addition to the six Climate Missions, the Climate Project at MIT includes Climate Frontier Projects, which are efforts launched by these missions, and a Climate HQ, which will support fundamental research, education, and outreach, as well as new resources to connect research to the practical work of climate response.

Top 5 AI tool directories: Discover and showcase AI innovations – AI News

Hey there, AI enthusiasts! If you’re anything like me, you’re always on the lookout for the best resources to discover the latest and greatest in artificial intelligence. Whether you’re a developer eager to showcase your cutting-edge tool or someone simply fascinated by the rapid advancements in…

How developers can use Llama 3.1 to build advanced models

With enhanced knowledge, flexibility, and multilingual prowess, Llama 3.1 empowers developers to explore uncharted territories in AI research and development.
Keep reading for a deeper dive into Llama 3.1, its features, and how developers can use it to build advanced models….

Meta advances open source AI with ‘frontier-level’ Llama 3.1

Meta has unveiled Llama 3.1, marking a significant milestone in the company’s commitment to open source AI. This release, which Meta CEO Mark Zuckerberg calls “the first frontier-level open source AI model,” aims to challenge the dominance of closed AI systems and democratise access to advanced…

Matrox ConvertIP Enables ST2110 for the Sphere in Las Vegas – Videoguys

Matrox ConvertIP Enables ST2110 for the Sphere in Las Vegas – Videoguys

In this article from Matrox, they give an overview about how Fuse Technical Group created a one-of-a-kind experience with Matrox Video Technology for the Lax Vegas Sphere. Las Vegas has unveiled a cutting-edge live entertainment venue with a stunning technological setup. Las Vegas has unveiled a groundbreaking live entertainment venue that redefines the concert experience through cutting-edge technology and immersive design. This state-of-the-art venue boasts a 580,000-square-foot LED exterior and an unparalleled 160,000-square-foot, 16K resolution wraparound LED screen inside. Combined with a spatial audio system and 4D physical effects, it offers showgoers an immersive audio-visual spectacle like no other.

A Legendary Debut

In September 2023, a legendary musical act inaugurated this remarkable venue with a residency, delivering the first-ever show. Behind the scenes, multimedia system designer Fuse Technical Group played a pivotal role in bringing this vision to life. Fuse is renowned for their expertise in creating awe-inspiring video and lighting systems, LED solutions, media servers, control systems, and more for various events worldwide.

Advanced ST 2110 Workflow for 16K Video Outputs

For this groundbreaking residency, Fuse Technical Group developed an advanced video playback system capable of handling 16K resolution, augmenting live input across 16K over a complete SMPTE ST 2110 IP backbone. This system was designed to deliver stunning visual elements throughout the show, setting a new standard for live entertainment.

Overcoming Technical Challenges with Expertise

The development of this system required Fuse to leverage their extensive experience in SDI and event production. The content feeding the screens operated on ST 2110 at 4K (4096 x 2160), necessitating 26 4K outputs to drive the entire system. This setup allowed for dynamic live input placement on the LED canvas, creating an immersive visual experience for the audience.

“Working with ST 2110 is very different from working with SDI,” said Ryan Middlemiss, Fuse Technical Group’s director of media servers. “ST 2110 offers limitless distribution capabilities, but it also presents unique challenges.”

Partnership with Matrox Video

To navigate these challenges, Fuse turned to Matrox Video for their expertise in ST 2110. Matrox provided critical hardware and software solutions, including ConductIP for routing and orchestrating ST 2110 sources, and ConvertIP devices to convert high-resolution SDI into ST 2110. This partnership ensured low latency, 25G network speeds, and 2022-7 redundancy, meeting the high demands of the project.

“The support from Matrox Video was crucial,” Middlemiss noted. “Their responsiveness and engineering support, especially with last-minute changes, were key to our success.”

Efficient Remote Operation with Matrox Extio 3

The live production system included 30 computers, operated remotely by a programmer and two technicians using Matrox Extio 3 IP KVM extenders. This setup allowed them to control multiple monitors and computers efficiently from a single workstation, significantly enhancing operational efficiency during the show.

“Extio 3 empowers the techs and the programmer to be as efficient as possible,” said Middlemiss. “Its efficiency is incredibly valuable, especially when there’s no time to spare.”

Conclusion

Las Vegas’ next-generation entertainment venue sets a new standard for live shows with its advanced technology and immersive design. The collaboration between Fuse Technical Group and Matrox Video demonstrates the power of combining expertise and innovative solutions to create unforgettable experiences. Concertgoers are treated to the audiovisual experience of a lifetime, showcasing the future of live entertainment.

Read the full article from Matrox HERE


Learn more about Matrox below: