Dealing with the limitations of our noisy world

Dealing with the limitations of our noisy world

Tamara Broderick first set foot on MIT’s campus when she was a high school student, as a participant in the inaugural Women’s Technology Program. The monthlong summer academic experience gives young women a hands-on introduction to engineering and computer science.

What is the probability that she would return to MIT years later, this time as a faculty member?

That’s a question Broderick could probably answer quantitatively using Bayesian inference, a statistical approach to probability that tries to quantify uncertainty by continuously updating one’s assumptions as new data are obtained.

In her lab at MIT, the newly tenured associate professor in the Department of Electrical Engineering and Computer Science (EECS) uses Bayesian inference to quantify uncertainty and measure the robustness of data analysis techniques.

“I’ve always been really interested in understanding not just ‘What do we know from data analysis,’ but ‘How well do we know it?’” says Broderick, who is also a member of the Laboratory for Information and Decision Systems and the Institute for Data, Systems, and Society. “The reality is that we live in a noisy world, and we can’t always get exactly the data that we want. How do we learn from data but at the same time recognize that there are limitations and deal appropriately with them?”

Broadly, her focus is on helping people understand the confines of the statistical tools available to them and, sometimes, working with them to craft better tools for a particular situation.

For instance, her group recently collaborated with oceanographers to develop a machine-learning model that can make more accurate predictions about ocean currents. In another project, she and others worked with degenerative disease specialists on a tool that helps severely motor-impaired individuals utilize a computer’s graphical user interface by manipulating a single switch.

A common thread woven through her work is an emphasis on collaboration.

“Working in data analysis, you get to hang out in everybody’s backyard, so to speak. You really can’t get bored because you can always be learning about some other field and thinking about how we can apply machine learning there,” she says.

Hanging out in many academic “backyards” is especially appealing to Broderick, who struggled even from a young age to narrow down her interests.

A math mindset

Growing up in a suburb of Cleveland, Ohio, Broderick had an interest in math for as long as she can remember. She recalls being fascinated by the idea of what would happen if you kept adding a number to itself, starting with 1+1=2 and then 2+2=4.

“I was maybe 5 years old, so I didn’t know what ‘powers of two’ were or anything like that. I was just really into math,” she says.

Her father recognized her interest in the subject and enrolled her in a Johns Hopkins program called the Center for Talented Youth, which gave Broderick the opportunity to take three-week summer classes on a range of subjects, from astronomy to number theory to computer science.

Later, in high school, she conducted astrophysics research with a postdoc at Case Western University. In the summer of 2002, she spent four weeks at MIT as a member of the first class of the Women’s Technology Program.

She especially enjoyed the freedom offered by the program, and its focus on using intuition and ingenuity to achieve high-level goals. For instance, the cohort was tasked with building a device with LEGOs that they could use to biopsy a grape suspended in Jell-O.

The program showed her how much creativity is involved in engineering and computer science, and piqued her interest in pursuing an academic career.

“But when I got into college at Princeton, I could not decide — math, physics, computer science — they all seemed super-cool. I wanted to do all of it,” she says.

She settled on pursuing an undergraduate math degree but took all the physics and computer science courses she could cram into her schedule.

Digging into data analysis

After receiving a Marshall Scholarship, Broderick spent two years at Cambridge University in the United Kingdom, earning a master of advanced study in mathematics and a master of philosophy in physics.

In the UK, she took a number of statistics and data analysis classes, including her first class on Bayesian data analysis in the field of machine learning.

It was a transformative experience, she recalls.

“During my time in the U.K., I realized that I really like solving real-world problems that matter to people, and Bayesian inference was being used in some of the most important problems out there,” she says.

Back in the U.S., Broderick headed to the University of California at Berkeley, where she joined the lab of Professor Michael I. Jordan as a grad student. She earned a PhD in statistics with a focus on Bayesian data analysis. 

She decided to pursue a career in academia and was drawn to MIT by the collaborative nature of the EECS department and by how passionate and friendly her would-be colleagues were.

Her first impressions panned out, and Broderick says she has found a community at MIT that helps her be creative and explore hard, impactful problems with wide-ranging applications.

“I’ve been lucky to work with a really amazing set of students and postdocs in my lab — brilliant and hard-working people whose hearts are in the right place,” she says.

One of her team’s recent projects involves a collaboration with an economist who studies the use of microcredit, or the lending of small amounts of money at very low interest rates, in impoverished areas.

The goal of microcredit programs is to raise people out of poverty. Economists run randomized control trials of villages in a region that receive or don’t receive microcredit. They want to generalize the study results, predicting the expected outcome if one applies microcredit to other villages outside of their study.

But Broderick and her collaborators have found that results of some microcredit studies can be very brittle. Removing one or a few data points from the dataset can completely change the results. One issue is that researchers often use empirical averages, where a few very high or low data points can skew the results.

Using machine learning, she and her collaborators developed a method that can determine how many data points must be dropped to change the substantive conclusion of the study. With their tool, a scientist can see how brittle the results are.

“Sometimes dropping a very small fraction of data can change the major results of a data analysis, and then we might worry how far those conclusions generalize to new scenarios. Are there ways we can flag that for people? That is what we are getting at with this work,” she explains.

At the same time, she is continuing to collaborate with researchers in a range of fields, such as genetics, to understand the pros and cons of different machine-learning techniques and other data analysis tools.

Happy trails

Exploration is what drives Broderick as a researcher, and it also fuels one of her passions outside the lab. She and her husband enjoy collecting patches they earn by hiking all the trails in a park or trail system.

“I think my hobby really combines my interests of being outdoors and spreadsheets,” she says. “With these hiking patches, you have to explore everything and then you see areas you wouldn’t normally see. It is adventurous, in that way.”

They’ve discovered some amazing hikes they would never have known about, but also embarked on more than a few “total disaster hikes,” she says. But each hike, whether a hidden gem or an overgrown mess, offers its own rewards.

And just like in her research, curiosity, open-mindedness, and a passion for problem-solving have never led her astray.

Startup accelerates progress toward light-speed computing

Startup accelerates progress toward light-speed computing

Our ability to cram ever-smaller transistors onto a chip has enabled today’s age of ubiquitous computing. But that approach is finally running into limits, with some experts declaring an end to Moore’s Law and a related principle, known as Dennard’s Scaling.

Those developments couldn’t be coming at a worse time. Demand for computing power has skyrocketed in recent years thanks in large part to the rise of artificial intelligence, and it shows no signs of slowing down.

Now Lightmatter, a company founded by three MIT alumni, is continuing the remarkable progress of computing by rethinking the lifeblood of the chip. Instead of relying solely on electricity, the company also uses light for data processing and transport. The company’s first two products, a chip specializing in artificial intelligence operations and an interconnect that facilitates data transfer between chips, use both photons and electrons to drive more efficient operations.

“The two problems we are solving are ‘How do chips talk?’ and ‘How do you do these [AI] calculations?’” Lightmatter co-founder and CEO Nicholas Harris PhD ’17 says. “With our first two products, Envise and Passage, we’re addressing both of those questions.”

In a nod to the size of the problem and the demand for AI, Lightmatter raised just north of $300 million in 2023 at a valuation of $1.2 billion. Now the company is demonstrating its technology with some of the largest technology companies in the world in hopes of reducing the massive energy demand of data centers and AI models.

“We’re going to enable platforms on top of our interconnect technology that are made up of hundreds of thousands of next-generation compute units,” Harris says. “That simply wouldn’t be possible without the technology that we’re building.”

From idea to $100K

Prior to MIT, Harris worked at the semiconductor company Micron Technology, where he studied the fundamental devices behind integrated chips. The experience made him see how the traditional approach for improving computer performance — cramming more transistors onto each chip — was hitting its limits.

“I saw how the roadmap for computing was slowing, and I wanted to figure out how I could continue it,” Harris says. “What approaches can augment computers? Quantum computing and photonics were two of those pathways.”

Harris came to MIT to work on photonic quantum computing for his PhD under Dirk Englund, an associate professor in the Department of Electrical Engineering and Computer Science. As part of that work, he built silicon-based integrated photonic chips that could send and process information using light instead of electricity.

The work led to dozens of patents and more than 80 research papers in prestigious journals like Nature. But another technology also caught Harris’s attention at MIT.

“I remember walking down the hall and seeing students just piling out of these auditorium-sized classrooms, watching relayed live videos of lectures to see professors teach deep learning,” Harris recalls, referring to the artificial intelligence technique. “Everybody on campus knew that deep learning was going to be a huge deal, so I started learning more about it, and we realized that the systems I was building for photonic quantum computing could actually be leveraged to do deep learning.”

Harris had planned to become a professor after his PhD, but he realized he could attract more funding and innovate more quickly through a startup, so he teamed up with Darius Bunandar PhD ’18, who was also studying in Englund’s lab, and Thomas Graham MBA ’18. The co-founders successfully launched into the startup world by winning the 2017 MIT $100K Entrepreneurship Competition.

Seeing the light

Lightmatter’s Envise chip takes the part of computing that electrons do well, like memory, and combines it with what light does well, like performing the massive matrix multiplications of deep-learning models.

“With photonics, you can perform multiple calculations at the same time because the data is coming in on different colors of light,” Harris explains. “In one color, you could have a photo of a dog. In another color, you could have a photo of a cat. In another color, maybe a tree, and you could have all three of those operations going through the same optical computing unit, this matrix accelerator, at the same time. That drives up operations per area, and it reuses the hardware that’s there, driving up energy efficiency.”

Passage takes advantage of light’s latency and bandwidth advantages to link processors in a manner similar to how fiber optic cables use light to send data over long distances. It also enables chips as big as entire wafers to act as a single processor. Sending information between chips is central to running the massive server farms that power cloud computing and run AI systems like ChatGPT.

Both products are designed to bring energy efficiencies to computing, which Harris says are needed to keep up with rising demand without bringing huge increases in power consumption.

“By 2040, some predict that around 80 percent of all energy usage on the planet will be devoted to data centers and computing, and AI is going to be a huge fraction of that,” Harris says. “When you look at computing deployments for training these large AI models, they’re headed toward using hundreds of megawatts. Their power usage is on the scale of cities.”

Lightmatter is currently working with chipmakers and cloud service providers for mass deployment. Harris notes that because the company’s equipment runs on silicon, it can be produced by existing semiconductor fabrication facilities without massive changes in process.

The ambitious plans are designed to open up a new path forward for computing that would have huge implications for the environment and economy.

“We’re going to continue looking at all of the pieces of computers to figure out where light can accelerate them, make them more energy efficient, and faster, and we’re going to continue to replace those parts,” Harris says. “Right now, we’re focused on interconnect with Passage and on compute with Envise. But over time, we’re going to build out the next generation of computers, and it’s all going to be centered around light.”

A careful rethinking of the Iraq War

A careful rethinking of the Iraq War

The term “fog of war” expresses the chaos and uncertainty of the battlefield. Often, it is only in hindsight that people can grasp what was unfolding around them.

Now, additional clarity about the Iraq War has arrived in the form of a new book by MIT political scientist Roger Petersen, which dives into the war’s battlefield operations, political dynamics, and long-term impact. The U.S. launched the Iraq War in 2003 and formally wrapped it up in 2011, but Petersen analyzes the situation in Iraq through the current day and considers what the future holds for the country.

After a decade of research, Petersen identifies four key factors for understanding Iraq’s situation. First, the U.S. invasion created chaos and a lack of clarity in terms of the hierarchy among Shia, Sunni, and Kurdish groups. Second, given these conditions, organizations that comprised a mix of militias, political groups, and religious groups came to the fore and captured elements of the new state the U.S. was attempting to set up. Third, by about 2018, the Shia groups became dominant, establishing a hierarchy, and along with that dominance, sectarian violence has fallen. Finally, the hybrid organizations established many years ago are now highly integrated into the Iraqi state.

Petersen has also come to believe two things about the Iraq War are not fully appreciated. One is how widely U.S. strategy varied over time in response to shifting circumstances.

“This was not one war,” says Petersen. “This was many different wars going on. We had at least five strategies on the U.S. side.”

And while the expressed goal of many U.S. officials was to build a functioning democracy in Iraq, the intense factionalism of Iraqi society led to further military struggles, between and among religious and ethnic groups. Thus, U.S. military strategy shifted as this multisided conflict evolved.

“What really happened in Iraq, and the thing the United States and Westerners did not understand at first, is how much this would become a struggle for dominance among Shias, Sunnis, and Kurds,” says Petersen. “The United States thought they would build a state, and the state would push down and penetrate society. But it was society that created militias and captured the state.”

Attempts to construct a well-functioning state, in Iraq or elsewhere must confront this factor, Petersen adds. Most people think in terms of groups. They think in terms of group hierarchies, and they’re motivated when they believe their own group is not in a proper space in the hierarchy. This is this emotion of resentment. I think this is just human nature.”

Petersen’s book, “Death, Dominance, and State-Building: The U.S. in Iraq and the Future of American Military Intervention,” is published today by Oxford University Press. Petersen is the Arthur and Ruth Sloan Professor of Political Science at MIT and a member of the Security Studies Program based at MIT’s Center for International Studies.

Research on the ground

Petersen spent years interviewing people who were on the ground in Iraq during the war, from U.S. military personnel to former insurgents to regular Iraqi citizens, while extensively analyzing data about the conflict.

“I didn’t really come to conclusions about Iraq until six or seven years of applying this method,” he says.

Ultimately, one core fact about the country heavily influenced the trajectory of the war. Iraq’s Sunni Muslims made up about 20 percent or less of the country’s population but had been politically dominant before the U.S. took military action. After the U.S. toppled former dictator Saddam Hussein, it created an opening for the Shia majority to grasp more power.

“The United States said, ‘We’re going to have democracy and think in individual terms,’ but this is not the way it played out,” Petersen says. “The way it played out was, over the years, the Shia organizations became the dominant force. The Sunnis and Kurds are now basically subordinate within this Shia-dominated state. The Shias also had advantages in organizing violence over the Sunnis, and they’re the majority. They were going to win.”

As Petersen details in the book, a central unit of power became the political militia, based on ethnic and religious identification. One Shia militia, the Badr Organization, had trained professionally for years in Iran. The local Iraqi leader Moqtada al-Sadr could recruit Shia fighters from among the 2 million people living in the Sadr City slum. And no political militia wanted to back a strong multiethnic government.

“They liked this weaker state,” Petersen says. “The United States wanted to build a new Iraqi state, but what we did was create a situation where multiple and large Shia militia make deals with each other.”

A captain’s war

In turn, these dynamics meant the U.S. had to shift military strategies numerous times, occasionally in high-profile ways. The five strategies Petersen identifies are clear, hold, build (CHB); decapitation; community mobilization; homogenization; and war-fighting.

“The war from the U.S. side was highly decentralized,” Petersen says. Military captains, who typically command about 140 to 150 soldiers, had fairly wide berth in terms of how they were choosing to fight.  

“It was a captain’s war in a lot of ways,” Petersen adds.

The point is emphatically driven home in one chapter, “Captain Wright goes to Baghdad,” co-authored with Col. Timothy Wright PhD ’18, who wrote his MIT political science dissertation based on his experience and company command during the surge period.

As Petersen also notes, drawing on government data, the U.S. also managed to suppress violence fairly effectively at times, particularly before 2006 and after 2008. “The professional soldiers tried to do a good job, but some of the problems they weren’t going to solve,” Petersen says.

Still, all of this raises a conundrum. If trying to start a new state in Iraq was always likely to lead to an increase in Shia power, is there really much the U.S. could have done differently?

“That’s a million-dollar question,” Petersen says.

Perhaps the best way to engage with it, Petersen notes, is to recognize the importance of studying how factional groups grasp power through use of violence, and how that emerges in society. It is a key issue running throughout Petersen’s work, and one, he notes, that has often been studied by his graduate students in MIT’s Security Studies Program.

“Death, Dominance, and State-Building” has received praise from foreign-policy scholars. Paul Staniland, a political scientist at the University of Chicago, has said the work combines “intellectual creativity with careful attention to on-the ground dynamics,” and is “a fascinating macro-level account of the politics of group competition in Iraq. This book is required reading for anyone interested in civil war, U.S. foreign policy, or the politics of violent state-building.”

Petersen, for his part, allows that he was pleased when one marine who served in Iraq read the manuscript in advance and found it interesting.

“He said, ‘This is good, and it’s not the way we think about it,’” Petersen says. “That’s my biggest compliment, to have a practitioner say it make them think. If I can get that kind of reaction, I’ll be pleased.”

Power when the sun doesn’t shine

Power when the sun doesn’t shine

In 2016, at the huge Houston energy conference CERAWeek, MIT materials scientist Yet-Ming Chiang found himself talking to a Tesla executive about a thorny problem: how to store the output of solar panels and wind turbines for long durations.        

Chiang, the Kyocera Professor of Materials Science and Engineering, and Mateo Jaramillo, a vice president at Tesla, knew that utilities lacked a cost-effective way to store renewable energy to cover peak levels of demand and to bridge the gaps during windless and cloudy days. They also knew that the scarcity of raw materials used in conventional energy storage devices needed to be addressed if renewables were ever going to displace fossil fuels on the grid at scale.

Energy storage technologies can facilitate access to renewable energy sources, boost the stability and reliability of power grids, and ultimately accelerate grid decarbonization. The global market for these systems — essentially large batteries — is expected to grow tremendously in the coming years. A study by the nonprofit LDES (Long Duration Energy Storage) Council pegs the long-duration energy storage market at between 80 and 140 terawatt-hours by 2040. “That’s a really big number,” Chiang notes. “Every 10 people on the planet will need access to the equivalent of one EV [electric vehicle] battery to support their energy needs.”

In 2017, one year after they met in Houston, Chiang and Jaramillo joined forces to co-found Form Energy in Somerville, Massachusetts, with MIT graduates Marco Ferrara SM ’06, PhD ’08 and William Woodford PhD ’13, and energy storage veteran Ted Wiley.

“There is a burgeoning market for electrical energy storage because we want to achieve decarbonization as fast and as cost-effectively as possible,” says Ferrara, Form’s senior vice president in charge of software and analytics.

Investors agreed. Over the next six years, Form Energy would raise more than $800 million in venture capital.

Bridging gaps

The simplest battery consists of an anode, a cathode, and an electrolyte. During discharge, with the help of the electrolyte, electrons flow from the negative anode to the positive cathode. During charge, external voltage reverses the process. The anode becomes the positive terminal, the cathode becomes the negative terminal, and electrons move back to where they started. Materials used for the anode, cathode, and electrolyte determine the battery’s weight, power, and cost “entitlement,” which is the total cost at the component level.

During the 1980s and 1990s, the use of lithium revolutionized batteries, making them smaller, lighter, and able to hold a charge for longer. The storage devices Form Energy has devised are rechargeable batteries based on iron, which has several advantages over lithium. A big one is cost.

Chiang once declared to the MIT Club of Northern California, “I love lithium-ion.” Two of the four MIT spinoffs Chiang founded center on innovative lithium-ion batteries. But at hundreds of dollars a kilowatt-hour (kWh) and with a storage capacity typically measured in hours, lithium-ion was ill-suited for the use he now had in mind.

The approach Chiang envisioned had to be cost-effective enough to boost the attractiveness of renewables. Making solar and wind energy reliable enough for millions of customers meant storing it long enough to fill the gaps created by extreme weather conditions, grid outages, and when there is a lull in the wind or a few days of clouds.

To be competitive with legacy power plants, Chiang’s method had to come in at around $20 per kilowatt-hour of stored energy — one-tenth the cost of lithium-ion battery storage.

But how to transition from expensive batteries that store and discharge over a couple of hours to some as-yet-undefined, cheap, longer-duration technology?

“One big ball of iron”

That’s where Ferrara comes in. Ferrara has a PhD in nuclear engineering from MIT and a PhD in electrical engineering and computer science from the University of L’Aquila in his native Italy. In 2017, as a research affiliate at the MIT Department of Materials Science and Engineering, he worked with Chiang to model the grid’s need to manage renewables’ intermittency.

How intermittent depends on where you are. In the United States, for instance, there’s the windy Great Plains; the sun-drenched, relatively low-wind deserts of Arizona, New Mexico, and Nevada; and the often-cloudy Pacific Northwest.

Ferrara, in collaboration with Professor Jessika Trancik of MIT’s Institute for Data, Systems, and Society and her MIT team, modeled four representative locations in the United States and concluded that energy storage with capacity costs below roughly $20/kWh and discharge durations of multiple days would allow a wind-solar mix to provide cost-competitive, firm electricity in resource-abundant locations.

Now that they had a time frame, they turned their attention to materials. At the price point Form Energy was aiming for, lithium was out of the question. Chiang looked at plentiful and cheap sulfur. But a sulfur, sodium, water, and air battery had technical challenges.

Thomas Edison once used iron as an electrode, and iron-air batteries were first studied in the 1960s. They were too heavy to make good transportation batteries. But this time, Chiang and team were looking at a battery that sat on the ground, so weight didn’t matter. Their priorities were cost and availability.

“Iron is produced, mined, and processed on every continent,” Chiang says. “The Earth is one big ball of iron. We wouldn’t ever have to worry about even the most ambitious projections of how much storage that the world might use by mid-century.” If Form ever moves into the residential market, “it’ll be the safest battery you’ve ever parked at your house,” Chiang laughs. “Just iron, air, and water.”

Scientists call it reversible rusting. While discharging, the battery takes in oxygen and converts iron to rust. Applying an electrical current converts the rusty pellets back to iron, and the battery “breathes out” oxygen as it charges. “In chemical terms, you have iron, and it becomes iron hydroxide,” Chiang says. “That means electrons were extracted. You get those electrons to go through the external circuit, and now you have a battery.”

Form Energy’s battery modules are approximately the size of a washer-and-dryer unit. They are stacked in 40-foot containers, and several containers are electrically connected with power conversion systems to build storage plants that can cover several acres.

The right place at the right time

The modules don’t look or act like anything utilities have contracted for before.

That’s one of Form’s key challenges. “There is not widespread knowledge of needing these new tools for decarbonized grids,” Ferrara says. “That’s not the way utilities have typically planned. They’re looking at all the tools in the toolkit that exist today, which may not contemplate a multi-day energy storage asset.”

Form Energy’s customers are largely traditional power companies seeking to expand their portfolios of renewable electricity. Some are in the process of decommissioning coal plants and shifting to renewables.

Ferrara’s research pinpointing the need for very low-cost multi-day storage provides key data for power suppliers seeking to determine the most cost-effective way to integrate more renewable energy.

Using the same modeling techniques, Ferrara and team show potential customers how the technology fits in with their existing system, how it competes with other technologies, and how, in some cases, it can operate synergistically with other storage technologies.

“They may need a portfolio of storage technologies to fully balance renewables on different timescales of intermittency,” he says. But other than the technology developed at Form, “there isn’t much out there, certainly not within the cost entitlement of what we’re bringing to market.”  Thanks to Chiang and Jaramillo’s chance encounter in Houston, Form has a several-year lead on other companies working to address this challenge. 

In June 2023, Form Energy closed its biggest deal to date for a single project: Georgia Power’s order for a 15-megawatt/1,500-megawatt-hour system. That order brings Form’s total amount of energy storage under contracts with utility customers to 40 megawatts/4 gigawatt-hours. To meet the demand, Form is building a new commercial-scale battery manufacturing facility in West Virginia.

The fact that Form Energy is creating jobs in an area that lost more than 10,000 steel jobs over the past decade is not lost on Chiang. “And these new jobs are in clean tech. It’s super exciting to me personally to be doing something that benefits communities outside of our traditional technology centers.

“This is the right time for so many reasons,” Chiang says. He says he and his Form Energy co-founders feel “tremendous urgency to get these batteries out into the world.”

This article appears in the Winter 2024 issue of Energy Futures, the magazine of the MIT Energy Initiative.

Eight from MIT named 2024 Sloan Research Fellows

Eight from MIT named 2024 Sloan Research Fellows

Eight members of the MIT faculty are among 126 early-career researchers honored with 2024 Sloan Research Fellowships by the Alfred P. Sloan Foundation. Representing the departments of Chemistry, Electrical Engineering and Computer Science, and Physics, and the MIT Sloan School of Management, the awardees will receive a two-year, $75,000 fellowship to advance their research.

“Sloan Research Fellowships are extraordinarily competitive awards involving the nominations of the most inventive and impactful early-career scientists across the U.S. and Canada,” says Adam F. Falk, president of the Alfred P. Sloan Foundation. “We look forward to seeing how fellows take leading roles shaping the research agenda within their respective fields.”

Jacob Andreas is an associate professor in the Department of Electrical Engineering and Computer Science (EECS) as well as the Computer Science and Artificial Intelligence Laboratory (CSAIL). His research aims to build intelligent systems that can communicate effectively using language and learn from human guidance. Jacob has been named a Kavli Fellow by the National Academy of Sciences, and has received the NSF CAREER award, MIT’s Junior Bose and Kolokotrones teaching awards, and paper awards at ACL, ICML and NAACL.

Adam Belay, Jamieson Career Development Associate Professor of EECS in CSAIL, focuses on operating systems and networking, specifically developing practical and efficient methods for microsecond-scale distributed computing, which has many applications pertaining to resource management in data centers. His operating system, Caladan, reallocates server resources on a microsecond scale, resulting in high CPU utilization with low tail latency. Additionally, Belay has contributed to load balancing, and Application-Integrated Far Memory in OS designs.

Soonwon Choi, assistant professor of physics, is a researcher in the Center for Theoretical Physics, a division of the Laboratory for Nuclear Science. His research is focused on the intersection of quantum information and out-of-equilibrium dynamics of quantum many-body systems, specifically exploring the dynamical phenomena that occur in strongly interacting quantum many-body systems far from equilibrium and designing their novel applications for quantum information science. Recent contributions from Choi, recipient of the Inchon Award, include the development of simple methods to benchmark the quality of analog quantum simulators. His work allows for efficiently and easily characterizing quantum simulators, accelerating the goal of utilizing them in studying exotic phenomena in quantum materials that are difficult to synthesize in a laboratory.

Maryam Farboodi, the Jon D. Gruber Career Development Assistant Professor of Finance in the MIT Sloan School of Management, studies the economics of big data. She explores how big data technologies have changed trading strategies and financial outcomes, as well as the consequences of the emergence of big data for technological growth in the real economy. She also works on developing methodologies to estimate the value of data. Furthermore, Farboodi studies intermediation and network formation among financial institutions, and the spillovers to the real economy. She is also interested in how information frictions shape the local and global economic cycles.

Lina Necib PhD ’17, an assistant professor of physics and a member of the MIT Kavli Institute for Astrophysics and Space Research, explores the origin of dark matter through a combination of simulations and observational data that correlate the dynamics of dark matter with that of the stars in the Milky Way. She has investigated the local dynamic structures in the solar neighborhood using the Gaia satellite, contributed to building a catalog of local accreted stars using machine learning techniques, and discovered a new stream called Nyx. Necib is interested in employing Gaia in conjunction with other spectroscopic surveys to understand the dark matter profile in the local solar neighborhood, the center of the galaxy, and in dwarf galaxies.

Arvind Satyanarayan in an assistant professor of computer science and leader of the CSAIL Visualization Group. Satyanarayan uses interactive data visualization as a petri dish to study intelligence augmentation, asking how computational representations and software systems help amplify our cognition and creativity while respecting our agency. His work has been recognized with an NSF CAREER award, best paper awards at academic venues such as ACM CHI and IEEE VIS, and honorable mentions among practitioners including Kantar’s Information is Beautiful Awards. Systems he helped develop are widely used in industry, on Wikipedia, and in the Jupyter/Python data science communities.

Assistant professor of physics and a member of the Kavli Institute Andrew Vanderburg explores the use of machine learning, especially deep neural networks, in the detection of exoplanets, or planets which orbit stars other than the sun. He is interested in developing cutting-edge techniques and methods to discover new planets outside of our solar system, and studying the planets we find to learn their detailed properties. Vanderburg conducts astronomical observations using facilities on Earth like the Magellan Telescopes in Chile as well as space-based observatories like the Transiting Exoplanet Survey Satellite and the James Webb Space Telescope. Once the data from these telescopes are in hand, they develop new analysis methods that help extract as much scientific value as possible.

Xiao Wang is a core institute member of the Broad Institute of MIT and Harvard, and the Thomas D. and Virginia Cabot Assistant Professor of Chemistry. She started her lab in 2019 to develop and apply new chemical, biophysical, and genomic tools to better probe and understand tissue function and dysfunction at the molecular level. Specifically, with in situ sequencing of nucleic acids as the core approach, Wang aims to develop high-resolution and highly-multiplexed molecular imaging methods across multiple scales toward understanding the physical and chemical basis of brain wiring and function. She is the recipient of a Packard Fellowship, NIH Director’s New Innovator Award, and is a Searle Scholar.

Brain surgery training from an avatar

Brain surgery training from an avatar

Benjamin Warf, a renowned neurosurgeon at Boston Children’s Hospital, stands in the MIT.nano Immersion Lab. More than 3,000 miles away, his virtual avatar stands next to Matheus Vasconcelos in Brazil as the resident practices delicate surgery on a doll-like model of a baby’s brain.

With a pair of virtual-reality goggles, Vasconcelos is able to watch Warf’s avatar demonstrate a brain surgery procedure before replicating the technique himself and while asking questions of Warf’s digital twin.

“It’s an almost out-of-body experience,” Warf says of watching his avatar interact with the residents. “Maybe it’s how it feels to have an identical twin?”

And that’s the goal: Warf’s digital twin bridged the distance, allowing him to be functionally in two places at once. “It was my first training using this model, and it had excellent performance,” says Vasconcelos, a neurosurgery resident at Santa Casa de São Paulo School of Medical Sciences in São Paulo, Brazil. “As a resident, I now feel more confident and comfortable applying the technique in a real patient under the guidance of a professor.”

Warf’s avatar arrived via a new project launched by medical simulator and augmented reality (AR) company EDUCSIM. The company is part of the 2023 cohort of START.nano, MIT.nano’s deep-tech accelerator that offers early-stage startups discounted access to MIT.nano’s laboratories.

In March 2023, Giselle Coelho, EDUCSIM’s scientific director and a pediatric neurosurgeon at Santa Casa de São Paulo and Sabará Children’s Hospital, began working with technical staff in the MIT.nano Immersion Lab to create Warf’s avatar. By November, the avatar was training future surgeons like Vasconcelos.

“I had this idea to create the avatar of Dr. Warf as a proof of concept, and asked, ‘What would be the place in the world where they are working on technologies like that?’” Coelho says. “Then I found MIT.nano.”

Capturing a Surgeon

As a neurosurgery resident, Coelho was so frustrated by the lack of practical training options for complex surgeries that she built her own model of a baby brain. The physical model contains all the structures of the brain and can even bleed, “simulating all the steps of a surgery, from incision to skin closure,” she says.

She soon found that simulators and virtual reality (VR) demonstrations reduced the learning curve for her own residents. Coelho launched EDUCSIM in 2017 to expand the variety and reach of the training for residents and experts looking to learn new techniques.

Those techniques include a procedure to treat infant hydrocephalus that was pioneered by Warf, the director of neonatal and congenital neurosurgery at Boston Children’s Hospital. Coelho had learned the technique directly from Warf and thought his avatar might be the way for surgeons who couldn’t travel to Boston to benefit from his expertise.

To create the avatar, Coelho worked with Talis Reks, the AR/VR/gaming/big data IT technologist in the Immersion Lab.

“A lot of technology and hardware can be very expensive for startups to access as they start their company journey,” Reks explains. “START.nano is one way of enabling them to utilize and afford the tools and technologies we have at MIT.nano’s Immersion Lab.”

Coelho and her colleagues needed high-fidelity and high-resolution motion-capture technology, volumetric video capture, and a range of other VR/AR technologies to capture Warf’s dexterous finger motions and facial expressions. Warf visited MIT.nano on several occasions to be digitally “captured,” including performing an operation on the physical baby model while wearing special gloves and clothing embedded with sensors.

“These technologies have mostly been used for entertainment or VFX [visual effects] or CGI [computer-generated imagery],” says Reks, “But this is a unique project, because we’re applying it now for real medical practice and real learning.”

One of the biggest challenges, Reks says, was helping to develop what Coelho calls “holoportation”— transmitting the 3D, volumetric video capture of Warf in real-time over the internet so that his avatar can appear in transcontinental medical training.

The Warf avatar has synchronous and asynchronous modes. The training that Vasconcelos received was in the asynchronous mode, where residents can observe the avatar’s demonstrations and ask it questions. The answers, delivered in a variety of languages, come from AI algorithms that draw from previous research and an extensive bank of questions and answers provided by Warf.

In the synchronous mode, Warf operates his avatar from a distance in real time, Coelho says. “He could walk around the room, he could talk to me, he could orient me. It’s amazing.”

Coelho, Warf, Reks, and other team members demonstrated a combination of the modes in a second session in late December. This demo consisted of volumetric live video capture between the Immersion Lab and Brazil, spatialized and visible in real-time through AR headsets. It significantly expanded upon the previous demo, which had only streamed volumetric data in one direction through a two-dimensional display.

Powerful impacts

Warf has a long history of training desperately needed pediatric neurosurgeons around the world, most recently through his nonprofit Neurokids. Remote and simulated training has been an increasingly large part of training since the pandemic, he says, although he doesn’t feel it will ever completely replace personal hands-on instruction and collaboration.

“But if in fact one day we could have avatars, like this one from Giselle, in remote places showing people how to do things and answering questions for them, without the cost of travel, without the time cost and so forth, I think it could be really powerful,” Warf says.

The avatar project is especially important for surgeons serving remote and underserved areas like the Amazon region of Brazil, Coelho says. “This is a way to give them the same level of education that they would get in other places, and the same opportunity to be in touch with Dr. Warf.”

One baby treated for hydrocephalus at a recent Amazon clinic had traveled by boat 30 hours for the surgery, according to Coelho.

Training surgeons with the avatar, she says, “can change reality for this baby and can change the future.”

I’m Learning How To Drive And Pacific Drive Is Helping

Pacific Drive is a game about one’s relationship with their car. As you navigate a reality-bending doomsday loop, your vehicle is the only thing keeping you from impending death – if you’re good enough at driving it. I’ve had fun in this world speeding through the forest and grabbing glowing yellow orbs, but the car is one of the most interesting parts to me for one reason: I don’t have a driver’s license yet.

After years of procrastination and two expired temporary permits, I’ve finally started learning to drive in earnest. I spent years of adulthood in the dark, and it’s been fascinating to open my eyes to something that is an everyday experience for millions of people. And as much as it’s opened my eyes to a new side of the real world, it’s making me see video game driving in a different light, and Pacific Drive is the first instance of that.

I’m Learning How To Drive And Pacific Drive Is Helping

For starters, you drive in first person, which is something I’ve never tried in a game before. I typically swap to a third-person perspective as quickly as possible because it gives me a better sense of what’s going on around me. The thing is, when I do that, I’m basically playing as the car, not the person driving the car. Mirrors are irrelevant when you have a camera floating around to get perspective, and all you need to do to start the engine is press the gas button.

In Pacific Drive, you play from the driver’s seat, and you have to familiarize yourself with the space. Starting the car is more than the tap of a button – you have to aim at the ignition and insert the key to start the engine, then aim at the gear shift to put the car from park into drive. You’re free to exit the car at will as well, but if you leave the car on, you’ll waste gas, and if you don’t put it in park, it will roll down a hill. While driving, you have to physically turn your character’s head to look in the mirrors to see what’s behind you – I just wish it had the backup camera in my partner’s Honda Accord. 

I also have a more basic appreciation for the anatomy of a car. Granted, Pacific Drive does simplify things (I don’t need a license to know a hatchback trunk door and a driver’s side door don’t have the same crafting recipe in real life [or a crafting recipe at all, for that matter]), but by forcing me to repair and upgrade the car by looking at its individual parts, I have a deeper understanding than I otherwise would. 

For example, an early goal has the player craft a handbrake. In order to actually put it in the car, you need to open the hood, at which point you can see a translucent image of where the handbrake will actually go once it’s in the car. Seeing it highlighted at the base of the vehicle with lines connecting to each wheel was a surprise to me, partially because I had never considered how a handbrake actually works and partially because I was impressed the developers took the steps to actually illustrate the entire mechanism. It would have been much easier to just have the handbrake exist as an upgrade on a menu, but seeing where it’s supposed to go and putting it there myself gave me ownership over the whole thing.

I’m not suggesting Pacific Drive is some kind of groundbreaking car simulator, but it is more of a car simulator than I thought it would be, and now that I’m learning to drive, I have a very real appreciation for the way the developers spent time to add some realism to the car and its upgrades. Going the (pun intended) extra mile in areas like this really makes Pacific Drive stand out, and it’s made me far more invested in the game – and my car – than I would have been otherwise. 

My partner walked past me as I played the other day, crafting upgrades in the garage. I pointed to the rusted sedan on the screen and proudly said, “You see this? This is my car. I installed the handbrake myself.”

Balatro Review And Why Unicorn Overlord’s Battle System Rules | GI Show (Feat. Eric Van Allen)

Balatro Review And Why Unicorn Overlord’s Battle System Rules | GI Show (Feat. Eric Van Allen)

In this week’s episode of The Game Informer Show, returning guest Eric Van Allen (Destructoid, Axe of the Blood God) joins us to discuss our Balatro review, spoiler-free thoughts about the length of Final Fantasy VII Rebirth, the early hours of Penny’s Big Breakaway, and finally, how the combat system in Vanillaware’s Unicorn Overlord mimics Final Fantasy XII’s Gambits (that’s a very good thing).

Lastly, we want to highlight our former colleague and close friend, Blake Hester, who was affected by a corporate restructuring from our parent company this week. Blake’s original reporting and responsibility as our working features editor have transformed Game Informer over the last three-and-a-half years into an outlet known for thoughtful reporting on games and the people making them. Please go support Blake and follow his work right here.

Watch The Video Version:

[embedded content]

Follow us on social media: Blake Hester (@MetallicaIsRad), Alex Van Aken (@itsVanAken), Kyle Hilliard (@KyleMHilliard), Marcus Stewart (@MarcusStewart7), Eric Van Allen (@SeaMoosi)

The Game Informer Show is a weekly gaming podcast covering the latest video game news, industry topics, exclusive reveals, and reviews. Join host Alex Van Aken every Thursday to chat about your favorite games – past and present – with Game Informer staff, developers, and special guests from around the industry. Listen on Apple PodcastsSpotify, or your favorite podcast app.

The Game Informer Show – Podcast Timestamps:

00:00:00 – Intro

00:02:04 – Blake Hester Was Laid Off From Game Informer

00:11:23 – Balatro Review

00:27:50 – Unicorn Overlord Demo

00:44:07 – Penny’s Big Breakaway

01:03:40 – Final Fantasy VII Rebirth

01:18:32 – Housekeeping