Dealing with the limitations of our noisy world

Dealing with the limitations of our noisy world

Tamara Broderick first set foot on MIT’s campus when she was a high school student, as a participant in the inaugural Women’s Technology Program. The monthlong summer academic experience gives young women a hands-on introduction to engineering and computer science.

What is the probability that she would return to MIT years later, this time as a faculty member?

That’s a question Broderick could probably answer quantitatively using Bayesian inference, a statistical approach to probability that tries to quantify uncertainty by continuously updating one’s assumptions as new data are obtained.

In her lab at MIT, the newly tenured associate professor in the Department of Electrical Engineering and Computer Science (EECS) uses Bayesian inference to quantify uncertainty and measure the robustness of data analysis techniques.

“I’ve always been really interested in understanding not just ‘What do we know from data analysis,’ but ‘How well do we know it?’” says Broderick, who is also a member of the Laboratory for Information and Decision Systems and the Institute for Data, Systems, and Society. “The reality is that we live in a noisy world, and we can’t always get exactly the data that we want. How do we learn from data but at the same time recognize that there are limitations and deal appropriately with them?”

Broadly, her focus is on helping people understand the confines of the statistical tools available to them and, sometimes, working with them to craft better tools for a particular situation.

For instance, her group recently collaborated with oceanographers to develop a machine-learning model that can make more accurate predictions about ocean currents. In another project, she and others worked with degenerative disease specialists on a tool that helps severely motor-impaired individuals utilize a computer’s graphical user interface by manipulating a single switch.

A common thread woven through her work is an emphasis on collaboration.

“Working in data analysis, you get to hang out in everybody’s backyard, so to speak. You really can’t get bored because you can always be learning about some other field and thinking about how we can apply machine learning there,” she says.

Hanging out in many academic “backyards” is especially appealing to Broderick, who struggled even from a young age to narrow down her interests.

A math mindset

Growing up in a suburb of Cleveland, Ohio, Broderick had an interest in math for as long as she can remember. She recalls being fascinated by the idea of what would happen if you kept adding a number to itself, starting with 1+1=2 and then 2+2=4.

“I was maybe 5 years old, so I didn’t know what ‘powers of two’ were or anything like that. I was just really into math,” she says.

Her father recognized her interest in the subject and enrolled her in a Johns Hopkins program called the Center for Talented Youth, which gave Broderick the opportunity to take three-week summer classes on a range of subjects, from astronomy to number theory to computer science.

Later, in high school, she conducted astrophysics research with a postdoc at Case Western University. In the summer of 2002, she spent four weeks at MIT as a member of the first class of the Women’s Technology Program.

She especially enjoyed the freedom offered by the program, and its focus on using intuition and ingenuity to achieve high-level goals. For instance, the cohort was tasked with building a device with LEGOs that they could use to biopsy a grape suspended in Jell-O.

The program showed her how much creativity is involved in engineering and computer science, and piqued her interest in pursuing an academic career.

“But when I got into college at Princeton, I could not decide — math, physics, computer science — they all seemed super-cool. I wanted to do all of it,” she says.

She settled on pursuing an undergraduate math degree but took all the physics and computer science courses she could cram into her schedule.

Digging into data analysis

After receiving a Marshall Scholarship, Broderick spent two years at Cambridge University in the United Kingdom, earning a master of advanced study in mathematics and a master of philosophy in physics.

In the UK, she took a number of statistics and data analysis classes, including her first class on Bayesian data analysis in the field of machine learning.

It was a transformative experience, she recalls.

“During my time in the U.K., I realized that I really like solving real-world problems that matter to people, and Bayesian inference was being used in some of the most important problems out there,” she says.

Back in the U.S., Broderick headed to the University of California at Berkeley, where she joined the lab of Professor Michael I. Jordan as a grad student. She earned a PhD in statistics with a focus on Bayesian data analysis. 

She decided to pursue a career in academia and was drawn to MIT by the collaborative nature of the EECS department and by how passionate and friendly her would-be colleagues were.

Her first impressions panned out, and Broderick says she has found a community at MIT that helps her be creative and explore hard, impactful problems with wide-ranging applications.

“I’ve been lucky to work with a really amazing set of students and postdocs in my lab — brilliant and hard-working people whose hearts are in the right place,” she says.

One of her team’s recent projects involves a collaboration with an economist who studies the use of microcredit, or the lending of small amounts of money at very low interest rates, in impoverished areas.

The goal of microcredit programs is to raise people out of poverty. Economists run randomized control trials of villages in a region that receive or don’t receive microcredit. They want to generalize the study results, predicting the expected outcome if one applies microcredit to other villages outside of their study.

But Broderick and her collaborators have found that results of some microcredit studies can be very brittle. Removing one or a few data points from the dataset can completely change the results. One issue is that researchers often use empirical averages, where a few very high or low data points can skew the results.

Using machine learning, she and her collaborators developed a method that can determine how many data points must be dropped to change the substantive conclusion of the study. With their tool, a scientist can see how brittle the results are.

“Sometimes dropping a very small fraction of data can change the major results of a data analysis, and then we might worry how far those conclusions generalize to new scenarios. Are there ways we can flag that for people? That is what we are getting at with this work,” she explains.

At the same time, she is continuing to collaborate with researchers in a range of fields, such as genetics, to understand the pros and cons of different machine-learning techniques and other data analysis tools.

Happy trails

Exploration is what drives Broderick as a researcher, and it also fuels one of her passions outside the lab. She and her husband enjoy collecting patches they earn by hiking all the trails in a park or trail system.

“I think my hobby really combines my interests of being outdoors and spreadsheets,” she says. “With these hiking patches, you have to explore everything and then you see areas you wouldn’t normally see. It is adventurous, in that way.”

They’ve discovered some amazing hikes they would never have known about, but also embarked on more than a few “total disaster hikes,” she says. But each hike, whether a hidden gem or an overgrown mess, offers its own rewards.

And just like in her research, curiosity, open-mindedness, and a passion for problem-solving have never led her astray.

Startup accelerates progress toward light-speed computing

Startup accelerates progress toward light-speed computing

Our ability to cram ever-smaller transistors onto a chip has enabled today’s age of ubiquitous computing. But that approach is finally running into limits, with some experts declaring an end to Moore’s Law and a related principle, known as Dennard’s Scaling.

Those developments couldn’t be coming at a worse time. Demand for computing power has skyrocketed in recent years thanks in large part to the rise of artificial intelligence, and it shows no signs of slowing down.

Now Lightmatter, a company founded by three MIT alumni, is continuing the remarkable progress of computing by rethinking the lifeblood of the chip. Instead of relying solely on electricity, the company also uses light for data processing and transport. The company’s first two products, a chip specializing in artificial intelligence operations and an interconnect that facilitates data transfer between chips, use both photons and electrons to drive more efficient operations.

“The two problems we are solving are ‘How do chips talk?’ and ‘How do you do these [AI] calculations?’” Lightmatter co-founder and CEO Nicholas Harris PhD ’17 says. “With our first two products, Envise and Passage, we’re addressing both of those questions.”

In a nod to the size of the problem and the demand for AI, Lightmatter raised just north of $300 million in 2023 at a valuation of $1.2 billion. Now the company is demonstrating its technology with some of the largest technology companies in the world in hopes of reducing the massive energy demand of data centers and AI models.

“We’re going to enable platforms on top of our interconnect technology that are made up of hundreds of thousands of next-generation compute units,” Harris says. “That simply wouldn’t be possible without the technology that we’re building.”

From idea to $100K

Prior to MIT, Harris worked at the semiconductor company Micron Technology, where he studied the fundamental devices behind integrated chips. The experience made him see how the traditional approach for improving computer performance — cramming more transistors onto each chip — was hitting its limits.

“I saw how the roadmap for computing was slowing, and I wanted to figure out how I could continue it,” Harris says. “What approaches can augment computers? Quantum computing and photonics were two of those pathways.”

Harris came to MIT to work on photonic quantum computing for his PhD under Dirk Englund, an associate professor in the Department of Electrical Engineering and Computer Science. As part of that work, he built silicon-based integrated photonic chips that could send and process information using light instead of electricity.

The work led to dozens of patents and more than 80 research papers in prestigious journals like Nature. But another technology also caught Harris’s attention at MIT.

“I remember walking down the hall and seeing students just piling out of these auditorium-sized classrooms, watching relayed live videos of lectures to see professors teach deep learning,” Harris recalls, referring to the artificial intelligence technique. “Everybody on campus knew that deep learning was going to be a huge deal, so I started learning more about it, and we realized that the systems I was building for photonic quantum computing could actually be leveraged to do deep learning.”

Harris had planned to become a professor after his PhD, but he realized he could attract more funding and innovate more quickly through a startup, so he teamed up with Darius Bunandar PhD ’18, who was also studying in Englund’s lab, and Thomas Graham MBA ’18. The co-founders successfully launched into the startup world by winning the 2017 MIT $100K Entrepreneurship Competition.

Seeing the light

Lightmatter’s Envise chip takes the part of computing that electrons do well, like memory, and combines it with what light does well, like performing the massive matrix multiplications of deep-learning models.

“With photonics, you can perform multiple calculations at the same time because the data is coming in on different colors of light,” Harris explains. “In one color, you could have a photo of a dog. In another color, you could have a photo of a cat. In another color, maybe a tree, and you could have all three of those operations going through the same optical computing unit, this matrix accelerator, at the same time. That drives up operations per area, and it reuses the hardware that’s there, driving up energy efficiency.”

Passage takes advantage of light’s latency and bandwidth advantages to link processors in a manner similar to how fiber optic cables use light to send data over long distances. It also enables chips as big as entire wafers to act as a single processor. Sending information between chips is central to running the massive server farms that power cloud computing and run AI systems like ChatGPT.

Both products are designed to bring energy efficiencies to computing, which Harris says are needed to keep up with rising demand without bringing huge increases in power consumption.

“By 2040, some predict that around 80 percent of all energy usage on the planet will be devoted to data centers and computing, and AI is going to be a huge fraction of that,” Harris says. “When you look at computing deployments for training these large AI models, they’re headed toward using hundreds of megawatts. Their power usage is on the scale of cities.”

Lightmatter is currently working with chipmakers and cloud service providers for mass deployment. Harris notes that because the company’s equipment runs on silicon, it can be produced by existing semiconductor fabrication facilities without massive changes in process.

The ambitious plans are designed to open up a new path forward for computing that would have huge implications for the environment and economy.

“We’re going to continue looking at all of the pieces of computers to figure out where light can accelerate them, make them more energy efficient, and faster, and we’re going to continue to replace those parts,” Harris says. “Right now, we’re focused on interconnect with Passage and on compute with Envise. But over time, we’re going to build out the next generation of computers, and it’s all going to be centered around light.”

A careful rethinking of the Iraq War

A careful rethinking of the Iraq War

The term “fog of war” expresses the chaos and uncertainty of the battlefield. Often, it is only in hindsight that people can grasp what was unfolding around them.

Now, additional clarity about the Iraq War has arrived in the form of a new book by MIT political scientist Roger Petersen, which dives into the war’s battlefield operations, political dynamics, and long-term impact. The U.S. launched the Iraq War in 2003 and formally wrapped it up in 2011, but Petersen analyzes the situation in Iraq through the current day and considers what the future holds for the country.

After a decade of research, Petersen identifies four key factors for understanding Iraq’s situation. First, the U.S. invasion created chaos and a lack of clarity in terms of the hierarchy among Shia, Sunni, and Kurdish groups. Second, given these conditions, organizations that comprised a mix of militias, political groups, and religious groups came to the fore and captured elements of the new state the U.S. was attempting to set up. Third, by about 2018, the Shia groups became dominant, establishing a hierarchy, and along with that dominance, sectarian violence has fallen. Finally, the hybrid organizations established many years ago are now highly integrated into the Iraqi state.

Petersen has also come to believe two things about the Iraq War are not fully appreciated. One is how widely U.S. strategy varied over time in response to shifting circumstances.

“This was not one war,” says Petersen. “This was many different wars going on. We had at least five strategies on the U.S. side.”

And while the expressed goal of many U.S. officials was to build a functioning democracy in Iraq, the intense factionalism of Iraqi society led to further military struggles, between and among religious and ethnic groups. Thus, U.S. military strategy shifted as this multisided conflict evolved.

“What really happened in Iraq, and the thing the United States and Westerners did not understand at first, is how much this would become a struggle for dominance among Shias, Sunnis, and Kurds,” says Petersen. “The United States thought they would build a state, and the state would push down and penetrate society. But it was society that created militias and captured the state.”

Attempts to construct a well-functioning state, in Iraq or elsewhere must confront this factor, Petersen adds. Most people think in terms of groups. They think in terms of group hierarchies, and they’re motivated when they believe their own group is not in a proper space in the hierarchy. This is this emotion of resentment. I think this is just human nature.”

Petersen’s book, “Death, Dominance, and State-Building: The U.S. in Iraq and the Future of American Military Intervention,” is published today by Oxford University Press. Petersen is the Arthur and Ruth Sloan Professor of Political Science at MIT and a member of the Security Studies Program based at MIT’s Center for International Studies.

Research on the ground

Petersen spent years interviewing people who were on the ground in Iraq during the war, from U.S. military personnel to former insurgents to regular Iraqi citizens, while extensively analyzing data about the conflict.

“I didn’t really come to conclusions about Iraq until six or seven years of applying this method,” he says.

Ultimately, one core fact about the country heavily influenced the trajectory of the war. Iraq’s Sunni Muslims made up about 20 percent or less of the country’s population but had been politically dominant before the U.S. took military action. After the U.S. toppled former dictator Saddam Hussein, it created an opening for the Shia majority to grasp more power.

“The United States said, ‘We’re going to have democracy and think in individual terms,’ but this is not the way it played out,” Petersen says. “The way it played out was, over the years, the Shia organizations became the dominant force. The Sunnis and Kurds are now basically subordinate within this Shia-dominated state. The Shias also had advantages in organizing violence over the Sunnis, and they’re the majority. They were going to win.”

As Petersen details in the book, a central unit of power became the political militia, based on ethnic and religious identification. One Shia militia, the Badr Organization, had trained professionally for years in Iran. The local Iraqi leader Moqtada al-Sadr could recruit Shia fighters from among the 2 million people living in the Sadr City slum. And no political militia wanted to back a strong multiethnic government.

“They liked this weaker state,” Petersen says. “The United States wanted to build a new Iraqi state, but what we did was create a situation where multiple and large Shia militia make deals with each other.”

A captain’s war

In turn, these dynamics meant the U.S. had to shift military strategies numerous times, occasionally in high-profile ways. The five strategies Petersen identifies are clear, hold, build (CHB); decapitation; community mobilization; homogenization; and war-fighting.

“The war from the U.S. side was highly decentralized,” Petersen says. Military captains, who typically command about 140 to 150 soldiers, had fairly wide berth in terms of how they were choosing to fight.  

“It was a captain’s war in a lot of ways,” Petersen adds.

The point is emphatically driven home in one chapter, “Captain Wright goes to Baghdad,” co-authored with Col. Timothy Wright PhD ’18, who wrote his MIT political science dissertation based on his experience and company command during the surge period.

As Petersen also notes, drawing on government data, the U.S. also managed to suppress violence fairly effectively at times, particularly before 2006 and after 2008. “The professional soldiers tried to do a good job, but some of the problems they weren’t going to solve,” Petersen says.

Still, all of this raises a conundrum. If trying to start a new state in Iraq was always likely to lead to an increase in Shia power, is there really much the U.S. could have done differently?

“That’s a million-dollar question,” Petersen says.

Perhaps the best way to engage with it, Petersen notes, is to recognize the importance of studying how factional groups grasp power through use of violence, and how that emerges in society. It is a key issue running throughout Petersen’s work, and one, he notes, that has often been studied by his graduate students in MIT’s Security Studies Program.

“Death, Dominance, and State-Building” has received praise from foreign-policy scholars. Paul Staniland, a political scientist at the University of Chicago, has said the work combines “intellectual creativity with careful attention to on-the ground dynamics,” and is “a fascinating macro-level account of the politics of group competition in Iraq. This book is required reading for anyone interested in civil war, U.S. foreign policy, or the politics of violent state-building.”

Petersen, for his part, allows that he was pleased when one marine who served in Iraq read the manuscript in advance and found it interesting.

“He said, ‘This is good, and it’s not the way we think about it,’” Petersen says. “That’s my biggest compliment, to have a practitioner say it make them think. If I can get that kind of reaction, I’ll be pleased.”