Due to the nature of some of the material discussed here, this article will contain fewer reference links and illustrations than usual. Something noteworthy is currently happening in the AI synthesis community, though its significance may take a while to become clear. Hobbyists are training generative…
Insights into political outsiders
As the old saw has it, 90 percent of politics is just showing up. Which is fine for people who are already engaged in the political system and expect to influence it. What about everyone else? The U.S. has millions and millions of people who typically do not vote or participate in politics. Is there a way into political life for those who are normally disconnected from it?
This is a topic MIT political scientist Ariel White has been studying closely over the last decade. White conducts careful empirical research on typically overlooked subjects, such as the relationship between incarceration and political participation; the way people interact with government administrators; and how a variety of factors, from media coverage to income inequality, influence engagement with politics.
While the media heavily cover the views of frequent voters in certain areas, there is very little attention paid to citizens who do not vote regularly but could. To grasp U.S. politics, it might help us to better understand such people.
“I think there is a much broader story to be told here,” says White, an associate professor in MIT’s Department of Political Science.
Study by study, her research has been telling that story. Even short, misdemeanor-linked jail terms, White has found, reduce the likelihood that people will vote — and lower the propensity of family members to vote as well. When people are convicted of felonies, they often lose their right to vote, but they also vote at low rates when eligible. Other studies by White also suggest that an 8 percent minimum wage increase leads to an increase in turnout of about one-third of 1 percent, and that those receiving public benefits are far less likely to vote than those who do not.
These issues are often viewed in partisan terms, although the reality, White thinks, is considerably more complex. When evaluating infrequent or disconnected voters, we do not know enough to make assumptions about these matters.
“Getting people with past criminal convictions registered and voting, when they are eligible, is not a surefire partisan advantage for anybody,” White says. “There’s a lot of heterogeneity in this group, which is not what people assume. Legislators tend to treat this as a partisan issue, but at the mass public level you see less polarization, and more people are willing to support a path for others back into daily life.”
Experiences matter
White grew up near Rochester, New York, and majored in economics and government at Cornell University. She says that initially she never considered entering academia, and tried her hand at a few jobs after graduation. One of them, working as an Americorps-funded paralegal in a legal services office, had a lasting influence; she started thinking more about the nature of government-citizen interactions in these settings.
“It really stuck in my mind the way people’s experiences, one-on-one with a person who is representing government, when trying to get benefits, really shapes people’s views about how government is going to operate and see them, and what they can expect from the state,” White says. “People’s experiences with government matter for what they do politically.”
Before long, White was accepted into the doctoral program at Harvard University, where she earned an MA in 2012 and her PhD in 2016. White then joined the MIT faculty, also in 2016, and has remained at the Institute ever since.
White’s first published paper, in 2015, co-authored with Julie Faller and Noah Nathan, found that government officials tended to have different levels of responsiveness when providing voting information to people of apparently different ethnicities. It won an award from the American Political Science Association. (Nathan is now also a faculty member at MIT.)
Since then, White has published a string of papers examining how many factors interact with voting propensities. In one study focused in Pennsylvania, she found that public benefits recipients made up 20 percent of eligible voters in 2020 but just 12 percent of those who voted. When examining the criminal justice system, White has found that even short-term jail time leads to a turnout drop of several percentage points among the incarcerated. Family members of those serving even short jail sentences are less likely to vote in the near term too, although their participation rebounds over time.
“People don’t often think of incarceration as a thing they connect with politics,” White says. “Descriptively, with many people who have had the experience of incarceration or criminal convictions, or who are living in families or neighborhoods with a lot of it, we don’t see a lot of political action, and we see low levels of voting. Given how widespread incarceration is in the U.S., it seems like one of the most common and impactful things the government can do. But for a long time it was left to sociology to study.”
How to reach people?
Having determined that citizens are less likely to vote in many circumstances, White’s research is now evolving toward a related question: What are the most viable ways of changing that? To be sure, nothing is likely to create a tsunami of new voters. Even where people convicted of felonies can vote from prison, she found in still another study, they do so at single-digit rates. People who are used to not voting are not going to start voting at high rates, on aggregate.
Still, this fall, White led a new field experiment about getting unregistered voters to both register and vote. In this case, she and some colleagues created a study designed to see if friends of unregistered voters might be especially able to get their networks to join the voter rolls. The results are still under review. But for White, it is a new area where many kinds of experiments and studies seem possible.
“Political science in general and the world of actual practicing political campaigns knows an awful lot about how to get registered voters to turn out to vote,” White says. “There’s so much work on get-out-the-vote activities, mailers and calls and texts. We know way, way less about the 1-in-4 or so eligible voters who are simply not registered at all, and are in a very real sense invisible in the political landscape. Overwhelmingly, the people I’m curious about fall into that category.”
It is also a subject that she hopes will sustain the interest of her students. White’s classes tend to be filled by students with many different registered majors but an abiding interest in civic life. White wants them to come away with a more informed sense of their civic landscape, as well as new tools for conducting clean empirical studies. And, who knows? Like White herself, some of her students may end up making a career out of political engagement, even if they don’t know it yet.
“I really like working with MIT students,” White says. “I do hope my students gain some key understandings about what we know about political life, and how we can know about it, which I think are likely to be helpful to them in a variety of realms. My hope is they take a fundamental understanding of social science research, and some big questions, and some big concepts, out into the world.”
The Dual-Edged Sword of AI in Cybersecurity: Opportunities, Threats, and the Road Ahead
As we move into 2025, the cybersecurity landscape is entering a critical period of transformation. The advancements in artificial intelligence that have driven innovation and progress for the last several years, are now poised to become a double-edged sword. As security professionals, these tools promise new…
Addressing AI Skepticism in Healthcare: Overcoming Obstacles To Secure Communication
Healthcare leaders are keen to embrace AI, partly to keep pace with competitors and other industries, but, more importantly, to increase efficiency and improve patient experiences. However, only 77% of healthcare leaders actually trust AI to benefit their business. While AI chatbots excel at handling routine…
Fermata Secures $10 Million Series A Funding to Revolutionize Agriculture with AI
Fermata, a trailblazer in data science and computer vision for agriculture, has raised $10 million in a Series A funding round led by Raw Ventures. The investment will accelerate Fermata’s mission to transform the horticulture industry by building a centralized digital “brain” that combines advanced data…
Coffee fix: MIT students decode the science behind the perfect cup
Elaine Jutamulia ’24 took a sip of coffee with a few drops of anise extract. It was her second try.
“What do you think?” asked Omar Orozco, standing at a lab table in MIT’s Breakerspace, surrounded by filters, brewing pots, and other coffee paraphernalia.
“I think when I first tried it, it was still pretty bitter,” Jutamulia said thoughtfully. “But I think now that it’s steeped for a little bit — it took out some of the bitterness.”
Jutamulia and current MIT senior Orozco were part of class 3.000 (Coffee Matters: Using the Breakerspace to Make the Perfect Cup), a new MIT course that debuted in spring 2024. The class combines lectures on chemistry and the science of coffee with hands-on experimentation and group projects. Their project explored how additives such as anise, salt, and chili oil influence coffee extraction — the process of dissolving flavor compounds from ground coffee into water — to improve taste and correct common brewing errors.
Alongside tasting, they used an infrared spectrometer to identify the chemical compounds in their coffee samples that contribute to flavor. Does anise make bitter coffee smoother? Could chili oil balance the taste?
“Generally speaking, if we could make a recommendation, that’s what we’re trying to find,” Orozco said.
A three-unit “discovery class” designed to help first-year students explore majors, 3.000 was widely popular, enrolling more than 50 students. Its success was driven by the beverage at its core and the class’s hands-on approach, which pushes students to ask and answer questions they might not have otherwise.
For aeronautics and astronautics majors Gabi McDonald and McKenzie Dinesen, coffee was the draw, but the class encouraged them to experiment and think in new ways. “It’s easy to drop people like us in, who love coffee, and, ‘Oh my gosh, there’s this class where we can go make coffee half the time and try all different kinds of things?’” McDonald says.
Percolating knowledge
The class pairs weekly lectures on topics such as coffee chemistry, the anatomy and composition of a coffee bean, the effects of roasting, and the brewing process with tasting sessions — students sample coffee brewed from different beans, roasts, and grinds. In the MIT Breakerspace, a new space on campus conceived and managed by the Department of Materials Science and Engineering (DMSE), students use equipment such as a digital optical microscope to examine ground coffee particles and a scanning electron microscope, which shoots beams of electrons at samples to reveal cross-sections of beans in stunning detail.
Once students learn to operate instruments for guided tasks, they form groups and design their own projects.
“The driver for those projects is some question they have about coffee raised by one of the lectures or the tasting sessions, or just something they’ve always wanted to know,” says DMSE Professor Jeffrey Grossman, who designed and teaches the class. “Then they’ll use one or more of these pieces of equipment to shed some light on it.”
Grossman traces the origins of the class to his initial vision for the Breakerspace, a laboratory for materials analysis and lounge for MIT undergraduates. Opened in November 2023, the space gives students hands-on experience with materials science and engineering, an interdisciplinary field combining chemistry, physics, and engineering to probe the composition and structure of materials.
“The world is made of stuff, and these are the tools to understand that stuff and bring it to life,” says Grossman. So he envisioned a class that would give students an “exploratory, inspiring nudge.”
“Then the question wasn’t the pedagogy, it was, ‘What’s the hook?’ In materials science, there are a lot of directions you could go, but if you have one that inspires people because they know it and maybe like it already, then that’s exciting.”
Cup of ambition
That hook, of course, was coffee, the second-most-consumed beverage after water. It captured students’ imagination and motivated them to push boundaries.
Orozco brought a fair amount of coffee knowledge to the class. In 2023, he taught in Mexico through the MISTI Global Teaching Labs program, where he toured several coffee farms and acquired a deeper knowledge of the beverage. He learned, for example, that black coffee, contrary to general American opinion, isn’t naturally bitter; bitterness arises from certain compounds that develop during the roasting process.
“If you properly brew it with the right beans, it actually tastes good,” says Orozco, a humanities and engineering major. A year later, in 3.000, he expanded his understanding of making a good brew, particularly through the group project with Jutamulia and other students to fix bad coffee.
The group prepared a control sample of “perfectly brewed” coffee — based on taste, coffee-to-water ratio, and other standards covered in class — alongside coffee that was under-extracted and over-extracted. Under-extracted coffee, made with water that isn’t hot enough or brewed for too short a time, tastes sharp or sour. Over-extracted coffee, brewed with too much coffee or for too long, tastes bitter.
Those coffee samples got additives and were analyzed using Fourier Transform Infrared (FTIR) spectroscopy, measuring how coffee absorbed infrared light to identify flavor-related compounds. Jutamulia examined FTIR readings taken from a sample with lime juice to see how the citric acid influenced its chemical profile.
“Can we find any correlation between what we saw and the existing known measurements of citric acid?” asks Jutamulia, who studied computation and cognition at MIT, graduating last May.
Another group dove into coffee storage, questioning why conventional wisdom advises against freezing.
“We just wondered why that’s the case,” says electrical engineering and computer science major Noah Wiley, a coffee enthusiast with his own espresso machine.
The team compared methods like freezing brewed coffee, frozen coffee grounds, and whole beans ground after freezing, evaluating their impact on flavor and chemical composition.
“Then we’re going to see which ones taste good,” says Wiley. The team used a class coffee review sheet to record attributes like acidity, bitterness, sweetness, and overall flavor, pairing the results with FTIR analysis to determine how storage affected taste.
Wiley acknowledged that “good” is subjective. “Sometimes there’s a group consensus. I think people like fuller coffee, not watery,” he says.
Other student projects compared caffeine levels in different coffee types, analyzed the effect of microwaving coffee on its chemical composition and flavor, and investigated the differences between authentic and counterfeit coffee beans.
“We gave the students some papers to look at in case they were interested,” says Justin Lavallee, Breakerspace manager and co-teacher of the class. “But mostly we told them to focus on something they wanted to learn more about.”
Drip, drip, drip
Beyond answering specific questions about coffee, both students and teachers gained deeper insights into the beverage.
“Coffee is a complicated material. There are thousands of molecules in the beans, which change as you roast and extract them,” says Grossman. “The number of ways you can engineer this collection of molecules — it’s profound, ranging from where and how the coffee’s grown to how the cherries are then treated to get the beans to how the beans are roasted and ground to the brewing method you use.”
Dinesen learned firsthand, discovering, for example, that darker roasts have less caffeine than lighter roasts, puncturing a common misconception. “You can vary coffee so much — just with the roast of the bean, the size of the ground,” she says. “It’s so easily manipulatable, if that’s a word.”
In addition to learning about the science and chemistry behind coffee, Dinesen and McDonald gained new brewing techniques, like using a pour-over cone. The pair even incorporated coffee making and testing into their study routine, brewing coffee while tackling problem sets for another class.
“I would put my pour-over cone in my backpack with a Ziploc bag full of grounds, and we would go to the Student Center and pull out the cone, a filter, and the coffee grounds,” McDonald says. “And then we would make pour-overs while doing a P-set. We tested different amounts of water, too. It was fun.”
Tony Chen, a materials science and engineering major, reflected on the 3.000’s title — “Using the Breakerspace to Make the Perfect Cup” — and whether making a perfect cup is possible. “I don’t think there’s one perfect cup because each person has their own preferences. I don’t think I’ve gotten to mine yet,” he says.
Enthusiasm for coffee’s complexity and the discovery process was exactly what Grossman hoped to inspire in his students. “The best part for me was also just seeing them developing their own sense of curiosity,” he says.
He recalled a moment early in the class when students, after being given a demo of the optical microscope, saw the surface texture of a magnified coffee bean, the mottled shades of color, and the honeycomb-like pattern of tiny irregular cells.
“They’re like, ‘Wait a second. What if we add hot water to the grounds while it’s under the microscope? Would we see the extraction?’ So, they got hot water and some ground coffee beans, and lo and behold, it looked different. They could see the extraction right there,” Grossman says. “It’s like they have an idea that’s inspired by the learning, and they go and try it. I saw that happen many, many times throughout the semester.”
Personal interests can influence how children’s brains respond to language
A recent study from the McGovern Institute for Brain Research shows how interests can modulate language processing in children’s brains and paves the way for personalized brain research.
The paper, which appears in Imaging Neuroscience, was conducted in the lab of MIT professor and McGovern Institute investigator John Gabrieli, and led by senior author Anila D’Mello, a recent McGovern postdoc who is now an assistant professor at the University of Texas Southwestern Medical Center and the University of Texas at Dallas.
“Traditional studies give subjects identical stimuli to avoid confounding the results,” says Gabrieli, who is the Grover Hermann Professor of Health Sciences and Technology and a professor of brain and cognitive sciences at MIT. “However, our research tailored stimuli to each child’s interest, eliciting stronger — and more consistent — activity patterns in the brain’s language regions across individuals.”
This work unveils a new paradigm that challenges current methods and shows how personalization can be a powerful strategy in neuroscience. The paper’s co-first authors are Halie Olson, a postdoc at the McGovern Institute, and Kristina Johnson PhD ’21, an assistant professor at Northeastern University and former doctoral student at the MIT Media Lab. “Our research integrates participants’ lived experiences into the study design,” says Johnson. “This approach not only enhances the validity of our findings, but also captures the diversity of individual perspectives, often overlooked in traditional research.”
Taking interest into account
When it comes to language, our interests are like operators behind the switchboard. They guide what we talk about and who we talk to. Research suggests that interests are also potent motivators and can help improve language skills. For instance, children score higher on reading tests when the material covers topics that are interesting to them.
But neuroscience has shied away from using personal interests to study the brain, especially in the realm of language. This is mainly because interests, which vary between people, could throw a wrench into experimental control — a core principle that drives scientists to limit factors that can muddle the results.
Gabrieli, D’Mello, Olson, and Johnson ventured into this unexplored territory. The team wondered if tailoring language stimuli to children’s interests might lead to higher responses in language regions of the brain. “Our study is unique in its approach to control the kind of brain activity our experiments yield, rather than control the stimuli we give subjects,” says D’Mello. “This stands in stark contrast to most neuroimaging studies that control the stimuli but might introduce differences in each subject’s level of interest in the material.”
In their recent study, the authors recruited a cohort of 20 children to investigate how personal interests affected the way the brain processes language. Caregivers described their child’s interests to the researchers, spanning baseball, train lines, “Minecraft,” and musicals. During the study, children listened to audio stories tuned to their unique interests. They were also presented with audio stories about nature (this was not an interest among the children) for comparison. To capture brain activity patterns, the team used functional magnetic resonance imaging (fMRI), which measures changes in blood flow caused by underlying neural activity.
New insights into the brain
“We found that, when children listened to stories about topics they were really interested in, they showed stronger neural responses in language areas than when they listened to generic stories that weren’t tailored to their interests,” says Olson. “Not only does this tell us how interests affect the brain, but it also shows that personalizing our experimental stimuli can have a profound impact on neuroimaging results.”
The researchers noticed a particularly striking result. “Even though the children listened to completely different stories, their brain activation patterns were more overlapping with their peers when they listened to idiosyncratic stories compared to when they listened to the same generic stories about nature,” says D’Mello. This, she notes, points to how interests can boost both the magnitude and consistency of signals in language regions across subjects without changing how these areas communicate with each other.
Gabrieli noted another finding: “In addition to the stronger engagement of language regions for content of interest, there was also stronger activation in brain regions associated with reward and also with self-reflection.” Personal interests are individually relevant and can be rewarding, potentially driving higher activation in these regions during personalized stories.
These personalized paradigms might be particularly well-suited to studies of the brain in unique or neurodivergent populations. Indeed, the team is already applying these methods to study language in the brains of autistic children.
This study breaks new ground in neuroscience and serves as a prototype for future work that personalizes research to unearth further knowledge of the brain. In doing so, scientists can compile a more complete understanding of the type of information that is processed by specific brain circuits and more fully grasp complex functions such as language.
The role of modeling in the energy transition
Joseph F. DeCarolis, administrator for the U.S. Energy Information Administration (EIA), has one overarching piece of advice for anyone poring over long-term energy projections.
“Whatever you do, don’t start believing the numbers,” DeCarolis said at the MIT Energy Initiative (MITEI) Fall Colloquium. “There’s a tendency when you sit in front of the computer and you’re watching the model spit out numbers at you … that you’ll really start to believe those numbers with high precision. Don’t fall for it. Always remain skeptical.”
This event was part of MITEI’s new speaker series, MITEI Presents: Advancing the Energy Transition, which connects the MIT community with the energy experts and leaders who are working on scientific, technological, and policy solutions that are urgently needed to accelerate the energy transition.
The point of DeCarolis’s talk, titled “Stay humble and prepare for surprises: Lessons for the energy transition,” was not that energy models are unimportant. On the contrary, DeCarolis said, energy models give stakeholders a framework that allows them to consider present-day decisions in the context of potential future scenarios. However, he repeatedly stressed the importance of accounting for uncertainty, and not treating these projections as “crystal balls.”
“We can use models to help inform decision strategies,” DeCarolis said. “We know there’s a bunch of future uncertainty. We don’t know what’s going to happen, but we can incorporate that uncertainty into our model and help come up with a path forward.”
Dialogue, not forecasts
EIA is the statistical and analytic agency within the U.S. Department of Energy, with a mission to collect, analyze, and disseminate independent and impartial energy information to help stakeholders make better-informed decisions. Although EIA analyzes the impacts of energy policies, the agency does not make or advise on policy itself. DeCarolis, who was previously professor and University Faculty Scholar in the Department of Civil, Construction, and Environmental Engineering at North Carolina State University, noted that EIA does not need to seek approval from anyone else in the federal government before publishing its data and reports. “That independence is very important to us, because it means that we can focus on doing our work and providing the best information we possibly can,” he said.
Among the many reports produced by EIA is the agency’s Annual Energy Outlook (AEO), which projects U.S. energy production, consumption, and prices. Every other year, the agency also produces the AEO Retrospective, which shows the relationship between past projections and actual energy indicators.
“The first question you might ask is, ‘Should we use these models to produce a forecast?’” DeCarolis said. “The answer for me to that question is: No, we should not do that. When models are used to produce forecasts, the results are generally pretty dismal.”
DeCarolis pointed to wildly inaccurate past projections about the proliferation of nuclear energy in the United States as an example of the problems inherent in forecasting. However, he noted, there are “still lots of really valuable uses” for energy models. Rather than using them to predict future energy consumption and prices, DeCarolis said, stakeholders should use models to inform their own thinking.
“[Models] can simply be an aid in helping us think and hypothesize about the future of energy,” DeCarolis said. “They can help us create a dialogue among different stakeholders on complex issues. If we’re thinking about something like the energy transition, and we want to start a dialogue, there has to be some basis for that dialogue. If you have a systematic representation of the energy system that you can advance into the future, we can start to have a debate about the model and what it means. We can also identify key sources of uncertainty and knowledge gaps.”
Modeling uncertainty
The key to working with energy models is not to try to eliminate uncertainty, DeCarolis said, but rather to account for it. One way to better understand uncertainty, he noted, is to look at past projections, and consider how they ended up differing from real-world results. DeCarolis pointed to two “surprises” over the past several decades: the exponential growth of shale oil and natural gas production (which had the impact of limiting coal’s share of the energy market and therefore reducing carbon emissions), as well as the rapid rise in wind and solar energy. In both cases, market conditions changed far more quickly than energy modelers anticipated, leading to inaccurate projections.
“For all those reasons, we ended up with [projected] CO2 [carbon dioxide] emissions that were quite high compared to actual,” DeCarolis said. “We’re a statistical agency, so we’re really looking carefully at the data, but it can take some time to identify the signal through the noise.”
Although EIA does not produce forecasts in the AEO, people have sometimes interpreted the reference case in the agency’s reports as predictions. In an effort to illustrate the unpredictability of future outcomes in the 2023 edition of the AEO, the agency added “cones of uncertainty” to its projection of energy-related carbon dioxide emissions, with ranges of outcomes based on the difference between past projections and actual results. One cone captures 50 percent of historical projection errors, while another represents 95 percent of historical errors.
“They capture whatever bias there is in our projections,” DeCarolis said of the uncertainty cones. “It’s being captured because we’re comparing actual [emissions] to projections. The weakness of this, though, is: who’s to say that those historical projection errors apply to the future? We don’t know that, but I still think that there’s something useful to be learned from this exercise.”
The future of energy modeling
Looking ahead, DeCarolis said, there is a “laundry list of things that keep me up at night as a modeler.” These include the impacts of climate change; how those impacts will affect demand for renewable energy; how quickly industry and government will overcome obstacles to building out clean energy infrastructure and supply chains; technological innovation; and increased energy demand from data centers running compute-intensive workloads.
“What about enhanced geothermal? Fusion? Space-based solar power?” DeCarolis asked. “Should those be in the model? What sorts of technology breakthroughs are we missing? And then, of course, there are the unknown unknowns — the things that I can’t conceive of to put on this list, but are probably going to happen.”
In addition to capturing the fullest range of outcomes, DeCarolis said, EIA wants to be flexible, nimble, transparent, and accessible — creating reports that can easily incorporate new model features and produce timely analyses. To that end, the agency has undertaken two new initiatives. First, the 2025 AEO will use a revamped version of the National Energy Modeling System that includes modules for hydrogen production and pricing, carbon management, and hydrocarbon supply. Second, an effort called Project BlueSky is aiming to develop the agency’s next-generation energy system model, which DeCarolis said will be modular and open source.
DeCarolis noted that the energy system is both highly complex and rapidly evolving, and he warned that “mental shortcuts” and the fear of being wrong can lead modelers to ignore possible future developments. “We have to remain humble and intellectually honest about what we know,” DeCarolis said. “That way, we can provide decision-makers with an honest assessment of what we think could happen in the future.”
How hard is it to prevent recurring blackouts in Puerto Rico?
Researchers at MIT’s Laboratory for Information and Decision Systems (LIDS) have shown that using decision-making software and dynamic monitoring of weather and energy use can significantly improve resiliency in the face of weather-related outages, and can also help to efficiently integrate renewable energy sources into the grid.
The researchers point out that the system they suggest might have prevented or at least lessened the kind of widespread power outage that Puerto Rico experienced last week by providing analysis to guide rerouting of power through different lines and thus limit the spread of the outage.
The computer platform, which the researchers describe as DyMonDS, for Dynamic Monitoring and Decision Systems, can be used to enhance the existing operating and planning practices used in the electric industry. The platform supports interactive information exchange and decision-making between the grid operators and grid-edge users — all the distributed power sources, storage systems and software that contribute to the grid. It also supports optimization of available resources and controllable grid equipment as system conditions vary. It further lends itself to implementing cooperative decision-making by different utility- and non-utility-owned electric power grid users, including portfolios of mixed resources, users, and storage. Operating and planning the interactions of the end-to-end high-voltage transmission grid with local distribution grids and microgrids represents another major potential use of this platform.
This general approach was illustrated using a set of publicly-available data on both meteorology and details of electricity production and distribution in Puerto Rico. An extended AC Optimal Power Flow software developed by SmartGridz Inc. is used for system-level optimization of controllable equipment. This provides real-time guidance for deciding how much power, and through which transmission lines, should be channeled by adjusting plant dispatch and voltage-related set points, and in extreme cases, where to reduce or cut power in order to maintain physically-implementable service for as many customers as possible. The team found that the use of such a system can help to ensure that the greatest number of critical services maintain power even during a hurricane, and at the same time can lead to a substantial decrease in the need for construction of new power plants thanks to more efficient use of existing resources.
The findings are described in a paper in the journal Foundations and Trends in Electric Energy Systems, by MIT LIDS researchers Marija Ilic and Laurentiu Anton, along with recent alumna Ramapathi Jaddivada.
“Using this software,” Ilic says, they show that “even during bad weather, if you predict equipment failures, and by using that information exchange, you can localize the effect of equipment failures and still serve a lot of customers, 50 percent of customers, when otherwise things would black out.”
Anton says that “the way many grids today are operated is sub-optimal.” As a result, “we showed how much better they could do even under normal conditions, without any failures, by utilizing this software.” The savings resulting from this optimization, under everyday conditions, could be in the tens of percents, they say.
The way utility systems plan currently, Ilic says, “usually the standard is that they have to build enough capacity and operate in real time so that if one large piece of equipment fails, like a large generator or transmission line, you still serve customers in an uninterrupted way. That’s what’s called N-minus-1.” Under this policy, if one major component of the system fails, they should be able to maintain service for at least 30 minutes. That system allows utilities to plan for how much reserve generating capacity they need to have on hand. That’s expensive, Ilic points out, because it means maintaining this reserve capacity all the time, even under normal operating conditions when it’s not needed.
In addition, “right now there are no criteria for what I call N-minus-K,” she says. If bad weather causes five pieces of equipment to fail at once, “there is no software to help utilities decide what to schedule” in terms of keeping the most customers, and the most important services such as hospitals and emergency services, provided with power. They showed that even with 50 percent of the infrastructure out of commission, it would still be possible to keep power flowing to a large proportion of customers.
Their work on analyzing the power situation in Puerto Rico started after the island had been devastated by hurricanes Irma and Maria. Most of the electric generation capacity is in the south, yet the largest loads are in San Juan, in the north, and Mayaguez in the west. When transmission lines get knocked down, a lot of rerouting of power needs to happen quickly.
With the new systems, “the software finds the optimal adjustments for set points,” for example, changing voltages can allow for power to be redirected through less-congested lines, or can be increased to lessen power losses, Anton says.
The software also helps in the long-term planning for the grid. As many fossil-fuel power plants are scheduled to be decommissioned soon in Puerto Rico, as they are in many other places, planning for how to replace that power without having to resort to greenhouse gas-emitting sources is a key to achieving carbon-reduction goals. And by analyzing usage patterns, the software can guide the placement of new renewable power sources where they can most efficiently provide power where and when it’s needed.
As plants are retired or as components are affected by weather, “We wanted to ensure the dispatchability of power when the load changes,” Anton says, “but also when crucial components are lost, to ensure the robustness at each step of the retirement schedule.”
One thing they found was that “if you look at how much generating capacity exists, it’s more than the peak load, even after you retire a few fossil plants,” Ilic says. “But it’s hard to deliver.” Strategic planning of new distribution lines could make a big difference.
Jaddivada, director of innovation at SmartGridz, says that “we evaluated different possible architectures in Puerto Rico, and we showed the ability of this software to ensure uninterrupted electricity service. This is the most important challenge utilities have today. They have to go through a computationally tedious process to make sure the grid functions for any possible outage in the system. And that can be done in a much more efficient way through the software that the company developed.”
The project was a collaborative effort between the MIT LIDS researchers and others at MIT Lincoln Laboratory, the Pacific Northwest National Laboratory, with overall help of SmartGridz software.
Ralph Gootee, CTO and Co-Founder at TigerEye – Interview Series
Ralph Gootee, CTO and Co-Founder at TigerEye, leads the development of a business simulation platform designed to enhance strategic decision-making, planning, and execution. By leveraging advanced time-aware AI technology, TigerEye enables organizations to streamline planning processes, simulate various scenarios, and make data-driven decisions more efficiently. Founded…