Personal interests can influence how children’s brains respond to language

A recent study from the McGovern Institute for Brain Research shows how interests can modulate language processing in children’s brains and paves the way for personalized brain research.

The paper, which appears in Imaging Neuroscience, was conducted in the lab of MIT professor and McGovern Institute investigator John Gabrieli, and led by senior author Anila D’Mello, a recent McGovern postdoc who is now an assistant professor at the University of Texas Southwestern Medical Center and the University of Texas at Dallas.

“Traditional studies give subjects identical stimuli to avoid confounding the results,” says Gabrieli, who is the Grover Hermann Professor of Health Sciences and Technology and a professor of brain and cognitive sciences at MIT. “However, our research tailored stimuli to each child’s interest, eliciting stronger — and more consistent — activity patterns in the brain’s language regions across individuals.” 

This work unveils a new paradigm that challenges current methods and shows how personalization can be a powerful strategy in neuroscience. The paper’s co-first authors are Halie Olson, a postdoc at the McGovern Institute, and Kristina Johnson PhD ’21, an assistant professor at Northeastern University and former doctoral student at the MIT Media Lab. “Our research integrates participants’ lived experiences into the study design,” says Johnson. “This approach not only enhances the validity of our findings, but also captures the diversity of individual perspectives, often overlooked in traditional research.”

Taking interest into account

When it comes to language, our interests are like operators behind the switchboard. They guide what we talk about and who we talk to. Research suggests that interests are also potent motivators and can help improve language skills. For instance, children score higher on reading tests when the material covers topics that are interesting to them.

But neuroscience has shied away from using personal interests to study the brain, especially in the realm of language. This is mainly because interests, which vary between people, could throw a wrench into experimental control — a core principle that drives scientists to limit factors that can muddle the results.

Gabrieli, D’Mello, Olson, and Johnson ventured into this unexplored territory. The team wondered if tailoring language stimuli to children’s interests might lead to higher responses in language regions of the brain. “Our study is unique in its approach to control the kind of brain activity our experiments yield, rather than control the stimuli we give subjects,” says D’Mello. “This stands in stark contrast to most neuroimaging studies that control the stimuli but might introduce differences in each subject’s level of interest in the material.”

In their recent study, the authors recruited a cohort of 20 children to investigate how personal interests affected the way the brain processes language. Caregivers described their child’s interests to the researchers, spanning baseball, train lines, “Minecraft,” and musicals. During the study, children listened to audio stories tuned to their unique interests. They were also presented with audio stories about nature (this was not an interest among the children) for comparison. To capture brain activity patterns, the team used functional magnetic resonance imaging (fMRI), which measures changes in blood flow caused by underlying neural activity.

New insights into the brain

“We found that, when children listened to stories about topics they were really interested in, they showed stronger neural responses in language areas than when they listened to generic stories that weren’t tailored to their interests,” says Olson. “Not only does this tell us how interests affect the brain, but it also shows that personalizing our experimental stimuli can have a profound impact on neuroimaging results.”

The researchers noticed a particularly striking result. “Even though the children listened to completely different stories, their brain activation patterns were more overlapping with their peers when they listened to idiosyncratic stories compared to when they listened to the same generic stories about nature,” says D’Mello. This, she notes, points to how interests can boost both the magnitude and consistency of signals in language regions across subjects without changing how these areas communicate with each other.

Gabrieli noted another finding: “In addition to the stronger engagement of language regions for content of interest, there was also stronger activation in brain regions associated with reward and also with self-reflection.” Personal interests are individually relevant and can be rewarding, potentially driving higher activation in these regions during personalized stories.

These personalized paradigms might be particularly well-suited to studies of the brain in unique or neurodivergent populations. Indeed, the team is already applying these methods to study language in the brains of autistic children.

This study breaks new ground in neuroscience and serves as a prototype for future work that personalizes research to unearth further knowledge of the brain. In doing so, scientists can compile a more complete understanding of the type of information that is processed by specific brain circuits and more fully grasp complex functions such as language. 

The role of modeling in the energy transition

Joseph F. DeCarolis, administrator for the U.S. Energy Information Administration (EIA), has one overarching piece of advice for anyone poring over long-term energy projections.

“Whatever you do, don’t start believing the numbers,” DeCarolis said at the MIT Energy Initiative (MITEI) Fall Colloquium. “There’s a tendency when you sit in front of the computer and you’re watching the model spit out numbers at you … that you’ll really start to believe those numbers with high precision. Don’t fall for it. Always remain skeptical.”

This event was part of MITEI’s new speaker series, MITEI Presents: Advancing the Energy Transition, which connects the MIT community with the energy experts and leaders who are working on scientific, technological, and policy solutions that are urgently needed to accelerate the energy transition.

The point of DeCarolis’s talk, titled “Stay humble and prepare for surprises: Lessons for the energy transition,” was not that energy models are unimportant. On the contrary, DeCarolis said, energy models give stakeholders a framework that allows them to consider present-day decisions in the context of potential future scenarios. However, he repeatedly stressed the importance of accounting for uncertainty, and not treating these projections as “crystal balls.”

“We can use models to help inform decision strategies,” DeCarolis said. “We know there’s a bunch of future uncertainty. We don’t know what’s going to happen, but we can incorporate that uncertainty into our model and help come up with a path forward.”

Dialogue, not forecasts

EIA is the statistical and analytic agency within the U.S. Department of Energy, with a mission to collect, analyze, and disseminate independent and impartial energy information to help stakeholders make better-informed decisions. Although EIA analyzes the impacts of energy policies, the agency does not make or advise on policy itself. DeCarolis, who was previously professor and University Faculty Scholar in the Department of Civil, Construction, and Environmental Engineering at North Carolina State University, noted that EIA does not need to seek approval from anyone else in the federal government before publishing its data and reports. “That independence is very important to us, because it means that we can focus on doing our work and providing the best information we possibly can,” he said.

Among the many reports produced by EIA is the agency’s Annual Energy Outlook (AEO), which projects U.S. energy production, consumption, and prices. Every other year, the agency also produces the AEO Retrospective, which shows the relationship between past projections and actual energy indicators.

“The first question you might ask is, ‘Should we use these models to produce a forecast?’” DeCarolis said. “The answer for me to that question is: No, we should not do that. When models are used to produce forecasts, the results are generally pretty dismal.”

DeCarolis pointed to wildly inaccurate past projections about the proliferation of nuclear energy in the United States as an example of the problems inherent in forecasting. However, he noted, there are “still lots of really valuable uses” for energy models. Rather than using them to predict future energy consumption and prices, DeCarolis said, stakeholders should use models to inform their own thinking.

“[Models] can simply be an aid in helping us think and hypothesize about the future of energy,” DeCarolis said. “They can help us create a dialogue among different stakeholders on complex issues. If we’re thinking about something like the energy transition, and we want to start a dialogue, there has to be some basis for that dialogue. If you have a systematic representation of the energy system that you can advance into the future, we can start to have a debate about the model and what it means. We can also identify key sources of uncertainty and knowledge gaps.”

Modeling uncertainty

The key to working with energy models is not to try to eliminate uncertainty, DeCarolis said, but rather to account for it. One way to better understand uncertainty, he noted, is to look at past projections, and consider how they ended up differing from real-world results. DeCarolis pointed to two “surprises” over the past several decades: the exponential growth of shale oil and natural gas production (which had the impact of limiting coal’s share of the energy market and therefore reducing carbon emissions), as well as the rapid rise in wind and solar energy. In both cases, market conditions changed far more quickly than energy modelers anticipated, leading to inaccurate projections.

“For all those reasons, we ended up with [projected] CO2 [carbon dioxide] emissions that were quite high compared to actual,” DeCarolis said. “We’re a statistical agency, so we’re really looking carefully at the data, but it can take some time to identify the signal through the noise.”

Although EIA does not produce forecasts in the AEO, people have sometimes interpreted the reference case in the agency’s reports as predictions. In an effort to illustrate the unpredictability of future outcomes in the 2023 edition of the AEO, the agency added “cones of uncertainty” to its projection of energy-related carbon dioxide emissions, with ranges of outcomes based on the difference between past projections and actual results. One cone captures 50 percent of historical projection errors, while another represents 95 percent of historical errors.

“They capture whatever bias there is in our projections,” DeCarolis said of the uncertainty cones. “It’s being captured because we’re comparing actual [emissions] to projections. The weakness of this, though, is: who’s to say that those historical projection errors apply to the future? We don’t know that, but I still think that there’s something useful to be learned from this exercise.”

The future of energy modeling

Looking ahead, DeCarolis said, there is a “laundry list of things that keep me up at night as a modeler.” These include the impacts of climate change; how those impacts will affect demand for renewable energy; how quickly industry and government will overcome obstacles to building out clean energy infrastructure and supply chains; technological innovation; and increased energy demand from data centers running compute-intensive workloads.

“What about enhanced geothermal? Fusion? Space-based solar power?” DeCarolis asked. “Should those be in the model? What sorts of technology breakthroughs are we missing? And then, of course, there are the unknown unknowns — the things that I can’t conceive of to put on this list, but are probably going to happen.”

In addition to capturing the fullest range of outcomes, DeCarolis said, EIA wants to be flexible, nimble, transparent, and accessible — creating reports that can easily incorporate new model features and produce timely analyses. To that end, the agency has undertaken two new initiatives. First, the 2025 AEO will use a revamped version of the National Energy Modeling System that includes modules for hydrogen production and pricing, carbon management, and hydrocarbon supply. Second, an effort called Project BlueSky is aiming to develop the agency’s next-generation energy system model, which DeCarolis said will be modular and open source.

DeCarolis noted that the energy system is both highly complex and rapidly evolving, and he warned that “mental shortcuts” and the fear of being wrong can lead modelers to ignore possible future developments. “We have to remain humble and intellectually honest about what we know,” DeCarolis said. “That way, we can provide decision-makers with an honest assessment of what we think could happen in the future.” 

How hard is it to prevent recurring blackouts in Puerto Rico?

Researchers at MIT’s Laboratory for Information and Decision Systems (LIDS) have shown that using decision-making software and dynamic monitoring of weather and energy use can significantly improve resiliency in the face of weather-related outages, and can also help to efficiently integrate renewable energy sources into the grid.

The researchers point out that the system they suggest might have prevented or at least lessened the kind of widespread power outage that Puerto Rico experienced last week by providing analysis to guide rerouting of power through different lines and thus limit the spread of the outage.

The computer platform, which the researchers describe as DyMonDS, for Dynamic Monitoring and Decision Systems, can be used to enhance the existing operating and planning practices used in the electric industry. The platform supports interactive information exchange and decision-making between the grid operators and grid-edge users — all the distributed power sources, storage systems and software that contribute to the grid. It also supports optimization of available resources and controllable grid equipment as system conditions vary. It further lends itself to implementing cooperative decision-making by different utility- and non-utility-owned electric power grid users, including portfolios of mixed resources, users, and storage. Operating and planning the interactions of the end-to-end high-voltage transmission grid with local distribution grids and microgrids represents another major potential use of this platform.

This general approach was illustrated using a set of publicly-available data on both meteorology and details of electricity production and distribution in Puerto Rico. An extended AC Optimal Power Flow software developed by SmartGridz Inc. is used for system-level optimization of controllable equipment. This provides real-time guidance for deciding how much power, and through which transmission lines, should be channeled by adjusting plant dispatch and voltage-related set points, and in extreme cases, where to reduce or cut power in order to maintain physically-implementable service for as many customers as possible. The team found that the use of such a system can help to ensure that the greatest number of critical services maintain power even during a hurricane, and at the same time can lead to a substantial decrease in the need for construction of new power plants thanks to more efficient use of existing resources.

The findings are described in a paper in the journal Foundations and Trends in Electric Energy Systems, by MIT LIDS researchers Marija Ilic and Laurentiu Anton, along with recent alumna Ramapathi Jaddivada.

“Using this software,” Ilic says, they show that “even during bad weather, if you predict equipment failures, and by using that information exchange, you can localize the effect of equipment failures and still serve a lot of customers, 50 percent of customers, when otherwise things would black out.”

Anton says that “the way many grids today are operated is sub-optimal.” As a result, “we showed how much better they could do even under normal conditions, without any failures, by utilizing this software.” The savings resulting from this optimization, under everyday conditions, could be in the tens of percents, they say.

The way utility systems plan currently, Ilic says, “usually the standard is that they have to build enough capacity and operate in real time so that if one large piece of equipment fails, like a large generator or transmission line, you still serve customers in an uninterrupted way. That’s what’s called N-minus-1.” Under this policy, if one major component of the system fails, they should be able to maintain service for at least 30 minutes. That system allows utilities to plan for how much reserve generating capacity they need to have on hand. That’s expensive, Ilic points out, because it means maintaining this reserve capacity all the time, even under normal operating conditions when it’s not needed.

In addition, “right now there are no criteria for what I call N-minus-K,” she says. If bad weather causes five pieces of equipment to fail at once, “there is no software to help utilities decide what to schedule” in terms of keeping the most customers, and the most important services such as hospitals and emergency services, provided with power. They showed that even with 50 percent of the infrastructure out of commission, it would still be possible to keep power flowing to a large proportion of customers.

Their work on analyzing the power situation in Puerto Rico started after the island had been devastated by hurricanes Irma and Maria. Most of the electric generation capacity is in the south, yet the largest loads are in San Juan, in the north, and Mayaguez in the west. When transmission lines get knocked down, a lot of rerouting of power needs to happen quickly.

With the new systems, “the software finds the optimal adjustments for set points,” for example, changing voltages can allow for power to be redirected through less-congested lines, or can be increased to lessen power losses, Anton says.

The software also helps in the long-term planning for the grid. As many fossil-fuel power plants are scheduled to be decommissioned soon in Puerto Rico, as they are in many other places, planning for how to replace that power without having to resort to greenhouse gas-emitting sources is a key to achieving carbon-reduction goals. And by analyzing usage patterns, the software can guide the placement of new renewable power sources where they can most efficiently provide power where and when it’s needed.

As plants are retired or as components are affected by weather, “We wanted to ensure the dispatchability of power when the load changes,” Anton says, “but also when crucial components are lost, to ensure the robustness at each step of the retirement schedule.”

One thing they found was that “if you look at how much generating capacity exists, it’s more than the peak load, even after you retire a few fossil plants,” Ilic says. “But it’s hard to deliver.” Strategic planning of new distribution lines could make a big difference.

Jaddivada, director of innovation at SmartGridz, says that “we evaluated different possible architectures in Puerto Rico, and we showed the ability of this software to ensure uninterrupted electricity service. This is the most important challenge utilities have today. They have to go through a computationally tedious process to make sure the grid functions for any possible outage in the system. And that can be done in a much more efficient way through the software that the company  developed.”

The project was a collaborative effort between the MIT LIDS researchers and others at MIT Lincoln Laboratory, the Pacific Northwest National Laboratory, with overall help of SmartGridz software. 

New filter captures and recycles aluminum from manufacturing waste

Used in everything from soda cans and foil wrap to circuit boards and rocket boosters, aluminum is the second-most-produced metal in the world after steel. By the end of this decade, demand is projected to drive up aluminum production by 40 percent worldwide. This steep rise will magnify aluminum’s environmental impacts, including any pollutants that are released with its manufacturing waste.

MIT engineers have developed a new nanofiltration process to curb the hazardous waste generated from aluminum production. Nanofiltration could potentially be used to process the waste from an aluminum plant and retrieve any aluminum ions that would otherwise have escaped in the effluent stream. The captured aluminum could then be upcycled and added to the bulk of the produced aluminum, increasing yield while simultaneously reducing waste.

The researchers demonstrated the membrane’s performance in lab-scale experiments using a novel membrane to filter various solutions that were similar in content to the waste streams produced by aluminum plants. They found that the membrane selectively captured more than 99 percent of aluminum ions in these solutions.

If scaled up and implemented in existing production facilities, the membrane technology could reduce the amount of wasted aluminum and improve the environmental quality of the waste that plants generate.

“This membrane technology not only cuts down on hazardous waste but also enables a circular economy for aluminum by reducing the need for new mining,” says John Lienhard, the Abdul Latif Jameel Professor of Water in the Department of Mechanical Engineering, and director of the Abdul Latif Jameel Water and Food Systems Lab (J-WAFS) at MIT. “This offers a promising solution to address environmental concerns while meeting the growing demand for aluminum.”

Lienhard and his colleagues report their results in a study appearing today in the journal ACS Sustainable Chemistry and Engineering. The study’s co-authors include MIT mechanical engineering undergraduates Trent Lee and Vinn Nguyen, and Zi Hao Foo SM ’21, PhD ’24, who is a postdoc at the University of California at Berkeley.

A recycling niche

Lienhard’s group at MIT develops membrane and filtration technologies for desalinating seawater and remediating various sources of wastewater. In looking for new areas to apply their work, the team found an unexplored opportunity in aluminum and, in particular, the wastewater generated from the metal’s production.

As part of aluminum’s production, metal-rich ore, called bauxite, is first mined from open pits, then put through a series of chemical reactions to separate the aluminum from the rest of the mined rock. These reactions ultimately produce aluminum oxide, in a powdery form called alumina. Much of this alumina is then shipped to refineries, where the powder is poured into electrolysis vats containing a molten mineral called cryolite. When a strong electric current is applied, cryolite breaks alumina’s chemical bonds, separating aluminum and oxygen atoms. The pure aluminum then settles in liquid form to the bottom of the vat, where it can be collected and cast into various forms.

Cryolite electrolyte acts as a solvent, facilitating the separation of alumina during the molten salt electrolysis process. Over time, the cryolite accumulates impurities such as sodium, lithium, and potassium ions — gradually reducing its effectiveness in dissolving alumina. At a certain point, the concentration of these impurities reaches a critical level, at which the electrolyte must be replaced with fresh cryolite to main process efficiency. The spent cryolite, a viscous sludge containing residual aluminum ions and impurities, is then transported away for disposal.   

“We learned that for a traditional aluminum plant, something like 2,800 tons of aluminum are wasted per year,” says lead author Trent Lee. “We were looking at ways that the industry can be more efficient, and we found cryolite waste hadn’t been well-researched in terms of recycling some of its waste products.”

A charged kick

In their new work, the researchers aimed to develop a membrane process to filter cryolite waste and recover aluminum ions that inevitably make it into the waste stream. Specifically, the team looked to capture aluminum while letting through all other ions, especially sodium, which builds up significantly in the cryolite over time.

The team reasoned that if they could selectively capture aluminum from cryolite waste, the aluminum could be poured back into the electrolysis vat without adding excessive sodium that would further slow the electrolysis process.

The researchers’ new design is an adaptation of membranes used in conventional water treatment plants. These membranes are typically made from a thin sheet of polymer material that is perforated by tiny, nanometer-scale pores, the size of which is tuned to let through specific ions and molecules.

The surface of conventional membranes carries a natural, negative charge. As a result, the membranes repel any ions that carry the same negative charge, while they attract positively charged ions to flow through.

In collaboration with the Japanese membrane company Nitto Denko, the MIT team sought to examine the efficacy of commercially available membranes that could filter through most positively charged ions in cryolite wastewater while repelling and capturing aluminum ions. However, aluminum ions also carry a positive charge, of +3, where sodium and the other cations carry a lesser positive charge of +1.

Motivated by the group’s recent work investigating membranes for recovering lithium from salt lakes and spent batteries, the team tested a novel Nitto Denko membrane with a thin, positively charged coating covering the membrane. The coating’s charge is just positive enough to strongly repel and retain aluminum while allowing less positively charged ions to flow through.

“The aluminum is the most positively charged of the ions, so most of it is kicked away from the membrane,” Foo explains.

The team tested the membrane’s performance by passing through solutions with various balances of ions, similar to what can be found in cryolite waste. They observed that the membrane consistently captured 99.5 percent of aluminum ions while allowing through sodium and the other cations. They also varied the pH of the solutions, and found the membrane maintained its performance even after sitting in highly acidic solution for several weeks.

“A lot of this cryolite waste stream comes at different levels of acidity,” Foo says. “And we found the membrane works really well, even within the harsh conditions that we would expect.”

The new experimental membrane is about the size of a playing card. To treat cryolite waste in an industrial-scale aluminum production plant, the researchers envision a scaled-up version of the membrane, similar to what is used in many desalination plants, where a long membrane is rolled up in a spiral configuration, through which water flows.

“This paper shows the viability of membranes for innovations in circular economies,” Lee says. “This membrane provides the dual benefit of upcycling aluminum while reducing hazardous waste.”

20+ Free Admin Dashboard Templates for Figma – Speckyboy

A great dashboard is both attractive and informative. Users should be able to get what they need effortlessly. The look should be clean and easy to understand. The result is something users want to visit time and again.

Designing a dashboard from scratch is a huge task, though. Things can get complicated in a hurry with so many widgets competing for attention. Who has the time to deal with all of this?

That’s what makes a Figma template so helpful. A beautiful and functional design is already in place. There are also components for you to use, duplicate, and customize. That means your project will be off to a running start.

Does it sound like something you could use? If so, check out our collection of free admin dashboard templates for Figma. There are options here for virtually every use case. Choose your favorite and get started!

You might also like our collection of web and mobile UI templates for Figma.

Give users an easy-to-navigate experience with this Figma UI template. It features a high-contrast color scheme, beautiful design components, and outstanding typography. Use it, and you’ll have a professional-grade dashboard in no time.

Dash - Free Dashboard UI Figma

Download this Figma dashboard template and gain access to over 500 UI components. You’ll find charts, buttons, card layouts, navigation bars, and more. It provides the ultimate flexibility for your project.

UI Dashboard Builder for Figma

Here’s a UI kit that includes everything you need to build a dashboard layout. It includes multiple screens in both light and dark modes. It also uses Figma variables for easier customization.

Dashboard UI Kit for Figma

Crown is a dashboard template inspired by Material Design – Google’s open-source design system. This step makes everything seem intuitive and familiar. The components are colorful, and the layout is roomy.

Crown - Material Admin Dashboard UI Kit for Figma

This open-source dashboard template was designed to work with React. The package includes several templates and components with light and dark versions. It’s a versatile choice for building web applications.

Horizon UI - Trendiest Open Source Admin Template Dashboard for Figma

Create an analytics-focused dashboard using this Figma template. It features a modern aesthetic and support for multiple color schemes. The template uses layers, making it easy to customize to suit your needs.

Website Admin Dashboard for Figma

Kanban boards are great for organizing information for individuals and teams. This Figma template uses the concept to help you build a task management app. Use its clean design to improve communication and stay focused.

Dashboard Task Boards Figma Free

Sales Analytics Dashboard UI Kit has 16 predesigned screens for different use cases. You’ll also find plenty of widgets, well-organized layers, and an easy-to-customize setup. It’s also built for accessibility and meets WCAG 2 requirements.

Sales Analytics Dashboard – Light UI for Figma

The components included in this UI kit will make your dashboard project a breeze. It’s all here: dropdowns, modal windows, navigation, charts, form elements, and more. Use them to build a custom application that’s beautiful and functional.

Admin System UI Kit for Figma

Here’s a different take on the traditional dashboard screen. NewsNet focuses on content more than statistics. That makes it a great choice for company intranets or personalized web portals. There are several creative possibilities here.

NewsNet - News Dashboard for Figma

This free dashboard UI kit focuses on finance. You might use it for a company’s accounting department or as part of an employee portal. The design is clean and easy to read.

Free UI Kit - Dashboard Payroll for Figma

Create a custom dashboard layout in minutes using these beautifully designed component cards. Mix and match them to display an array of stats and info. These colorful cards are flexible, and many include crisp graphics.

Full Charts Components for Figma

This stylish template is perfect for use as an analytics dashboard. It includes all the basics in a simple and colorful layout. Customize it to your heart’s content. You’ll save time without sacrificing quality.

Analytics Dashboard - Built with Fikri Chart Library (Free)

BankDash is a free dashboard UI kit that includes over 300 screen layouts. It uses the latest Figma features such as variables and auto layout. That makes it a fit for virtually any type of project.

BankDash - Dashboard UI Kit

There’s a lot to like about this free dashboard template. It’s clean, colorful, and includes mobile and desktop viewports templates. You’ll find plenty of resources to get your project off the ground.

Dashboard Free for Figma

This free Figma dashboard template includes plenty of ready-made components. Each can be customized to fit your content and color scheme. Pick your favorites and build a user-friendly interface!

Dashboard Figma Free

Do you want to build a collaborative dashboard? This calendar UI template will give you a terrific head start. It includes views for mobile and desktop. In addition, it outlines tasks in an easy-to-follow format.

Dashboard Calendar UI

Digesto is a dashboard template that focuses on content organization. It’s perfect for user portals, client reputation tracking, or any project where media is front and center. The template includes six screens and several attractive components.

Free Figma Template: Digesto AI Summarizer

This free open-source admin dashboard kit includes an atomic design system. The template features UI elements like tables, charts, forms, etc. You’ll also have access to light and dark versions in an easy-to-edit package.

Sneat – Free Figma Admin Dashboard UI Kit

With more than 350 global styles and 10,000+ components, Untitled UI is a powerful package. That provides plenty of options for building a dashboard to match your needs. If you can dream it, you can do it.

Untitled UI – FREE Figma UI Kit and Design System v2.0

Use this dashboard UI kit for real estate and property management projects. Its well-organized layout will help users stay on top of their tasks in style. The kit includes one screen, a component set, and a style guide.

Property Management Dashboard UI Kit

Form UI Kit uses a monochromatic color scheme to enhance legibility. It includes all the basics to build an attractive and functional dashboard. There’s enough here to cover a variety of needs.

Form UI Kit - Free in Figma

Users of Tailwind CSS will want to check out this admin dashboard template. The kit includes four distinctive dashboard layouts and over 400 UI elements. It’s a great way to combine the popular CSS framework with your dashboard project.

Free Figma Tailwind Admin Dashboard – TailAdmin

Build a Beautiful Dashboard in Less Time

Dashboards are among the most important and most difficult design projects. Users depend on them to perform tasks and gather information. However, building an effective one requires excellent attention to detail.

The templates in this collection are designed to make your job easier. They provide a solid foundation to build upon. The design and layout are taken care of. That allows you to focus on executing your plan.

Now that you have so many outstanding templates within reach – what will you build?


Related Topics


Top

Loren Graham, professor emeritus of the history of science, dies at 91

Loren R. Graham, professor emeritus of the history of science who served on the MIT faculty for nearly three decades, died on Dec. 15, 2024, at the age of 91.

Graham received a BS in chemical engineering from Purdue University in 1955, the same year his classmate, acquaintance, and future NASA astronaut and moon walker Neil Armstrong graduated with a BS in aeronautical engineering. Graham went on to earn a PhD in history in 1964 from Columbia University, where he taught from 1965 until 1978. 

In 1978, Graham joined the MIT Program in Science, Technology, and Society (STS) as a professor of the history of science. His specialty during his tenure with the program was in the history of science in Russia and the Soviet Union in the 19th, 20th, and 21st centuries. His work focused on Soviet and Marxist philosophy of science and science politics.

Much of Graham’s career spanned the Cold War. He participated in one of the first academic exchange programs between the United States and the Soviet Union from 1960 to 1961 and marched in the Moscow May Day Parade just weeks after Yuri Gagarin became the first human in space. In 1965, he received a Fulbright Award to do research in the Soviet Union.

Graham wrote extensively on the influence of social context in science and the study of contemporary science and technology in Russia. He also experimented in writing a nonfiction mystery, “Death in the Lighthouse” (2013), and making documentary films. His publications include “Science, Philosophy and Human Behavior in the Soviet Union” (1987), “Science and the Soviet Social Order” (1990), “Science in Russia and the Soviet Union: A Short History” (1993), “The Ghost of the Executed Engineer” (1993); “A Face in the Rock” (1995); and “What Have We Learned About Science and Technology from the Russian Experience?” (1998).

His publication “Science, Philosophy and Science in the Soviet Union” was nominated for the National Book Award in 1987. He received the George Sarton Medal from the History of Science Society in 1996 and the Follo Award of the Michigan Historical Society in 2000 for his contributions to Michigan history.

Many former colleagues recall the impact he had at MIT. In 1988, with fellow faculty member Roe Merrett Smith, professor emeritus of history, he played a leading role in establishing the graduate program in the history and social study of science and technology that is now known as HASTS. This interdisciplinary graduate Program in History, Anthropology, and Science, Technology, and Society has become one of the most selective graduate programs at MIT.

“Loren was an intellectual innovator and role model for teaching and advising,” says Sherry Turkle, MIT professor of sociology. “And he was a wonderful colleague. … He experimented. He had fun. He cared about writing and about finding joy in work.”

Graham served on the STS faculty until his retirement in 2006.

Throughout his life, Graham was a member of many foundations and honorary societies, including the U.S. Civilian Research and Development Foundation, the American Philosophical Society, the American Academy of Arts and Sciences, and the Russian Academy of Natural Science.

He was also a member on several boards of trustees, including George Soros’ International Science Foundation, which supported Russian scientists after the collapse of the Soviet Union. For many years he served on the board of trustees of the European University at St. Petersburg, remaining an active member on its development board until 2024. After donating thousands of books from his own library to the university, a special collection was established in his name.

In 2012, Graham was awarded a medal by the Russian Academy of Sciences at a ceremony in Moscow for his contributions to the history of science. “His own life as a scholar covered a great deal of important history,” says David Mindell, MIT professor of aeronautics and astronautics and the Dibner Professor of the History of Engineering and Manufacturing.

Graham is survived by ​​his wife, Patricia Graham, and daughter, Meg Peterson.

Richard Locke PhD ’89 named dean of the MIT Sloan School of Management

Richard Locke PhD ’89, a prominent scholar and academic administrator with a wide range of leadership experience, has been named the new dean of the MIT Sloan School of Management. The appointment is effective July 1.

In becoming the school’s 10th dean, Locke is rejoining the Institute, where he previously served in multiple roles from 1988 to 2013, as a faculty member, a department head, and a deputy dean of MIT Sloan. After leaving MIT, Locke was a senior leader at Brown University, including seven and a half years as Brown’s provost. Since early 2023, he has been dean of Apple University, an educational unit within Apple Inc. focused on educating the company’s employees on leadership, management, and the company’s culture and organization.

“I am thrilled to be returning to MIT Sloan,” says Locke, whose formal title will be the John C Head III Dean at MIT Sloan. “It is a special place, with its world-class faculty, innovative research and educational programs, and close-knit community, all within the MIT ecosystem.”

He adds: “All of these assets give MIT Sloan an opportunity to chart the future — to shape how new technologies will reconfigure industries and careers, how new enterprises will be created and run, how individuals will work and live, and how national economies will develop and adapt. It will be exciting and fun to work with great colleagues and to help lead the school to its next phase of global prominence and impact.”

As dean at MIT Sloan, Locke follows David C. Schmittlein, who stepped down in February 2024 after a nearly 17-year tenure. Georgia Perakis, the William F. Pounds Professor of Operations Research and Statistics and Operations Management at MIT Sloan, has been serving as the interim John C Head III Dean since then and will continue in the role until Locke begins.

Institute leaders welcomed Locke back, citing his desire to help MIT Sloan address significant global challenges, including climate change, the role of artificial intelligence in society, and new health care solutions, while refining best practices for businesses and workplaces.

“MIT Sloan has been very fortunate in its leaders. Both Dave Schmittlein and Georgia Perakis set a high bar, and we continue that tradition with the selection of Rick Locke,” says MIT President Sally A. Kornbluth. “Beyond his wide-ranging experience and accomplishments and superb academic credentials, I have come to know Rick as an outstanding leader, both from the years when we were both provosts and through his thoughtful service on the MIT Corporation. Rick has always impressed me with his intellectual breadth, personal grace, and fresh ideas. We’re delighted that he will be rejoining our campus community.”

In a letter to the MIT community, MIT Provost Cynthia Barnhart praised Locke’s “transformative career” and noted how she and the search committee agree “that Rick’s depth of experience makes him a once-in-a-generation leader who will ‘hit the ground sprinting’” as MIT Sloan’s next dean.

Barnhart added: “The committee and I were impressed by his vision for removing frictions that slow research efforts, his exceptional track record of raising substantial funds to support academic communities, and his strong grasp of and attentiveness to the interests and needs of MIT Sloan’s constituencies.”

A political scientist by training, Locke has conducted high-profile research on labor practices in global supply chains, among other topics. His career has also included efforts to bring together stakeholders, from multinational firms to supply-chain workers, in an effort to upgrade best practices in business.

Locke is widely known for a vigorous work ethic, a humane manner around co-workers, and a leadership outlook that blends idealism about civic engagement with realism about global challenges.

His wide-ranging work and interests make Locke well-suited to MIT Sloan. The school has about 115 tenure-track faculty and 1,600 students spread over eight degree programs, with wide-ranging initiatives and academic groups connecting core management topics with more specialized topics relating to the innovation economy and entrepreneurship, the social impact of business and technology, policy development, and much more.

MIT conducted an extensive search process for the position, evaluating internal and external candidates over the last several months. The search committee’s co-chairs were Kate Kellogg, the David J. McGrath Jr Professor of Management and Innovation at MIT Sloan; and Andrew W. Lo, the Charles E. and Susan T. Harris Professor at MIT Sloan.

The committee solicited and received extensive feedback about the position and the school from  stakeholders including faculty, students, staff, and alumni, while engaging with MIT leadership about the role.

“MIT Sloan occupies a rare position in the world as a management school connected to one of the great engineering and scientific universities,” Kellogg says.

She adds: “Rick has a strong track record of bringing faculty from different domains together, and we think he is going to be great at connecting Sloan even further to the rest of MIT, around grand challenges such as climate, AI, and health care.”

Lo credits Schmittlein for “an incredible 17-year legacy of extraordinary leadership,” observing that Schmittlein helped MIT Sloan expand in size, consolidate its strengths, and build new programs. About Perakis, Kellogg notes, “Georgia’s outstanding work as dean has built on these strengths and sparked important new innovations and partnerships in areas like AI and entrepreneurship. She’s also expanded the school’s footprint in Southeast Asia and helped advance key Institute-wide priorities like the Climate Project at MIT and the Generative AI consortium.”

Kellogg and Lo expressed confidence that Locke would help MIT Sloan continue to adapt and grow.

“MIT and MIT Sloan are at inflection points in our ability to invent the future, given the role technology is playing in virtually every aspect of our lives,” Lo says. “Rick has the same vision and ambitions that we do, and the experience and skills to help us realize that vision. We couldn’t be more excited by this choice.”

Lo adds: “Rick is a first-rate scholar and first-rate educator who really gets our mission and core values and ethos. Dave was an extraordinary dean, and we expect the same from Rick. He sees the full potential of MIT Sloan and how to achieve it.”

Locke received his BA from Wesleyan University and an MA in education from the University of Chicago. He earned his doctorate in political science at MIT, writing a dissertation about local politics and industrial change in Italy, under the supervision of now-Institute Professor Suzanne Berger.

Locke joined the MIT faculty as an assistant professor of international management, was promoted in 1993 to an associate professor of management and political science, and earned tenure in 1996. In 2000, he was named the Alvin J. Siteman Professor of Entrepreneurship, becoming a full professor in 2001.

In 2010, Locke took on a new role at MIT, heading the Department of Political Science, a position he held through 2013; he was also given a new endowed professorship, the Class of 1922 Professor of Political Science and Management. During the same time frame, Locke also served as deputy dean at MIT Sloan, from 2009 through 2010, and then again from 2012 through 2013.

Locke moved to Brown in order to take the position of director of the Thomas J. Watson Institute for International and Public Affairs. In 2015, he was named Brown’s provost, the university’s chief academic officer and budget officer.

During his initial chapter at MIT Sloan, Locke co-founded MIT’s Global Entrepreneurship Lab (G-Lab) as well as other action learning programs, helped the effort to double the size of the Sloan Fellows Program, and worked to update MIT Sloan Executive Education programs, among other projects.

Locke has authored or co-authored five books and dozens of journal articles and book chapters, helping open up the study of global labor practices while also examining the political implications of industrial changes and labor relations. For his research on working conditions in global supply chains, Locke was given the Faculty Pioneer for Academic Leadership award by the Aspen Institute’s Business and Society Program, the Progress Medal from the Society of Progress, the Dorothy Day Award for Outstanding Labor Research from the American Political Society Association, and the Responsible Research in Management Award.

His books include “Remaking the Italian Economy” (1995); “Employment Relations in a Changing World Economy” (co-edited with Thomas Kochan, and Michael Piore, 1995); “Working in America” (co-authored with Paul Osterman, Thomas Kochan, Michael Piore, 2001); “The Promise and Limits of Private Power Promoting Labor Standards in a Global Economy” (2013); and “Production in the Innovation Economy (co-edited with Rachel Wellhausen, 2014).

A committed educator, Locke has won numerous awards for teaching in his career including the Graduate Management Society Teaching Award, in 1990; the Excellence in Teaching Award from MIT Sloan, in 2003; the Class of 1960 Innovation in Teaching Award, from MIT in 2007; and the Jamieson Prize for Excellence in Teaching, from MIT, in 2008.

Over the course of his career, Locke has been a visiting professor or scholar at several universities, including Bocconi University in Milan; the Harvard Kennedy School; the Saïd Business School of the University of Oxford; the Universidade Federal do Rio de Janeiro, Brazil; the Universita’ Ca Foscari of Venice, Italy; the Universita Degli Studi di Milano, Italy; Georg-August Universität, in Göttingen, Germany; and the Universita’ Federico II in Naples, Italy.

Locke has remained connected to MIT even over the most recent decade of his career, including his service as a member of the MIT Corporation.

“I loved my time at MIT Sloan because of its wonderful mix of ambition, energy, and drive for excellence, but also humility,” Locke says. “We knew that we didn’t always have all the answers, but were curious to learn more, and eager to do the work to find solutions to some of the world’s great challenges. Now as dean, I look forward to once again being part of this wonderful community.”

A new way to determine whether a species will successfully invade an ecosystem

When a new species is introduced into an ecosystem, it may succeed in establishing itself, or it may fail to gain a foothold and die out. Physicists at MIT have now devised a formula that can predict which of those outcomes is most likely.

The researchers created their formula based on analysis of hundreds of different scenarios that they modeled using populations of soil bacteria grown in their laboratory. They now plan to test their formula in larger-scale ecosystems, including forests. This approach could also be helpful in predicting whether probiotics or fecal microbiota treatments (FMT) would successfully combat infections of the human GI tract.

“People eat a lot of probiotics, but many of them can never invade our gut microbiome at all, because if you introduce it, it does not necessarily mean that it can grow and colonize and benefit your health,” says Jiliang Hu SM ’19, PhD ’24, the lead author of the study.

MIT professor of physics Jeff Gore is the senior author of the paper, which appears today in the journal Nature Ecology and Evolution. Matthieu Barbier, a researcher at the Plant Health Institute Montpellier, and Guy Bunin, a professor of physics at Technion, are also authors of the paper.

Population fluctuations

Gore’s lab specializes in using microbes to analyze interspecies interactions in a controlled way, in hopes of learning more about how natural ecosystems behave. In previous work, the team has used bacterial populations to demonstrate how changing the environment in which the microbes live affects the stability of the communities they form.

In this study, the researchers wanted to study what determines whether an invasion by a new species will succeed or fail. In natural communities, ecologists have hypothesized that the more diverse an ecosystem is, the more it will resist an invasion, because most of the ecological niches will already be occupied and few resources are left for an invader.

However, in both natural and experimental systems, scientists have observed that this is not consistently true: While some highly diverse populations are resistant to invasion, other highly diverse populations are more likely to be invaded.

To explore why both of those outcomes can occur, the researchers set up more than 400 communities of soil bacteria, which were all native to the soil around MIT. The researchers established communities of 12 to 20 species of bacteria, and six days later, they added one randomly chosen species as the invader. On the 12th day of the experiment, they sequenced the genomes of all the bacteria to determine if the invader had established itself in the ecosystem.

In each community, the researchers also varied the nutrient levels in the culture medium on which the bacteria were grown. When nutrient levels were high, the microbes displayed strong interactions, characterized by heightened competition for food and other resources, or mutual inhibition through mechanisms such as pH-mediated cross-toxin effects. Some of these populations formed stable states in which the fraction of each microbe did not vary much over time, while others formed communities in which most of the species fluctuated in number.

The researchers found that these fluctuations were the most important factor in the outcome of the invasion. Communities that had more fluctuations tended to be more diverse, but they were also more likely to be invaded successfully.

“The fluctuation is not driven by changes in the environment, but it is internal fluctuation driven by the species interaction. And what we found is that the fluctuating communities are more readily invaded and also more diverse than the stable ones,” Hu says.

In some of the populations where the invader established itself, the other species remained, but in smaller numbers. In other populations, some of the resident species were outcompeted and disappeared completely. This displacement tended to happen more often in ecosystems when there were stronger competitive interactions between species.

In ecosystems that had more stable, less diverse populations, with stronger interactions between species, invasions were more likely to fail.

Regardless of whether the community was stable or fluctuating, the researchers found that the fraction of the original species that survived in the community before invasion predicts the probability of invasion success. This “survival fraction” could be estimated in natural communities by taking the ratio of the diversity within a local community (measured by the number of species in that area) to the regional diversity (number of species found in the entire region).

“It would be exciting to study whether the local and regional diversity could be used to predict susceptibility to invasion in natural communities,” Gore says.

Predicting success

The researchers also found that under certain circumstances, the order in which species arrived in the ecosystem played a role in whether an invasion was successful. When the interactions between species were strong, the chances of a species becoming successfully incorporated went down when that species was introduced after other species have already become established.

When the interactions are weak, this “priority effect” disappears and the same stable equilibrium is reached no matter what order the microbes arrived in.

“Under a strong interaction regime, we found the invader has some disadvantage because it arrived later. This is of interest in ecology because people have always found that in some cases the order in which species arrived matters a lot, while in the other cases it doesn’t matter,” Hu says.

The researchers now plan to try to replicate their findings in ecosystems for which species diversity data is available, including the human gut microbiome. Their formula could allow them to predict the success of probiotic treatment, in which beneficial bacteria are consumed orally, or FMT, an experimental treatment for severe infections such as C. difficile, in which beneficial bacteria from a donor’s stool are transplanted into a patient’s colon.

“Invasions can be harmful or can be good depending on the context,” Hu says. “In some cases, like probiotics, or FMT to treat C. difficile infection, we want the healthy species to invade successfully. Also for soil protection, people introduce probiotics or beneficial species to the soil. In that case people also want the invaders to succeed.”

The research was funded by the Schmidt Polymath Award and the Sloan Foundation.

MIT affiliates awarded 2024 National Medals of Science, Technology

Four MIT faculty members are among 23 world-class researchers who have been awarded the nation’s highest honors for scientists and innovators, the White House announced today.

Angela Belcher and Emery Brown were each presented with the National Medal of Science at a White House ceremony this afternoon, and Paula Hammond ’84, PhD ’93, and Feng Zhang were awarded the National Medal of Technology and Innovation.

Belcher, the James Mason Crafts Professor of Biological Engineering and Materials Science and Engineering and a member of the Koch Institute for Integrative Cancer Research, was honored for her work designing novel materials for applications that include solar cells, batteries, and medical imaging.

Brown, the Edward Hood Taplin Professor of Medical Engineering and Computational Neuroscience, was recognized for work that has revealed how anesthesia affects the brain. Brown is also a member of MIT’s Picower Institute for Learning and Memory and Institute for Medical Engineering and Science (IMES).

Hammond, an MIT Institute Professor, vice provost for faculty, and member of the Koch Institute, was honored for developing methods for assembling thin films that can be used for drug delivery, wound healing, and many other applications.

Zhang, the James and Patricia Poitras Professor of Neuroscience at MIT and a professor of brain and cognitive sciences and biological engineering, was recognized for his work developing molecular tools, including the CRISPR genome-editing system, that have the potential to diagnose and treat disease. Zhang is also an investigator at the McGovern Institute for Brain Research and a core member of the Broad Institute of MIT and Harvard.

Two additional MIT alumni also accepted awards: Richard Lawrence Edwards ’76, a professor at the University of Minnesota, received a National Medal of Science for his work in geochemistry. And Noubar Afeyan PhD ’87 accepted one of two National Medals of Technology and Innovation awarded to an organization. These awards went to the biotechnology companies Moderna, which Afeyan co-founded, and Pfizer, for their development of vaccines for Covid-19.

This year, the White House awarded the National Medal of Science to 14 recipients and named nine individual awardees of the National Medal of Technology and Innovation, along with two organizations. To date, nearly 100 MIT affiliates have won one of these two honors.

“Emery Brown is at the forefront of the Institute’s collaborations among neuroscience, medicine, and patient care. His research has shifted the paradigm for brain monitoring during general anesthesia for surgery. His pioneering approach based on neural oscillations, as opposed to solely monitoring vital signs, promises to revolutionize how anesthesia medications are delivered to patients,” says Nergis Mavalvala, dean of MIT’s School of Science. “Feng Zhang is one of the preeminent researchers in CRISPR technologies that have accelerated the pace of science and engineering, blending entrepreneurship and scientific discovery. These new molecular technologies can modify the cell’s genetic information, engineer vehicles to deliver these tools into the correct cells, and scale to restore organ function. Zhang will apply these life-altering innovations to diseases such as neurodegeneration, immune disorders, and aging.”

Hammond and Belcher are frequent collaborators, and each of them has had significant impact on the fields of nanotechnology and nanomedicine.

“Angela Belcher and Paula Hammond have made tremendous contributions to science and engineering, and I’m thrilled for each of them to receive this well-deserved recognition,” says Anantha Chandrakasan, dean of the School of Engineering and chief innovation and strategy officer at MIT. “By harnessing the processes of nature, Angela’s innovations have impacted fields from energy to the environment to medicine. Her non-invasive imaging system has improved outcomes for patients diagnosed with many types of cancer. Paula’s pioneering research in nanotechnology helped transform the ways in which we deliver and administer drugs within the body — through her technique, therapeutics can be customized and sent directly to specifically targeted cells, including cancer cells.”

Growing materials with viruses

Belcher, who joined the MIT faculty in 2002 and served as head of the Department of Biological Engineering from 2019 to 2023, initially heard that she was being considered for the National Medal of Science in September, and in mid-December, found out she had won.

“It was quite shocking and just a huge honor. It’s an honor to be considered, and then to get the email and the call that I actually was receiving it was humbling,” she says.

Belcher, who earned a bachelor’s degree in creative studies and a PhD in inorganic chemistry from the University of California at Santa Barbara, has focused much of her research on developing ways to use biological systems, such as viruses, to grow materials.

“Since graduate school, I’ve been fascinated with trying to understand how nature makes materials and then applying those processes, whether directly through biological molecules, or through evolving biological molecules or biological organisms, to make materials that are of technological importance,” she says.

Early in her career, she developed a technique for generating materials by engineering viruses to self-assemble into nanoscale scaffolds that can be coated with inorganic materials to form functional devices such as batteries, semiconductors, solar cells, and catalysts. This approach allows for exquisite control over the electronic, optical, and magnetic properties of the material.

In the late 2000s, then-MIT president Susan Hockfield asked Belcher to join the newly formed Koch Institute, whose mission is to bring together scientists and engineers to seek new ways to diagnose and treat cancer. Not knowing much about cancer biology, Belcher was hesitant at first, but she ended up moving her lab to the Koch Institute and applying her work to the new challenge.

One of her first projects, on which she collaborated with Hammond, was a method for using shortwave infrared light to image cancer cells. This technology, eventually commercialized by a company called Cision Vision, is now being used in hospitals to image lymph nodes during cancer surgery, helping them to determine if a tumor has spread.

Belcher is now focused on finding technologies to detect other cancers, especially ovarian cancer, which is difficult to diagnose in early stages, as well as developing cancer vaccines.

Unlocking the mysteries of anesthesia

Brown, who has been on the MIT faculty since 2005, said he was “overjoyed” when he found out he would receive the National Medal of Science.

“I’m extremely excited and quite honored to receive such an award, because it is one of the pinnacles of recognition in the scientific field in the United States,” he says.

Much of Brown’s work has focused on achieving a better understanding of what happens in the human brain under anesthesia. Trained as an anesthesiologist, Brown earned his MD from Harvard Medical School and a PhD in statistics from Harvard University.

Since 1992, he has been a member of the Harvard Medical School faculty and a practicing anesthesiologist at Massachusetts General Hospital. Early in his research career, he worked on developing methods to characterize the properties of the human circadian clock. These included characterizing the clock’s phase response curve to light, accurately measuring its intrinsic period, and measuring the impact of physiologically designed schedules on shift worker performance. Later, he became interested in developing signal processing methods to characterize how neurons represent signals and stimuli in their ensemble activity.

In collaboration with Matt Wilson, an MIT professor of neuroscience, Brown devised algorithms to decode the position of an animal in its environment by reading the activity of a small group of place cell neurons in the animal’s brain. Other applications of these methods included characterizing learning, controlling brain-machine interfaces, and controlling brain states such as medically induced coma.

“I was practicing anesthesia at the time, and as I saw more and more of what the neuroscientists were doing, it occurred to me we could use their paradigms to study anesthesia, and we should, because we weren’t doing that,” he says. “Anesthesia was not being looked at as a neuroscience subdiscipline. It was looked at as a subdiscipline of pharmacology.”

Over the past two decades, Brown’s work has revealed how anesthesia drugs induce unconsciousness in the brain, along with other altered arousal states. Anesthesia drugs such as propofol dramatically alter the brain’s intrinsic oscillations. These oscillations can be seen with electroencephalography (EEG). During the awake state, these oscillations usually have high frequency and low amplitude, but as anesthetic drugs are given, they shift generally to low frequency, high amplitude. Working with MIT professors Earl Miller and Ila Fiete, as well as collaborators at Massachusetts General Hospital and Boston University, Brown has shown that these changes disrupt normal communication between different brain regions, leading to loss of consciousness.

Brown has also shown that these EEG oscillations can be used to monitor whether a patient is too deeply unconscious, and he has developed a closed-loop anesthesia delivery system that can maintain a patient’s anesthesia state at precisely desired levels. Brown and colleagues have also developed methods to accelerate recovery from anesthesia. More precise control and accelerated recovery could help to prevent the cognitive impairments that often affect patients after they emerge from anesthesia. Accelerating recovery from anesthesia has also suggested ways to accelerate recovery from coma.

Building multifunctional materials

Hammond, who earned both her bachelor’s degree and PhD in chemical engineering from MIT, has been a member of the faculty since 1995 and was named an Institute Professor in 2021. She was also the 2023-24 recipient of MIT’s Killian Award, the highest honor that the faculty bestows.

Early in her career, Hammond developed a novel technique for generating functional thin-film materials by stacking layers of charged polymeric materials. This approach can be used to build polymers with highly controlled architectures by alternately exposing a surface to positively and negatively charged particles.

She initially used this layer-by-layer assembly technique to build ultrathin batteries and fuel cell electrodes, before turning her attention to biomedical applications. To adapt the films for drug delivery, she came up with ways to incorporate drug molecules into the layers of the film. These molecules are then released when the particles reach their targets.

“We began to look at bioactive materials and how we could sandwich them into these layers and use that as a way to deliver the drug in a very controlled fashion, at the right time and in the right place,” she says. “We are using the layering as a way to modify the surface of a nanoparticle so that there is a very high and selective affinity for the cancer cells we’re targeting.”

Using this technique, she has created drug-delivery nanoparticles that are coated with molecules that specifically target cancer cells, with a particular focus on ovarian cancer. These particles can be tailored to carry chemotherapy drugs such as cisplatin, immunotherapy agents, or nucleic acids such as messenger RNA.

Working with colleagues around MIT, she has also developed materials that can be used to promote wound healing, blood clotting, and tissue regeneration.

“What we have found is that these layers are very versatile. They can coat a very broad range of substrates, and those substrates can be anything from a bone implant, which can be quite large, down to a nanoparticle, which is 100 nanometers,” she says.

Designing molecular tools

Zhang, who earned his undergraduate degree from Harvard University in 2004, has contributed to the development of multiple molecular tools to accelerate the understanding of human disease. While a graduate student at Stanford University, from which he received his PhD in 2009, Zhang worked in the lab of Professor Karl Deisseroth. There, he worked on a protein called channelrhodopsin, which he and Deisseroth believed held potential for engineering mammalian cells to respond to light.

The resulting technique, known as optogenetics, is now used widely used in neuroscience and other fields. By engineering neurons to express light-sensitive proteins such as channelrhodopsin, researchers can either stimulate or silence the cells’ electrical impulses by shining different wavelengths of light on them. This has allowed for detailed study of the roles of specific populations of neurons in the brain, and the mapping of neural circuits that control a variety of behaviors.

In 2011, about a month after joining the MIT faculty, Zhang attended a talk by Harvard Medical School Professor Michael Gilmore, who studies the pathogenic bacterium Enteroccocus. The scientist mentioned that these bacteria protect themselves from viruses with DNA-cutting enzymes known as nucleases, which are part of a defense system known as CRISPR.

“I had no idea what CRISPR was, but I was interested in nucleases,” Zhang told MIT News in 2016. “I went to look up CRISPR, and that’s when I realized you might be able to engineer it for use for genome editing.”

In January 2013, Zhang and members of his lab reported that they had successfully used CRISPR to edit genes in mammalian cells. The CRISPR system includes a nuclease called Cas9, which can be directed to cut a specific genetic target by RNA molecules known as guide strands.

Since then, scientists in fields from medicine to plant biology have used CRISPR to study gene function and investigate the possibility of correcting faulty genes that cause disease. More recently, Zhang’s lab has devised many enhancements to the original CRISPR system, such as making the targeting more precise and preventing unintended cuts in the wrong locations.

The National Medal of Science was established in 1959 and is administered for the White House by the National Science Foundation. The medal recognizes individuals who have made outstanding contributions to science and engineering.

The National Medal of Technology and Innovation was established in 1980 and is administered for the White House by the U.S. Department of Commerce’s Patent and Trademark Office. The award recognizes those who have made lasting contributions to America’s competitiveness and quality of life and helped strengthen the nation’s technological workforce.

An abundant phytoplankton feeds a global network of marine microbes

One of the hardest-working organisms in the ocean is the tiny, emerald-tinged Prochlorococcus marinus. These single-celled “picoplankton,” which are smaller than a human red blood cell, can be found in staggering numbers throughout the ocean’s surface waters, making Prochlorococcus the most abundant photosynthesizing organism on the planet. (Collectively, Prochlorococcus fix as much carbon as all the crops on land.) Scientists continue to find new ways that the little green microbe is involved in the ocean’s cycling and storage of carbon.

Now, MIT scientists have discovered a new ocean-regulating ability in the small but mighty microbes: cross-feeding of DNA building blocks. In a study appearing today in Science Advances, the team reports that Prochlorococcus shed these extra compounds into their surroundings, where they are then “cross-fed,” or taken up by other ocean organisms, either as nutrients, energy, or for regulating metabolism. Prochlorococcus’ rejects, then, are other microbes’ resources.

What’s more, this cross-feeding occurs on a regular cycle: Prochlorococcus tend to shed their molecular baggage at night, when enterprising microbes quickly consume the cast-offs. For a microbe called SAR11, the most abundant bacteria in the ocean, the researchers found that the nighttime snack acts as a relaxant of sorts, forcing the bacteria to slow down their metabolism and effectively recharge for the next day.

Through this cross-feeding interaction, Prochlorococcus could be helping many microbial communities to grow sustainably, simply by giving away what it doesn’t need. And they’re doing so in a way that could set the daily rhythms of microbes around the world.

“The relationship between the two most abundant groups of microbes in ocean ecosystems has intrigued oceanographers for years,” says co-author and MIT Institute Professor Sallie “Penny” Chisholm, who played a role in the discovery of Prochlorococcus in 1986. “Now we have a glimpse of the finely tuned choreography that contributes to their growth and stability across vast regions of the oceans.”

Given that Prochlorococcus and SAR11 suffuse the surface oceans, the team suspects that the exchange of molecules from one to the other could amount to one of the major cross-feeding relationships in the ocean, making it an important regulator of the ocean carbon cycle.

“By looking at the details and diversity of cross-feeding processes, we can start to unearth important forces that are shaping the carbon cycle,” says the study’s lead author, Rogier Braakman, a research scientist in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS).

Other MIT co-authors include Brandon Satinsky, Tyler O’Keefe, Shane Hogle, Jamie Becker, Robert Li, Keven Dooley, and Aldo Arellano, along with Krista Longnecker, Melissa Soule, and Elizabeth Kujawinski of Woods Hole Oceanographic Institution (WHOI).

Spotting castaways

Cross-feeding occurs throughout the microbial world, though the process has mainly been studied in close-knit communities. In the human gut, for instance, microbes are in close proximity and can easily exchange and benefit from shared resources.

By comparison, Prochlorococcus are free-floating microbes that are regularly tossed and mixed through the ocean’s surface layers. While scientists assume that the plankton are involved in some amount of cross-feeding, exactly how this occurs, and who would benefit, have historically been challenging to probe; any stuff that Prochlorococcus cast away would have vanishingly low concentrations,and be exceedingly difficult to measure.

But in work published in 2023, Braakman teamed up with scientists at WHOI, who pioneered ways to measure small organic compounds in seawater. In the lab, they grew various strains of Prochlorococcus under different conditions and characterized what the microbes released. They found that among the major “exudants,” or released molecules, were purines and pyridines, which are molecular building blocks of DNA. The molecules also happen to be nitrogen-rich — a fact that puzzled the team. Prochlorococcus are mainly found in ocean regions that are low in nitrogen, so it was assumed they’d want to retain any and all nitrogen-containing compounds they can. Why, then, were they instead throwing such compounds away?

Global symphony

In their new study, the researchers took a deep dive into the details of Prochlorococcus’ cross-feeding and how it influences various types of ocean microbes.

They set out to study how Prochlorococcus use purine and pyridine in the first place, before expelling the compounds into their surroundings. They compared published genomes of the microbes, looking for genes that encode purine and pyridine metabolism. Tracing the genes forward through the genomes, the team found that once the compounds are produced, they are used to make DNA and replicate the microbes’ genome. Any leftover purine and pyridine is recycled and used again, though a fraction of the stuff is ultimately released into the environment. Prochlorococcus appear to make the most of the compounds, then cast off what they can’t.

The team also looked to gene expression data and found that genes involved in recycling purine and pyrimidine peak several hours after the recognized peak in genome replication that occurs at dusk. The question then was: What could be benefiting from this nightly shedding?

For this, the team looked at the genomes of more than 300 heterotrophic microbes — organisms that consume organic carbon rather than making it themselves through photosynthesis. They suspected that such carbon-feeders could be likely consumers of Prochlorococcus’ organic rejects. They found most of the heterotrophs contained genes that take up either purine or pyridine, or in some cases, both, suggesting microbes have evolved along different paths in terms of how they cross-feed.

The group zeroed in on one purine-preferring microbe, SAR11, as it is the most abundant heterotrophic microbe in the ocean. When they then compared the genes across different strains of SAR11, they found that various types use purines for different purposes, from simply taking them up and using them intact to breaking them down for their energy, carbon, or nitrogen. What could explain the diversity in how the microbes were using Prochlorococcus’ cast-offs?

It turns out the local environment plays a big role. Braakman and his collaborators performed a metagenome analysis in which they compared the collectively sequenced genomes of all microbes in over 600 seawater samples from around the world, focusing on SAR11 bacteria. Metagenome sequences were collected alongside measurements of various environmental conditions and geographic locations in which they are found. This analysis showed that the bacteria gobble up purine for its nitrogen when the nitrogen in seawater is low, and for its carbon or energy when nitrogen is in surplus — revealing the selective pressures shaping these communities in different ocean regimes.

“The work here suggests that microbes in the ocean have developed relationships that advance their growth potential in ways we don’t expect,” says co-author Kujawinski.

Finally, the team carried out a simple experiment in the lab, to see if they could directly observe a mechanism by which purine acts on SAR11. They grew the bacteria in cultures, exposed them to various concentrations of purine, and unexpectedly found it causes them to slow down their normal metabolic activities and even growth. However, when the researchers put these same cells under environmentally stressful conditions, they continued growing strong and healthy cells, as if the metabolic pausing by purines helped prime them for growth, thereby avoiding the effects of the stress.

“When you think about the ocean, where you see this daily pulse of purines being released by Prochlorococcus, this provides a daily inhibition signal that could be causing a pause in SAR11 metabolism, so that the next day when the sun comes out, they are primed and ready,” Braakman says. “So we think Prochlorococcus is acting as a conductor in the daily symphony of ocean metabolism, and cross-feeding is creating a global synchronization among all these microbial cells.”

This work was supported, in part, by the Simons Foundation and the National Science Foundation.