Using fluorescent labels that switch on and off, MIT engineers can study how molecules in a cell interact…
New Optimization Framework for Robot Motion Planning – Technology Org
It isn’t easy for a robot to find its way out of a maze. Picture these machines trying…
‘Placenta-on-a-Chip’ With Sensing, Imaging Technology – Technology Org
A research poster dated Dec. 9, 2015, hangs just outside Nicole Hashemi’s Iowa State University laboratory. It introduces…
Telescope Detects Second Highest-Energy Cosmic Ray Ever – Technology Org
In 1991, the University of Utah Fly’s Eye experiment detected the highest-energy cosmic ray ever observed. Later dubbed…
Boosting rocket reliability at the material level – Technology Org
The success of the SpaceX Falcon 9 reusable launch vehicle has been one of the most remarkable technological…
Using LLMs to code new tasks for robots – Technology Org
You’ve likely heard that “experience is the best teacher” — but what if learning in the real world…
A Novel Lightweight Wearable Device to Perform Balance Exercises at Home – Technology Org
Maintaining balance and posture is quite a complex skill, even though it comes naturally to most people. However,…
Q&A: Phillip Sharp and Amy Brand on the future of open-access publishing
Providing open access to scholarly publications is a long-running issue with new developments on the horizon. Last year, the U.S. federal government’s Office of Science and Technology Policy mandated that starting in 2026 publishers must provide open access to publications stemming from federal funding. That provides more impetus for the open-access movement in academia.
Meanwhile, other trends are changing academic publishing, including consolidation of journal titles and provision of access by having authors (and their home institutions) pay for publication costs. With these developments unfolding, a group of MIT scholars is releasing a new white paper about academic open-access publishing. The paper gathers information, identifies outstanding questions, and calls for further research and data to inform policy on the subject.
The group was chaired by Institute Professor Emeritus Phillip A. Sharp, of the Department of Biology and Koch Institute of Integrative Cancer Research, who co-authored the report along with William B. Bonvillian, senior director of special projects at MIT Open Learning; Robert Desimone, director of the McGovern Institute for Brain Research; Barbara Imperiali, the Class of 1922 Professor of Biology; David R. Karger, professor of electrical engineering; Clapperton Chakanetsa Mavhunga, professor of science, technology, and society; Amy Brand, director and publisher of the MIT Press; Nick Lindsay, director for journals and open access at MIT Press; and Michael Stebbins of Science Advisors, LLC.
MIT News spoke with Sharp and Brand about the state of open-access publishing.
Q: What are the key benefits of open access, as you see it?
Amy Brand: As an academic publisher running the MIT Press, we have embraced open access in both books and journals for a long time because it is our mission to support our authors and get their research out into the world. Whether it’s completely removing paywalls and barriers, or keeping prices low, we do whatever we can to disseminate the content that we publish. Even before we were talking about federal policies, this was a priority at the MIT Press.
Phillip Sharp: As a scientist, I’m interested in having my research make the largest impact it can, to help solve some of the challenges of society. And open access, making research available to people around the world, is an important aspect of that. But the quality of research is dependent upon peer review. So, I think open access policies need to be considered and promoted in the context of a very valuable and vigorous peer-review publication process.
Q: What are the key elements of this report?
Brand: The first part of the report is a history of open access, and the second part is a list of questions driving toward evidence-based policy. On the one hand, there are questions such as: How does policy impact the day-to-day work of researchers and their students? What are the impacts on the lab? Other questions have to do with the impacts on the publishing industry. One reason I was invested in doing this is concerns about the impact on nonprofit publishers, on university presses, on scientific societies that publish. Some of the questions we raise have to do with understanding the impact on smaller, nonprofit publishers and ultimately knowing how to protect their viability.
Sharp: The current policies for open access being required by OSTP’s Nelson Memo dramatically change who is paying for publication, where the resources come from for publication. It puts a lot of emphasis on the research institute or other sources to cover that. And that raises another issue in open access: Will this limit publications from researchers at institutes that cannot afford the charge? The scientific community is very international, and the impact of science in many countries is incredibly important. So dealing with the [impact of] open access is something that needs to be developed with evidence and policy.
The report notes that if open access was covered by an institution for all publications at $3,000 per article, MIT’s total cost would be $25 million per year. That’s going to be a challenge. And if it’s a challenge for MIT, it’s going to be an enormous challenge in a number of other places.
Q: What are some additional points about open access that we should keep in mind?
Brand: The Nelson Memo also provides that self-archiving is one of the ways to comply with the policy — which means authors can take an earlier version of an article and put it in an institutional repository. Here at MIT we have the DSpace repository that contains many of the papers that faculty publish. The economics of that are very different, and it’s also a little unclear how that’s going to play out. We recently saw one scientific society decide to implement a charge around that, something the community has never seen before.
But as we essentially have a system that already creates incentives for publishers to increase these article processing charges, the publication charges, there are a lot of questions about how publishers who do high-quality peer review will be sustained, and where that money is going to come from.
Sharp: When you come to the data side of the issue, it’s complicated because of the value of the data itself. It’s important that data is collected and has metadata about the research process that’s been made available to others. It’s also time to talk about this in the academic community.
Q: The report makes clear that there are multiple trends here: consolidation in for-profit publishing, growth of open-access publications, fiscal pressure on university libraries, and now the federal mandate. Complicated as the present may be, it does seem that MIT wants to look ahead on this issue.
Brand: I do think in the publishing community, and certainly in the university press community, we’ve been way out in front on this for a while, and with some of the business models we helped implement and test and create, we’re finding other publishers are following suit and they are interested. But right now, with the new federal policy, most publishers have no choice but to begin asking: What does sustainable high-quality publishing mean if, as a publisher, I have to distribute all or some of this content in open digital form?
Sharp: The purpose of this report is to stimulate that conversation: more numbers, every bit of evidence. Communities have been responsible for the quality of science in different disciplines, and sharing the repsonsbility of peer review is something that motivates a lot of engagement. Sustaining that is important for the discipline. Without that sustainability, there will be slower progress in science, in my opinion.
With a quantum “squeeze,” clocks could keep even more precise time, MIT researchers propose
The practice of keeping time hinges on stable oscillations. In a grandfather clock, the length of a second is marked by a single swing of the pendulum. In a digital watch, the vibrations of a quartz crystal mark much smaller fractions of time. And in atomic clocks, the world’s state-of-the-art timekeepers, the oscillations of a laser beam stimulate atoms to vibrate at 9.2 billion times per second. These smallest, most stable divisions of time set the timing for today’s satellite communications, GPS systems, and financial markets.
A clock’s stability depends on the noise in its environment. A slight wind can throw a pendulum’s swing out of sync. And heat can disrupt the oscillations of atoms in an atomic clock. Eliminating such environmental effects can improve a clock’s precision. But only by so much.
A new MIT study finds that even if all noise from the outside world is eliminated, the stability of clocks, laser beams, and other oscillators would still be vulnerable to quantum mechanical effects. The precision of oscillators would ultimately be limited by quantum noise.
But in theory, there’s a way to push past this quantum limit. In their study, the researchers also show that by manipulating, or “squeezing,” the states that contribute to quantum noise, the stability of an oscillator could be improved, even past its quantum limit.
“What we’ve shown is, there’s actually a limit to how stable oscillators like lasers and clocks can be, that’s set not just by their environment, but by the fact that quantum mechanics forces them to shake around a little bit,” says Vivishek Sudhir, assistant professor of mechanical engineering at MIT. “Then, we’ve shown that there are ways you can even get around this quantum mechanical shaking. But you have to be more clever than just isolating the thing from its environment. You have to play with the quantum states themselves.”
The team is working on an experimental test of their theory. If they can demonstrate that they can manipulate the quantum states in an oscillating system, the researchers envision that clocks, lasers, and other oscillators could be tuned to super-quantum precision. These systems could then be used to track infinitesimally small differences in time, such as the fluctuations of a single qubit in a quantum computer or the presence of a dark matter particle flitting between detectors.
“We plan to demonstrate several instances of lasers with quantum-enhanced timekeeping ability over the next several years,” says Hudson Loughlin, a graduate student in MIT’s Department of Physics. “We hope that our recent theoretical developments and upcoming experiments will advance our fundamental ability to keep time accurately, and enable new revolutionary technologies.”
Loughlin and Sudhir detail their work in an open-access paper published in the journal Nature Communications.
Laser precision
In studying the stability of oscillators, the researchers looked first to the laser — an optical oscillator that produces a wave-like beam of highly synchronized photons. The invention of the laser is largely credited to physicists Arthur Schawlow and Charles Townes, who coined the name from its descriptive acronym: light amplification by stimulated emission of radiation.
A laser’s design centers on a “lasing medium” — a collection of atoms, usually embedded in glass or crystals. In the earliest lasers, a flash tube surrounding the lasing medium would stimulate electrons in the atoms to jump up in energy. When the electrons relax back to lower energy, they give off some radiation in the form of a photon. Two mirrors, on either end of the lasing medium, reflect the emitted photon back into the atoms to stimulate more electrons, and produce more photons. One mirror, together with the lasing medium, acts as an “amplifier” to boost the production of photons, while the second mirror is partially transmissive and acts as a “coupler” to extract some photons out as a concentrated beam of laser light.
Since the invention of the laser, Schawlow and Townes put forth a hypothesis that a laser’s stability should be limited by quantum noise. Others have since tested their hypothesis by modeling the microscopic features of a laser. Through very specific calculations, they showed that indeed, imperceptible, quantum interactions among the laser’s photons and atoms could limit the stability of their oscillations.
“But this work had to do with extremely detailed, delicate calculations, such that the limit was understood, but only for a specific kind of laser,” Sudhir notes. “We wanted to enormously simplify this, to understand lasers and a wide range of oscillators.”
Putting the “squeeze” on
Rather than focus on a laser’s physical intricacies, the team looked to simplify the problem.
“When an electrical engineer thinks of making an oscillator, they take an amplifier, and they feed the output of the amplifier into its own input,” Sudhir explains. “It’s like a snake eating its own tail. It’s an extremely liberating way of thinking. You don’t need to know the nitty gritty of a laser. Instead, you have an abstract picture, not just of a laser, but of all oscillators.”
In their study, the team drew up a simplified representation of a laser-like oscillator. Their model consists of an amplifier (such as a laser’s atoms), a delay line (for instance, the time it takes light to travel between a laser’s mirrors), and a coupler (such as a partially reflective mirror).
The team then wrote down the equations of physics that describe the system’s behavior, and carried out calculations to see where in the system quantum noise would arise.
“By abstracting this problem to a simple oscillator, we can pinpoint where quantum fluctuations come into the system, and they come in in two places: the amplifier and the coupler that allows us to get a signal out of the oscillator,” Loughlin says. “If we know those two things, we know what the quantum limit on that oscillator’s stability is.”
Sudhir says scientists can use the equations they lay out in their study to calculate the quantum limit in their own oscillators.
What’s more, the team showed that this quantum limit might be overcome, if quantum noise in one of the two sources could be “squeezed.” Quantum squeezing is the idea of minimizing quantum fluctuations in one aspect of a system at the expense of proportionally increasing fluctuations in another aspect. The effect is similar to squeezing air from one part of a balloon into another.
In the case of a laser, the team found that if quantum fluctuations in the coupler were squeezed, it could improve the precision, or the timing of oscillations, in the outgoing laser beam, even as noise in the laser’s power would increase as a result.
“When you find some quantum mechanical limit, there’s always some question of how malleable is that limit?” Sudhir says. “Is it really a hard stop, or is there still some juice you can extract by manipulating some quantum mechanics? In this case, we find that there is, which is a result that is applicable to a huge class of oscillators.”
This research is supported, in part, by the National Science Foundation.
AWS and NVIDIA Announce New Strategic Partnership
In a notable announcement at AWS re:Invent, Amazon Web Services (AWS) and NVIDIA unveiled a major expansion of their strategic collaboration, setting a new benchmark in the realm of generative AI. This partnership represents a pivotal moment in the field, marrying AWS’s robust cloud infrastructure with…