An interstellar instrument takes a final bow

They planned to fly for four years and to get as far as Jupiter and Saturn. But nearly half a century and 15 billion miles later, NASA’s twin Voyager spacecraft have far exceeded their original mission, winging past the outer planets and busting out of our heliosphere, beyond the influence of the sun. The probes are currently making their way through interstellar space, traveling farther than any human-made object.

Along their improbable journey, the Voyagers made first-of-their-kind observations at all four giant outer planets and their moons using only a handful of instruments, including MIT’s Plasma Science Experiments — identical plasma sensors that were designed and built in the 1970s in Building 37 by MIT scientists and engineers.

The Plasma Science Experiment (also known as the Plasma Spectrometer, or PLS for short) measured charged particles in planetary magnetospheres, the solar wind, and the interstellar medium, the material between stars. Since launching on the Voyager 2 spacecraft in 1977, the PLS has revealed new phenomena near all the outer planets and in the solar wind across the solar system. The experiment played a crucial role in confirming the moment when Voyager 2 crossed the heliosphere and moved outside of the sun’s regime, into interstellar space.

Now, to conserve the little power left on Voyager 2 and prolong the mission’s life, the Voyager scientists and engineers have made the decision to shut off MIT’s Plasma Science Experiment. It’s the first in a line of science instruments that will progressively blink off over the coming years. On Sept. 26, the Voyager 2 PLS sent its last communication from 12.7 billion miles away, before it received the command to shut down.

MIT News spoke with John Belcher, the Class of 1922 Professor of Physics at MIT, who was a member of the original team that designed and built the plasma spectrometers, and John Richardson, principal research scientist at MIT’s Kavli Institute for Astrophysics and Space Research, who is the experiment’s principal investigator. Both Belcher and Richardson offered their reflections on the retirement of this interstellar piece of MIT history.

Q: Looking back at the experiment’s contributions, what are the greatest hits, in terms of what MIT’s Plasma Spectrometer has revealed about the solar system and interstellar space?

Richardson: A key PLS finding at Jupiter was the discovery of the Io torus, a plasma donut surrounding Jupiter, formed from sulphur and oxygen from Io’s volcanos (which were discovered in Voyager images). At Saturn, PLS found a magnetosphere full of water and oxygen that had been knocked off of Saturn’s icy moons. At Uranus and Neptune, the tilt of the magnetic fields led to PLS seeing smaller density features, with Uranus’ plasma disappearing near the planet. Another key PLS observation was of the termination shock, which was the first observation of the plasma at the largest shock in the solar system, where the solar wind stopped being supersonic. This boundary had a huge drop in speed and an increase in the density and temperature of the solar wind. And finally, PLS documented Voyager 2’s crossing of the heliopause by detecting a stopping of outward-flowing plasma. This signaled the end of the solar wind and the beginning of the local interstellar medium (LISM). Although not designed to measure the LISM, PLS constantly measured the interstellar plasma currents beyond the heliosphere. It is very sad to lose this instrument and data!

Belcher: It is important to emphasize that PLS was the result of decades of development by MIT Professor Herbert Bridge (1919-1995) and Alan Lazarus (1931-2014). The first version of the instrument they designed was flown on Explorer 10 in 1961. And the most recent version is flying on the Solar Probe, which is collecting measurements very close to the sun to understand the origins of solar wind. Bridge was the principal investigator for plasma probes on spacecraft which visited the sun and every major planetary body in the solar system.

Q: During their tenure aboard the Voyager probes, how did the plasma sensors do their job over the last 47 years?

Richardson: There were four Faraday cup detectors designed by Herb Bridge that measured currents from ions and electrons that entered the detectors. By measuring these particles at different energies, we could find the plasma velocity, density, and temperature in the solar wind and in the four planetary magnetospheres Voyager encountered. Voyager data were (and are still) sent to Earth every day and received by NASA’s deep space network of antennae. Keeping two 1970s-era spacecraft going for 47 years and counting has been an amazing feat of JPL engineering prowess — you can google the most recent rescue when Voyager 1 lost some memory in November of 2023 and stopped sending data. JPL figured out the problem and was able to reprogram the flight data system from 15 billion miles away, and all is back to normal now. Shutting down PLS involves sending a command which will get to Voyager 2 about 19 hours later, providing the rest of the spacecraft enough power to continue.

Q: Once the plasma sensors have shut down, how much more could Voyager do, and how far might it still go?

Richardson: Voyager will still measure the galactic cosmic rays, magnetic fields, and plasma waves. The available power decreases about 4 watts per year as the plutonium which powers them decays. We hope to keep some of the instruments running until the mid-2030s, but that will be a challenge as power levels decrease.

Belcher: Nick Oberg at the Kapteyn Astronomical Institute in the Netherlands has made an exhaustive study of the future of the spacecraft, using data from the European Space Agency’s spacecraft Gaia. In about 30,000 years, the spacecraft will reach the distance to the nearest stars. Because space is so vast, there is zero chance that the spacecraft will collide directly with a star in the lifetime of the universe. However, the spacecraft surface will erode by microcollisions with vast clouds of interstellar dust, but this happens very slowly. 

In Oberg’s estimate, the Golden Records [identical records that were placed aboard each probe, that contain selected sounds and images to represent life on Earth] are likely to survive for a span of over 5 billion years. After those 5 billion years, things are difficult to predict, since at this point, the Milky Way will collide with its massive neighbor, the Andromeda galaxy. During this collision, there is a one in five chance that the spacecraft will be flung into the intergalactic medium, where there is little dust and little weathering. In that case, it is possible that the spacecraft will survive for trillions of years. A trillion years is about 100 times the current age of the universe. The Earth ceases to exist in about 6 billion years, when the sun enters its red giant phase and engulfs it.

In a “poor man’s” version of the Golden Record, Robert Butler, the chief engineer of the Plasma Instrument, inscribed the names of the MIT engineers and scientists who had worked on the spacecraft on the collector plate of the side-looking cup. Butler’s home state was New Hampshire, and he put the state motto, “Live Free or Die,” at the top of the list of names. Thanks to Butler, although New Hampshire will not survive for a trillion years, its state motto might. The flight spare of the PLS instrument is now displayed at the MIT Museum, where you can see the text of Butler’s message by peering into the side-looking sensor. 

Those Non-Design Technologies Web Designers Need to Know – Speckyboy

We call ourselves web designers and developers. However, the job often goes beyond those narrow margins.

Freelancers and small agencies deal with a range of non-design and coding issues. We become the first person our clients contact when they have a question. It happens – even when we aren’t directly involved with the subject matter.

  • I just received this message from Google. What does it mean?
  • Why can’t I receive email from my website?
  • My website was hacked. Help!

Yes, we are the catch-all technical support representatives. No matter the problem, web designers are the solution. That’s what some clients think, at least.

We’re often the link between clients and technology. And perhaps we shouldn’t try to tackle every problem. But it wouldn’t hurt to brush up on a few non-design technologies.

With that in mind, here are a few areas that web designers should study. You know, just in case.


SEO & Site Indexing Basics

Search engine optimization (SEO) is a niche unto itself. Some professionals specialize in making sure websites are indexed and rank well.

That doesn’t stop clients from asking their web designer, though. Site owners want to rank highly in Google search results. And they are often in the dark about how to do it.

To that end, it’s worth learning the basics of SEO. Even if the subject makes your skin crawl.

You’ll be able to explain the hows and whys to clients. That will help them make more informed decisions about content. They may decide to jump in feet first with an SEO professional.

Clients will ask you about SEO. A little background knowledge makes you look smart!

SEO Resources

Understanding how search engines work can benefit you and your clients.

DNS & Email Delivery

Launching or moving a website often includes changing a domain’s DNS settings. These settings ensure that the site directs users to the right place.

DNS is much more than that, though. There are also settings for configuring email as well. That has become a hot topic these days.

Email providers are increasingly requiring domain owners to verify their properties. Domains without DKIM, DMARC, or SPF records may have email delivery issues. For example, Gmail blocks email from unauthenticated domains.

What does this have to do with web design? Well, websites with contact forms can fall victim to these issues. The same goes for eCommerce websites. An unauthenticated domain means clients and users will miss these emails.

Now is the time to learn how DNS works. You’ll want to pay special attention to email. Clients without an IT department may need your help ensuring smooth email delivery.

DNS & Email Resources

Email deliverability issues can be prevented by adding domain verification records.

Security for Websites and Beyond

We live in an age of online insecurity. Malicious actors don’t take a minute off. Instead, they continue to wreak havoc.

Sure, we talk about web security quite a bit. And we try our best to build a virtual mote around websites. But websites are still being compromised.

We’re learning that security goes deeper than installing updates or tweaking .htaccess files. The fitness of a user’s device also plays a role.

Stolen session cookies are a prime example. Hackers can grab them off of a compromised device. A “bulletproof” website is no match for a phone with an info stealer installed. They can waltz right in and do whatever they want.

Understanding how device security impacts the web is crucial. It’s something that can benefit us and our clients. After all, a single weak link can break the chain.

Website Security Resources

Websites are under a constant threat from hackers.

Command Line Tools

Some of us cringe at the mere thought of using a command line tool. Hasn’t that stuff gone the way of the dinosaur?

Nothing could be further from the truth. Command line tools like WordPress CLI remain popular. Why is that? It’s all about power and efficiency.

The command line doesn’t have the overhead of a graphical user interface (GUI). Thus, it handles bulk operations faster. For example, you can perform a search-and-replace operation on a database more quickly.

You can also do a lot of behind-the-scenes work with your web server. The command line may be the only way to run specific tasks.

It’s worth brushing up on command-line operations. They are a huge time saver in the right circumstances.

Command Line Resources

Command line tools are still a popular way to perform tasks.

Become a More Well-Rounded Web Designer

The skills above are all adjacent to web design. And the need for this knowledge is growing.

Perhaps that has always been the case with SEO. Meanwhile, security and DNS seem to be just about mandatory these days.

Working with clients means you inevitably will face questions about these subjects. Freelancers and small agencies don’t always have an expert within reach. So, it’s up to us to find answers.

The command line is more about adding another tool to your toolbox. The improved efficiency will benefit you. And the result is better service for your clients.

Web designers tend to be specialists. We focus on the front-end or back-end. But the more we know, the more well-rounded we become.

It’s one way to stay on the cutting edge of the industry for years to come.

Related Topics


Top

How Google Outranks Medium.com Plagiarized Content Ahead of Original Content

For years, Google has emphasized to the webmaster community that original, high-quality content is key to ranking well in search results. Google’s systems aim to reward content that demonstrates E-E-A-T (expertise, experience, authoritativeness, and trustworthiness), regardless of how the content is produced. This focus on quality…

Q&A: A new initiative to help strengthen democracy

In the United States and around the world, democracy is under threat. Anti-democratic attitudes have become more prevalent, partisan polarization is growing, misinformation is omnipresent, and politicians and citizens sometimes question the integrity of elections. 

With this backdrop, the MIT Department of Political Science is launching an effort to establish a Strengthening Democracy Initiative. In this Q&A, department head David Singer, the Raphael Dorman-Helen Starbuck Professor of Political Science, discusses the goals and scope of the initiative.

Q: What is the purpose of the Strengthening Democracy Initiative?

A: Well-functioning democracies require accountable representatives, accurate and freely available information, equitable citizen voice and participation, free and fair elections, and an abiding respect for democratic institutions. It is unsettling for the political science community to see more and more evidence of democratic backsliding in Europe, Latin America, and even here in the U.S. While we cannot single-handedly stop the erosion of democratic norms and practices, we can focus our energies on understanding and explaining the root causes of the problem, and devising interventions to maintain the healthy functioning of democracies.

MIT political science has a history of generating important research on many facets of the democratic process, including voting behavior, election administration, information and misinformation, public opinion and political responsiveness, and lobbying. The goals of the Strengthening Democracy Initiative are to place these various research programs under one umbrella, to foster synergies among our various research projects and between political science and other disciplines, and to mark MIT as the country’s leading center for rigorous, evidence-based analysis of democratic resiliency.

Q: What is the initiative’s research focus?

A: The initiative is built upon three research pillars. One pillar is election science and administration. Democracy cannot function without well-run elections and, just as important, popular trust in those elections. Even within the U.S., let alone other countries, there is tremendous variation in the electoral process: whether and how people register to vote, whether they vote in person or by mail, how polling places are run, how votes are counted and validated, and how the results are communicated to citizens.

The MIT Election Data and Science Lab is already the country’s leading center for the collection and analysis of election-related data and dissemination of electoral best practices, and it is well positioned to increase the scale and scope of its activities.

The second pillar is public opinion, a rich area of study that includes experimental studies of public responses to misinformation and analyses of government responsiveness to mass attitudes. Our faculty employ survey and experimental methods to study a range of substantive areas, including taxation and health policy, state and local politics, and strategies for countering political rumors in the U.S. and abroad. Faculty research programs form the basis for this pillar, along with longstanding collaborations such as the Political Experiments Research Lab, an annual omnibus survey in which students and faculty can participate, and frequent conferences and seminars.

The third pillar is political participation, which includes the impact of the criminal justice system and other negative interactions with the state on voting, the creation of citizen assemblies, and the lobbying behavior of firms on Congressional legislation. Some of this research relies on machine learning and AI to cull and parse an enormous amount of data, giving researchers visibility into phenomena that were previously difficult to analyze. A related research area on political deliberation brings together computer science, AI, and the social sciences to analyze the dynamics of political discourse in online forums and the possible interventions that can attenuate political polarization and foster consensus.

The initiative’s flexible design will allow for new pillars to be added over time, including international and homeland security, strengthening democracies in different regions of the world, and tackling new challenges to democratic processes that we cannot see yet.

Q: Why is MIT well-suited to host this new initiative?

A: Many people view MIT as a STEM-focused, highly technical place. And indeed it is, but there is a tremendous amount of collaboration across and within schools at MIT — for example, between political science and the Schwarzman College of Computing and the Sloan School of Management, and between the social science fields and the schools of science and engineering. The Strengthening Democracy Initiative will benefit from these collaborations and create new bridges between political science and other fields. It’s also important to note that this is a nonpartisan research endeavor. The MIT political science department has a reputation for rigorous, data-driven approaches to the study of politics, and its position within the MIT ecosystem will help us to maintain a reputation as an “honest broker,” and to disseminate path-breaking, evidence-based research and interventions to help democracies become more resilient.

Q: Will the new initiative have an educational mission?

A: Of course! The department has a long history of bringing in scores of undergraduate researchers via MIT’s Undergraduate Research Opportunities Program. The initiative will be structured to provide these students with opportunities to study various facets of the democratic process, and for faculty to have a ready pool of talented students to assist with their projects. My hope is to provide students with the resources and opportunities to test their own theories by designing and implementing surveys in the U.S. and abroad, and use insights and tools from computer science, applied statistics, and other disciplines to study political phenomena. As the initiative grows, I expect more opportunities for students to collaborate with state and local officials on improvements to election administration, and to study new puzzles related to healthy democracies.

Postdoctoral researchers will also play a prominent role by advancing research across the initiative’s pillars, supervising undergraduate researchers, and handling some of the administrative aspects of the work.

Q: This sounds like a long-term endeavor. Do you expect this initiative to be permanent?

A: Yes. We already have the pieces in place to create a leading center for the study of healthy democracies (and how to make them healthier). But we need to build capacity, including resources for a pool of researchers to shift from one project to another, which will permit synergies between projects and foster new ones. A permanent initiative will also provide the infrastructure for faculty and students to respond swiftly to current events and new research findings — for example, by launching a nationwide survey experiment, or collecting new data on an aspect of the electoral process, or testing the impact of a new AI technology on political perceptions. As I like to tell our supporters, there are new challenges to healthy democracies that were not on our radar 10 years ago, and no doubt there will be others 10 years from now that we have not imagined. We need to be prepared to do the rigorous analysis on whatever challenges come our way. And MIT Political Science is the best place in the world to undertake this ambitious agenda in the long term.

Navigating AI Investments: 5 Tactics to Balance Innovation with Sustainability

As the AI landscape rapidly evolves, business and technology leaders face growing challenges in balancing immediate AI investments with long-term sustainability objectives. In the rush to adopt AI, many businesses neglect this balance, prioritizing short-term gains over sustainable practices – a trend that can negatively impact…

LanguaTalk Review: Is This the Best Language Learning Hack?

Learning a new language is a big commitment. With LanguaTalk, the journey feels much more manageable. I’ve tried other popular language learning platforms, like Duolingo, Babbel, and Pronounce. While these platforms are great for structured lessons and basic practice, LanguaTalk stands out for its personalized, one-on-one…

Microelectronics projects awarded CHIPS and Science Act funding

MIT and Lincoln Laboratory are participants in four microelectronics proposals selected for funding to the Northeast Microelectronics Coalition (NEMC) Hub. The funding comes from the Microelectronics Commons, a $2 billion initiative of the CHIPS and Science Act to strengthen U.S. leadership in semiconductor manufacturing and innovation. The regional awards are among 33 projects announced as part of a $269 million federal investment.

U.S. Department of Defense (DoD) and White House officials announced the awards during an event on Sept. 18, hosted by the NEMC Hub at MIT Lincoln Laboratory. The NEMC Hub, a division of the Massachusetts Technology Collaborative, leads a network of more than 200 member organizations across the region to enable the lab-to-fab transition of critical microelectronics technologies for the DoD. The NEMC Hub is one of eight regional hubs forming a nationwide chip network under the Microelectronics Commons and is executed through the Naval Surface Warfare Center Crane Division and the National Security Technology Accelerator (NSTXL).

“The $38 million in project awards to the NEMC Hub are a recognition of the capability, capacity, and commitment of our members,” said Mark Halfman, NEMC Hub director. “We have a tremendous opportunity to grow microelectronics lab-to-fab capabilities across the Northeast region and spur the growth of game-changing technologies.”

“We are very pleased to have Lincoln Laboratory be a central part of the vibrant ecosystem that has formed within the Microelectronics Commons program,” said Mark Gouker, assistant head of the laboratory’s Advanced Technology Division and NEMC Hub advisory group representative. “We have made strong connections to academia, startups, DoD contractors, and commercial sector companies through collaborations with our technical staff and by offering our microelectronics fabrication infrastructure to assist in these projects. We believe this tighter ecosystem will be important to future Microelectronics Commons programs as well as other CHIPS and Science Act programs.”

The nearly $38 million award to the NEMC Hub is expected to support six collaborative projects, four of which will involve MIT and/or Lincoln Laboratory.

“These projects promise significant gains in advanced microelectronics technologies,” said Ian A. Waitz, MIT’s vice president for research. “We look forward to working alongside industry and government organizations in the NEMC Hub to strengthen U.S. microelectronics innovation, workforce and education, and lab-to-fab translation.”

The projects selected for funding support key technology areas identified in the federal call for competitive proposals. MIT campus researchers will participate in a project advancing commercial leap-ahead technologies, titled “Advancing DoD High Power Systems: Transition of High Al% AlGaN from Lab to Fab,” and another in the area of 5G/6G, called “Wideband, Scalable MIMO arrays for NextG Systems: From Antennas to Decoders.”

Researchers both at Lincoln Laboratory and on campus will contribute to a quantum technology project called “Community‐driven Hybrid Integrated Quantum‐Photonic Integrated circuits (CHIQPI).”

Lincoln Laboratory researchers will also participate in the “Wideband Same‐Frequency STAR Array Platform Based on Heterogeneous Multi-Domain Self‐Interference Cancellation” project.

The anticipated funding for these four projects follows a $7.7 million grant awarded earlier this year to MIT from the NEMC Hub, alongside an agreement between MIT and Applied Materials, to add advanced nanofabrication equipment and capabilities to MIT.nano.

The funding comes amid construction of the Compound Semiconductor Laboratory – Microsystem Integration Facility (CSL-MIF) at Lincoln Laboratory. The CSL-MIF will complement Lincoln Laboratory’s existing Microelectronics Laboratory, which has remained the U.S. government’s most advanced silicon-based research and fabrication facility for decades. When completed in 2028, the CSL-MIF is expected to play a vital role in the greater CHIPS and Science Act ecosystem.

“Lincoln Laboratory has a long history of developing advanced microelectronics to enable critical national security systems,” said Melissa Choi, Lincoln Laboratory director. “We are excited to embark on these awarded projects, leveraging our microelectronics facilities and partnering with fellow hub members to be at the forefront of U.S. microelectronics innovation.”

Officials who spoke at the Sept. 18 event emphasized the national security and economic imperatives to building a robust microelectronics workforce and innovation network.

“The Microelectronics Commons is an essential part of the CHIPS and Science Act’s whole-of-government approach to strengthen the U.S. microelectronics ecosystem and secure lasting technical leadership in this critical sector,” said Dev Shenoy, the principal director for microelectronics in the Office of the Under Secretary of Defense for Research and Engineering. “I believe in the incredible impact this work will have for American economies, American defense, and the American people.”

“The secret sauce of what made the U.S. the lead innovator in the world for the last 100 years was the coming together of the U.S. government and the public sector, together with the private sector and teaming up with academia and research,” said Amos Hochstein, special presidential coordinator for global infrastructure and energy security at the U.S. Department of State. “That is what enabled us to be the forefront of innovation and technology, and that is what we have to do again.”