A portable light system that can digitize everyday objects

When Nikola Tesla predicted we’d have handheld phones that could display videos, photographs, and more, his musings seemed like a distant dream. Nearly 100 years later, smartphones are like an extra appendage for many of us.

Digital fabrication engineers are now working toward expanding the display capabilities of other everyday objects. One avenue they’re exploring is reprogrammable surfaces — or items whose appearances we can digitally alter — to help users present important information, such as health statistics, as well as new designs on things like a wall, mug, or shoe.

Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), the University of California at Berkeley, and Aarhus University have taken an intriguing step forward by fabricating “PortaChrome,” a portable light system and design tool that can change the color and textures of various objects. Equipped with ultraviolet (UV) and red, green, and blue (RGB) LEDs, the device can be attached to everyday objects like shirts and headphones. Once a user creates a design and sends it to a PortaChrome machine via Bluetooth, the surface can be programmed into multicolor displays of health data, entertainment, and fashion designs.

Video thumbnail

Play video

PortaChrome: A portable light system that can digitize everyday objects
Video: MIT CSAIL

To make an item reprogrammable, the object must be coated with photochromic dye, an invisible ink that can be turned into different colors with light patterns. Once it’s coated, individuals can create and relay patterns to the item via the team’s graphic design software, or use the team’s API to interact with the device directly and embed data-driven designs. When attached to a surface, PortaChrome’s UV lights saturate the dye while the RGB LEDs desaturate it, activating the colors and ensuring each pixel is toned to match the intended design.

Zhu and her colleagues’ integrated light system changes objects’ colors in less than four minutes on average, which is eight times faster than their prior work, “Photo-Chromeleon.” This speed boost comes from switching to a light source that makes contact with the object to transmit UV and RGB rays. Photo-Chromeleon used a projector to help activate the color-changing properties of photochromic dye, where the light on the object’s surface is at a reduced intensity.

“PortaChrome provides a more convenient way to reprogram your surroundings,” says Yunyi Zhu ’20, MEng ’21, an MIT PhD student in electrical engineering and computer science, affiliate of CSAIL, and lead author on a paper about the work. “Compared with our projector-based system from before, PortaChrome is a more portable light source that can be placed directly on top of the photochromic surface. This allows the color change to happen without user intervention and helps us avoid contaminating our environment with UV. As a result, users can wear their heart rate chart on their shirt after a workout, for instance.”

Giving everyday objects a makeover

In demos, PortaChrome displayed health data on different surfaces. A user hiked with PortaChrome sewed onto their backpack, putting it into direct contact with the back of their shirt, which was coated in photochromic dye. Altitude and heart rate sensors sent data to the lighting device, which was then converted into a chart through a reprogramming script developed by the researchers. This process created a health visualization on the back of the user’s shirt. In a similar showing, MIT researchers displayed a heart gradually coming together on the back of a tablet to show how a user was progressing toward a fitness goal.

PortaChrome also showed a flair for customizing wearables. For example, the researchers redesigned some white headphones with sideways blue lines and horizontal yellow and purple stripes. The photochromic dye was coated on the headphones and the team then attached the PortaChrome device to the inside of the headphone case. Finally, the researchers successfully reprogrammed their patterns onto the object, which resembled watercolor art. Researchers also recolored a wrist splint to match different clothes using this process.

Eventually, the work could be used to digitize consumers’ belongings. Imagine putting on a cloak that can change your entire shirt design, or using your car cover to give your vehicle a new look.

PortaChrome’s main ingredients

On the hardware end, PortaChrome is a combination of four main ingredients. Their portable device consists of a textile base as a sort of backbone, a textile layer with the UV lights soldered on and another with the RGB stuck on, and a silicone diffusion layer to top it off. Resembling a translucent honeycomb, the silicone layer covers the interlaced UV and RGB LEDs and directs them toward individual pixels to properly illuminate a design over a surface.

This device can be flexibly wrapped around objects with different shapes. For tables and other flat surfaces, you could place PortaChrome on top, like a placemat. For a curved item like a thermos, you could wrap the light source around like a coffee cup sleeve to ensure it reprograms the entire surface.

The portable, flexible light system is crafted with maker space-available tools (like laser cutters, for example), and the same method can be replicated with flexible PCB materials and other mass manufacturing systems.

While it can also quickly convert our surroundings into dynamic displays, Zhu and her colleagues believe it could benefit from further speed boosts. They’d like to use smaller LEDs, with the likely result being a surface that could be reprogrammed in seconds with a higher-resolution design, thanks to increased light intensity.

“The surfaces of our everyday things are encoded with colors and visual textures, delivering crucial information and shaping how we interact with them,” says Georgia Tech postdoc Tingyu Cheng, who was not involved with the research. “PortaChrome is taking a leap forward by providing reprogrammable surfaces with the integration of flexible light sources (UV and RGB LEDs) and photochromic pigments into everyday objects, pixelating the environment with dynamic color and patterns. The capabilities demonstrated by PortaChrome could revolutionize the way we interact with our surroundings, particularly in domains like personalized fashion and adaptive user interfaces. This technology enables real-time customization that seamlessly integrates into daily life, offering a glimpse into the future of ‘ubiquitous displays.’”

Zhu is joined by nine CSAIL affiliates on the paper: MIT PhD student and MIT Media Lab affiliate Cedric Honnet; former visiting undergraduate researchers Yixiao Kang, Angelina J. Zheng, and Grace Tang; MIT undergraduate student Luca Musk; University of Michigan Assistant Professor Junyi Zhu SM ’19, PhD ’24; recent postdoc and Aarhus University assistant professor Michael Wessely; and senior author Stefanie Mueller, the TIBCO Career Development Associate Professor in the MIT departments of Electrical Engineering and Computer Science and Mechanical Engineering and leader of the HCI Engineering Group at CSAIL.

This work was supported by the MIT-GIST Joint Research Program and was presented at the ACM Symposium on User Interface Software and Technology in October.

Asteroid grains shed light on the outer solar system’s origins

Tiny grains from a distant asteroid are revealing clues to the magnetic forces that shaped the far reaches of the solar system over 4.6 billion years ago.

Scientists at MIT and elsewhere have analyzed particles of the asteroid Ryugu, which were collected by the Japanese Aerospace Exploration Agency’s (JAXA) Hayabusa2 mission and brought back to Earth in 2020. Scientists believe Ryugu formed on the outskirts of the early solar system before migrating in toward the asteroid belt, eventually settling into an orbit between Earth and Mars.

The team analyzed Ryugu’s particles for signs of any ancient magnetic field that might have been present when the asteroid first took shape. Their results suggest that if there was a magnetic field, it would have been very weak. At most, such a field would have been about 15 microtesla. (The Earth’s own magnetic field today is around 50 microtesla.)

Even so, the scientists estimate that such a low-grade field intensity would have been enough to pull together primordial gas and dust to form the outer solar system’s asteroids and potentially play a role in giant planet formation, from Jupiter to Neptune.

The team’s results, which are published today in the journal AGU Advances, show for the first time that the distal solar system likely harbored a weak magnetic field. Scientists have known that a magnetic field shaped the inner solar system, where Earth and the terrestrial planets were formed. But it was unclear whether such a magnetic influence extended into more remote regions, until now.

“We’re showing that, everywhere we look now, there was some sort of magnetic field that was responsible for bringing mass to where the sun and planets were forming,” says study author Benjamin Weiss, the Robert R. Shrock Professor of Earth and Planetary Sciences at MIT. “That now applies to the outer solar system planets.”

The study’s lead author is Elias Mansbach PhD ’24, who is now a postdoc at Cambridge University. MIT co-authors include Eduardo Lima, Saverio Cambioni, and Jodie Ream, along with Michael Sowell and Joseph Kirschvink of Caltech, Roger Fu of Harvard University, Xue-Ning Bai of Tsinghua University, Chisato Anai and Atsuko Kobayashi of the Kochi Advanced Marine Core Research Institute, and Hironori Hidaka of Tokyo Institute of Technology.

A far-off field

Around 4.6 billion years ago, the solar system formed from a dense cloud of interstellar gas and dust, which collapsed into a swirling disk of matter. Most of this material gravitated toward the center of the disk to form the sun. The remaining bits formed a solar nebula of swirling, ionized gas. Scientists suspect that interactions between the newly formed sun and the ionized disk generated a magnetic field that threaded through the nebula, helping to drive accretion and pull matter inward to form the planets, asteroids, and moons.

“This nebular field disappeared around 3 to 4 million years after the solar system’s formation, and we are fascinated with how it played a role in early planetary formation,” Mansbach says.

Scientists previously determined that a magnetic field was present throughout the inner solar system — a region that spanned from the sun to about 7 astronomical units (AU), out to where Jupiter is today. (One AU is the distance between the sun and the Earth.) The intensity of this inner nebular field was somewhere between 50 to 200 microtesla, and it likely influenced the formation of the inner terrestrial planets. Such estimates of the early magnetic field are based on meteorites that landed on Earth and are thought to have originated in the inner nebula.

“But how far this magnetic field extended, and what role it played in more distal regions, is still uncertain because there haven’t been many samples that could tell us about the outer solar system,” Mansbach says.

Rewinding the tape

The team got an opportunity to analyze samples from the outer solar system with Ryugu, an asteroid that is thought to have formed in the early outer solar system, beyond 7 AU, and was eventually brought into orbit near the Earth. In December 2020, JAXA’s Hayabusa2 mission returned samples of the asteroid to Earth, giving scientists a first look at a potential relic of the early distal solar system.

The researchers acquired several grains of the returned samples, each about a millimeter in size. They placed the particles in a magnetometer — an instrument in Weiss’ lab that measures the strength and direction of a sample’s magnetization. They then applied an alternating magnetic field to progressively demagnetize each sample.

“Like a tape recorder, we are slowly rewinding the sample’s magnetic record,” Mansbach explains. “We then look for consistent trends that tell us if it formed in a magnetic field.”

They determined that the samples held no clear sign of a preserved magnetic field. This suggests that either there was no nebular field present in the outer solar system where the asteroid first formed, or the field was so weak that it was not recorded in the asteroid’s grains. If the latter is the case, the team estimates such a weak field would have been no more than 15 microtesla in intensity.

The researchers also reexamined data from previously studied meteorites. They specifically looked at “ungrouped carbonaceous chondrites” — meteorites that have properties that are characteristic of having formed in the distal solar system. Scientists had estimated the samples were not old enough to have formed before the solar nebula disappeared. Any magnetic field record the samples contain, then, would not reflect the nebular field. But Mansbach and his colleagues decided to take a closer look.

“We reanalyzed the ages of these samples and found they are closer to the start of the solar system than previously thought,” Mansbach says. “We think these samples formed in this distal, outer region. And one of these samples does actually have a positive field detection of about 5 microtesla, which is consistent with an upper limit of 15 microtesla.”

This updated sample, combined with the new Ryugu particles, suggest that the outer solar system, beyond 7 AU, hosted a very weak magnetic field, that was nevertheless strong enough to pull matter in from the outskirts to eventually form the outer planetary bodies, from Jupiter to Neptune.

“When you’re further from the sun, a weak magnetic field goes a long way,” Weiss notes. “It was predicted that it doesn’t need to be that strong out there, and that’s what we’re seeing.”

The team plans to look for more evidence of distal nebular fields with samples from another far-off asteroid, Bennu, which were delivered to Earth in September 2023 by NASA’s OSIRIS-REx spacecraft.

“Bennu looks a lot like Ryugu, and we’re eagerly awaiting first results from those samples,” Mansbach says.

This research was supported, in part, by NASA.

Startup gives surgeons a real-time view of breast cancer during surgery

Breast cancer is the second most common type of cancer and cause of cancer death for women in the United States, affecting one in eight women overall.

Most women with breast cancer undergo lumpectomy surgery to remove the tumor and a rim of healthy tissue surrounding the tumor. After the procedure, the removed tissue is sent to a pathologist to look for signs of disease at the edge of the tissue assessed. Unfortunately, about 20 percent of women who have lumpectomies must undergo a second surgery to remove more tissue.

Now, an MIT spinout is giving surgeons a real-time view of cancerous tissue during surgery. Lumicell has developed a handheld device and an optical imaging agent that, when combined, allow surgeons to scan the tissue within the surgical cavity to visualize residual cancer cells.  The surgeons see these images on a monitor that can guide them to remove additional tissue during the procedure.

In a clinical trial of 357 patients, Lumicell’s technology not only reduced the need for second surgeries but also revealed tissue suspected to contain cancer cells that may have otherwise been missed by the standard of care lumpectomy.

The company received U.S. Food and Drug Administration approval for the technology earlier this year, marking a major milestone for Lumicell and the founders, who include MIT professors Linda Griffith and Moungi Bawendi along with PhD candidate W. David Lee ’69, SM ’70. Much of the early work developing and testing the system took place at the Koch Institute for Integrative Cancer Research at MIT, beginning in 2008.

The FDA approval also held deep personal significance for some of Lumicell’s team members, including Griffith, a two-time breast cancer survivor, and Lee, whose wife’s passing from the disease in 2003 changed the course of his life.

An interdisciplinary approach

Lee ran a technology consulting group for 25 years before his wife was diagnosed with breast cancer. Watching her battle the disease inspired him to develop technologies that could help cancer patients.

His neighbor at the time was Tyler Jacks, the founding director of the Koch Institute. Jacks invited Lee to a series of meetings at the Koch involving professors Robert Langer and Bawendi, and Lee eventually joined the Koch Institute as an integrative program officer in 2008, where he began exploring an approach for improving imaging in living organisms with single-cell resolution using charge-coupled device (CCD) cameras.

“CCD pixels at the time were each 2 or 3 microns and spaced 2 or 3 microns,” Lee explains. “So the idea was very simple: to stabilize a camera on a tissue so it would move with the breathing of the animal, so the pixels would essentially line up with the cells without any fancy magnification.”

That work led Lee to begin meeting regularly with a multidisciplinary group including Lumicell co-founders Bawendi, currently the Lester Wolfe Professor of Chemistry at MIT and winner of the 2023 Nobel Prize in Chemistry; Griffith, the School of Engineering Professor of Teaching Innovation in MIT’s Department of Biological Engineering and an extramural faculty member at the Koch Institute; Ralph Weissleder, a professor at Harvard Medical School; and David Kirsch, formerly a postdoc at the Koch Institute and now a scientist at the Princess Margaret Cancer Center.

“On Friday afternoons, we’d get together, and Moungi would teach us some chemistry, Lee would teach us some engineering, and David Kirsch would teach some biology,” Griffith recalls.

Through those meetings, the researchers began to explore the effectiveness of combining Lee’s imaging approach with engineered proteins that would light up where the immune system meets the edge of tumors, for use during surgery. To begin testing the idea, the group received funding from the Koch Institute Frontier Research Program via the Kathy and Curt Marble Cancer Research Fund.

“Without that support, this never would have happened,” Lee says. “When I was learning biology at MIT as an undergrad, genetics weren’t even in the textbooks yet. But the Koch Institute provided education, funding, and most importantly, connections to faculty, who were willing to teach me biology.”

In 2010, Griffith was diagnosed with breast cancer.

“Going through that personal experience, I understood the impact that we could have,” Griffith says. “I had a very unusual situation and a bad kind of tumor. The whole thing was nerve-wracking, but one of the most nerve-wracking times was waiting to find out if my tumor margins were clear after surgery. I experienced that uncertainty and dread as a patient, so I became hugely sensitized to our mission.”

The approach Lumicell’s founders eventually settled on begins two to six hours before surgery, when patients receive the optical imaging agent through an IV. Then, during surgery, surgeons use Lumicell’s handheld imaging device to scan the walls of the breast cavity. Lumicell’s cancer detection software shows spots that highlight regions suspected to contain residual cancer on the computer monitor, which the surgeon can then remove. The process adds less than 7 minutes on average to the procedure.

“The technology we developed allows the surgeon to scan the actual cavity, whereas pathology only looks at the lump removed, and [pathologists] make their assessment based on looking at about 1 or 2 percent of the surface area,” Lee says. “Not only are we detecting cancer that was left behind to potentially eliminate second surgeries, we are also, very importantly, finding cancer in some patients that wouldn’t be found in pathology and may not generate a second surgery.”

Exploring other cancer types

Lumicell is currently exploring if its imaging agent is activated in other tumor types, including prostate, sarcoma, esophageal, gastric, and more.

Lee ran Lumicell between 2008 and 2020. After stepping down as CEO, he decided to return to MIT to get his PhD in neuroscience, a full 50 years since he earned his master’s. Shortly thereafter, Howard Hechler took over as Lumicell’s president and chief operating officer.

Looking back, Griffith credits MIT’s culture of learning for the formation of Lumicell.

“People like David [Lee] and Moungi care about solving problems,” Griffith says. “They’re technically brilliant, but they also love learning from other people, and that’s what makes makes MIT special. People are confident about what they know, but they are also comfortable in that they don’t know everything, which drives great collaboration. We work together so that the whole is bigger than the sum of the parts.”

Huawei’s Ascend 910C: A Bold Challenge to NVIDIA in the AI Chip Market

The Artificial Intelligence (AI) chip market has been growing rapidly, driven by increased demand for processors that can handle complex AI tasks. The need for specialized AI accelerators has increased as AI applications like machine learning, deep learning, and neural networks evolve. NVIDIA has been the…

Fluid Everything Else

We can apply the concept of fluid typography to almost anything. This way we can have a layout that fluidly changes with the size of its parent container. Few users will ever see the transition, but they will all appreciate the results. Honestly, they will.

Fluid Everything…

YoloLiv’s Biggest Sale of the Year! | Black Friday 2024

Save now on YoloLiv YoloBox Ultra, Pro, Mini, and Instream with Black Friday Specials! Now through December 6th, 2024 only!

$200 Off Black Friday Special!

  • Encoder, Monitor, Switcher & Recorder
  • Stream to Facebook, YouTube, RTMPs, & more for widescreen
  • Stream to Instagram & TikTok vertical
  • 4 HDMI Inputs
  • 4K Streaming
  • ISO Recording
  • NDI (additional $99 fee from YoloLiv)

$1,498.00 reg.
$1,298.00 PROMO
Offer expires 12/6/24

$200 Off Black Friday Special!

Control your YoloBox with confidence, faster then ever before. YoloDeck has 15 LCD keys poised to trigger unlimited actions for your YoloBox

$1,647.00 reg.
$1,447.00 PROMO
Offer expires 12/6/24

$50 Off Black Friday Special!

  • Encoder, Monitor, Switcher & Recorder
  • Widescreen Orientation only
  • Stream to Facebook, YouTube, RTMPs, & more
  • 3 HDMI Inputs
  • 1080p Streaming
  • 8″ Display

$998.00 reg.
$948.00 PROMO
Offer expires 12/6/24

$50 Off Black Friday Special!

Control your YoloBox with confidence, faster then ever before. YoloDeck has 15 LCD keys poised to trigger unlimited actions for your YoloBox

$1,147.00 reg.
$1,097.00 PROMO
Offer expires 12/6/24

$50 Off Black Friday Special!

  • Encoder, Monitor, Switcher & Recorder
  • Widescreen Orientation only
  • Stream to Facebook, YouTube, RTMPs, & more
  • 1 HDMI Input
  • 1080p Streaming
  • 5.5″ Display

$698.00 reg.
$648.00 PROMO
Offer expires 12/6/24

$100 Off Black Friday Special!

  • Encoder, Monitor, Switcher & Recorder
  • Vertical Orientation only
  • Stream to TikTok and Instagram
  • 2 HDMI Inputs
  • 1080p Streaming

$998.00 reg.
$898.00 PROMO
Offer expires 12/6/24

Heico Sandee, Founder and CEO of Smart Robotics – Interview Series

Heico Sandee, is the Co-Founder and CEO of Smart Robotics. Smart Robotics offers technology and services designed to automate pick-and-place stations in fulfillment centers. The company provides user-friendly, reliable, and adaptable pick-and-place systems capable of handling a wide variety of items. What inspired you to co-found…

Quantum Computing: The Future of Data-Driven Decision Making

Data is not just an asset; it has become the lifeblood of businesses today, driving everything from daily decisions to long-term strategy, and is central to competitiveness and innovation. It’s no surprise that companies generate more data than ever before, from all sources. The surge in…

Creatify Review: How I Turn Product Links Into Ad Videos

Have you ever wished you could create high-quality video ads without hiring an expensive production team? Video is everything in today’s fast-paced digital world. Aberdeen Group says video drives 49% faster revenue growth than non-video content. But if you’re like most small businesses or content creators, you…

AI hallucinations gone wrong as Alaska uses fake stats in policy

The combination of artificial intelligence and policymaking can occasionally have unforeseen repercussions, as seen recently in Alaska. In an unusual turn of events, Alaska legislators reportedly used AI-generated citations that were inaccurate to justify a proposed policy banning cellphones in schools. As reported by /The Alaska…