The Future of Cloud Security: 20 Statistics & Trends to Track

The Future of Cloud Security: 20 Statistics & Trends to Track

EXECUTIVE SUMMARY:

In a decade driven by digital transformation, the increased reliance on cloud computing has presented unprecedented opportunities for businesses, enabling scalability and efficiency. However, the shift to cloud has also introduced challenges — particularly in relation to cyber security.

As you strategize and prepare for the remainder of the year, it’s crucial to maintain an in-depth understanding of the contemporary cloud security landscape. In this article, explore 20 statistics and trends that will keep your organization up-to-date and ready to combat tomorrow’s threats.

The future of cloud security: Statistics and trends

1. Cloud breaches on the rise. Seventy-nine percent of companies have contended with at least one cloud breach in the last 18 months, underscoring the urgency around new and robust cloud security measures.

2. It’s tougher for large businesses. Nearly a third of all large businesses say that securing cloud assets and resources is a major challenge, highlighting the level of heightened complexity that organizations with extensive digital footprints face.

3. Expanding volume of cloud data. By the end of 2025, experts predict that the cloud will host a staggering 100 zettabytes of data. This means that cloud storage demands will soar, as will the need for unified and automated cloud security.

4. Ubiquity of cloud adoption. Ninety-two percent of organizations already have some of their IT environment hosted in the cloud. In other words, cloud infrastructure has been embraced, and organizations need to keep proactively expanding cloud security initiatives.

5. Sensitive data in the cloud. Nearly 50% of businesses use the cloud to store sensitive data; both encrypted and unencrypted. Organizations must prioritize effective encryption practices, mitigating the risks associated with unauthorized access and breaches.

6. Real-time security assessments lagging. Only one in five organizations assesses their overall cloud security posture in real-time, indicating that many organizations have a gap in their security measures.

7. Challenges in multi-cloud environments. Eighty-six percent of organizations experience issues while managing data in multi-cloud environments. Streamline and simplify your multi-cloud security this year.

8. Insecure APIs a key concern. In a survey, more than 50% of cyber security experts mentioned insecure APIs as a critical cloud security concern. Strengthening API security should be a focal point within cloud security strategies.

9. Cost of breaches in hybrid environments. In 2023, the average cost of a breach in a hybrid cloud environment was $3.61 million, with projections indicating a continued, severe financial impact caused by security lapses.

10. Automation in cloud security. Seventy-eight percent of companies currently assess cloud security through automation. As the year progresses, an increasing number of organizations should and likely will implement automated cloud security solutions.

11. Rise of Zero Trust approach. Eighty percent of enterprises have stated that they are considering, evaluating or deploying Zero Trust. This reflects a shift towards more stringent cyber security models.

12. Demand for comprehensive assessments. Nearly 80% of organizations are seeking a more comprehensive cloud security assessment. This trend highlights increased awareness of cloud security challenges and a growing demand for holistic security solutions.

13. Targeting newer cloud technologies. In 2024, cyber security experts anticipate more frequent attacks that target newer cloud technologies, such as container-based resources. Be sure to leverage corresponding security best practices.

14. Cloud security spending forecast. Cloud security is expected to remain as the fastest-growing area of security and risk management spending this year. In other words, organizations appear to be prioritizing cloud security investments.

15. Projected spending in 2024. Cloud security spending in 2024 is predicted to reach $7 billion, signifying a substantial financial commitment to fortifying cloud infrastructure.

16. AI-driven cloud management. In 2024, AI-driven cloud management is expected to shift from novelty to norm, enabling organizations to enhance security performance and advance overall cloud operations.

17. Focus on serverless security. As serverless computing expands, organizations need to pay greater attention to security, doing so by adopting serverless security tools and best practices.

18. Continued growth of Kubernetes. In 2024, Kubernetes is expected to maintain growth and relevance, pointing to its importance in container orchestration and management.

19. Increased cloud threat detection spending. Nearly 90% of organizations expect to increase Cloud Threat Detection and Response (CDR) spending.

20. Adaptability as a key theme. Organizations will likely need to cultivate new mindsets and to expand efforts to secure sensitive data amidst evolving threats. Adaptation is central to ensuring resilient and future-proof cloud security strategies.

Conclusion

Ready for what’s next? The stats and insights offer a few key takeaways — Organizations need to be proactive and need to adopt comprehensive cloud security solutions.

But more than that, cloud security success hinges on the abilities to stay agile, innovative and eager to continuously improve.

More future of cloud security resources

  • 10 cloud security essentials 2024 – Read article
  • CEO Gil Shwed explains how Check Point CloudGuard works – Watch video
  • Discover advanced cloud security product information – Here

MIT Faculty Founder Initiative announces finalists for second competition

MIT Faculty Founder Initiative announces finalists for second competition

The MIT Faculty Founder Initiative has announced 12 finalists for the 2023-24 MIT-Royalty Pharma Prize Competition. The competition, which is supported by Royalty Pharma, aims to support female faculty entrepreneurs in biotechnology and provide them with resources to help take their ideas to commercialization. 

“We are building a playbook to get inventions out of the lab towards impacting patients by connecting female faculty to the innovation ecosystem and creating a community of peers,” says Sangeeta Bhatia, the John J. and Dorothy Wilson Professor of Health Sciences and Technology and of Electrical Engineering and Computer Science (EECS), and faculty director of the MIT Faculty Founder Initiative.

Throughout the academic year, finalists for the prize competition will receive support through a number of events, workshops, and programs. These activities focus on topics ranging from executive education classes in entrepreneurship to intellectual property and fundraising strategy. Participants also have access to over 50 best-in-class executives, investors, and advisors who have volunteered to provide mentorship and guidance to the finalists as they further develop their startup ideas.

This spring, the cohort will pitch their ideas to a selection committee of faculty, biotech founders, and venture capitalists. The grand prize winner will receive $250,000 in discretionary funds, and the breakthrough science award winner and runner-up award winner will each receive $100,000. The winners will be announced at a showcase event on May 2, at which the entire cohort will share their work. All participants also receive a $10,000 stipend for participating in the competition.

“The support the MIT Faculty Founder Initiative provides female entrepreneurs in biotech is tremendous. Participants receive truly invaluable guidance from some of the world’s top experts to help hone their ideas and launch companies that have the potential to make a real impact in the biotech space,” adds Anantha Chandrakasan, dean of the School of Engineering and the Vannevar Bush Professor of Electrical Engineering and Computer Science.  

The MIT Faculty Founder Initiative was launched in 2020 by the MIT School of Engineering, in collaboration with the Martin Trust Center for MIT Entrepreneurship. The idea for the program stemmed from a research project Bhatia conducted alongside Susan Hockfield, MIT Corporation life member, MIT president emerita, and professor of neuroscience, and Nancy Hopkins, professor emerita of biology. The team discovered that of the 250 biotech startups created by MIT professors, fewer than 10 percent had been founded by women, who made up 22 percent of all faculty.

In their research, the team estimated that if female faculty founded startups at the same rate as their male counterparts, there would be 40 more biotech companies.

“What that means is 40 more potential medicines. The societal impact of that is really important. It’s a lost opportunity,” says Bhatia, who co-write an editorial in Science alongside Hopkins and Hockfield.

In 2021, the Faculty Founder Initiative launched its first prize competition, which was supported by Northpond Ventures. Nine finalists pitched their ideas, with Ellen Roche, Latham Family Career Development Professor, an associate professor of mechanical engineering, and a core faculty of the Institute for Medical Engineering and Science (IMES), taking the grand prize. Eight of the nine participants have continued on their entrepreneurial journey.

The second prize competition cohort includes researchers affiliated with MIT as well as Brown University.

“We are thrilled to be supporting the 2023-2024 MIT-Royalty Pharma Prize Competition and this cohort of 12 brilliant researchers. Their ideas can lead to transformative solutions for patients around the world,” says Pablo Legorreta, founder and CEO of Royalty Pharma.

The 2023-24 finalists include:

  • Anne Carpenter, institute scientist at the Broad Institute of MIT and Harvard, serves as the senior director of the Imaging Platform. She is an expert in developing and applying methods for extracting quantitative information from biological images, especially in a high-throughput manner. Her group’s open-source CellProfiler software is used by thousands of biologists worldwide and their Cell Painting assay has been adopted throughout the pharma industry to accelerate drug discovery. Carpenter earned a BS from Purdue University and a PhD from the University of Illinois at Urbana-Champaign.
     
  • Kareen Coulombe, associate professor of engineering, is the director of graduate studies in biomedical engineering at Brown University and leads the Coulombe Lab for Heart Regeneration and Health. She studies cardiac regenerative medicine — from fundamentals of tissue formation and contractility to integration with the host heart — to develop translational therapies for heart disease patients around the world. Coulombe received a BS from the University of Rochester and a PhD from the University of Washington.
     
  • Betar Gallant, Class of 1922 Career Development Professor and associate professor of mechanical engineering, leads the Gallant Energy and Carbon Conversion Lab. Her research focuses on advanced battery chemistries and materials for high-energy rechargeable and primary batteries. She is also developing insights into reaction mechanisms that underpin advanced greenhouse gas mitigation technologies. Gallant received her BS, master’s degree, and PhD from MIT.
     
  • Carolina Haass-Koffler, associate professor of psychiatry and human behavior and associate professor of behavioral and social sciences at Brown University, is the chief of Brown’s Clinical Neuroscience Lab. As a translational investigator, she combines preclinical and clinical research in an effort to examine bio-behavioral mechanisms of addiction and developing novel therapeutic interventions. Haass-Koffler received her BS from the University of California at Berkeley, her PharmD from the University of California at San Francisco, and her PhD from Università di Camerino.
     
  • Stephanie Jones is a professor of neuroscience at Brown University. Her research integrates human brain imaging and computational neuroscience methods to study brain dynamics in health and disease. She aims to develop biophysically principled models of neural circuits that bridge electrophysiological measures of brain function to the underlying cellular and network level dynamics. Jones received a BS and master’s degree in mathematics from Boston College, and a PhD in mathematics from Boston University, followed by neuroscience training at Massachusetts General Hospital (MGH). 
     
  • Laura Lewis is the Athinoula A. Martinos Associate Professor of IMES and EECS at MIT, principal investigator in the Research Laboratory of Electronics at MIT, and an associate faculty member at the Martinos Center for Biomedical Imaging at MGH. Lewis focuses on neuroimaging approaches that better map brain function, with a particular focus on sleep. She is developing computational and signal processing approaches for neuroimaging data and applying these tools to study how neural computation is dynamically modulated across sleep, wake, attentional, and affective states. Lewis earned a BS at McGill University and a PhD at MIT.
     
  • Frederike Petzschner, assistant professor at the Carney Institute for Brain Science at Brown University. She also serves as the director of the Brainstorm Program, an incubator program that accelerates the translation of computational brain science to clinical applications and commercialization. She and her team at the PEAC (Psychiatry, Embodiment, and Computation) Lab study the latent cognitive processes that underpin perception and decision-making in both healthy individuals and those suffering from obsessive-compulsive disorder, addiction, and, most recently, chronic pain. The group recently launched SOMA, a digital tool designed to assist individuals with chronic pain. Petzschner received a BS and MS from the University of Würzburg and a PhD from Ludwig-Maximilians University in Munich.
     
  • Theresa Raimondo is an assistant professor of engineering at Brown University. Her research broadly centers around the design of RNA-lipid nanoparticles (LNPs) for therapeutic applications. By modulating both the RNA molecule (structure and sequence) and the lipid nanoparticle formulation, her team can deliver RNA-LNPs to immune cells in vivo for immunotherapy. In this application, siRNA-LNPs are used as a novel cancer checkpoint inhibitor therapy. Raimondo received a BS from Brown University and a MS and PhD from Harvard University.
     
  • Ritu Raman, the Brit (1961) and Alex (1949) d’Arbeloff Career Development Professor in Engineering Design and assistant professor of mechanical engineering at MIT, designs adaptive living materials powered by assemblies of living cells for applications ranging from medicine to machines. Currently, she is focused on building living neuromuscular tissues to advance understanding of human disease and restore mobility after injury or trauma. Raman received a BS from Cornell University and an MS and PhD as an NSF Fellow from the University of Illinois at Urbana-Champaign.
     
  • Deblina Sarkar, the AT&T Career Development Professor and assistant professor of media arts and sciences at MIT, is the founder and director of the Nano-Cybernetic Biotrek research group. She conducts transdisciplinary research fusing engineering, applied physics, and biology, aiming to bridge the gap between nanotechnology and synthetic biology to develop disruptive technologies for nanoelectronic devices and create new paradigms for life-nanomachine symbiosis. She received a BTech from the Indian Institute of Technology and an MS and PhD from the University of California at Santa Barbara.
     
  • Jessica Stark starts as assistant professor in the departments of Biological Engineering and Chemical Engineering and the Koch Institute for Integrative Cancer Research at MIT this month. She develops biological technologies to realize the largely untapped potential of glycans for immunological discovery and immunotherapy. Stark received a BS from Cornell University and a PhD from Northwestern University.
     
  • Joelle Straehla is a Charles W. (1995) and Jennifer C. Johnson Clinical Investigator at the Koch Institute, a pediatric oncologist at Dana-Farber/Boston Children’s Cancer and Blood Disorders Center, and an instructor of pediatrics at Harvard Medical School. She conducts research at the intersection of nanotechnology and systems biology with the ultimate goal of accelerating cancer nanomedicine translation. She received a BS from the University of Florida and an MD from Northwestern University.

Q&A: What sets the recent Japan earthquake apart from others?

On Jan. 1, a magnitude 7.6 earthquake struck the western side of Japan on the Noto Peninsula, killing over 200 people. Japan is prone to earthquakes, including a magnitude 9.1 earthquake in 2011 that triggered a tsunami and killed almost 20,000 people.

William Frank, the Victor P. Starr Career Development Professor in the Department of Earth, Atmospheric and Planetary Sciences at MIT, has been studying an earthquake swarm in the region where the most recent earthquake occurred. He explains the difference between subduction earthquakes and earthquake swarms, and why the unknown nature of these swarms makes predictions hard.

Q: Why is Japan prone to earthquakes?

A: Japan is prone to earthquakes because it is at the western edge of the Pacific plate and a more complicated junction where two plates are subducting, or plunging, beneath the tectonic plate that Japan is sitting on. It’s at the interface between those plates where you’re going to have a lot of earthquakes, because you’re generating stress as the plates move past one another.

But interestingly, this earthquake was not due to subduction. It’s on the west coast of the island, and the subduction zones are on the east coast. There are still a lot of active tectonics that are not related to subduction. This one place is enigmatic [as to] why there are so many earthquakes, but there’s been this earthquake swarm happening there since 2020. This latest earthquake is the latest big earthquake in the swarm.

Q&A: What sets the recent Japan earthquake apart from others?

This map shows the radius of the intensity surrounding the epicenter of the earthquake in colored lines, while the box over the Noto Peninsula designates the slip amplitude in the area.

Image courtesy of the U.S. Geological Survey


Q: What is an earthquake swarm, and how can you tell this earthquake is a part of it?

A: Normally you have the big earthquake, what we call the mainshock, that is followed by a sequence of aftershocks. But in a swarm, there’s no clear mainshock because there’s a lot of earthquakes before and there’s also a lot of earthquakes afterwards. Often there will just be one earthquake that will be bigger than the rest sometime within that swarm duration.

Earthquake swarms are typically around plate boundaries. There’s a lot of them in subduction zones but not only [there] — there are also earthquake swarms, for example, in Southern California. These can last days, months, years. We call it a swarm is because it’s generating many more earthquakes than we expect from that region, in sustained activity, for the past few years.

Q: How can you tell a swarm from general seismic activity in the region?

A: It’s not obvious; it’ll take some time before you realize that what’s happening now is not what was happening previously. Typically, it’s something that ramps up, attains some sort of sustained activity level, and then ramps back down.

Q: Tying it into the 2011 earthquake, which caused significantly more damage, what makes an earthquake more damaging than others?

A: The 2011 earthquake was the subduction of the Pacific plate beneath Japan. In there, you have a lot of fault real estate that can rupture altogether and generate a magnitude 9 earthquake. That earthquake was offshore of Japan, so the shaking was strong, but the biggest damage came from the tsunami. The sudden motion of the seafloor moved the water on top, and that created a big tsunami that then caused its own set of aftereffects and damage to the coast of Japan.

For this earthquake on the Noto Peninsula, because it was beneath the land it’s not going to have that sudden uplift of the water on top and feed that tsunami. After the New Year’s Day earthquake, the Japanese authorities initially put out a bunch of tsunami alerts, but then eventually removed them when they realized that we don’t expect this to generate the motion necessary for a tsunami. Depending on the tectonic context, a tsunami will likely be generated or not by an earthquake, and that is often the hazard that causes the most amount of damage.

Q: What can these swarms tell us about future activity in the region?

A: Going back to the mainshock/aftershock earthquake sequence, we know that there’s going to be an elevated rate [of activity] for the next few days or months, and that these earthquakes are going to happen in the general region of where that big earthquake happened.

For a swarm, because we don’t understand it as well, we don’t have a clear idea of what’s going to happen. Sometimes we’ve seen swarms that are actually stopped by a big earthquake, and then there’s nothing else afterwards — it sort of shuts down the system. Sometimes it’s just the biggest earthquake in a long sequence of earthquakes.

Q: You’ll often hear people talking about big earthquakes being foreshocks or predictors for bigger earthquakes to come. Do we need to be worried about this being a foreshock?

A: When we are thinking about something along the lines of what you just mentioned, it’s because we’re thinking about the earthquake budget along a tectonic plate boundary. On a boundary, we know the relative motion and we know that the plates are pretty much rigid, so that all the motion is being accommodated at the interface between the two plates. That gives us some budget for how these plates are going to move over a long period of time.

Let’s say, for example, there’s a magnitude 7, but we know that there’s enough slip budget potentially for a magnitude 8, then maybe that magnitude 7 will change the stress state of that tectonic environment and make it so that the eight might come quicker than if the seven hadn’t happened.

But that’s when we put ourselves within the context of a tectonic plate interface, like the subduction zones off the coast of East Japan. For this swarm, we don’t have a good idea beforehand of what are the actual structures that are going to host the earthquakes. Because we don’t have a good idea of where the earthquakes can potentially happen, we can’t use that simple model of a slip budget along a fault. Until we have a better understanding of which structures are hosting the earthquakes and the relative motion we expect on those over a long period of time, we can’t really forecast what.

Generating the policy of tomorrow

Generating the policy of tomorrow

As first-year students in the Social and Engineering Systems (SES) doctoral program within the MIT Institute for Data, Systems, and Society (IDSS), Eric Liu and Ashely Peake share an interest in investigating housing inequality issues.

They also share a desire to dive head-first into their research.

“In the first year of your PhD, you’re taking classes and still getting adjusted, but we came in very eager to start doing research,” Liu says.

Liu, Peake, and many others found an opportunity to do hands-on research on real-world problems at the MIT Policy Hackathon, an initiative organized by students in IDSS, including the Technology and Policy Program (TPP). The weekend-long, interdisciplinary event — now in its sixth year — continues to gather hundreds of participants from around the globe to explore potential solutions to some of society’s greatest challenges.

This year’s theme, “Hack-GPT: Generating the Policy of Tomorrow,” sought to capitalize on the popularity of generative AI (like the chatbot ChatGPT) and the ways it is changing how we think about technical and policy-based challenges, according to Dansil Green, a second-year TPP master’s student and co-chair of the event.

“We encouraged our teams to utilize and cite these tools, thinking about the implications that generative AI tools have on their different challenge categories,” Green says.

After 2022’s hybrid event, this year’s organizers pivoted back to a virtual-only approach, allowing them to increase the overall number of participants in addition to increasing the number of teams per challenge by 20 percent.

“Virtual allows you to reach more people — we had a high number of international participants this year — and it helps reduce some of the costs,” Green says. “I think going forward we are going to try and switch back and forth between virtual and in-person because there are different benefits to each.”

“When the magic hits”

Liu and Peake competed in the housing challenge category, where they could gain research experience in their actual field of study. 

“While I am doing housing research, I haven’t necessarily had a lot of opportunities to work with actual housing data before,” says Peake, who recently joined the SES doctoral program after completing an undergraduate degree in applied math last year. “It was a really good experience to get involved with an actual data problem, working closer with Eric, who’s also in my lab group, in addition to meeting people from MIT and around the world who are interested in tackling similar questions and seeing how they think about things differently.”

Joined by Adrian Butterton, a Boston-based paralegal, as well as Hudson Yuen and Ian Chan, two software engineers from Canada, Liu and Peake formed what would end up being the winning team in their category: “Team Ctrl+Alt+Defeat.” They quickly began organizing a plan to address the eviction crisis in the United States.

“I think we were kind of surprised by the scope of the question,” Peake laughs. “In the end, I think having such a large scope motivated us to think about it in a more realistic kind of way — how could we come up with a solution that was adaptable and therefore could be replicated to tackle different kinds of problems.”

Watching the challenge on the livestream together on campus, Liu says they immediately went to work, and could not believe how quickly things came together.

“We got our challenge description in the evening, came out to the purple common area in the IDSS building and literally it took maybe an hour and we drafted up the entire project from start to finish,” Liu says. “Then our software engineer partners had a dashboard built by 1 a.m. — I feel like the hackathon really promotes that really fast dynamic work stream.”

“People always talk about the grind or applying for funding — but when that magic hits, it just reminds you of the part of research that people don’t talk about, and it was really a great experience to have,” Liu adds.

A fresh perspective

“We’ve organized hackathons internally at our company and they are great for fostering innovation and creativity,” says Letizia Bordoli, senior AI product manager at Veridos, a German-based identity solutions company that provided this year’s challenge in Data Systems for Human Rights. “It is a great opportunity to connect with talented individuals and explore new ideas and solutions that we might not have thought about.”

The challenge provided by Veridos was focused on finding innovative solutions to universal birth registration, something Bordoli says only benefited from the fact that the hackathon participants were from all over the world.

“Many had local and firsthand knowledge about certain realities and challenges [posed by the lack of] birth registration,” Bordoli says. “It brings fresh perspectives to existing challenges, and it gave us an energy boost to try to bring innovative solutions that we may not have considered before.”

New frontiers

Alongside the housing and data systems for human rights challenges was a challenge in health, as well as a first-time opportunity to tackle an aerospace challenge in the area of space for environmental justice.

“Space can be a very hard challenge category to do data-wise since a lot of data is proprietary, so this really developed over the last few months with us having to think about how we could do more with open-source data,” Green explains. “But I am glad we went the environmental route because it opened the challenge up to not only space enthusiasts, but also environment and climate people.”

One of the participants to tackle this new challenge category was Yassine Elhallaoui, a system test engineer from Norway who specializes in AI solutions and has 16 years of experience working in the oil and gas fields. Elhallaoui was a member of Team EcoEquity, which proposed an increase in policies supporting the use of satellite data to ensure proper evaluation and increase water resiliency for vulnerable communities.

“The hackathons I have participated in in the past were more technical,” Elhallaoui says. “Starting with [MIT Science and Technology Policy Institute Director Kristen Kulinowski’s] workshop about policy writers and the solutions they came up with, and the analysis they had to do … it really changed my perspective on what a hackathon can do.”

“A policy hackathon is something that can make real changes in the world,” she adds.

Faculty, staff, students to evaluate ways to decarbonize MIT’s campus

Faculty, staff, students to evaluate ways to decarbonize MIT’s campus

With a goal to decarbonize the MIT campus by 2050, the Institute must look at “new ideas, transformed into practical solutions, in record time,” as stated in “Fast Forward: MIT’s Climate Action Plan for the Decade.” This charge calls on the MIT community to explore game-changing and evolving technologies with the potential to move campuses like MIT away from carbon emissions-based energy systems.

To help meet this tremendous challenge, the Decarbonization Working Group — a new subset of the Climate Nucleus — recently launched. Comprised of appointed MIT faculty, researchers, and students, the working group is leveraging its members’ expertise to meet the charge of exploring and assessing existing and in-development solutions to decarbonize the MIT campus by 2050. The group is specifically charged with informing MIT’s efforts to decarbonize the campus’s district energy system.

Co-chaired by Director of Sustainability Julie Newman and Department of Architecture Professor Christoph Reinhart, the working group includes members with deep knowledge of low- and zero-carbon technologies and grid-level strategies. In convening the group, Newman and Reinhart sought out members researching these technologies as well as exploring their practical use. “In my work on multiple projects on campus, I have seen how cutting-edge research often relies on energy-intensive equipment,” shares PhD student and group member Ippolyti Dellatolas. “It’s clear how new energy-efficiency strategies and technologies could use campus as a living lab and then broadly deploy these solutions across campus for scalable emissions reductions.” This approach is one of MIT’s strong suits and a recurring theme in its climate action plans — using the MIT campus as a test bed for learning and application. “We seek to study and analyze solutions for our campus, with the understanding that our findings have implications far beyond our campus boundaries,” says Newman.

The efforts of the working group represent just one part of the multipronged approach to identify ways to decarbonize the MIT campus. The group will work in parallel and at times collaboratively with the team from the Office of the Vice President for Campus Services and Stewardship that is managing the development plan for potential zero-carbon pathways for campus buildings and the district energy system. In May 2023, MIT engaged Affiliated Engineers, Inc. (AEI), to support the Institute’s efforts to identify, evaluate, and model various carbon-reduction strategies and technologies to provide MIT with a series of potential decarbonization pathways. Each of the pathways must demonstrate how to manage the generation of energy and its distribution and use on campus. As MIT explores electrification, a significant challenge will be the availability of resilient clean power from the grid to help generate heat for our campus without reliance on natural gas.

When the Decarbonization Working Group began work this fall, members took the time to learn more about current systems and baseline information. Beginning this month, members will organize analysis around each of their individual areas of expertise and interest and begin to evaluate existing and emerging carbon reduction technologies. “We are fortunate that there are constantly new ideas and technologies being tested in this space and that we have a committed group of faculty working together to evaluate them,” Newman says. “We are aware that not every technology is the right fit for our unique dense urban campus, and nor are we solving for a zero-carbon campus as an island, but rather in the context of an evolving regional power grid.”

Supported by funding from the Climate Nucleus, evaluating technologies will include site visits to locations where priority technologies are currently deployed or being tested. These site visits may range from university campuses implementing district geothermal and heat pumps to test sites of deep geothermal or microgrid infrastructure manufacturers. “This is a unique moment for MIT to demonstrate leadership by combining best decarbonization practices, such as retrofitting building systems to achieve deep energy reductions and converting to low-temperature district heating systems with ‘nearly there’ technologies such as deep geothermal, micronuclear, energy storage, and ubiquitous occupancy-driven temperature control,” says Reinhart. “As first adopters, we can find out what works, allowing other campuses to follow us at reduced risks.”

The findings and recommendations of the working group will be delivered in a report to the community at the end of 2024. There will be opportunities for the MIT community to learn more about MIT’s decarbonization efforts at community events on Jan. 24 and March 14, as well as MIT’s Sustainability Connect forum on Feb. 8.

Zev Farbman, Co-Founder & CEO at Lightricks – Interview Series

Zev Farbman is the Co-Founder & CEO at Lightricks, a pioneer in innovative technology that bridges the gap between imagination and creation. As an AI-first company, with a mission to build an innovative photo and video creation platform, they aim to enable content creators and brands…

AlphaGeometry: How DeepMind’s AI Masters Geometry Problems at Olympian Levels?

In the ever-evolving landscape of artificial intelligence, the conquest of cognitive abilities has been a fascinating journey. Mathematics, with its intricate patterns and creative problem-solving, stands as a testament to human intelligence. While recent advancements in language models have excelled in solving word problems, the realm…

Nier Reincarnation Shuts Down In April

Nier Reincarnation Shuts Down In April

Nier Reincarnation, the story-driven mobile RPG set in the popular Nier universe, will have its services ended this April. While the game will continue to receive support until then, its final story campaign will serve as its last big hurrah. 

Reincarnation first launched on July 28, 2021 for iOS and Android and takes place within an unspecified point in the Nier timeline. Like the previous titles, Reincarnation is an RPG set within a mysterious prison world called The Cage, with players controlling different protagonists divided across multiple story arcs. The game has also hosted several crossover events with games such as the remake of Nier Replicant, Final Fantasy XIV, and Persona 5.

The shutdown news came the same day as the launch of the latest chapter of its final story arc, The People And The World, dubbed Act II: The Return. Check out the trailer below.

[embedded content]

In a shutdown notice, Square Enix states, 

We regret to inform you that NieR Re[in]carnation will be ending service with the conclusion of The People and the World.

The final chapter of The People and the World is planned to be released Mar. 28, 2024. The game will run for a month following the final chapter’s release until Apr. 29, 2024 22:00 PST when service will officially end. We would like to express our deepest gratitude to our players for your patronage over the 2.5 years since we launched on July 28, 2021.

Until the end of service, we will continue to add content and characters, as well as hold various events and campaigns, so we hope you will continue to enjoy NieR Re[in]carnation until the end.

Until service concludes, Square Enix will offer 10 free summons per day, weekly gem gifts, and other perks. Sales of items on the Premium Shop have ceased. You can read the full rollout of final updates and events in Square Enix’s blog post

Former editor Jason Guisao was impressed with Nier Reincarnation, hailing it in an opinion piece as one of the strongest mobile titles available, writing that “Lengthy narratives and skill-based gameplay loops have come in console/PC-ports like Genshin Impact and PUBG Mobile. Nier Reincarnation, however, employs a happy medium. Heartfelt plotlines with striking visuals, Keiichi Okabe’s mesmerizing score, unintrusive microtransactions, and simple, but rewarding, combat controls establish Reincarnation as one of the best console-like games on mobile devices.”

The Legend Of Zelda, Super Mario Composer Koji Kondo To Be Inducted Into AIAS Hall Of Fame

Koji Kondo, longtime composer for The Legend of Zelda and Super Mario Bros. franchises, will be inducted into the Academy of Interactvie Arts & Sciences Hall of Fame at this year’s D.I.C.E. Awards. The presentation will take place at the 27th annual D.I.C.E. Awards in Las Vegas, Nevada, during the ceremony on Thursday, February 15, at 8 p.m. PT/11 p.m. ET. 

“I am deeply thankful for being selected by D.I.C.E. for this important award,” Kondo writes in a press release. “It is a true honor to be recognized this way, and I am extremely humbled. Thanks to the help from the many people surrounding me and the support from our customers and fans, I was fortunate enough to be involved in game music development for decades.  I am grateful for everyone who helped and supported me.

The Legend Of Zelda, Super Mario Composer Koji Kondo To Be Inducted Into AIAS Hall Of Fame

“I will continue my efforts in the music and sound aspects of development to hopefully make everyone’s game experience even more enjoyable for years to come.” 

Meggan Scavio, president of AIAS, says Kondo’s musical work is timeless and impactful, spanning multiple generations of fans across the world. Scavio adds, “For over 39 years, he has delighted us all in the industry with sounds and songs that are recognized by so many fans and became a pop culture phenomenon. We are honored to be able to induct Kondo-san into the Hall of Fame.” 

Kondon was born on August 13, 1961, in Nagoya-City in the Aichi-prefecture of Japan. He began working for Nintendo in April 1984 after graduating from the Osaka University of Arts, according to a press release. At Nintendo, Kondo was responsible for sound programming, music, and sound effects for the company’s now-two biggest franchises, Super Mario Bros. and The Legend of Zelda. 

Kondo’s portfolio includes musical composition for The Legend of Zelda: Ocarina of Time, Super Mario Sunshine, New Super Mario Bros., Super Mario Galaxy, The Legend of Zelda: Skyward Sword, and last year’s Super Mario Bros. Wonder, amongst others. Today, he is the senior officer of Nintendo’s entertainment planning and development division. 

After his induction into the AIAS Hall of Fame, Kondo will join the likes of Ed Boon, Tim Schafer, Connie Booth, Todd Howard, Hideo Kojima, Leslie Benzies, Bonnie Ross, Dan and Sam Houser, and Tim Sweeney, to name a few. 

For more about the composer and the music he’s created, read Game Informer’s interview with Koji Kondo from last year here, and then check out this story about an upcoming Legend of Zelda orchestral concert Nintendo is uploading to YouTube next month. After that, read Game Informer’s Super Mario Bros. Wonder review


What’s your favorite Koji Kondo song? Let us know in the comments below!