Rapid AI Advances Spotlight Critical Global Tech Skills Shortage

For a perfect example of just how quickly technology evolves, look no further than ChatGPT. While artificial intelligence, chatbots, and virtual assistants were hardly new concepts prior to ChatGPT launching, it managed to take the conversation to the next level. Today, it seems like AI is…

Generative AI and Robotics: Are We on the Brink of a Breakthrough?

Imagine a world where robots can compose symphonies, paint masterpieces, and write novels. This fascinating fusion of creativity and automation, powered by Generative AI, is not a dream anymore; it is reshaping our future in significant ways. The convergence of Generative AI and robotics is leading…

The societal implications of digital WMDs – CyberTalk

Bryan Neilson is an experienced Cyberspace & Intelligence Operations professional who built his career supporting Cyberspace Operations, Intelligence Collection, and Counterintelligence for the U.S. Intelligence Community. Bryan’s work, which has spanned the globe, can be directly tied to saving the lives of countless officers and assets, enabling of kinetic military objectives, and helping to build and maintain the strategic advantage of the United States throughout Cyberspace and beyond. Fusing his proficiencies in Cyberspace Operations and Human Intelligence, Bryan has become a trailblazer in his industry and has brought his unique expertise to Check Point Software Technologies – where he serves as Regional Architect, Evangelist, and global Subject Matter Expert in Sophisticated Cyberspace and Intelligence Tradecraft.

In the last several days, the cyber security industry has been rocked by a rare acknowledgement from U.S. Government Officials regarding the likelihood of extensive compromise of U.S. Critical Infrastructure by specific state-sponsored hacking groups. In a rare public pronouncement, the United States’ National Security Agency (NSA) revealed the extent to which it (and other federal agencies) believes that specific Nation-State sponsored actors have been actively and successfully engaging in broad campaigns to compromise various systems controlling critical infrastructure components within the U.S.

It has been a long-held belief among many cyberspace professionals that sophisticated state-enabled offensive actors have been actively and covertly compromising various critical infrastructure systems and networks across the United States and its allies – activity that has been on-going for several years.  Nevertheless, these public statements from the NSA – an organization known for keeping such issues and ‘troubles’ concealed from the general public – suggest mounting concerns among U.S. intelligence, military, diplomacy, and congressional officials.

Furthermore, U.S. officials have noted how this observed ‘buildup’ is predominantly targeting critical infrastructure systems of little to no intelligence value; thus, raising alarm that the motivation behind this activity is for the sole purpose of gaining a strategic advantage (the ability to disrupt U.S. and allied critical infrastructure) in the event conflicts arise.

Since early 2023, when the NSA and Microsoft collaboratively identified and publicly-revealed the existence of China’s Volt Typhoon program and alluded to the extent to which this mission had gained strategic access among critical infrastructure, worry throughout Washington has been mounting. The primary concerns are three-fold: A) strategic pre-positioning and control over U.S. critical infrastructure represents a substantive threat to the United States government, economy, and society; B) such wide-ranging pre-positioning has the potential to fundamentally shift the balance of power and displace the United States’ strategic advantage and dominance within the Cyberspace Domain; and C) such pre-positioning activity positions adversarial nation states with a “first-strike” capacity against the United States. These concerns have been echoed by Air Force General Timothy Haugh (Commander of U.S. Cyber Command and the top military official in the United States for cyberspace), in a telling statement made to the Washington Times, “We see attempts to be latent in a network that is critical infrastructure, that has no intelligence value, which is why it is so concerning.

Recent public statements from the NSA and the subsequent comments from the Commander of the U.S. Cyber Command paint a rather bleak picture for the continuing security of United States critical infrastructure – and in turn, the future stability and resiliency of the U.S. government, economy, and society. Nevertheless, it is imperative to remember that this pre-positioning activity some U.S. adversaries are being accused of is neither new, unprecedented, nor, legally speaking, an act of outright hostility. Many countries with cyberspace operations capacities that are at least moderately sophisticated are actively engaged in the premeditated, organized, nation-sanctioned, and clandestine compromise of systems and networks for the sole purpose of gaining a strategic advantage over their adversaries – the United States being no exception. Lacking the critical element of direct and overt hostility, such activity is predominantly viewed and handled in the same manner as espionage, rather than actions indicative of war.

Chartered, in part, with maintaining and increasing the strategic advantage and dominance the United States has long held throughout the cyberspace domain, U.S. Cyber Command actively engages in this same strategic pre-positioning targeting U.S. adversaries. Such maneuvers intend and ultimately result in the compromise of and surreptitious control over thousands of systems and networks deemed advantageous to the interest and strategic advantage of the United States (systems and networks critical to the governmental, military, economic, and societal functions of other nations). This type of activity neither intends nor results in any immediate denial effect and therefore, does not meet the legal standard of Cyberspace Attack – a hostile act.  Rather, this type of activity is more aligned with acts of Cyberspace Exploitation.

Understanding this subtle yet crucial nuance between cyberspace attack and cyberspace exploitation is paramount to properly framing the situation that the world now faces. Cyberspace attack and cyberspace exploitation are two sides of the same coin. While both seek the compromise of systems, networks, data, and other assets, they fundamentally differ in both execution and motivation.

Cyberspace Attack, being of more substantial concern, consists of acts The societal implications of digital WMDs – CyberTalkcarried out within or through the cyberspace domain that have either the intent or result of causing immediate denial effects (defined as any form of degradation, disruption, or destruction). Actions carried out in this manner are still classified as Cyberspace Attack, even if this denial effect impacts resources outside the cyberspace domain. Cyberspace Exploitation, on the other hand, does not arise from the motivation of causing an immediate denial effect. Rather, Cyberspace Exploitation consists of acts of espionage or enablement carried out within or through the cyberspace domain. Lacking any motivation or outcome of an immediate denial effect, acts of Cyberspace Exploitation are not considered directly hostile and, from a legal, military, and diplomatic perspective handled much differently – through espionage, military maneuvers, counterintelligence, international pressure, and diplomacy. Notable however, is the standard setting forth “Enablement Activity” as an act of cyberspace exploitation. Such enablement activity consists of actions carried out for the purpose of enabling future activity or operations within or outside the cyberspace domain – regardless of the intent, motivation, or ultimate outcome inherent to such future activity.

Cyberspace Operations (which includes the aforementioned Cyberspace Attack and Cyberspace Exploitation, along with Cyberspace Security and Cyberspace Defense) establishes the current legal, military, and diplomatic doctrine and framework adopted by a majority of countries. The pre-positioning activity that is now raising alarm within the United States, while concerning and notable, represents non-hostile enablement activity within the discipline of Cyberspace Exploitation. The inclusion of “enablement activity” under the umbrella of Cyberspace Exploitation is a direct causal factor in the increased targeting and successful compromise of critical infrastructure systems around the world.

The rapid expansion in actors capable and willing to engage in cyberspace exploitation combined with the relative ease by which many critical infrastructure components can be compromised has led to a new “Mutually Assured Destruction” (MAD) style buildup of offensive capabilities and strategically pre-compromised and controlled critical assets. Though not directly hostile, this enablement activity does tactically position an actor to have control over the critical infrastructure of another country – thus providing the actor the ability to cause substantial damage to the country’s government, military, economy, and society.

Today, the world finds itself again in the grips of a transformed Cold War – watching the proliferation, buildup, and strategic placement of weapons of mass destruction. Reminiscent of global issues faced in decades past, this race towards mutually assured destruction is now driven by computer code rather than fissile material – a new age of weapons known as Digital Weapons of Mass Destruction.

"Let us hope the will of good men is enough to counter the terrible strength of this thing that was put in motion" - Donaldson, R. (Director) - Thirteen Days, New Line CinemaThe implications go beyond the direct impacts these digital WMDs would have on the physical world to the social and psychological impacts that they could have on people. In his 1955 book titled, “The Sane Society” social psychologist Erich Fromm describes the “Socially Patterned Defect”: a systemic illness underlying and inherent to modern societies, that absent the distractions of modern technology, would present in clinical signs of neurosis, psychosis, and socially-deviant behaviors among the population. Though more than half a century has passed since originally theorized, the hypothesis of a Socially Patterned Defect has been tested and upheld throughout the decades – even in today’s modern world. The aggressive adoption by modern societies of technologies providing on-demand access to real-time communications and information represents a new social and public health threat posed by such Digital WMDs. Unfortunate, but true, is the fact that most societies and individuals within the modern world are ill-prepared and would be effectively unable to function in a world without the modern technologies they have come to rely on.

Consider, as one example, the very real possibility of disruption to a nation’s power and communications infrastructure. While undoubtedly damaging to the nation’s government, military, and economy, the impact such an event would have on the society could be far more substantial. The co-dependency and reliance most modern societies have on current technology creates an ideal comorbidity condition where, any unexpected, immediate, and long-term absence of such technology could have the potential of causing this Socially Patterned Defect to emerge – resulting in mass disorder, public health and law enforcement crises, and ultimately societal and government collapse within the impacted population(s). Such effects resulting from a population’s loss of modern technology are not simply theoretical but have been observed on numerous occasions (and in relatively small scale) in the aftermath of recent natural disasters. This scenario represents a simple and limited-in-scope example of what is possible and of interests to sophisticated actors today. Considering the enablement activity being observed intends to acquire control over the whole of a nation’s critical infrastructure (communications, energy, emergency services, healthcare, transportation, and water systems – to name a few), the outcomes could be even more grave.

While the totality of impact such Digital WMDs would have on society seems dire, there is hope on the horizon. In May 2024, the United States Department of State published the “International Cyberspace and Digital Policy Strategy”. Laying the foundation for a brighter, more secure, and more sustainable future, this policy seeks to set the cornerstone of a more diplomatic approach to cyberspace. Though seen as a watershed moment in the history of cyberspace, it is important to remember that these efforts are still very much in their infancy and will take years to fully formalize and canonize; and could be easily disrupted should tensions between key nations reach a point where conflict involving hostile actions within or through cyberspace seem warranted. Until such time, this new strategy is only complementary to and in no way contradicting or superseding the current military-minded doctrine of cyberspace operations.

With the stakes so high and any global realignment of doctrine so far off, it now rests on the shoulders of the global collective of cyberspace operations and cyber security professionals to help drive the world to this more secure reality – one where Digital WMDs are less prolific and the thought of triggering such weapons is considered a taboo in the same vein as the use of nuclear weapons. As an industry, the most powerful countermeasures are not the cyber security technologies – they have time and again proven inadequate and unable to stand up against sophisticated offensivecapabilities – but rather the knowledge, expertise, good nature, and voices of these unique professionals. In the interest of prevention, advocating for non-proliferation, disarmament, and international oversight and control of Digital WMDs is essential. Through this, governments can be pressured to ensure such weapons are rarely used; and if so, are employed in a restrained manner accounting for all reasonable measures to ensure societal stability.  While seeking prevention would be ideal, mutually agreed global disarmament and restraint among nations who possess (or who could easily develop) Digital WMDs is doubtful. Therefore, a measure of focus must be shifted to preparation and response rather than prevention.

With this new Cold War being fought out within a realm that is largely intangible and through actions rarely perceptible or considered, the seriousness and criticality of the situation the world now faces is often overlooked or not entirely comprehended. Just as populations around the world took measures in preparation for nuclear war throughout the mid-20th century, the world once again must proactively prepare for the possibility of conflict involving actions taken through cyberspace intended to result in disruptions to critical infrastructure. Everyone, from individuals to the largest organizations and educational institutions, to governments must preemptively address these threats and plan for a reality where critical services are made unavailable for an extended period of time.

Organizations can take strategic and common-sense measures to help ensure they are better prepared for such possibilities. Building comprehensive Continuity of Operations Plans that include contingencies for loss of critical infrastructure is fundamental. Through this, organizations should identify resources and services that are deemed critical (those a company would be unable to function without) and identify alternative means of operations should these resources and services be made unavailable. Organizations should also seek to establish substitute communications strategies, alternate work site locations, and disaster-scenario personnel reporting requirements. Additionally, any continuity of operations program should account for identification and loss of human resources that provide or hold critical knowledge for the organization.

To be more proactive, organizations should build teams (or partner with services) to provide real-time monitoring, investigations, digital forensics, incident handling, cyber threat intelligence, and proactive threat hunting capabilities. Governments must also come to the table and lower the bar for entry to build strategic public-private partnerships for the purposes of sharing critical information and intelligence. While sophisticated offensive activity can very likely go unseen even with the latest incident response strategies, technologies, and intelligence, this remains the best method of identifying and curtailing the compromise of critical systems for the purpose of pre-positioning.  Furthermore, where employed, organizations should exercise restraint in the use and deployment of counteroffensive capabilities, actions, and services to avoid causing further escalation.

"Knowledge is of no practical value unless it is put into practice." - Anton ChekhovLastly, while an uncomfortable conversations, all organizations and individuals must come to grips with the limitations and fallibility of many modern security technologies. Where most of these technologies are employed, a sobering fact must be acknowledged: no matter how robust a system is believed to be, the likelihood of previous, current, and ongoing compromise by a sophisticated actor is unquestionable – even more so for any system controlling or maintaining critical infrastructure. Nevertheless, there do exist some truly capable frameworks employing a consolidated and comprehensive approach coupled with AI-powered and cloud-delivered next-generation capabilities. Leveraging these advanced all-encompassing solutions (such as the Check Point Infinity Platform) remains the only method proven successful in preventing sophisticated offensive activity.

Microscope system sharpens scientists’ view of neural circuit connections

Microscope system sharpens scientists’ view of neural circuit connections

The brain’s ability to learn comes from “plasticity,” in which neurons constantly edit and remodel the tiny connections called synapses that they make with other neurons to form circuits. To study plasticity, neuroscientists seek to track it at high resolution across whole cells, but plasticity doesn’t wait for slow microscopes to keep pace, and brain tissue is notorious for scattering light and making images fuzzy. In an open access paper in Scientific Reports, a collaboration of MIT engineers and neuroscientists describes a new microscopy system designed for fast, clear, and frequent imaging of the living brain.

The system, called “multiline orthogonal scanning temporal focusing” (mosTF), works by scanning brain tissue with lines of light in perpendicular directions. As with other live brain imaging systems that rely on “two-photon microscopy,” this scanning light “excites” photon emission from brain cells that have been engineered to fluoresce when stimulated. The new system proved in the team’s tests to be eight times faster than a two-photon scope that goes point by point, and proved to have a four-fold better signal-to-background ratio (a measure of the resulting image clarity) than a two-photon system that just scans in one direction.

“Tracking rapid changes in circuit structure in the context of the living brain remains a challenge,” says co-author Elly Nedivi, the William R. (1964) and Linda R. Young Professor of Neuroscience in The Picower Institute for Learning and Memory and MIT’s departments of Biology and Brain and Cognitive Sciences. “While two-photon microscopy is the only method that allows high-resolution visualization of synapses deep in scattering tissue, such as the brain, the required point-by-point scanning is mechanically slow. The mosTF system significantly reduces scan time without sacrificing resolution.”

Scanning a whole line of a sample is inherently faster than just scanning one point at a time, but it kicks up a lot of scattering. To manage that scattering, some scope systems just discard scattered photons as noise, but then they are lost, says lead author Yi Xue SM ’15, PhD ’19, an assistant professor at the University of California at Davis and a former graduate student in the lab of corresponding author Peter T.C. So, professor of mechanical engineering and biological engineering at MIT. Newer single-line and the mosTF systems produce a stronger signal (thereby resolving smaller and fainter features of stimulated neurons) by algorithmically reassigning scattered photons back to their origin. In a two-dimensional image, that process is better accomplished by using the information produced by a two-dimensional, perpendicular-direction system such as mosTF, than by a one-dimensional, single-direction system, Xue says.

“Our excitation light is a line, rather than a point — more like a light tube than a light bulb — but the reconstruction process can only reassign photons to the excitation line and cannot handle scattering within the line,” Xue explains. “Therefore, scattering correction is only performed along one dimension for a 2D image. To correct scattering in both dimensions, we need to scan the sample and correct scattering along the other dimension as well, resulting in an orthogonal scanning strategy.”

In the study the team tested their system head-to-head against a point-by-point scope (a two-photon laser scanning microscope — TPLSM) and a line-scanning temporal focusing microscope (lineTF). They imaged fluorescent beads through water and through a lipid-infused solution that better simulates the kind of scattering that arises in biological tissue. In the lipid solution, mosTF produced images with a 36-times better signal-to-background ratio than lineTF.

For a more definitive proof, Xue worked with Josiah Boivin in the Nedivi lab to image neurons in the brain of a live, anesthetized mouse, using mosTF. Even in this much more complex environment, where the pulsations of blood vessels and the movement of breathing provide additional confounds, the mosTF scope still achieved a four-fold better signal-to-background ratio. Importantly, it was able to reveal the features where many synapses dwell: the spines that protrude along the vine-like processes, or dendrites, that grow out of the neuron cell body. Monitoring plasticity requires being able to watch those spines grow, shrink, come, and go across the entire cell, Nedivi says.

“Our continued collaboration with the So lab and their expertise with microscope development has enabled in vivo studies that are unapproachable using conventional, out-of-the-box two-photon microscopes,” she adds.

So says he is already planning further improvements to the technology.

“We’re continuing to work toward the goal of developing even more efficient microscopes to look at plasticity even more efficiently,” he says. “The speed of mosTF is still limited by needing to use high-sensitivity, low-noise cameras that are often slow. We are now working on a next-generation system with new type of detectors such as hybrid photomultiplier or avalanche photodiode arrays that are both sensitive and fast.”

In addition to Xue, So, Boivin, and Nedivi, the paper’s other authors are Dushan Wadduwage and Jong Kang Park.

The National Institutes of Health, Hamamatsu Corp., Samsung Advanced Institute of Technology, Singapore-MIT Alliance for Research and Technology Center, Biosystems and Micromechanics, The Picower Institute for Learning and Memory, The JPB Foundation, and The Center for Advanced Imaging at Harvard University provided support for the research.

Arvind, longtime MIT professor and prolific computer scientist, dies at 77

Arvind, longtime MIT professor and prolific computer scientist, dies at 77

Arvind Mithal, the Charles W. and Jennifer C. Johnson Professor in Computer Science and Engineering at MIT, head of the faculty of computer science in the Department of Electrical Engineering and Computer Science (EECS), and a pillar of the MIT community, died on June 17. Arvind, who went by the mononym, was 77 years old.

A prolific researcher who led the Computation Structures Group in the Computer Science and Artificial Intelligence Laboratory (CSAIL), Arvind served on the MIT faculty for nearly five decades.

As a scientist, Arvind was well known for important contributions to dataflow computing, which seeks to optimize the flow of data to take advantage of parallelism, achieving faster and more efficient computation.

In the last 25 years, his research interests broadened to include developing techniques and tools for formal modeling, high-level synthesis, and formal verification of complex digital devices like microprocessors and hardware accelerators, as well as memory models and cache coherence protocols for parallel computing architectures and programming languages.

Those who knew Arvind describe him as a rare individual whose interests and expertise ranged from high-level, theoretical formal systems all the way down through languages and compilers to the gates and structures of silicon hardware.

The applications of Arvind’s work are far-reaching, from reducing the amount of energy and space required by data centers to streamlining the design of more efficient multicore computer chips.

“Arvind was both a tremendous scholar in the fields of computer architecture and programming languages and a dedicated teacher, who brought systems-level thinking to our students. He was also an exceptional academic leader, often leading changes in curriculum and contributing to the Engineering Council in meaningful and impactful ways. I will greatly miss his sage advice and wisdom,” says Anantha Chandrakasan, chief innovation and strategy officer, dean of engineering, and the Vannevar Bush Professor of Electrical Engineering and Computer Science.

“Arvind’s positive energy, together with his hearty laugh, brightened so many people’s lives. He was an enduring source of wise counsel for colleagues and for generations of students. With his deep commitment to academic excellence, he not only transformed research in computer architecture and parallel computing but also brought that commitment to his role as head of the computer science faculty in the EECS department. He left a lasting impact on all of us who had the privilege of working with him,” says Dan Huttenlocher, dean of the MIT Schwarzman College of Computing and the Henry Ellis Warren Professor of Electrical Engineering and Computer Science.

Arvind developed an interest in parallel computing while he was a student at the Indian Institute of Technology in Kanpur, from which he received his bachelor’s degree in 1969. He earned a master’s degree and PhD in computer science in 1972 and 1973, respectively, from the University of Minnesota, where he studied operating systems and mathematical models of program behavior. He taught at the University of California at Irvine from 1974 to 1978 before joining the faculty at MIT.

At MIT, Arvind’s group studied parallel computing and declarative programming languages, and he led the development of two parallel computing languages, Id and pH. He continued his work on these programming languages through the 1990s, publishing the book “Implicit Parallel Programming in pH” with co-author R.S. Nikhil in 2001, the culmination of more than 20 years of research.

In addition to his research, Arvind was an important academic leader in EECS. He served as head of computer science faculty in the department and played a critical role in helping with the reorganization of EECS after the establishment of the MIT Schwarzman College of Computing.

“Arvind was a force of nature, larger than life in every sense. His relentless positivity, unwavering optimism, boundless generosity, and exceptional strength as a researcher was truly inspiring and left a profound mark on all who had the privilege of knowing him. I feel enormous gratitude for the light he brought into our lives and his fundamental impact on our community,” says Daniela Rus, the Andrew and Erna Viterbi Professor of Electrical Engineering and Computer Science and the director of CSAIL.

His work on dataflow and parallel computing led to the Monsoon project in the late 1980s and early 1990s. Arvind’s group, in collaboration with Motorola, built 16 dataflow computing machines and developed their associated software. One Monsoon dataflow machine is now in the Computer History Museum in Mountain View, California.

Arvind’s focus shifted in the 1990s when, as he explained in a 2012 interview for the Institute of Electrical and Electronics Engineers (IEEE), funding for research into parallel computing began to dry up.

“Microprocessors were getting so much faster that people thought they didn’t need it,” he recalled.

Instead, he began applying techniques his team had learned and developed for parallel programming to the principled design of digital hardware.

In addition to mentoring students and junior colleagues at MIT, Arvind also advised universities and governments in many countries on research in parallel programming and semiconductor design.

Based on his work on digital hardware design, Arvind founded Sandburst in 2000, a fabless manufacturing company for semiconductor chips. He served as the company’s president for two years before returning to the MIT faculty, while continuing as an advisor. Sandburst was later acquired by Broadcom.

Arvind and his students also developed Bluespec, a programming language designed to automate the design of chips. Building off this work, he co-founded the startup Bluespec, Inc., in 2003, to develop practical tools that help engineers streamline device design.

Over the past decade, he was dedicated to advancing undergraduate education at MIT by bringing modern design tools to courses 6.004 (Computation Structures) and 6.191 (Introduction to Deep Learning), and incorporating Minispec, a programming language that is closely related to Bluespec.

Arvind was honored for these and other contributions to data flow and multithread computing, and the development of tools for the high-level synthesis of hardware, with membership in the National Academy of Engineering in 2008 and the American Academy of Arts and Sciences in 2012. He was also named a distinguished alumnus of IIT Kanpur, his undergraduate alma mater.

“Arvind was more than a pillar of the EECS community and a titan of computer science; he was a beloved colleague and a treasured friend. Those of us with the remarkable good fortune to work and collaborate with Arvind are devastated by his sudden loss. His kindness and joviality were unwavering; his mentorship was thoughtful and well-considered; his guidance was priceless. We will miss Arvind deeply,” says Asu Ozdaglar, deputy dean of the MIT Schwarzman College of Computing and head of EECS.

Among numerous other awards, including membership in the Indian National Academy of Sciences and fellowship in the Association for Computing Machinery and IEEE, he received the Harry H. Goode Memorial Award from IEEE in 2012, which honors significant contributions to theory or practice in the information processing field.

A humble scientist, Arvind was the first to point out that these achievements were only possible because of his outstanding and brilliant collaborators. Chief among those collaborators were the undergraduate and graduate students he felt fortunate to work with at MIT. He maintained excellent relationships with them both professionally and personally, and valued these relationships more than the work they did together, according to family members.

In summing up the key to his scientific success, Arvind put it this way in the 2012 IEEE interview: “Really, one has to do what one believes in. I think the level at which most of us work, it is not sustainable if you don’t enjoy it on a day-to-day basis. You can’t work on it just because of the results. You have to work on it because you say, ‘I have to know the answer to this,’” he said.

He is survived by his wife, Gita Singh Mithal, their two sons Divakar ’01 and Prabhakar ’04, their wives Leena and Nisha, and two grandchildren, Maya and Vikram. 

Hidetaka Miyazaki Talks Why Bloodborne Is Special To Him And How It Led To Elden Ring

Bloodborne arrived on PS4 in 2015 and immediately became one of the best games of the year, earning a 9.75 out of 10 from Game Informer and even taking home our award for Best PlayStation Exclusive. But something about that game has stuck with players. In addition to resonating with a wider audience than many of From Software’s most iconic titles to that point, Bloodborne delivered an engrossing world full of mystery and challenge, causing it to remain top-of-mind for many Soulslike fans even today. In the lead-up to the launch of Elden Ring: Shadow of the Erdtree, I sat down with the creator of Elden Ring, Dark Souls, Sekiro: Shadows Die Twice, Demon’s Souls, and Bloodborne to learn why the 2015 PS4 exclusive holds as special a place in his heart as it does the game’s many fans.

For Miyazaki, who has directed nearly every game in From Software’s legendary Souls catalog (including Elden Ring, Sekiro, and Bloodborne), the relationship between offense and defense started getting more fully fleshed out during the development of Bloodborne. “It’s become something much more fluid and active, I think, which was a very defining characteristic of Sekiro, and it’s something I’ve been thinking about since Bloodborne,” Miyazaki says. “Perhaps in Sekiro, it appears most obviously or its the clearest form that I think that philosophy can embody. And personally, I think there’s one more level we can crank it up to and sharpen that and hone in on that mechanic even more, but I think Sekiro was a big turning point.”

Hidetaka Miyazaki Talks Why Bloodborne Is Special To Him And How It Led To Elden Ring

Sekiro: Shadows Die Twice

When I mention that Bloodborne was the first From Software game that clicked with me, Miyazaki smiles. “I’m very glad to hear that,” he exclaims. Bloodborne is a special game for me as well. I’m very, very happy to hear you say that.”

I then follow up, asking the director why the critically acclaimed title sticks out in the vast pool of his other beloved creations. “A couple of reasons,” he begins. “The first one being it was probably one of the most challenging development cycles we’ve had from a studio perspective. The second, perhaps bigger element is how personal it was for me in the sense that I’ve imparted a lot of my own ideas into this game, whether it be story, the world-building component, or even the game mechanics and game systems that are in place. It is perhaps the strongest reflection of my type of flavoring of a game that one can experience.”

However, Miyazaki’s influence is undeniable in the entire From Software catalog, which can likely be attributed to the fact that he has been heavily involved in the stage and level design from Demon’s Souls all the way up to Elden Ring. “My approach of making games as the game director, it’s like sandwiching from a very high, conceptual level and painting the final image of what we’re trying to achieve, as well as going really granular on some of the detailed elements of what the players experience,” he says. “By sandwiching the game development process, the middle almost has only one place to go which is completing that whole experience. Of course, the high-level conceptual stuff might be easy to imagine, but of the details that I pick and choose to oversee myself, the level design is one of them because I think that experience really creates and raises the floor of what players are going to feel and experience through the game design. This is true with Elden Ring and true with Dark Souls as well: I’ll look at what’s being done and say, ‘Alright, this, this, this, and this I’m going to oversee,’ because I know which points in that experience are going to be the most effective and sandwich the high-level vision plus the details that players see.”

Elden Ring

Elden Ring

That influence and approach carried into Elden Ring, the latest critically acclaimed From Software title in the Soulslike subgenre. “In the case of Elden Ring, there was the very high-level conceptual vision, and then there were the details,” Miyazaki says. “The defining details for me throughout that game was artwork, the level design, the animation, as well as the text that you see on screen. I think that was the strongest supporting factor that helped elevate the entire experience for players. And because Elden Ring was such a massive experience to design certain levels, we did hand off to other level designers and game designers and I think that is what helps the company grow massively through this experience. Every game has a different set of details that need special attention and one of them that seemed appropriate to work collaboratively or hand off to other designers was the level design in this case. And that, again, I think helps elevate the company as a whole in terms of the talent we have.”

Though Bloodborne and Sekiro are tentpole moments in the evolution of the Soulslike subgenre, Elden Ring is the most successful game in the young category’s history. Not only does it carry an almost unheard-of 96 out of 100 on reviews aggregate site Metacritic (including a rare 10 out of 10 from Game Informer), but it also took home several Game of the Year Awards, including from Game Informer and The Game Awards.

Now, players have an excuse to jump back into that acclaimed 2022 title as From Software is poised to release Shadow of the Erdtree, the long-awaited DLC for Elden Ring. For more on Shadow of the Erdtree, be sure to check out our glowing review of the latest DLC right here.

MIT-Takeda Program wraps up with 16 publications, a patent, and nearly two dozen projects completed

MIT-Takeda Program wraps up with 16 publications, a patent, and nearly two dozen projects completed

When the Takeda Pharmaceutical Co. and the MIT School of Engineering launched their collaboration focused on artificial intelligence in health care and drug development in February 2020, society was on the cusp of a globe-altering pandemic and AI was far from the buzzword it is today.

As the program concludes, the world looks very different. AI has become a transformative technology across industries including health care and pharmaceuticals, while the pandemic has altered the way many businesses approach health care and changed how they develop and sell medicines.

For both MIT and Takeda, the program has been a game-changer.

When it launched, the collaborators hoped the program would help solve tangible, real-world problems. By its end, the program has yielded a catalog of new research papers, discoveries, and lessons learned, including a patent for a system that could improve the manufacturing of small-molecule medicines.

Ultimately, the program allowed both entities to create a foundation for a world where AI and machine learning play a pivotal role in medicine, leveraging Takeda’s expertise in biopharmaceuticals and the MIT researchers’ deep understanding of AI and machine learning.

“The MIT-Takeda Program has been tremendously impactful and is a shining example of what can be accomplished when experts in industry and academia work together to develop solutions,” says Anantha Chandrakasan, MIT’s chief innovation and strategy officer, dean of the School of Engineering, and the Vannevar Bush Professor of Electrical Engineering and Computer Science. “In addition to resulting in research that has advanced how we use AI and machine learning in health care, the program has opened up new opportunities for MIT faculty and students through fellowships, funding, and networking.”

What made the program unique was that it was centered around several concrete challenges spanning drug development that Takeda needed help addressing. MIT faculty had the opportunity to select the projects based on their area of expertise and general interest, allowing them to explore new areas within health care and drug development.

“It was focused on Takeda’s toughest business problems,” says Anne Heatherington, Takeda’s research and development chief data and technology officer and head of its Data Sciences Institute.

“They were problems that colleagues were really struggling with on the ground,” adds Simon Davies, the executive director of the MIT-Takeda Program and Takeda’s global head of statistical and quantitative sciences. Takeda saw an opportunity to collaborate with MIT’s world-class researchers, who were working only a few blocks away. Takeda, a global pharmaceutical company with global headquarters in Japan, has its global business units and R&D center just down the street from the Institute.

As part of the program, MIT faculty were able to select what issues they were interested in working on from a group of potential Takeda projects. Then, collaborative teams including MIT researchers and Takeda employees approached research questions in two rounds. Over the course of the program, collaborators worked on 22 projects focused on topics including drug discovery and research, clinical drug development, and pharmaceutical manufacturing. Over 80 MIT students and faculty joined more than 125 Takeda researchers and staff on teams addressing these research questions.

The projects centered around not only hard problems, but also the potential for solutions to scale within Takeda or within the biopharmaceutical industry more broadly.

Some of the program’s findings have already resulted in wider studies. One group’s results, for instance, showed that using artificial intelligence to analyze speech may allow for earlier detection of frontotemporal dementia, while making that diagnosis more quickly and inexpensively. Similar algorithmic analyses of speech in patients diagnosed with ALS may also help clinicians understand the progression of that disease. Takeda is continuing to test both AI applications.

Other discoveries and AI models that resulted from the program’s research have already had an impact. Using a physical model and AI learning algorithms can help detect particle size, mix, and consistency for powdered, small-molecule medicines, for instance, speeding up production timelines. Based on their research under the program, collaborators have filed for a patent for that technology.

For injectable medicines like vaccines, AI-enabled inspections can also reduce process time and false rejection rates. Replacing human visual inspections with AI processes has already shown measurable impact for the pharmaceutical company.

Heatherington adds, “our lessons learned are really setting the stage for what we’re doing next, really embedding AI and gen-AI [generative AI] into everything that we do moving forward.”

Over the course of the program, more than 150 Takeda researchers and staff also participated in educational programming organized by the Abdul Latif Jameel Clinic for Machine Learning in Health. In addition to providing research opportunities, the program funded 10 students through SuperUROP, the Advanced Undergraduate Research Opportunities Program, as well as two cohorts from the DHIVE health-care innovation program, part of the MIT Sandbox Innovation Fund Program.

Though the formal program has ended, certain aspects of the collaboration will continue, such as the MIT-Takeda Fellows, which supports graduate students as they pursue groundbreaking research related to health and AI. During its run, the program supported 44 MIT-Takeda Fellows and will continue to support MIT students through an endowment fund. Organic collaboration between MIT and Takeda researchers will also carry forward. And the programs’ collaborators are working to create a model for similar academic and industry partnerships to widen the impact of this first-of-its-kind collaboration. 

David Autor named the inaugural Daniel (1972) and Gail Rubinfeld Professor in Economics

David Autor named the inaugural Daniel (1972) and Gail Rubinfeld Professor in Economics

The Department of Economics has announced David Autor as the inaugural holder of the Daniel (1972) and Gail Rubinfeld Professorship in Economics, effective July 1. 

The endowed chair is made possible by the generosity of Daniel and Gail Rubinfeld. Daniel Rubinfeld SM ’68, PhD ’72 is the Robert L. Bridges Professor of Law and professor of economics emeritus at the University of California at Berkeley, and professor of law emeritus at New York University.

“The Rubinfeld Professorship in Economics is important for two reasons,” Rubinfeld says. “First, it allows MIT to wisely manage its resources. Second, as an economist, I believe it’s efficient for the economics department to plan for the long term, which this endowment allows.” 

MIT will use the fund to provide a full professorship for senior faculty in the Department of Economics. Faculty with research and teaching interests in the area of applied microeconomics will receive first preference.

David Autor’s scholarship explores the labor-market impacts of technological change and globalization on job polarization, skill demands, earnings levels and inequality, and electoral outcomes. He is a faculty co-director of the recently-launched MIT Shaping the Future of Work Initiative.

“I am privileged to be the inaugural holder of the Rubinfeld Professorship in Economics, honoring Daniel Rubinfeld’s illustrious career of scholarship and public service. As the Daniel (1972) and Gail Rubinfeld Professor of Economics, I aim to honor Dan Rubinfeld’s legacy by contributing in both domains,” Autor says.

Prior to Berkeley and NYU, Rubinfeld previously spent 11 years teaching at the University of Michigan at Ann Arbor.

Rubinfeld has been a fellow at the National Bureau of Economic Research, the Center for Advanced Study in the Behavioral Sciences, and the John Simon Guggenheim Memorial Foundation. Rubinfeld previously served as deputy assistant attorney general for antitrust in the U.S. Department of Justice.

Jon Gruber, department chair and Ford Professor of Economics, says the Rubinfelds’ gift illustrates two important lessons.

“The first is the ongoing power of the MIT education — Daniel’s PhD helped him to build an important career both inside and outside of academia, and this gift will help ensure others continue to benefit from this powerful experience,” says Gruber. “The second is the importance of support directly to the economics department at this time of rapidly growing costs of research.”

“Nothing ensures the future strength of an academic department as much as endowed professorships,” adds Agustin Rayo, the Kenan Sahin Dean of the MIT School of Humanities, Arts, and Social Sciences. “This seminal gift by Gail and Daniel Rubinfeld will have a lasting impact on the success of MIT economics for decades to come. We are deeply grateful for their generous investment in the department.”

Autor has received numerous awards for both his scholarship — the National Science Foundation CAREER Award, an Alfred P. Sloan Foundation Fellowship, the Sherwin Rosen Prize for outstanding contributions to the field of Labor Economics, the Andrew Carnegie Fellowship in 2019, the Society for Progress Medal in 2021— and for his teaching, including the MIT MacVicar Faculty Fellowship. 

In 2020, Autor received the Heinz 25th Anniversary Special Recognition Award from the Heinz Family Foundation for his work “transforming our understanding of how globalization and technological change are impacting jobs and earning prospects for American workers.” 

In 2023, Autor was recognized as one of two NOMIS Distinguished Scientists.

Autor earned a BA in psychology from Tufts University in 1989 and a PhD in public policy from Harvard University’s Kennedy School of Government in 1999.

Zelda Is The Protagonist In The Legend Of Zelda: Echoes of Wisdom, Out September

Zelda Is The Protagonist In The Legend Of Zelda: Echoes of Wisdom, Out September

For years, fans of the Legend of Zelda have clamored for the titular princess to star in her own game, but even as she’s become a more prominent character in recent entries, the Zelda-led Zelda game has yet to appear on store shelves. In today’s Nintendo Direct, that wish was finally granted; in The Legend of Zelda: Echoes of Wisdom, it’s up to Princess Zelda to save Hyrule when Link is captured.

[embedded content]

While the game takes its art style from 2019’s Link’s Awakening remake, this title is not a remake of any kind, and there’s no clear indication that it is connected to Link’s Awakening. In this adventure, Zelda uses a new magic item called the Tri Rod to journey across Hyrule. The Tri Rod can create “echoes” of items, like tables, beds, or boxes, to climb and explore the overworld and its dungeons, but it doesn’t stop there. Echoes of water blocks can be used to swim up and over certain obstacles, while trampolines allow players to easily leap across gaps.

Throughout the gameplay demonstration, Series producer Eiji Aonuma explains that players can also make echoes of enemies, and that these enemies can be used in combat on your side. Zelda captures a moblin to fight some slimes, then uses meat to lure in some bird enemies and summons a deku baba to snap them up. Aonuma goes on to say that there are so many echoes in the game and that he hasn’t even counted them all – we’ll have to learn what the limits of echoes are, if any, some other time.

As the trailer continues we get more glimpses into who Zelda will be interacting with throughout the game, and it includes two kinds of Zoras, some Deku shrubs, a Sheikah person (potentially Impa) and the Great Deku Tree. It also features some 2D platforming and underwater sections, as well as Zelda using birds and plants with helicopter-like leaves to glide.

The game launches alongside a golden Hyrule-themed Switch lite, which you can view in the gallery of images above. Luckily, you won’t have to wait long for either of them: the handheld and The Legend of Zelda: Echoes of Wisdom will be available later this year, on September 26.