Artificial Intelligence (AI) has come a long way from its early days of basic rule-based systems and simple machine learning algorithms. The world is now entering a new era in AI, driven by the revolutionary concept of open-weight models. Unlike traditional AI models with fixed weights…
How to Manage Your Website’s Technical Debt – Speckyboy
The web seems to move at the speed of light. The tools and best practices we use today will soon be outdated. It’s a vicious cycle we repeat again and again.
That often leaves us with some form of technical debt. It could be a WordPress theme that isn’t compatible with the latest version of PHP. Or a hacked-together layout that won’t adapt to future needs. The worst case is software that is no longer supported.
It will impact every website sooner or later. There are ways to manage or even prevent it, though.
So, how do you keep technical debt from becoming a nightmare? Let’s review a few tips for minimizing the impact.
Build with Sustainability in Mind
The first step is to reduce the chances for technical debt to take hold. In practice, it’s about building with sustainability in mind.
There are several things you can do. For one, use tools that are popular and well-maintained. It’s not a guarantee of smooth sailing. It does increase the chances of future viability, though.
Let’s use WordPress as an example. The content management system (CMS) has existed for over 20 years. It is continually updated. A large ecosystem of themes and plugins is also thriving.
Perhaps there’s another CMS that catches your eye. It hit the market only recently – there aren’t many users yet.
There’s nothing wrong with this new CMS. But is it sustainable? Only time will tell. Therefore, it may not be the best long-term solution. Using it comes with some level of risk.
Best practices also guard against technical debt. Use the latest standards when writing code. Don’t rely on CSS hacks to build layouts. Comment your code and take detailed notes.
The idea is to think about the present and future. That could save you some headaches down the road.
Perform Regular Audits of Your Website
The status of your stack can change in an instant. Thus, it’s a good idea to perform regular audits.
A website audit should cover both hardware and software. On the hardware side, make sure your web hosting is still viable. Check your site’s performance and resource usage. The results should tell you if you need to upgrade.
You’ll also want to look closely at the software you’re using. Start with the server’s OS. Move on to versions of PHP, MySQL, or whatever you have in place. These items are crucial to your site’s well-being.
From there, it’s time to look at your CMS, themes, and plugins. Also, review any software dependencies – JavaScript libraries are a good example.
Look for outdated items. Are updates available? Is it still actively maintained?
This process will help you identify potential problems. From there, you can take action.
So, how often should you audit your site? A yearly review is fine for small websites. Large and mission-critical sites would benefit from biannual or quarterly inspections.
Use Change as an Opportunity
Perhaps you found an item or two that needs addressing. That’s OK – change is inevitable!
The good news is that change also presents an opportunity. You can reassess how your website is working. There is a chance to build a more stable foundation for the future.
In some cases, you may have to swap one item for another. For example, maybe a WordPress plugin you use has been abandoned.
Now is the time to find a replacement that will offer better longevity. It’s also possible that you no longer need what the old plugin offers. That’s one way to reduce technical debt.
You might also need to modernize your code. We often do this when dealing with PHP compatibility issues.
It’s not only a chance to use the latest version of PHP. You can also look for ways to improve functionality and security. After all, reviewing the code you wrote years ago can show how far you’ve come. There’s a chance to build it better and stronger.
You can do more than bring your website up to date. You can also make forward-thinking changes. The hope is that you can lessen the technical debt you have today – and for the future.
Take Control of Your Site’s Technical Debt
Every website will deal with technical debt. That’s part of its lifecycle.
The difference is in how much debt you’ll face. Critical thinking early in the site-building process can reduce your burden. To that end, always search for the most stable and functional solution.
Changes will come eventually. That’s an opportunity to recalibrate your approach. You can review what works and what doesn’t. The lessons you learn will come in handy as your site evolves.
The key is to think about each step you take. Consider how it will impact your site today, tomorrow, and a year from now.
You probably won’t eliminate the need for change. However, you can learn how to make change more manageable.
Related Topics
Top
The risks behind the generative AI craze: Why caution is growing
In the near future, Silicon Valley might look back at recent events as the point where the generative AI craze went too far. This past summer, investors questioned whether top AI stocks could sustain their sky-high valuations, given the lack of returns on massive AI spending….
Institute Professor Emeritus John Little, a founder of operations research and marketing science, dies at 96
MIT Institute Professor Emeritus John D.C. Little ’48, PhD ’55, an inventive scholar whose work significantly influenced operations research and marketing, died on Sept. 27, at age 96. Having entered MIT as an undergraduate in 1945, he was part of the Institute community over a span of nearly 80 years and served as a faculty member at the MIT Sloan School of Management since 1962.
Little’s career was characterized by innovative computing work, an interdisciplinary and expansive research agenda, and research that was both theoretically robust and useful in practical terms for business managers. Little had a strong commitment to supporting and mentoring others at the Institute, and played a key role in helping shape the professional societies in his fields, such as the Institute for Operations Research and the Management Sciences (INFORMS).
He may be best known for his formulation of “Little’s Law,” a concept applied in operations research that generalizes the dynamics of queuing. Broadly, the theorem, expressed as L = λW, states that the number of customers or others waiting in a line equals their arrival rate multiplied by their average time spent in the system. This result can be applied to many systems, from manufacturing to health care to customer service, and helps quantify and fix business bottlenecks, among other things.
Little is widely considered to have been instrumental in the development of both operations research and marketing science, where he also made a range of advances, starting in the 1960s. Drawing on innovations in computer modeling, he analyzed a broad range of issues in marketing, from customer behavior and brand loyalty to firm-level decisions, often about advertising deployment strategy. Little’s research methods evolved to incorporate the new streams of data that information technology increasingly made available, such as the purchasing information obtained from barcodes.
“John Little was a mentor and friend to so many of us at MIT and beyond,” says Georgia Perakis, the interim John C. Head III Dean of MIT Sloan. “He was also a pioneer — as the first doctoral student in the field of operations research, as the founder of the Marketing Group at MIT Sloan, and with his research, including Little’s Law, published in 1961. Many of us at MIT Sloan are lucky to have followed in John’s footsteps, learning from his research and his leadership both at the school and in many professional organizations, including the INFORMS society where he served as its first president. I am grateful to have known and learned from John myself.”
Little’s longtime colleagues in the marketing group at MIT Sloan shared those sentiments.
“John was truly an academic giant with pioneering work in queuing, optimization, decision sciences, and marketing science,” says Stephen Graves, the Abraham J. Siegel Professor Post Tenure of Management at MIT Sloan. “He also was an exceptional academic leader, being very influential in the shaping and strengthening of the professional societies for operations research and for marketing science. And he was a remarkable person as a mentor and colleague, always caring, thoughtful, wise, and with a New England sense of humor.”
John Dutton Conant Little was born in Boston and grew up in Andover, Massachusetts. At MIT he majored in physics and edited the campus’ humor magazine. Working at General Electric after graduation, he met his future wife, Elizabeth Alden PhD ’54; they both became doctoral students in physics at MIT, starting in 1951.
Alden studied ferroelectric materials, which exhibit complex properties of polarization, and produced a thesis titled, “The Dynamic Behavior of Domain Walls in Barium Titanate,” working with Professor Arthur R. von Hippel. Little, advised by Professor Philip Morse, used MIT’s famous Whirlwind I computer for his dissertation work. His thesis, titled “Use of Storage Water in a Hydroelectric System,” modeled the optimally low-cost approach to distributing water held by dams. It was a thesis in both physics and operations research, and appears to be the first one ever granted in operations research.
Little then served in the U.S. Army and spent five years on the faculty at what is now Case Western Reserve University, before returning to the Institute in 1962 as an associate professor of operations research and management at MIT Sloan. Having worked at the leading edge of using computing to tackle operations problems, Little began applying computer modeling to marketing questions. His research included models of consumer choice and promotional spending, among other topics.
Little published several dozen scholarly papers across operations research and marketing, as well as co-editing, along with Robert C. Blattberg and Rashi Glazer, a 1974 book, “The Marketing Information Revolution,” published by Harvard Business School Press. Ever the wide-ranging scholar, he even published several studies about optimizing traffic signals and traffic flow.
Still, in addition to Little’s Law, some of his key work came from studies in marketing and management. In an influential 1970 paper in Management Science, Little outlined the specifications that a good data-driven management model should have, emphasizing that business leaders should be given tools they could thoroughly grasp.
In a 1979 paper in Operations Research, Little described the elements needed to develop a robust model of ad expenditures for businesses, such as the geographic distribution of spending, and a firm’s spending over time. And in a 1983 paper with Peter Guadagni, published in Marketing Science, Little used the advent of scanner data for consumer goods to build a powerful model of consumer behavior and brand loyalty, which has remained influential.
Separate though these topics might be, Little always sought to explain the dynamics at work in each case. As a scholar, he “had the vision to perceive marketing as source of interesting and relevant unexplored opportunities for OR [operations research] and management science,” wrote Little’s MIT colleagues John Hauser and Glen Urban in a biographical chapter about him, “Profile of John D.C. Little,” for the book “Profiles in Operations Research,” published in 2011. It it, Hauser and Urban detail the lasting contributions these papers and others made.
By 1967, Little had co-founded the firm Management Decisions Systems, which modeled marketing problems for major companies and was later purchased by Information Resources, Inc. on whose board Little served.
In 1989, Little was named Institute Professor, MIT’s highest faculty honor. He had previously served as director of the MIT Operations Research Center. At MIT Sloan he was the former head of the Management Science Area and the Behavioral and Policy Sciences Area.
For all his productivity as a scholar, Little also served as a valued mentor to many, while opening his family home outside of Boston to overseas-based faculty and students for annual Thanksgiving dinners. He also took pride in encouraging women to enter management and academia. In just one example, he was the principal faculty advisor for the late Asha Seth Kapadia SM ’65, one of the first international and female students at Sloan, who studied queuing theory and later became a longtime professor at the University of Texas School of Public Health.
Additionally, current MIT Sloan professor Juanjuan Zhang credits Little for inspiring her interest in the field; today Zhang is the John D.C. Little Professor of Marketing at MIT Sloan.
“John was a larger-than-life person,” Zhang says. “His foundational work transformed marketing from art, to art, science, and engineering, making it a process that ordinary people can follow to succeed. He democratized marketing.”
Little’s presence as an innovative, interdisciplinary scholar who also encouraged others to pursue their own work is fundamental to the way he is remembered at MIT.
“John pioneered in operations research at MIT and is widely known for Little’s Law, but he did even more work in marketing science,” said Urban, an emeritus dean of MIT Sloan and the David Austin Professor in Marketing, Emeritus. “He founded the field of operations research modeling in marketing, with analytic work on adaptive advertising, and did fundamental work on marketing response. He was true to our MIT philosophy of “mens et manus” [“mind and hand”] as he proposed that models should be usable by managers as well as being theoretically strong. Personally, John hired me as an assistant professor in 1966 and supported my work in the following 55 years at MIT. I am grateful to him, and sad to lose a friend and mentor.”
Hauser, the Kirin Professor of Marketing at MIT Sloan, added: “John made seminal contributions to many fields from operations to management science to founding marketing science. More importantly, he was a unique colleague who mentored countless faculty and students and who, by example, led with integrity and wit. I, and many others, owe our love of operations research and marketing science to John.”
In recognition of his scholarship, Little was elected to the National Academy of Engineering, and was a fellow of the American Association for the Advancement of Science. Among other honors, the American Marketing Association gave Little its Charles Parlin Award for contributions to the practice of marketing research, in 1979, and its Paul D. Converse Award for lifetime achievement, in 1992. Little was the first president of INFORMS, which honored him with its George E. Kimball Medal. Little was also president of The Institute of Management Sciences (TIMS), and the Operations Research Society of America (ORSA).
An avid jogger, biker, and seafood chef, Little was dedicated to his family. He is predeceased by his wife, Elizabeth, and his two sisters, Margaret and Francis. Little is survived by his children Jack, Sarah, Thomas, and Ruel; eight grandchildren; and two great-grandchildren. Arrangements for a memorial service have been entrusted to the Dee Funeral Home in Concord, Massachusetts.
Vectorize Raises $3.6 Million to Revolutionize AI-Powered Data Retrieval with Groundbreaking RAG Platform
Vectorize, a pioneering startup in the AI-driven data space, has secured $3.6 million in seed funding led by True Ventures. This financing marks a significant milestone for the company, as it launches its innovative Retrieval Augmented Generation (RAG) platform. Designed to optimize how businesses access and…
How AI is Amplifying Human Potential in Sales and Marketing
Artificial intelligence (AI) is revolutionizing how professionals approach marketing and sales in every sector. By embracing AI, professionals in the field are enhancing efficiency, boosting outcomes and driving faster, more informed decisions. Sales and marketing’s AI evolution signifies not just a shift in tools, but an…
How AI is Redefining Team Dynamics in Collaborative Software Development
While artificial intelligence is transforming various industries worldwide, its impact on software development is especially significant. AI-powered tools are enhancing code quality and efficiency and redefining how teams work together in collaborative environments. As AI continues to evolve, it’s becoming a key player in reconfiguring team…
Artificial intelligence meets “blisk” in new DARPA-funded collaboration
A recent award from the U.S. Defense Advanced Research Projects Agency (DARPA) brings together researchers from Massachusetts Institute of Technology (MIT), Carnegie Mellon University (CMU), and Lehigh University (Lehigh) under the Multiobjective Engineering and Testing of Alloy Structures (METALS) program. The team will research novel design tools for the simultaneous optimization of shape and compositional gradients in multi-material structures that complement new high-throughput materials testing techniques, with particular attention paid to the bladed disk (blisk) geometry commonly found in turbomachinery (including jet and rocket engines) as an exemplary challenge problem.
“This project could have important implications across a wide range of aerospace technologies. Insights from this work may enable more reliable, reusable, rocket engines that will power the next generation of heavy-lift launch vehicles,” says Zachary Cordero, the Esther and Harold E. Edgerton Associate Professor in the MIT Department of Aeronautics and Astronautics (AeroAstro) and the project’s lead principal investigator. “This project merges classical mechanics analyses with cutting-edge generative AI design technologies to unlock the plastic reserve of compositionally graded alloys allowing safe operation in previously inaccessible conditions.”
Different locations in blisks require different thermomechanical properties and performance, such as resistance to creep, low cycle fatigue, high strength, etc. Large scale production also necessitates consideration of cost and sustainability metrics such as sourcing and recycling of alloys in the design.
“Currently, with standard manufacturing and design procedures, one must come up with a single magical material, composition, and processing parameters to meet ‘one part-one material’ constraints,” says Cordero. “Desired properties are also often mutually exclusive prompting inefficient design tradeoffs and compromises.”
Although a one-material approach may be optimal for a singular location in a component, it may leave other locations exposed to failure or may require a critical material to be carried throughout an entire part when it may only be needed in a specific location. With the rapid advancement of additive manufacturing processes that are enabling voxel-based composition and property control, the team sees unique opportunities for leap-ahead performance in structural components are now possible.
Cordero’s collaborators include Zoltan Spakovszky, the T. Wilson (1953) Professor in Aeronautics in AeroAstro; A. John Hart, the Class of 1922 Professor and head of the Department of Mechanical Engineering; Faez Ahmed, ABS Career Development Assistant Professor of mechanical engineering at MIT; S. Mohadeseh Taheri-Mousavi, assistant professor of materials science and engineering at CMU; and Natasha Vermaak, associate professor of mechanical engineering and mechanics at Lehigh.
The team’s expertise spans hybrid integrated computational material engineering and machine-learning-based material and process design, precision instrumentation, metrology, topology optimization, deep generative modeling, additive manufacturing, materials characterization, thermostructural analysis, and turbomachinery.
“It is especially rewarding to work with the graduate students and postdoctoral researchers collaborating on the METALS project, spanning from developing new computational approaches to building test rigs operating under extreme conditions,” says Hart. “It is a truly unique opportunity to build breakthrough capabilities that could underlie propulsion systems of the future, leveraging digital design and manufacturing technologies.”
This research is funded by DARPA under contract HR00112420303. The views, opinions, and/or findings expressed are those of the author and should not be interpreted as representing the official views or policies of the Department of Defense or the U.S. government and no official endorsement should be inferred.
Study finds mercury pollution from human activities is declining
MIT researchers have some good environmental news: Mercury emissions from human activity have been declining over the past two decades, despite global emissions inventories that indicate otherwise.
In a new study, the researchers analyzed measurements from all available monitoring stations in the Northern Hemisphere and found that atmospheric concentrations of mercury declined by about 10 percent between 2005 and 2020.
They used two separate modeling methods to determine what is driving that trend. Both techniques pointed to a decline in mercury emissions from human activity as the most likely cause.
Global inventories, on the other hand, have reported opposite trends. These inventories estimate atmospheric emissions using models that incorporate average emission rates of polluting activities and the scale of these activities worldwide.
“Our work shows that it is very important to learn from actual, on-the-ground data to try and improve our models and these emissions estimates. This is very relevant for policy because, if we are not able to accurately estimate past mercury emissions, how are we going to predict how mercury pollution will evolve in the future?” says Ari Feinberg, a former postdoc in the Institute for Data, Systems, and Society (IDSS) and lead author of the study.
The new results could help inform scientists who are embarking on a collaborative, global effort to evaluate pollution models and develop a more in-depth understanding of what drives global atmospheric concentrations of mercury.
However, due to a lack of data from global monitoring stations and limitations in the scientific understanding of mercury pollution, the researchers couldn’t pinpoint a definitive reason for the mismatch between the inventories and the recorded measurements.
“It seems like mercury emissions are moving in the right direction, and could continue to do so, which is heartening to see. But this was as far as we could get with mercury. We need to keep measuring and advancing the science,” adds co-author Noelle Selin, an MIT professor in the IDSS and the Department of Earth, Atmospheric and Planetary Sciences (EAPS).
Feinberg and Selin, his MIT postdoctoral advisor, are joined on the paper by an international team of researchers that contributed atmospheric mercury measurement data and statistical methods to the study. The research appears this week in the Proceedings of the National Academy of Sciences.
Mercury mismatch
The Minamata Convention is a global treaty that aims to cut human-caused emissions of mercury, a potent neurotoxin that enters the atmosphere from sources like coal-fired power plants and small-scale gold mining.
The treaty, which was signed in 2013 and went into force in 2017, is evaluated every five years. The first meeting of its conference of parties coincided with disheartening news reports that said global inventories of mercury emissions, compiled in part from information from national inventories, had increased despite international efforts to reduce them.
This was puzzling news for environmental scientists like Selin. Data from monitoring stations showed atmospheric mercury concentrations declining during the same period.
Bottom-up inventories combine emission factors, such as the amount of mercury that enters the atmosphere when coal mined in a certain region is burned, with estimates of pollution-causing activities, like how much of that coal is burned in power plants.
“The big question we wanted to answer was: What is actually happening to mercury in the atmosphere and what does that say about anthropogenic emissions over time?” Selin says.
Modeling mercury emissions is especially tricky. First, mercury is the only metal that is in liquid form at room temperature, so it has unique properties. Moreover, mercury that has been removed from the atmosphere by sinks like the ocean or land can be re-emitted later, making it hard to identify primary emission sources.
At the same time, mercury is more difficult to study in laboratory settings than many other air pollutants, especially due to its toxicity, so scientists have limited understanding of all chemical reactions mercury can undergo. There is also a much smaller network of mercury monitoring stations, compared to other polluting gases like methane and nitrous oxide.
“One of the challenges of our study was to come up with statistical methods that can address those data gaps, because available measurements come from different time periods and different measurement networks,” Feinberg says.
Multifaceted models
The researchers compiled data from 51 stations in the Northern Hemisphere. They used statistical techniques to aggregate data from nearby stations, which helped them overcome data gaps and evaluate regional trends.
By combining data from 11 regions, their analysis indicated that Northern Hemisphere atmospheric mercury concentrations declined by about 10 percent between 2005 and 2020.
Then the researchers used two modeling methods — biogeochemical box modeling and chemical transport modeling — to explore possible causes of that decline. Box modeling was used to run hundreds of thousands of simulations to evaluate a wide array of emission scenarios. Chemical transport modeling is more computationally expensive but enables researchers to assess the impacts of meteorology and spatial variations on trends in selected scenarios.
For instance, they tested one hypothesis that there may be an additional environmental sink that is removing more mercury from the atmosphere than previously thought. The models would indicate the feasibility of an unknown sink of that magnitude.
“As we went through each hypothesis systematically, we were pretty surprised that we could really point to declines in anthropogenic emissions as being the most likely cause,” Selin says.
Their work underscores the importance of long-term mercury monitoring stations, Feinberg adds. Many stations the researchers evaluated are no longer operational because of a lack of funding.
While their analysis couldn’t zero in on exactly why the emissions inventories didn’t match up with actual data, they have a few hypotheses.
One possibility is that global inventories are missing key information from certain countries. For instance, the researchers resolved some discrepancies when they used a more detailed regional inventory from China. But there was still a gap between observations and estimates.
They also suspect the discrepancy might be the result of changes in two large sources of mercury that are particularly uncertain: emissions from small-scale gold mining and mercury-containing products.
Small-scale gold mining involves using mercury to extract gold from soil and is often performed in remote parts of developing countries, making it hard to estimate. Yet small-scale gold mining contributes about 40 percent of human-made emissions.
In addition, it’s difficult to determine how long it takes the pollutant to be released into the atmosphere from discarded products like thermometers or scientific equipment.
“We’re not there yet where we can really pinpoint which source is responsible for this discrepancy,” Feinberg says.
In the future, researchers from multiple countries, including MIT, will collaborate to study and improve the models they use to estimate and evaluate emissions. This research will be influential in helping that project move the needle on monitoring mercury, he says.
This research was funded by the Swiss National Science Foundation, the U.S. National Science Foundation, and the U.S. Environmental Protection Agency.