MIT’s Laboratory for Information and Decision Systems (LIDS) has been awarded $1,365,000 in funding from the Appalachian Regional…
Vulnerability Management: Technologies and Best Practices – Technology Org
What Is Vulnerability Management? Vulnerability management is a systematic, ongoing, and comprehensive approach to managing and mitigating vulnerabilities…
How VR Tech is Being Used in Trade Shows & Expos – Technology Org
Expos and exhibitions have hit a golden phase when it comes to business partnerships, consumer connections, and advertising….
Fastest EV Charger in US Can Fill EV Batteries for 200 Miles in 5 Minutes – Technology Org
Gravity Mobility, a Google-backed electric vehicle (EV) infrastructure startup, has officially opened the fastest electric car battery charging…
What Other Country Could Provide Abrams Tanks to Ukraine? – Technology Org
Ukraine already has some Abrams tanks from the US. But the defenders of Ukraine would like to have…
Smart glove teaches new physical skills – Technology Org
You’ve likely met someone who identifies as a visual or auditory learner, but others absorb knowledge through a…
The automated method helps researchers quantify uncertainty in their predictions – Technology Org
Pollsters trying to predict presidential election results and physicists searching for distant exoplanets have at least one thing…
Artificial Intelligence reveals prostate cancer is not just one disease – Technology Org
Artificial Intelligence has helped scientists reveal a new form of aggressive prostate cancer, which could revolutionise how the…
Turn the tables: Advance your security with evidence-backed expert insights – CyberTalk
EXECUTIVE SUMMARY:
In 2023, the cyber threat landscape evolved at a record-breaking pace. Global cyber attacks increased by roughly 48%. Threats became more sophisticated and expensive to contend with than ever before.
Cyber security leaders who overlook recent developments risk being blindsided by powerful, persistent and potentially damaging cyber adversaries. The bad actors are well-financed and are finding network “footholds” in unprecedented ways.
More than two-thirds of companies report having experienced a cyber attack in the last 12 months. The alarming truth is that businesses aren’t adapting fast enough. As cyber crime accelerates, will your organization be able to keep up?
Included in the report
Newly developed ransomware techniques have intensified the malware’s impact on businesses. In 2023, Check Point Research observed a notable spike in large-scale ransomware attacks intended to disrupt multiple businesses in quick succession. Actual incidents impacted hundreds or thousands of entities.
“By failing to prepare, you are preparing to fail,” – Benjamin Franklin |
And other evolving threats are even more pernicious than that. Threat actors have developed new tactics that covertly exploit edge devices for the purpose of executing extensive DDoS attacks, spam campaigns and network takeovers. Attackers have also increased their use of AI to scale efforts. AI is now used to analyze information, enhance phishing threats and to automate attacks.
The aforementioned represent just a fraction of the ways in which cyber attackers and attacks are becoming more sophisticated. Ensure that your organization knows which advanced security tools to implement. In the Check Point Research Cyber Security Report, get expert recommendations for strategic and innovative products that can keep pace with the latest threats.
Context as a compass
Context around contemporary threats can be just as critical as product recommendations, as context is what enables your organization to ‘see around corners’; to predict problems. Context defines agility. In short, context enables organizations to effectively anticipate, adapt to and respond to threats.
The environmental information included in the Check Point Research Cyber Security Report empowers security leaders to identify issues with greater accuracy and to present a stronger response.
Further details
In 2024, for the vast majority of organizations, confronting cyber security challenges will be a core business objective. As your organization looks ahead, ensure that it accounts for the latest cyber security trends, research, intelligence analyses and recommendations — as explained by preeminent industry experts.
Get valuable insights that can translate to stronger cyber security, improved business resilience, fewer nightmares and more sleep for your security staff. Discover why leading brands trust the Check Point Research Cyber Security Report. Download now.
For more information about the forces shaping the cyber threat landscape, subscribe to the CyberTalk.org newsletter. Get timely insights, cutting-edge analyses and more, delivered straight to your inbox each week.
Using generative AI to improve software testing
Generative AI is getting plenty of attention for its ability to create text and images. But those media represent only a fraction of the data that proliferate in our society today. Data are generated every time a patient goes through a medical system, a storm impacts a flight, or a person interacts with a software application.
Using generative AI to create realistic synthetic data around those scenarios can help organizations more effectively treat patients, reroute planes, or improve software platforms — especially in scenarios where real-world data are limited or sensitive.
For the last three years, the MIT spinout DataCebo has offered a generative software system called the Synthetic Data Vault to help organizations create synthetic data to do things like test software applications and train machine learning models.
The Synthetic Data Vault, or SDV, has been downloaded more than 1 million times, with more than 10,000 data scientists using the open-source library for generating synthetic tabular data. The founders — Principal Research Scientist Kalyan Veeramachaneni and alumna Neha Patki ’15, SM ’16 — believe the company’s success is due to SDV’s ability to revolutionize software testing.
SDV goes viral
In 2016, Veeramachaneni’s group in the Data to AI Lab unveiled a suite of open-source generative AI tools to help organizations create synthetic data that matched the statistical properties of real data.
Companies can use synthetic data instead of sensitive information in programs while still preserving the statistical relationships between datapoints. Companies can also use synthetic data to run new software through simulations to see how it performs before releasing it to the public.
Veeramachaneni’s group came across the problem because it was working with companies that wanted to share their data for research.
“MIT helps you see all these different use cases,” Patki explains. “You work with finance companies and health care companies, and all those projects are useful to formulate solutions across industries.”
In 2020, the researchers founded DataCebo to build more SDV features for larger organizations. Since then, the use cases have been as impressive as they’ve been varied.
With DataCebo’s new flight simulator, for instance, airlines can plan for rare weather events in a way that would be impossible using only historic data. In another application, SDV users synthesized medical records to predict health outcomes for patients with cystic fibrosis. A team from Norway recently used SDV to create synthetic student data to evaluate whether various admissions policies were meritocratic and free from bias.
In 2021, the data science platform Kaggle hosted a competition for data scientists that used SDV to create synthetic data sets to avoid using proprietary data. Roughly 30,000 data scientists participated, building solutions and predicting outcomes based on the company’s realistic data.
And as DataCebo has grown, it’s stayed true to its MIT roots: All of the company’s current employees are MIT alumni.
Supercharging software testing
Although their open-source tools are being used for a variety of use cases, the company is focused on growing its traction in software testing.
“You need data to test these software applications,” Veeramachaneni says. “Traditionally, developers manually write scripts to create synthetic data. With generative models, created using SDV, you can learn from a sample of data collected and then sample a large volume of synthetic data (which has the same properties as real data), or create specific scenarios and edge cases, and use the data to test your application.”
For example, if a bank wanted to test a program designed to reject transfers from accounts with no money in them, it would have to simulate many accounts simultaneously transacting. Doing that with data created manually would take a lot of time. With DataCebo’s generative models, customers can create any edge case they want to test.
“It’s common for industries to have data that is sensitive in some capacity,” Patki says. “Often when you’re in a domain with sensitive data you’re dealing with regulations, and even if there aren’t legal regulations, it’s in companies’ best interest to be diligent about who gets access to what at which time. So, synthetic data is always better from a privacy perspective.”
Scaling synthetic data
Veeramachaneni believes DataCebo is advancing the field of what it calls synthetic enterprise data, or data generated from user behavior on large companies’ software applications.
“Enterprise data of this kind is complex, and there is no universal availability of it, unlike language data,” Veeramachaneni says. “When folks use our publicly available software and report back if works on a certain pattern, we learn a lot of these unique patterns, and it allows us to improve our algorithms. From one perspective, we are building a corpus of these complex patterns, which for language and images is readily available. “
DataCebo also recently released features to improve SDV’s usefulness, including tools to assess the “realism” of the generated data, called the SDMetrics library as well as a way to compare models’ performances called SDGym.
“It’s about ensuring organizations trust this new data,” Veeramachaneni says. “[Our tools offer] programmable synthetic data, which means we allow enterprises to insert their specific insight and intuition to build more transparent models.”
As companies in every industry rush to adopt AI and other data science tools, DataCebo is ultimately helping them do so in a way that is more transparent and responsible.
“In the next few years, synthetic data from generative models will transform all data work,” Veeramachaneni says. “We believe 90 percent of enterprise operations can be done with synthetic data.”