Skip Levens, Marketing Director, Media & Entertainment, Quantum – Interview Series

Skip Levens is a product leader and AI strategist at Quantum, a leader in data management solutions for AI and unstructured data. He is currently responsible for driving engagement, awareness, and growth for Quantum’s end-to-end solutions. Throughout his career – which has included stops at organizations like Apple, Backblaze, Symply, and Active Storage – he has successfully led marketing and business development, evangelism, launched new products, built relationships with key stakeholders, and driven revenue growth.

Quantum provides end-to-end data solutions that help organizations manage, enrich, and protect unstructured data, such as video and audio files, at scale. Their technology focuses on transforming data into valuable insights, enabling businesses to extract value and make informed decisions. Quantum’s platform offers secure, scalable, and flexible solutions, combining onsite infrastructure with cloud capabilities. The company’s approach allows businesses to efficiently handle data growth while ensuring security and flexibility throughout the data lifecycle.

Can you provide an overview of Quantum’s approach to AI-driven data management for unstructured data?

By helping customers integrate artificial intelligence (AI) and machine learning (ML) into their key business operations, Quantum helps customers to effectively manage and unlock meaningful value from their unstructured data, creating actionable business insights that lead to better business decisions. By building their own AI/ML tools, companies can move from simply coping with the influx of data and content, to leveraging insights as a new driver of efficiencies and ultimately amplifies human expertise in all phases of business operations.

How does Quantum’s AI technology analyze unstructured data, and what are some key innovations that set your platform apart from competitors?

In the initial stages of adopting AI/ML tools, many organizations find their workflows become disordered and disconnected, and can lose track of their data, making it difficult to enforce security and protection standards. Too often, early development is hampered by ill-suited storage and file system performance.

We developed Myriad, a high-performance, software-defined file storage and intelligent fabric environment to elegantly meet the challenges of integrating AI/ML pipeline and high-performance workflows together – unifying workflows without the hardware constraints and limitations of other systems. Myriad is a clear departure from legacy hardware and storage constraints, and built with the latest storage and cloud technologies, is entirely microservices driven and orchestrated by Kubernetes to be a highly responsive system that rarely requires admin interaction. Myriad is exclusively architected to draw the highest performance from NVMe and intelligent fabric networking and near instantaneous remote direct memory access (RDMA) connections between every component. The result is an innovative system that responds intelligently and automatically to changes and requires minimal admin intervention to perform common tasks. By making intelligent fabric part of the system, Myriad is also an intrinsically load-balanced system that provides multiple 100Gbps ports of bandwidth as a single, balanced IP address.

Pairing Myriad with our cloud-like object storage system, ActiveScale, allows organizations to archive and preserve even the largest data lakes and content. The combination offers customers a true end-to-end data management solution for their AI pipelines. Moreover, when delivered alongside our CatDV solution, customers can tag and catalog data to further enrich their data and prepare it for analysis and AI.

Could you share insights on the use of AI with video surveillance at the Paris Olympics, and what other large-scale events or organizations have utilized this technology?

Machine Learning can develop repeatable actions that recognize patterns of interest on video and derive insights from a flood of real-time video data at a scale larger and faster than is possible by human efforts alone. Video surveillance, for example, can use AI to capture and flag suspicious behavior as it occurs, even if there are hundreds of cameras feeding the model information. A human attempting this task would only be able to process one event at a time, whereas AI-powered video surveillance can take on thousands of cases simultaneously.

Another application is crowd sentiment analysis, which can track long queues and pinpoint potential frustrations. These are all actions that a security expert can reliably flag, but by using AI/ML systems to continuously watch simultaneous feeds, those experts are freed to take appropriate action when needed, dramatically boosting overall effectiveness and safety.

What are the primary challenges organizations face when implementing AI for unstructured data analysis, and how does Quantum help mitigate these challenges?

Organizations must completely reimagine their approach to storage, as well as data and content management as a whole. Most organizations grow their storage capabilities organically, usually in response to one-off needs, and this creates multi-vendor confusion and unfortunate complexity.

With the adoption of AI, organizations must now simplify the storage that underpins their operations. Oftentimes, this requires implementing a “hot” part of the initial data ingest, or landing zone where applications and users can work as fast as possible. Then, a large “cold” type of storage is added that can easily archive massive amounts of data and protect it in a cost-effective way, with the ability to move the data back into a “hot” processing workflow almost instantaneously.

By reimagining storage into fewer, more compact solutions, the burden on admin staff is much lower. This kind of “hot/cold” data management solution is ideal for AI/ML workflow integration, and Quantum solutions enable customers create a highly agile, flexible platform that is concise and easy to manage.

How do Quantum’s AI innovations integrate with other AI-powered tools and technologies to enhance organizational growth and efficiency?

Many people think storage for AI/ML tools is only about feeding graphics processing units (GPUs), but that’s just one small part of the equation. Though speed and high-performance may be instrumental in feeding data as fast as possible to the GPUs that are performing data analysis, the bigger picture revolves around how an organization can integrate iterative and ongoing AI/ML development, training, and inference loops based on custom data. Oftentimes the first and most important AI/ML task addressed is building “knowledge bots” or “counselor bots” using proprietary data to inform internal knowledge workers. To make those knowledge bots useful and unique to each organization, large amounts of specialized information is required to inform the model that trains them. Cue an AI-powered storage solution: if that proprietary data is well-ordered and readily available in a streamlined storage workflow, it will be far easier to organize in types, sets, and catalogs of data which will, in turn, ensure that those knowledge bots are highly informed on the organization’s unique needs.

Can you elaborate on the AI-enabled workflow management features and how they streamline data processes?

We’re building a host of AI-enabled workflow management tools that integrate directly into storage solutions to automate tasks and provide valuable real-time insights, enabling fast and informed decision-making across organizations. This is due to new and advanced data classification and tagging systems that use AI to both organize data and make it easily retrievable, and even perform standard actions on that media such as conforming to a certain size, which significantly reduces the manual efforts needed when organizing data into training sets.

Intelligent automation tools manage data movement, backup, and compliance tasks based on set policies, ensuring consistent application, and reducing administrative burdens​. Real-time analytics and monitoring also offer immediate insights into data usage patterns and potential issues, automatically maintaining data integrity and quality throughout its entire lifecycle.

What is the outlook for AI-powered data management, and what trends do you foresee in the coming years?

As these tools evolve and become multi-modal, it will allow more expressive and open-ended ways of working with your data. In the future, you’ll be able to have a “conversation” with your system and be presented with information or analytics of interest such as ‘what is the fastest growing type of data in my ‘hot zone’ now?’. This level of specialization will be a differentiator for the organizations that build these tools into their storage solutions, making them more accurate and efficient even when confronted with constant new streams of evolving data.

What role do your cloud-based analytics and storage-as-a-service offerings play in the overall data management strategy?

Organizations with significant and expanding storage requirements often struggle to keep up with demand, especially when operating on limited budgets. Public cloud storage can lead to high and unpredictable costs, making it challenging to accurately estimate and purchase years’ worth of storage needs in advance. Many customers would like the public cloud experience of a known projected operating cost yet eliminate the surprise egress or API charges that public cloud can bring. To answer this need, we developed Quantum GO to give customers that private cloud experience with a low initial entry point and low fixed monthly payment options for a true storage-as-a-service experience in their own facility. As storage requirements increase, Quantum GO gives customers the added advantage of a simple ‘pay-as-you-grow’ subscription model to offer enhanced flexibility and scalability in a cost-effective manner.

How does Quantum plan to stay ahead in the rapidly evolving AI and data management landscape?

In today’s world, being merely a “storage provider” is not enough. Newly evolving data and business challenges require an intelligent, AI-empowering data platform that helps customers to maximize the value of their data. At Quantum, we continue to innovate and invest in enhanced capabilities for our customers to help them easily and effectively work with troves of data throughout their entire lifecycles.

We are expanding intelligent AI to uplevel the tagging, cataloguing, and organizing of data, making it easier than ever to search, find, and analyze it to extract more value and insight. We will continue to enhance our AI capabilities that assist with automatic video transcription, translating audio and video files into other languages within seconds, and enabling quick searches across thousands of files to identify spoken words or locate specific items, and more.

What advice would you give to organizations just beginning their journey with AI and unstructured data management?

AI/ML has had enormous hype, and because of that, it can be difficult to parse out what’s practical and useful. Organizations must first think about the data being created, and pinpoint how it’s being generated, captured, and preserved. Further, organizations must seek out a storage solution that is ready to access and retrieve data as needed, and one that will help guide both day-to-day workflow and future evolution. Even if it’s hard to agree on what the ultimate AI goals are, taking steps now to make sure that storage systems and data workflows are streamlined, simplified, and robust will pay enormous dividends when integrating current and future AI/ML initiatives. Organizations will then be well-positioned to keep exploring how these AI/ML tools can advance their mission without worrying about being able to properly support it with the right data management platform.

Thank you for the great interview, readers who wish to learn more should visit Quantum