10 Key Takeaways From Sam Altman’s Talk at Stanford

In a recent Q&A session at Stanford University, Sam Altman, the visionary CEO of OpenAI, shared invaluable insights on the future of artificial intelligence and its potential impact on society. As the co-founder of the research organization behind groundbreaking AI models like GPT and DALL-E, Altman’s…

Chuck Ros, SoftServe: Delivering transformative AI solutions responsibly

As the world embraces the transformative potential of AI, SoftServe is at the forefront of developing cutting-edge AI solutions while prioritising responsible deployment. Ahead of AI & Big Data Expo North America – where the company will showcase its expertise – Chuck Ros, Industry Success Director…

Andrew Gordon, Senior Research Consultant, Prolific – Interview Series

Andrew Gordon draws on his robust background in psychology and neuroscience to uncover insights as a researcher. With a BSc in Psychology, MSc in Neuropsychology, and Ph.D. in Cognitive Neuroscience, Andrew leverages scientific principles to understand consumer motivations, behavior, and decision-making. Prolific was created by researchers…

AniPortrait: Audio-Driven Synthesis of Photorealistic Portrait Animation

Over the years, the creation of realistic and expressive portraits animations from static images and audio has found a range of applications including gaming, digital media, virtual reality, and a lot more. Despite its potential application, it is still difficult for developers to create frameworks capable…

5 Best B2B Customer Support Tools (May 2024)

In today’s fast-paced business landscape, providing exceptional customer support is crucial for B2B companies looking to build long-lasting relationships with their clients. To meet the evolving needs of customers and streamline support operations, businesses are turning to advanced tools and platforms that offer a range of…

Optimizing Memory for Large Language Model Inference and Fine-Tuning

Large language models (LLMs) like GPT-4, Bloom, and LLaMA have achieved remarkable capabilities by scaling up to billions of parameters. However, deploying these massive models for inference or fine-tuning is challenging due to their immense memory requirements. In this technical blog, we will explore techniques for…