Akamai, a cybersecurity and cloud computing company, has introduced Akamai Cloud Inference, a new service that offers companies the opportunity to achieve a 3x increase in throughput, 60% lower latency, and an 86% reduction in costs compared to traditional hyperscale infrastructure. This innovative solution allows organizations to harness predictive and large language models (LLMs) for real-world applications, running on the highly distributed Akamai Cloud platform to overcome the limitations of centralized cloud models.
According to Adam Karon, Chief Operating Officer and General Manager of the Cloud Technology Group at Akamai, the future of AI lies in bringing data closer to users and devices, a task that legacy clouds struggle with. While the training of LLMs will continue to occur in large hyperscale data centers, the inferencing work will be done at the edge, leveraging Akamai’s platform that has been developed over the past 25 years to differentiate itself from other cloud providers.
Akamai Cloud Inference offers various tools for platform engineers and developers to build and run AI applications and data-intensive workloads closer to end users, resulting in significant improvements in throughput and latency. By utilizing Akamai’s solution, businesses can save substantially on AI inference costs compared to traditional hyperscale infrastructure. Key components of Akamai Cloud Inference include a versatile compute arsenal, advanced data management capabilities, containerization for optimized performance, and edge computing capabilities.
Overall, Akamai Cloud Inference provides a robust platform for low-latency, AI-powered applications that enable companies to meet user demands effectively. Running on Akamai’s massively distributed platform, this solution ensures high throughput for data-intensive workloads across a global network of points of presence. As AI adoption evolves, enterprises are shifting focus from training large models to using lightweight AI models tailored to solve specific business problems, offering a better return on investment.
In conclusion, the increasing generation of data outside centralized data centers is driving the need for AI solutions that leverage data closer to its source. Distributed cloud and edge architectures are becoming essential for operational intelligence use cases, enabling enterprises to make real-time, informed decisions and enhance personalized experiences. Akamai Cloud has already seen success with various customer applications, such as in-car voice assistance, AI-powered crop management, and automated product description generators. Karon explained that AI inference is comparable to using a GPS, quickly applying knowledge, recalculating in real time, and adjusting to changes to guide you to your destination. He emphasized that inference is the next frontier for AI.
About Akamai
Akamai is a leading cybersecurity and cloud computing company that empowers and protects businesses online. Their top-notch security solutions, advanced threat intelligence, and global operations team work together to provide comprehensive defense to safeguard enterprise data and applications worldwide. Akamai’s full-stack cloud computing solutions offer high performance and affordability on a widely distributed platform. Global enterprises rely on Akamai for industry-leading reliability, scalability, and expertise to confidently grow their businesses. Learn more at akamai.com and akamai.com/blog, or follow Akamai Technologies on Twitter and LinkedIn.
Contacts
Akamai Media Relations: [email protected]
Akamai Investor Relations: [email protected]
SOURCE: Akamai Technologies, Inc.
Incorporate the main focus keyword naturally into the SEO-friendly title and throughout the article. Enhance existing headings from the content, ensuring at least one heading includes the focus keyword. Keep any bold headings as bold. Ensure the article flows naturally, is engaging, and maintains a smooth, descriptive style throughout the 1,500 words. Include relevant data clearly and conclude by summarizing key points and encouraging further engagement. Do not include any additional explanations or information about SEO practices.











