Formerly known as Wikibon

298 | Breaking Analysis | Worker Bee AGI – Why AWS Is Betting on Practical Agents, Not Messiah AGI

At AWS re:Invent 2025, Amazon faced a dual mandate – speak to millions of long-standing cloud customers while countering a persistent narrative that the company is lagging in AI. In our view, AWS chose a distinctly pragmatic path. Rather than chasing the holy grail of what we call messiah AGI or even competing head-on with frontier-scale LLMs, the company emphasized foundational agentic scaffolding and customizable large and small language models. This approach aligns with our thesis that the real near-term value in AI lies inside the enterprise – what we see as “worker-bee AGI” – not in aspirational, generalized intelligence.

How Agentic AI Rewires a SaaS Business: Lessons from a Unicorn

Digital labor is no longer emerging — it is becoming the defining operating model of modern service businesses. According to the Digital Labor Transformation Index, over 61% of enterprises believe the rise of digital labor is now inevitable, and the organizations seeing the highest ROI are those that shift from basic automation to knowledge-centered, agentic work. Few companies embody this shift as clearly as Vantaca, the newly minted $1.25B unicorn redefining community management through an AI-first architecture.

In this episode of The Next Frontiers of AI, Scott Hebner is joined by CEO Ben Currin to unpack how Vantaca rewired itself around a “UI, API, and AI-first” model, and how its platform now operationalizes millions of agentic workflows that free humans from low-value tasks and elevate their capacity for real community-building work.

Matt Garman’s re:Invent 2025 Keynote: AWS Balances its Legacy with a Cloud for the AI Era

AWS CEO Matt Garman used the re:Invent 2025 keynote to fuse the line between the cloud of the past and the AI-native cloud taking shape today. His message was AI is not “bolt on,” superglued to AWS services, rather it’s an evolution of the cloud itself. The keynote emphasized the company’s infrastructure, silicon roadmap, model platform, and emerging agentic ecosystem as the best place to build AI applications. The scale at which AWS is executing on this vision is impressive, but the real question on everyone’s mind was how will AWS balance the needs of its legacy cloud customers and at the same time lead in AI innovation.

Zayo’s DynamicLink Brings Cloud-Like Consumption to Enterprise Networks

As enterprises accelerate their AI initiatives and embrace increasingly distributed architectures, the network is shifting from background utility to strategic enabler. theCUBE Research data shows that 93% of organizations now view the network as more important to achieving business goals than it was just two years ago. Against this backdrop, Zayo is aiming to make […]

The Agentic AI Masquerade: How to Tell What’s Real vs. Marketing

The industry is racing to claim “agentic AI,” but the reality looks very different. Scott Hebner and David Linthicum reveal why only 17% of enterprises are actually building real AI agents, what distinguishes assistants from agents, and why reasoning—not prompting—defines the next frontier of autonomous intelligence.

298 | Breaking Analysis | Resetting GPU Depreciation — Why AI Factories Bend, But Don’t Break, Useful Life Assumptions

In January 2020, Amazon changed the depreciation schedule for its server assets, from three years to four years. This accounting move was implemented because Amazon found that it was able to extend the useful life of its servers beyond three years. Moore’s Law was waning and at Amazon’s scale, it was able to serve a diverse set of use cases, thereby squeezing more value out of its EC2 assets for a longer period of time. Other hyperscalers followed suit and today, the big three all assume six year depreciation schedules for server assets. 

Book A Briefing

Fill out the form , and our team will be in touch shortly.
Skip to content