AI
AI
AI
Media and entertainment is undergoing a structural shift driven by fragmented audiences, new content formats and the growing role of AI in production workflows. The traditional broadcast model — built around scheduled programming and horizontal viewing — no longer reflects how audiences consume content.
Platforms such as TikTok, Instagram Reels and YouTube Shorts have normalized vertical video as a primary format, particularly among younger viewers. For organizations such as Fox Corp., this creates a dual challenge: delivering content across multiple platforms while managing cost and operational complexity.
At the same time, AI has moved beyond experimentation. The real value now lies at inference — where AI operates in real time within production environments, shaping outputs as content is created.
Live production, especially in sports, is one of the most complex areas in media. Content must be captured, formatted and distributed instantly across multiple channels.
Historically, adapting 16:9 broadcast feeds into vertical formats required dedicated teams, manual framing and post-event editing. This approach struggles to keep up with audience expectations for real-time highlights and engagement during live events.
Fox addressed this by building dedicated “vertical control rooms” to support social content. While effective, the model was labor-intensive and limited how much focus teams could place on storytelling.
Across the industry, the pressure points are consistent:
Meeting demand now requires real-time transformation built directly into the production pipeline.
The collaboration between Amazon Web Services Inc. and Fox focused on integrating AI inference into live production rather than adding it as a separate step.
The goal was to:
This approach positions AI as a set of supporting agents that handle repetitive tasks, allowing production teams to focus on creative decisions.
AWS introduced Elemental Inference as an extension of its Elemental media services, integrating directly into tools such as MediaLive. Instead of a standalone system, inference becomes part of the production stack.
Key capabilities include:
While sports is the primary use case, the same model applies to news and talk programming through speaker and scene detection.
Elemental Inference improves operational efficiency by reducing manual workflows and accelerating time-to-publish. More importantly, it increases engagement by enabling real-time content distribution across platforms.
It also shifts how teams work. By offloading repetitive tasks, production staff can focus on storytelling and audience engagement. At the same time, tighter integration between content and audience data creates a feedback loop: Create → distribute → analyze → optimize in real time
At NAB 2026, inference emerged as a foundational layer in media production. AI is no longer an add-on; it is becoming embedded infrastructure.
For media organizations, the implications are clear:
This pattern extends beyond media. Across industries, the value of AI is realized when it is integrated into real-world systems and workflows.
For more in-depth insights, visit theCUBEresearch.com.
Support our mission to keep content open and free by engaging with theCUBE community. Join theCUBE’s Alumni Trust Network, where technology leaders connect, share intelligence and create opportunities.
Founded by tech visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a dynamic ecosystem of industry-leading digital media brands that reach 15+ million elite tech professionals. Our new proprietary theCUBE AI Video Cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.