RED Cinema Cameras for Live Event Production

Red Pixel runs three completely separate camera systems, each built for a different job. They share a switcher and a tech booth, but the equipment, the control workflow, and the production role are distinct. Understanding what each system does — and why you'd deploy one over another — is what separates a thoughtful camera package from a pile of expensive gear.

System 1: RED Cinema Cameras — Premium Acquisition

Our RED camera package consists of 2x RED Raptor X and 2x RED Komodo X. These are cinema cameras, not broadcast cameras. The distinction is fundamental: they're designed to capture the highest possible image quality for post-production, and they happen to also feed a live switcher via RED's CINE BROADCAST protocol.

RED CINE BROADCAST Workflow

Each RED camera connects to a rack-mountable RED CINE BROADCAST base station via a LEMO SMPTE 311M/304M hybrid fiber optical cable. The base station is the bridge between cinema and broadcast — it receives high-frame-rate R3D data from the camera and outputs synchronized 12G-SDI and SMPTE ST 2110 IP feeds for the live switcher, while simultaneously recording full-resolution RAW data.

The CINE BROADCAST CCU uses NVIDIA GPUs (RTX 6000 Ada or L40S class) for real-time decompression and debayering of R3D data with minimal latency. It supports live broadcasting of up to two channels of 4K 60p (HDR/SDR) via 12G-SDI, plus SMPTE ST 2110 (TR-08) compliant IP output and 4K 60p JPEG-XS. Advanced workflows include 3x and 4x super slow-motion from 8K 120fps R3D data, and AI/ML augmentation via deep integration with NVIDIA's processing pipeline.

Control is via Cyanview RCP panels at the tech booth. Each RCP gives the operator remote iris, gain, white balance, and lift/gamma/gain color controls — broadcast-style operation on a cinema camera. The CCU applies color adjustments to the live broadcast output while the RAW data passes through untouched to the recorder. One camera, two completely independent output paths.

Our RED CCU Rack

  • 2x RED CINE BROADCAST CCU (rack-mount base stations)
  • 2x Cyanview RCP (remote control panels)
  • 2x KAIROS multi-axis controller (lens and positioning control)
  • 1x Blackmagic Smartview Duo (confidence monitoring)

The KAIROS multi-axis controllers handle motorized lens control (zoom, focus, iris) for the RED cameras — essential when the camera is on a jib or in a position where an operator can't physically touch the lens.

When to Deploy RED

RED cameras earn their place on shows where the post-event content has real value: keynotes that become on-demand libraries, product launches that feed months of marketing, earnings presentations that get archived for investor relations. The live broadcast feed is already professional-grade via the CCU. The RAW cinema archive is what makes the investment pay off — footage that can be graded in DaVinci Resolve with full creative latitude, no clipping, no compression artifacts.

System 2: Panasonic AK-UC3000 — Broadcast Studio Cameras

Our broadcast camera package is 3x Panasonic AK-UC3000 studio cameras. These are a completely different animal from the RED system — purpose-built for live production with traditional broadcast CCU control.

AK-UC3000 Capabilities

The AK-UC3000 uses a large-format 4K MOS sensor with exceptional sensitivity — F10 in High Sense Mode with an S/N ratio exceeding 60 dB. It shoots simultaneously in 4K and HD, accepts standard 2/3" broadcast lenses (so your existing glass works), and outputs 2160/59.94p, 1080/59.94p, and 720/59.94p over fiber. Broadcast-specific features include flash band compensation, high-speed scan for skew reduction, shock-less gain (-6 dB to 36 dB), user gamma curves, black gamma correction, multi-step DNR, and easy matrix adjustment.

These are the features that matter during a live show: predictable exposure behavior, stable color science under mixed lighting, and zero surprises when you cut between cameras.

Panasonic Broadcast CCU Rack

  • 3x Panasonic AK-UCU500 Camera Control Units
  • 3x Panasonic AK-HRP1000 Remote Operation Panels
  • 1x Barco 8x8 Matrix (signal routing)
  • 1x Blackmagic Smartview Duo (confidence monitoring)

Each AK-UC3000 connects to its AK-UCU500 CCU via optical fiber — supporting uncompressed 4K transmission over distances up to 2,000 meters between camera and control unit. The AK-HRP1000 operation panels give the shader (camera control operator) full paint control: iris, gain, white balance, detail, gamma, matrix — the traditional broadcast paint workflow that's been refined over decades.

When to Deploy Broadcast Cameras

The AK-UC3000 is the backbone of any multi-camera live switch. It's predictable, it's stable under challenging lighting, it plays nicely with broadcast switchers, and the operators who run them have muscle memory built over thousands of shows. For events where the live program output is the primary deliverable — corporate town halls, awards ceremonies, multi-camera IMAG — these are the cameras that carry the show.

System 3: Panasonic PTZ Cameras — Robotic Coverage

Our PTZ package includes 4x Panasonic AW-UE160 and 4x AW-UE150 robotic cameras. These serve a fundamentally different role from both the RED and broadcast systems: they provide automated, remotely operated coverage without a dedicated camera operator per position.

AW-UE160 Capabilities

The AW-UE160 is Panasonic's flagship PTZ — the first remote camera to support SMPTE ST 2110 for broadcast-grade IP transmission. It features a 1-inch MOS sensor (the largest in any Panasonic PTZ), a 20x optical zoom with 75.1-degree wide angle of view, and outputs 4K 60p over 12G-SDI, HDMI, optical fiber, and NDI/NDI|HX simultaneously. The autofocus system combines phase detection AF with contrast AF for continuous subject tracking. Additional features include FHD 120fps high-speed capture for slow-motion, HDR/HLG compatibility, built-in ND filters (1/4, 1/16, 1/64), and PoE++ power delivery.

The AW-UE150 is the previous-generation flagship with similar capabilities — 1-inch sensor, 20x zoom, 4K 60p output — and remains a rock-solid workhorse for positions where the UE160's ST 2110 support isn't needed.

When to Deploy PTZ

PTZ cameras go where you can't (or don't want to) put a human operator: audience reaction cameras mounted overhead, stage-left/right coverage positions with no room for a tripod, roaming wide shots during breakout sessions, and discreet speaker-tracking positions in intimate settings. They also serve as safety cameras — always-on wide shots that give the TD a fallback if a manned camera goes down.

AI Auto-Tracking: Panasonic AW-SF100

This is where our PTZ system goes from "remotely operated cameras" to "autonomous camera intelligence." The Panasonic AW-SF100 Auto Tracking Software is an AI-powered plug-in that runs on dedicated GPU-equipped tracking servers and controls PTZ cameras autonomously using facial recognition, human body detection, and deep learning-based motion tracking.

How It Works

The AW-SF100 analyzes the IP video stream from each PTZ camera in real-time using NVIDIA GPU acceleration. It combines multiple tracking methods: facial detection identifies and locks onto a specific person (you can pre-register faces so the software knows who to follow), body template matching keeps tracking even when the subject turns away from the camera, and deep learning-based motion detection maintains the track through occlusion and rapid movement.

The software sends pan/tilt/zoom commands back to the camera over IP — no external tracking hardware, no beacons, no wearable transmitters on the talent. The subject just walks, talks, and moves naturally. The camera follows.

Our Tracking Infrastructure

  • 4x Asus GPU Tracking Servers running Panasonic AW-SF100
  • Each server handles up to 4 cameras simultaneously (with the right GPU — RTX 3060 or higher for 4-camera tracking)
  • Supports all 8 of our PTZ cameras (4x AW-UE160 + 4x AW-UE150) simultaneously
  • Web-based control interface accessible from any device on the production network

Real-World Auto-Tracking Use Cases

Consider a pharmaceutical product launch with 6 presenters rotating through a 3-hour program. Without auto-tracking, you need a PTZ operator glued to a joystick for every camera, manually following each speaker. With the AW-SF100, you register each presenter's face during rehearsal, assign tracking zones, and let the AI handle the follow. The cameras track smoothly and predictably — no overshoot, no hunting, no losing the subject when they turn to gesture at a slide.

For corporate town halls where executives walk the stage with a handheld mic, auto-tracking eliminates the most common PTZ failure mode: a human operator who gets distracted or reacts too slowly to a direction change. The AI doesn't get distracted. It doesn't need a bathroom break during a 4-hour general session.

We still use manual PTZ control for creative shots — audience reactions, dramatic reveals, specific framings that require artistic judgment. But for speaker tracking, the SF100 is more consistent than a human operator over the course of a long show day.

How the Three Systems Work Together

On a full-scale production, all three camera systems feed into the same switching infrastructure — typically a Ross Carbonite Ultra for program switching. But each system has its own CCU rack, its own control workflow, and its own operator chain:

  • RED cameras are controlled by Cyanview RCPs via RED CINE BROADCAST CCUs — cinema workflow
  • AK-UC3000s are controlled by AK-HRP1000 panels via AK-UCU500 CCUs — broadcast workflow
  • PTZ cameras are controlled by AW-SF100 auto-tracking or manual joystick — robotic workflow

The TD sees all feeds on the same multiviewer and cuts between them identically on the switcher. But behind the scenes, each system is optimized for its specific role: cinema acquisition, broadcast reliability, or autonomous coverage. Mixing them in the same show isn't a compromise — it's how you get the best of all three worlds without any single system carrying a burden it wasn't designed for.

The Bottom Line

Choosing between RED, broadcast, and PTZ isn't an either/or decision. Each system exists to solve a different problem, and the most capable productions deploy all three. RED captures premium archive footage while feeding a live broadcast. Panasonic studio cameras deliver rock-solid multi-camera switching with decades of proven broadcast workflow. PTZ cameras with AI auto-tracking provide autonomous coverage that scales without adding operators. At Red Pixel, we bring all three systems to every major show — and the right combination to every smaller one.

Red Pixel Consulting deploys RED cinema, Panasonic broadcast, and AI-tracked PTZ camera systems for live events nationwide. Three distinct camera workflows, one integrated production. Your live show is broadcast-ready in real-time. Your archive is cinema-grade. Your coverage is autonomous where it counts.

Ready for Cinema-Grade Live Event Production?

Red Pixel Consulting designs and deploys RED camera systems for corporate events, conferences, and broadcast productions nationwide.

Work With Us