Pipelines¶
Pipelines are the core of Flex Video, defining how video is ingested, processed, and delivered.
Pipeline Types¶
SimplePipeline¶
SimplePipeline provides a high-level configuration interface:
Best for:
- Standard transcoding workflows
- Users who don't need GStreamer knowledge
- Consistent, maintainable configurations
MultiViewPipeline¶
MultiViewPipeline composes 1-4 video inputs into a single output stream:
{
"id": "quad-view",
"mode": "multiview",
"inputs": [
{
"label": "North Gate",
"source": { "uri": "rtsp://192.168.1.100:8554/live" }
},
{
"label": "South Gate",
"source": { "uri": "rtsp://192.168.1.101:8554/live" }
}
],
"encoding": {
"codec": "av1",
"bitrate": 750,
"width": 1920,
"height": 1080,
"fps": 30,
"quality": 5
},
"output": {
"uri": "rtsp://0.0.0.0:8731/quad-view"
}
}
Best for:
- Multi-camera monitoring (compose up to 4 feeds into one stream)
- Side-by-side or quad-view layouts
- Reducing the number of output streams while monitoring multiple sources
Input Sources¶
Each input accepts the same source types as SimplePipeline (RTSP, UDP, file, test pattern), plus direct V4L2 camera paths. The web interface provides a camera picker for selecting connected cameras.
Camera Exclusivity
Each V4L2 camera can only be used by one input at a time -- both within the same multiview pipeline and across all running pipelines. Attempting to use the same camera in multiple inputs returns a 409 PIPELINE_CONFLICT error.
Layout¶
Layout is computed automatically based on the number of inputs:
| Inputs | Layout |
|---|---|
| 1 | Full resolution |
| 2 | Side-by-side (2x1) |
| 3 | 2x2 grid (one slot empty) |
| 4 | 2x2 grid |
Per-Input Controls¶
Each input supports independent configuration:
| Option | Description |
|---|---|
label | Display name (e.g., "North Gate") |
source | Video source URI and latency settings |
transform | Mirror, rotate (0/90/180/270), grayscale |
crop | Crop edges (top, right, bottom, left pixels) |
overlay | Per-input text or timestamp overlay |
Global Controls¶
These apply to the composed output:
| Option | Description |
|---|---|
encoding | AV1 codec (fixed), bitrate, resolution, FPS |
output | Output URI (RTSP, UDP, TCP) and MTU |
overlay | Text/timestamp overlay on the composed output |
framegrab | Periodic AVIF capture from the composed output |
geolocation | GPS coordinates for announcements |
metadata | CoT, TAK Server, Lattice, mDNS tags |
AdvancedPipeline¶
AdvancedPipeline accepts raw GStreamer pipeline strings:
{
"id": "custom-pipeline",
"mode": "advanced",
"raw_pipeline": "videotestsrc ! videoconvert ! <flex:h264 /> ! rtph264pay ! udpsink host=239.1.1.1 port=5004"
}
Best for:
- Custom processing requirements
- Unusual source/sink combinations
- GStreamer experts
Encoder Macros¶
Advanced pipelines support encoder macro tags that expand to optimized encoder configurations. Use these instead of typing full encoder element strings:
| Macro | Expands To |
|---|---|
<flex:av1 /> | AV1 encoder + parser |
<flex:h264 /> | H.264 encoder + parser |
<flex:h265 /> | H.265 encoder + parser |
Set bitrate in kbps (default: 500, range: 5–5,000):
{
"id": "macro-pipeline",
"mode": "advanced",
"raw_pipeline": "videotestsrc ! queue ! videoconvert ! <flex:av1 bitrate=500 /> ! fakesink"
}
How macros work
Macros are expanded server-side when the pipeline starts. The unexpanded tags are what get stored and returned in API responses. License checks are enforced per macro codec.
Source Configuration¶
Supported Sources¶
| Scheme | Description | Example |
|---|---|---|
rtsp:// | RTSP stream | rtsp://camera.local:8554/live |
udp:// | UDP unicast/multicast | udp://239.1.1.1:5004 |
file:// | Local file | file:///video/sample.ts |
test:// | Test pattern | test://smpte |
RTSP Sources¶
Latency tuning:
| Network | Recommended latency_ms |
|---|---|
| Wired LAN | 50-100 |
| WiFi | 200-500 |
| WAN/Internet | 500-2000 |
UDP Multicast Sources¶
The network interface can also be embedded directly in the URI:
Both formats are equivalent. If both are provided, the network_interface field takes priority.
Network Interface
Set the network interface when using multicast to ensure packets arrive on the correct interface.
Test Patterns¶
Available patterns for testing without a real source:
| Pattern | Description |
|---|---|
smpte | SMPTE color bars |
ball | Bouncing ball |
snow | Random noise |
black | Solid black |
checkers-1 | 1px checkerboard |
circular | Circular pattern |
zone-plate | Zone plate |
Video Decoding¶
Flex Video automatically selects the best available decoder for incoming video streams. On supported hardware, hardware-accelerated decoders are used when available, reducing CPU load and improving performance. If no hardware decoder is available for a given codec, Flex Video falls back to software decoding.
| Platform | Hardware Decode Support | Status |
|---|---|---|
| Raspberry Pi 5 | H.265/HEVC | Supported |
| NXP i.MX8M Plus | H.264, H.265, VP8, VP9 | Supported |
| x86_64 Intel 10th gen+ / AMD (VAAPI) | Varies by GPU | Experimental |
The installer automatically enables hardware decoding on supported platforms. On x86_64 systems with VAAPI, hardware-accelerated encoding is also available. To force software-only decoding (for example, to troubleshoot playback issues), set the following in /opt/flex/.env:
Restart services after changing this value. See Hardware Acceleration for platform-specific setup details.
Encoding Configuration¶
"encoding": {
"codec": "h264",
"bitrate": 750,
"width": 1920,
"height": 1080,
"fps": 30,
"quality": 5
}
Codecs¶
| Codec | License | Efficiency | Compatibility |
|---|---|---|---|
h264 | Free | Good | Excellent |
h265 | Licensed | Better | Good |
av1 | Free | Best | Limited |
Bitrate Guidelines¶
AV1 is significantly more efficient than H.264/H.265. The following guidelines are for AV1 encoding:
| Resolution | Low | Recommended | High |
|---|---|---|---|
| 320x240 | 5 kbps | 25 kbps | 50 kbps |
| 480p | 50 kbps | 100 kbps | 150 kbps |
| 720p | 100 kbps | 175 kbps | 250 kbps |
| 1080p | 250 kbps | 500 kbps | 750 kbps |
| 4K | 750 kbps | 900 kbps | 1024 kbps |
For H.264 and H.265, use approximately 2-3x the AV1 values above.
Bitrate is the video target
The bitrate value is the video target bitrate at CBR (constant bitrate). The actual stream will fluctuate around this value and average to the target over time. KLV metadata and Codec2 audio (if enabled) add additional overhead on top of the video bitrate and are not included in this calculation.
Quality Preset¶
The quality setting (1-10) controls the encoder speed/quality tradeoff. Default is 5.
| Value | Latency | Use Case |
|---|---|---|
| 1 | Lowest | Absolute minimum latency. Required for FPV drones and other latency-critical applications. |
| 2-3 | Low | Good for live streaming. Recommended for 4K on low-memory hardware. |
| 5 | Medium | Default. Recommended for most real-time use cases up to 1080p. |
| 6-10 | Higher | Uses more CPU and adds encoding latency. Only recommended for low-resolution streams (480p or smaller) on constrained hardware. |
Tip
For FPV drone feeds or any application where latency is critical, always use quality 1. For 4K streams on devices with limited swap/memory, use quality 2-3 to avoid resource exhaustion. Values above 5 increase latency and resource usage significantly — only use them on low-resolution streams (480p and below).
Output Configuration¶
RTSP Output (Recommended)¶
Outputs to the built-in RTSP server. Clients connect to rtsp://server-ip:8731/stream-name. See Receiving & Playback for compatible players and plugin requirements.
UDP Multicast Output¶
The network interface can also be embedded in the URI:
MTU Settings¶
| Value | Use Case |
|---|---|
| 1200 | VPN, tunnels |
| 1400 | Default, safe |
| 1500 | Standard Ethernet |
| 9000 | Jumbo frames |
Optional Features¶
Overlays¶
Add text or timestamp overlays:
Transform¶
Apply video transformations:
Rotation values: 0, 90, 180, 270
Crop¶
Crop the video frame:
Geolocation¶
Set static or dynamic position using latitude/longitude or MGRS coordinates:
"geolocation": {
"latitude": 38.8977,
"longitude": -77.0365,
"altitude": 100.0,
"heading": 45.0,
"is_dynamic": false
}
Alternatively, provide an MGRS coordinate instead of latitude/longitude:
When both are provided, latitude/longitude take precedence. Responses include latitude, longitude, and computed mgrs.
Set is_dynamic: true when position will be updated from KLV metadata.
Pipeline Lifecycle¶
stateDiagram-v2
ready --> playing : play
playing --> ready : stop Runtime Editing¶
You can adjust encoding settings on a running pipeline without deleting and recreating it. This is supported for SimplePipeline and MultiViewPipeline.
Editable while playing:
| Setting | Effect |
|---|---|
| Bitrate | Stream restarts |
| Width / Height | Stream restarts |
| FPS | Stream restarts |
| Geolocation | Instant (no restart) |
Stream Restart
Changing bitrate, resolution, or FPS causes a brief stream interruption while the pipeline is stopped, updated, and restarted. Geolocation changes are applied instantly.
In the web interface, click Edit on a running Simple or Multiview pipeline to enter live edit mode. Encoding fields are editable while the stream plays. Click Apply Encoding Update to send the changes. If the update fails, the pipeline automatically rolls back to its previous settings.
AdvancedPipeline must be stopped before editing.
Automatic Recovery¶
Pipelines are automatically restored if the video processing backend restarts unexpectedly. Running streams will experience a brief interruption but resume without user intervention.
On-Demand vs Auto-Start¶
| Setting | Behavior |
|---|---|
on_demand: true | Pipeline created in ready state, must call /play |
on_demand: false | Pipeline starts immediately on creation |
Pipeline Limits¶
Pipeline limits are controlled by your license:
- Check limits:
GET /flex/licensed - Exceeding limits returns HTTP 402
Best Practices¶
- Use descriptive IDs:
entrance-cam-h264notpipe1 - Match resolution to source: Don't upscale unnecessarily
- Set appropriate latency: Higher for unreliable networks
- Use RTSP output: Better client compatibility than UDP
- Monitor pipeline state: Use SSE for real-time updates