Gatekeeper Control Center
Module 2.5 • Hybrid AI Pipeline
Inference Model
OFFLINE
Gatekeeper Engine
0Qualified
0Filtered
Live Ingestion Stream
LATENCY: <50msStream Idle / No Packets
Inference Engine:: Llama 3-8B-Instruct
SYSTEM_PROMPT.MD
DB_WRITE_ACTIVE
Policy Configuration
VAR_A: SearchBased FormatSRC: search_terms.raw_title
PASS: HOW_TO
DROP: REVIEW, COMPARISON, TIERLIST, NEWS, VLOG
VAR_B: Temporal SegmentSRC: search_terms.length_class
ULTRASHORT <20s
SHORT <30s
NORMAL 30s-4m
LONG >4m
ULTRALONG >15m
EXTREME >60m
Heuristic Decision Boundaries (ms/s)
ULTRASHORT
SHORT
NORMAL_MIN
NORMAL_MAX
LONG_MAX
ULTRALONG_MAX