YOLO catches the obvious.
A person in a clearing. A vehicle on a forest road.
Real-time AI search and rescue for the drones you already own.
It’s 2:47 AM. Someone’s daughter hasn’t come home. You have a $1,400 drone, four hours of darkness, and 12,000 acres of forest to search.
Volunteer search and rescue teams already own the hardware. A Mavic, a Mini 4 Pro, sometimes an Air 3S that the chief bought himself. What they don’t have is the software.
DJI FlightHub is thousands of dollars a month. Skydio Enterprise is a procurement cycle they’ll never win. So the volunteer crew watches a thermal feed on a tablet by truck-light, scrubbing the same hillside for the fourth time, eyes too tired to be sure of anything.
The drone sees more than the operator can. Huginn closes that gap.
YOLO catches the obvious.
A person in a clearing. A vehicle on a forest road.
Claude Vision catches what YOLO misses.
A partial silhouette through foliage. Smoke at a tree line. An orange jacket half-buried in leaves.
OCR scrapes the telemetry off the video itself.
Every detection lands on the map with coordinates — even when your drone won’t expose its GPS feed.
And these are three of the eyes among more.
Tiled inspection for the small things. Pose-aware attention for unusual postures. Tracking that holds a half-second glimpse across frames. Search by description, when you can name what you’re looking for.
You’ve got a 5-person crew, a Mavic, and a county the size of Rhode Island.
Wildland surveys, missing-person callouts, post-storm checks. One drone, no software budget.
Search ops without a $40,000-a-year enterprise contract.
Runs on hardware your team already owns. Any RTMP-capable drone, any tablet, a backend you can host wherever you keep your other things. No proprietary hardware. No four-figure subscriptions.
Huginn is in private beta with select SAR teams. Request access for your unit.
Request access