← Back to work
// case-study · in active build
BUILDINGLate-stage prototype · industrialising for pilot

FishTech Precision Feeding System

Precision feeding for African pond aquaculture. A closed loop that sees fish, decides the dose, and dispenses it.

Role · Founding engineerTimeline · Active build, pilot 2026Status · Late-stage prototype
ChArUco · ultrasonicAI camera · IMX500Pi 5 · Hailo-8L · OLEDIP66 enclosureAuger feederGeared DC · 3D-printed// apparatus · side-viewPond-edge installation · IP66 · ~1.8m mast1m scaleConceptual rendering
Camera
Sony IMX500
Compute
Pi 5 · Hailo-8L
Vision
YOLO-Pose
Actuation
Auger feeder
// section-01

Smallholders feed by guess.

Fish farming runs on a tight margin where feed is 60 to 70 percent of operating cost. Zimbabwean smallholders, including the thousands of ponds in the Presidential Community Fisheries Scheme, feed their fish by guess. The result is overfeeding, which wastes feed and degrades water quality, and underfeeding, which slows growth. Norwegian salmon operations have precision feeding systems engineered for industrial cages at a cost African smallholders have no access to. The price tier that fits African pond aquaculture has, until now, had nothing in it.

// section-02

A closed loop, not a monitoring tool.

FishTech Precision Feeding System is a closed-loop AI instrument. An overhead AI Camera, a Sony IMX500 with an on-sensor neural accelerator, streams the pond to a Raspberry Pi 5 with a Hailo NPU for accelerated vision inference. A custom-trained YOLO model, currently moving to a keypoint architecture for true geodesic length measurement, detects each fish. A floating ChArUco fiducial plus an ultrasonic depth sensor give per-frame auto-calibration, so the camera's pixel measurements convert to centimetres correctly at any installed height. A species-specific length-to-weight relationship from FishBase converts size to mass. The biomass estimator stratifies: the farmer owns the population count, the camera owns the average size, and the system reports the whole-pond biomass with an explicit confidence interval. From biomass, a precision feeding engine grounded in published tilapia, catfish, and carp biology returns a feed dose in grams, with hard guards for water temperature, feeding hours, and minimum inter-feed interval. When the engine emits a dose, the Pi drives a geared DC motor and motor driver, which rotates an auger to dispense the dose into the pond. The OLED display shows the number. The status LED ring goes green. The feed lands. The next analysis cycle reads the new state, and the loop closes.

// section-03

The closed loop, end to end.

Eight stages, one device. Detection runs on the Hailo accelerator. Calibration runs every frame against a floating ChArUco fiducial and an ultrasonic depth reading. Biomass and dose are computed in Python on the Pi. The auger fires. The cycle repeats.

// closed-loop · pipeline
camera → dose · runs on-device
01
Overhead camera
Sony IMX500
02
Raspberry Pi 5
8 GB · edge host
03
Hailo-8L NPU
13 TOPS · HEF
04
YOLO-Pose
4 keypoints / fish
05
ChArUco + ultrasonic
px → cm calibration
06
Biomass · CI
stratified · ±σ
07
Feeding engine
7-multiplier dose
08
Auger feeder
geared DC · dispense
loop closes · next cycle
fish · 0.94L ≈ 24cm
fish · 0.88L ≈ 21cm
// what the camera sees

YOLO-Pose returns four keypoints per fish: snout, dorsal origin, peduncle, tail-tip. Length is the sum of geodesic segments, not a bounding-box diagonal — orientation- invariant and far more accurate for fish at an angle. ChArUco + ultrasonic give the pixel-to-centimetre conversion per frame, so the camera can sit at any depth.

// section-04

The innovations

01
Stratified biomass with confidence intervals

The camera sees a sample of the pond, not the whole pond. We estimate average per-fish weight from what we see and multiply by what the farmer records: stocking minus mortality. The result is reported with a 1.96 sigma confidence interval, shown explicitly on the dashboard. Honest scientific methodology in a domain where most systems hide their uncertainty.

02
Multi-multiplier precision feeding engine

Feed equals biomass times size-tier rate times temperature multiplier times time-of-day multiplier times prior-response multiplier, divided by feeds per day. Every multiplier is grounded in published aquaculture biology, with hard guards for temperature window, feeding hours, and minimum inter-feed interval. Configurable per pond, per species, in YAML.

03
Hailo-accelerated keypoint vision

Moving from bounding-box detection to YOLO-Pose keypoint detection compiled to Hailo HEF format. Snout, dorsal origin, peduncle, and tail-tip keypoints give true geodesic length measurement that beats the bounding-box approach on angled fish. This is the architectural upgrade that unlocks serious length and biomass accuracy.

04
Automatic per-frame calibration

A floating ChArUco fiducial plus an ultrasonic depth sensor gives self-correcting pixel-to-centimetre calibration on every frame. The system is plug-and-play across ponds of any depth. No manual recalibration on installation.

05
Integrated automated auger feeder

The precision engine emits a dose in grams. A geared DC motor and motor driver rotate a 3D-printed auger to dispense it. The decision flows directly into mechanical actuation. Each feed event is a visible, quantified moment on the OLED, the LED ring, and the pond surface.

06
Edge-only, sovereign-by-design

Every farm runs its own Pi at the pond edge. SQLite on the microSD card is the source of truth. The dashboard is served by the Pi's own Wi-Fi hotspot, accessible by QR scan from any phone on the farm. No cloud, no recurring bills, no farmer data leaving Zimbabwe. Aligned with the National AI Strategy's data sovereignty pillar.

// section-05

The stack

Vision and Edge Compute
  • Python 3.11 · Ultralytics YOLOv8 (custom-trained)
  • OpenCV with ArUco / ChArUco modules
  • Migration in flight to YOLO-Pose on Hailo-8L (HEF compile)
  • Raspberry Pi 5 (8GB) · Raspberry Pi AI Camera (Sony IMX500)
  • Raspberry Pi AI Kit (Hailo-8L, 13 TOPS on M.2 HAT+)
  • JSN-SR04T ultrasonic · MPU6050 IMU · DS18B20 temperature
Backend, Persistence, Actuation
  • Flask + flask-cors (FastAPI migration on the roadmap)
  • SQLite with WAL mode, five-table per-pond schema
  • PyYAML config layer (one YAML per deployment)
  • N20 12V geared DC motor · L298N motor driver
  • 3D-printed auger and hopper
  • Powder-coated welded steel mast, IP66 ABS enclosure
Dashboard
  • Vite · React 19 · TypeScript 5
  • TanStack Router · TanStack Query (offline-first)
  • Tailwind CSS v4 · shadcn/ui · Radix
  • Motion · Recharts · React Hook Form + Zod
  • PWA, served from the Pi's local Wi-Fi hotspot
// section-06

The system sees fish, decides the dose, and is being industrialised.

The end-to-end pipeline runs on a real Raspberry Pi 5 with the AI Camera today. Detection, length extraction, calibrated pixel-to-centimetre conversion, species-specific length-to-weight, stratified biomass with confidence intervals, and the multi-multiplier precision feeding engine are all live. The dashboard is wired to the live backend across Live, Pond, Growth, Mortality, and Settings routes. The persistence layer captures every analysis cycle, every feed event, and every mortality entry. The product engineering is locked: welded steel mast specifications, IP66 enclosure layout, status LED ring and OLED faceplate, branded demo plinth and acrylic tank. The Hailo accelerator, keypoint vision pipeline, automated auger, and field enclosure are in active build for pilot deployment in 2026.

// section-07

What's next

  • YOLO-Pose keypoint detection, compiled to Hailo HEF
  • Hailo-8L AI Kit integration on the Pi 5
  • Automatic per-frame calibration via ChArUco plus ultrasonic
  • Auger feeder mechanical assembly and dispense calibration
  • Productised mast, IP66 enclosure, faceplate, and branded demo unit
  • Pilot pond deployment, target 30-day live farm run
  • FastAPI migration and a WebSocket live layer
  • SMS bridge in Shona for feature-phone farmers
// similar problem? let's talk

Have a CV-on-edge or IoT product you're trying to ship?

FishTech is the proof. The same playbook works for fleet telemetry, precision agriculture, retail inventory vision, and any other on-device AI instrument that has to survive without a cloud.

Start a conversation
Related
Live site
FishTech Consultancy
fishtech.co.zw →
Foundation
FishTech dissertation submission
NUST Zimbabwe · Nov 2025