pipoint

michaelh@juju.net.nz 2017-03-15 https://juju.net.nz/michaelh/project/pipoint/

Overview

An automatic camera pointer that keeps a rover like a model aircraft in the center of the frame.

Background

Some time ago I set up a camera, pointed it at the sky, and recorded as I flew my model plane about. It was quite cool, but the plane covers so much area that most of the video was of blue sky.

Requirements

The system shall be able to keep the following rovers in frame:

  • A 1 m wingspan model aircraft flying at 100 km/h at 50 m to 200 m range
  • A 1/10th scale model car driving at 40 km/h at 20 m to 100 m range

The system shall support a standard field configuration, where the rover says to the left, in front, or to the right of the operator but never goes behind.

The pointer shall support point a GoPro-class camera, especially a ~60 g Turnigy HD ActionCam with a 170" lens.

To minimise development time, prefer re-using existing hardware and software platforms.

To minimise hardware integration time, prefer a system with fewer parts.

Doing a round trip to test the system takes some time. To minimise the round trips:

  • In-field setup shall be minimised
  • The hardware, software, and parameters shall be readily modifable
  • The system shall support near real time debugging and tuning

The system shall use my standard tooling which is Go, Git, Ansible, Prometheus, and Buildbot for CI.

Implementation

The system shall consist of:

  • A PixFalcon Micro and GPS on the rover
  • A 433 MHz telemetry link
  • A Rasperry Pi 3 based base station
  • A Lynx B servo based Pan and Tilt Kit
  • A Linux laptop for display and control

Overview

The components are:

  • A Wifi accesspoint used as link when in the field
  • A Wifi client used as link at home
  • mavlink as the protocol
  • mavproxy to bridge between serial and UDP
  • gobot as the framework

The modes are:

  • Locate. The base station uses the rover’s GPS and compass to locate the base station itself.
  • Run. The base station receives the rover’s location, calculates the camera angle, and sends commands to the servos.

The support systems are:

  • Fast runtime configuration. The settings can be edited without interrupting the app. Idea: use spf13/viper
  • Commands (like to switch modes) are sent over REST.

The core abstraction is a parameters. These have an age and fire events when updated. A set of parameters may be updated as a group and events will fire at the end. Parameters have the same basic API as spf13/viper.

Servo control is via PWM. The PWM pins are:

  • PWM0: GPIO18 / pin 12
  • PWM1: GPIO19 / pin 35

Instructions on enabling these via DTB are here.

Test plan

The inital at-base checks are:

  • Check that the battery packs can run the electronics and servos
  • Check connectivity via the access point
  • Check the rover manual control
  • Check the remote battery
  • Check that rover telemetry is received by the base
  • Check that the GPS can lock and is received by the base
  • Check the camera battery and storage

The in-field checks are:

  • Repeat at-base
  • With the rover at the base, mark the base location
  • Save this location as default
  • Manually drive the rover ~10 m away at 0 deg to the base
  • Adjust the pan offset to point

Deliverables

The deliverables are:

  • TODO

Milestones

TODO

MilestoneWeekResult
M1W1All foos have bars barred
M2W3All bazs are frobbed

Risks and mitigations

TODO

Alternatives considered

Do nothing…

Appendix

Issues

  • GPS seems to be sampled at 2.5 Hz, and sometimes arrives earlier or later.
Avatar
Michael Hope
Software Engineer