aka Robot Arm for Video Screen Manipulation
The goal of the SAParm Project was to produce a proof-of-concept robotic video conferencing system to convey gestures of a remote conference participant. It could mimic human gestures such as nodding and shaking its head; show interest or strong feeling by moving forward; and indicate disinterest by moving backwards. It also moved in a very natural way by first turning its head, followed by the body, towards a speaker.
I designed the electronics and software to control the motion of the robot arm. Technically speaking, the SAParm project electronics was quite advanced for its day as it featured custom 32bit motor controller boards with their own RAM and FLASH buses and the embedded firmware performed real-time gravity compensation and PWM feathering.