Consumer users of wearable computers often desire subtle and silent control of their devices during meetings, public transportation and other social situations. In addition, environments, such as those involved in aviation, military, and emergency response, are often too noisy for speech recognition. We address these problems through silent speech recognition and gesture control, capturing movements associated with speech and intentional gestures of the tongue and jaw. The system has two components: the Tongue Magnet Interface (TMI), which utilizes the 3-axis magnetometer aboard Google Glass to measure the movement of a small magnet glued to the user's tongue, and the Outer Ear Interface (OEI), which measures the deformation in the ear canal caused by jaw movements using proximity sensors embedded in a set of earphones. We present encouraging results on these gesture-recognition interfaces as well as potential side benefits, such as detecting heart rate and whether the user is speaking.