Brain-machine interfaces that directly translate attempted speech from the speech motor areas could change the lives of people with complete paralysis. However, it remains uncertain exactly how speech production is encoded in cortex. Improving this understanding could greatly improve brain-machine interface design. Specifically, it is not clear to what extent the different levels of speech production (phonemes, or speech sounds, and articulatory gestures, which describe the movements of the articulator muscles) are represented in the motor cortex. Using electrocorticographic (ECoG) electrodes on the cortical surface, we recorded neural activity from speech motor and premotor areas during speech production. We decoded both gestures and phonemes using the neural signals. Overall classification accuracy was higher for gestures than phonemes. In particular, gestures were better represented in the primary sensorimotor cortices, while phonemes were better represented in more anterior areas.