Face perception is a complex process involving a network of brain structures, dynamically processing information to enable judgments about a face to be made (e.g., familiarity, identity, and expression). Here we introduce an analysis methodology that makes it possible to directly study this information processing in the brain from spatially and temporally resolved magnetoencephalographic signals. We apply our methodology to the study of 2 face categorization tasks, gender and expressiveness, and track the processing of 3 key visual features that underlie behavioral performance, over time and throughout the cortex. We find information processing correlates beginning from 90 ms following stimulus onset, where features are processed in isolation in occipital extrastriate regions. Over time, processing of successively more features and feature combinations takes place in occipitotemporal regions, with maximal information processing of visual information coinciding with the well-established face-selective M170 component at 170 ms. Later still, around 250-400 ms, cortical activity responds significantly more to task-specific features and their complex combinations. These results indicate a complex process of visual information processing during face perception with face parts processed in isolation at very early stages, and task-specific processing of combinations of features taking place within 300 ms. Crucially, our approach specifically establishes which information in the visual stimulus the brain signal is responding to and how this varies with time, cortical location, and task demands to establish a more precise tracking of information processing mechanisms in the cortex during face perception.