The time required to observe changes in participant evaluation of continuing medical education (CME) courses in surgical fields is unclear. We investigated the time required to observe changes in participant evaluation of an orthopaedic course after educational redesign using aggregate course-level data obtained from 1359 participants who attended one of 23 AO Davos Courses over a 5-year period between 2007 and 2011. Participants evaluated courses using two previously validated, 5-point Likert scales based on content and faculty performance, and we compared results between groups that underwent educational redesign incorporating serial needs assessment, problem-based learning, and faculty training initiatives (Masters Course), and those that did not (Non-Masters Course). Average scores for the usefulness and relevancy of a course and faculty performance were significantly higher for redesigned courses (p < 0.0001) and evaluations were significantly improved for both groups after faculty training was formalised in 2009 (p < 0.001). In summary, educational redesign incorporating serial needs assessment, problem-based learning, and faculty training initiatives were associated with improvement in participant evaluation, but these changes required 4-5 years to become evident.