The statistical reasoning assessment or sra is one of the first objective instruments developed to assess students' statistical reasoning. Published in 1998 (garfield, 1998a garfield, j. (1998a), challenges in assessing statistical reasoning, aera annual meeting presentation, san diego. [google scholar]), it became widely available after the garfield (2003) garfield, j. (2003), “assessing statistical reasoning,” statistics education research journal [online], 2 (1), 22–38. Http://www.stat.auckland.ac.nz/~iase/serj/serj2(1).pdf [google scholar] publication. Empirical studies applying the sra by garfield and co-authors brought forward two intriguing puzzles: the “gender puzzle”, and the puzzle of “non-existing relations with course performances”. Moreover, those studies find a, much less puzzling, country-effect. The present study aims to address those three empirical findings. Findings in this study suggest that both puzzles may be at least partly understood in terms of differences in effort students invest in studying: students with strong effort-based learning approaches tend to have lower correct reasoning scores, and higher misconception scores, than students with different learning approaches. In distinction with earlier studies, we administered the sra at the start of our course. Therefore measured reasoning abilities, correct as well as incorrect, are to be interpreted unequivocally as preconceptions independent of any instruction in our course. Implications of the empirical findings for statistics education are discussed.