In today’s Chronicle of Higher Education, Marc Dunkelman questioned the way that numerical data now drive scientific inquiry.
Over the past several decades, the value we once put in simple observations has been eclipsed by the power of statistics. It’s not that anyone believes that data should be ignored. In many cases, hard figures offer us valuable insights and empower us to hold individuals and organizations to account. … But it’s come now to the point where hard data aren’t just the means to a conclusion—data have emerged as the parameter for our questions. …
The point isn’t that we ought not to be rigorous in testing and challenging suppositions. It’s not that we should ignore hard facts when they’re available. But the world of scholarship—whether inside the academy or out—can’t be limited to the boundaries of measurable data. There are too many important questions for which we can’t compute answers. The absence of numerical evidence shouldn’t discourage an investigation. Quite the opposite: If a question is worth answering, then the underlying issue should be considered worthy of simple and sustained observation.
As I tell my graduate students, quantitative research is important in the quest for knowledge, but it is only a piece of the research puzzle. Qualitative research is the other piece, and it is just as important. Among other things, qualitative research is necessary for generating the hypotheses that can later be tested using quantitative methods. Novel hypotheses tend to emerge from “simple and sustained observation” (Dunkelman, 2014).
Perhaps William Bruce Cameron said it best:
Dunkelman, M. J. (2014, August 19). What data can’t convey. The Chronicle of Higher Education. Retrieved from http://chronicle.com/blogs/conversation/2014/08/19/what-data-cant-convey/?cid=pm&utm_source=pm&utm_medium=en