top of page

Beyond the Page: Reclaiming the "So What?" in Academic Research

  • Roger Morrad
  • Mar 23
  • 5 min read

As the clock runs into a new day, as I turn the metaphorical pages of the latest assignment submission and as the rain lashes the windowpane, a familiar question echoes in the quiet contemplation: "So what?".


This seemingly simple query, which we so often tell students to articulate in their essays, reports, research proposals and dissertations holds a profound and unsettling mirror to the broader academic research itself. While we diligently train the next generation to explain the significance of their findings, it is increasingly imperative that researchers at all levels, and the institutions we inhabit, turn this critical lens inward. If we are to be the example, shouldn't we, as a collective academic body, consistently ask ourselves the "so what" when we embark on, execute, and disseminate our own research?


The uncomfortable truth is that a significant portion of academic output, despite its rigorous methodology and exhaustive literature reviews, struggles to answer this fundamental question convincingly. This is not to diminish the dedicated efforts of countless scholars, nor to discount the value of foundational or theoretical work. Rather, it is a call to critically evaluate systemic pressures that can dilute the ultimate impact and relevance of our collective intellectual endeavours.


Are we at risk of inhibiting an ecosystem of "Meaninglessness" outputs: Where the "So What?" Gets Lost?


The challenges preventing research from fully addressing the "so what" are multi-faceted.


The "Replication Crisis" and the Erosion of Trust:


At the heart of much research credibility lies reproducibility. The persistent "replication crisis" across various fields (Ioannidis, 2005; Open Science Collaboration, 2015) reveals that many published findings are difficult, if not impossible, to independently verify. If our findings cannot be reliably reproduced, the "so what" becomes profoundly ambiguous. What meaningful implication can we draw from a finding that may not hold true beyond its initial context? This systemic fragility undermines the cumulative nature of knowledge and erodes trust in our outputs. The "so what" of such research often evaporates upon closer scrutiny, leaving behind a void where robust evidence should be stood.


"Publish or Perish" and the Tyranny of Quantity:


The prevailing "publish or perish" culture incentivises volume over genuine contribution in many fields. Researchers are often judged by the sheer number of publications, leading to what some term the "natural selection of bad science" (Smaldino & McElreath, 2016). This pressure can foster incremental or trivial research findings published as a reaction to pressures to write anything as per some contractual obligations or to support our CVs. The "so what" of these micro-contributions often becomes a whisper, lost amidst a cacophony of similar findings, rarely coalescing into a coherent, impactful narrative. When quantity trumps quality, the deep, transformative "so what" is frequently sacrificed for superficial metrics.


Hyper-Specialisation and Disciplinary Silos:


Academic advancement often necessitates deep dives into highly specialised niches. While specialisation can foster profound insights, it frequently creates disciplinary silos, rendering research inaccessible or irrelevant to those outside very narrow sub-fields (Fire & Finfgeld-Connett, 2020). The language, methodologies, and even the conceptual frameworks become so esoteric that cross-pollination is stifled. The "so what" for a broader audience, or even for adjacent disciplines, often gets buried under layers of jargon and hyper-specific context. If only a handful of individuals globally can truly understand your contribution, what is its broader "so what" for society?


The Chasm of Translational Impact:


Perhaps the most direct challenge to the "so what" lies in the disconnect between academic findings and their practical application. Research, particularly in fields with clear societal relevance, frequently remains confined within journal paywalls and academic conferences, failing to translate into tangible solutions, improved policies, or public understanding (Sarewitz, 2016; MacLeod et al., 2014).


The "so what" then becomes a hypothetical: "This could change things, if only it reached the right people, in the right format, at the right time." Without a concerted effort to bridge this translational chasm, the potential "so what" of much research remains frustratingly unrealised.


Reclaiming the "So What": A Call for Purpose-Driven Scholarship:


To truly address the "so what," we must foster a research environment that values impact as highly as novelty, and relevance as much as rigor. This requires a systemic shift in how we conceive, conduct, and evaluate academic work.


Prioritising Problem-Driven Research:


Instead of starting solely with a theoretical gap, researchers should increasingly begin with real-world problems or societal challenges. Framing research questions around actionable insights naturally forces the "so what" into the foreground. This does not mean abandoning fundamental research but rather cultivating greater awareness of its potential pathways to impact.


Embracing Interdisciplinarity and Public Engagement:


Breaking down disciplinary walls is crucial for uncovering complex "so what’s" that transcend single fields. Furthermore, engaging with stakeholders outside academia such as policymakers, industry leaders, community groups, and the general public from the outset of a project can directly inform research questions, ensuring their relevance and enhancing pathways for impact and dissemination (Nowotny et al., 2003).


Reforming Evaluation Metrics:


A genuine shift requires a re-evaluation of how researchers are assessed. Moving beyond simple publication counts to include metrics of societal impact, public engagement, data sharing, and methodological transparency would signal a commitment to quality and relevance over mere quantity (Hicks et al., 2015). Rewarding rigorous replication and robust, open science practices would directly address the replication crisis.


Cultivating a Culture of Reflexivity:


Researchers must consistently ask themselves: Who benefits from this research? What problem does it solve? How will its findings be communicated beyond academic peers? What would happen if this research were never done? This internal reflexivity fosters a deeper sense of purpose and helps pre-emptively answer the "so what" for diverse audiences.

The "so what" is not merely a question for students; it is the animating force of meaningful scholarship. By collectively confronting the systemic issues that dilute our impact and by consciously re-orienting our efforts towards purpose-driven, publicly engaged, and rigorously reproducible research, we can reclaim the profound "so what" that truly defines the value of academic inquiry.


The future of knowledge, and its ability to shape a better world, depends on it!



References


Alberts, B., Kirschner, M. W., Tilghman, S. M., & Varmus, H. (2014). Rescuing US biomedical research from its systemic flaws. Proceedings of the National Academy of Sciences111(16), 5773–5777. https://doi.org/10.1073/pnas.1404402111


Fire, M., & Finfgeld-Connett, D. (2020). The problem of hyper-specialization in scientific research: A network perspective. Journal of Informetrics14(4), 101077. https://doi.org/10.1016/j.joi.2020.101077


Hicks, D., Wouters, P., Waltman, L., de Rijcke, S., & Rafols, I. (2015). Bibliometrics: The Leiden Manifesto for research metrics. Nature520(7548), 429–431. https://doi.org/10.1038/520429a


Ioannidis, J. P. A. (2005). Why most published research findings are false. PLoS Medicine2(8), e124. https://doi.org/10.1371/journal.pmed.0020124


MacLeod, M. R., Michie, S., Roberts, I., McKee, M., Chalmers, I., & Clarke, M. (2014). Biomedical research: Increasing value, reducing waste. The Lancet383(9912), 101–104. https://doi.org/10.1016/S0140-6736(13)62329-6


Nowotny, H., Scott, P., & Gibbons, M. (2003). Re-Thinking Science: Knowledge and the Public in an Age of Uncertainty. Polity Press.


Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science349(6251), aac4716. https://doi.org/10.1126/science.aac4716


Sarewitz, D. (2016). The pressure to publish pushes science into irrelevance. Nature News534(7609), 147. https://doi.org/10.1038/533147a


Smaldino, P. E., & McElreath, R. (2016). The natural selection of bad science. Royal Society Open Science3(9), 160384. https://doi.org/10.1098/rsos.160384

 

 
 
 

Comments


bottom of page