Beyond Productivity: Considering Systemic Effects alongside Outcome Measures

In the proposed common standards research and development proposals developed by the US Department of Education’s Institute of Education Sciences (IES) and the National Science Foundation (NSF), two phrases are used to characterize research results: learning outcomes and educational outcomes. While these ways of assessing educational productivity are necessary, they are not, on their own, sufficient for covering the range of federal educational systemic investments.  A proposed modification to these evidence standards adds the phrase systemic effects (or similar wording) to be used for areas where outcome or productivity measures are not optimal or appropriate. This modification would help with innovations—often technologies—that have  theoretical connections to productivity, but where the evidentiary linkages can be complex and/or take many years to be manifest as well as policies areas that direct systemic behavior independent of productivity.

Systemic effects with technology that operates at systemic scale

Systemic effects become important with technologies that operate at systemic scales.  Because these tools can affect many parts of an organization or system at the same time, productivity measures that typically focus on transitional data points become weak indicators.  A well-known example is distance education where research has shown dramatic growth in the industry, but modest improvements in learning (Means, et. al, 2010).  The focus on productivity has obscured how these same innovations can impact teaching, organizations, and markets. This kind of information could be especially helpful as there is interest in applying these technologies into K-12 settings.  Other technologies such as those from the federally funded State Longitudinal Data Systems (SLDS) program, the Learning Registry, and the National Instructional Materials Accessibility Standard (NIMAS) are all intended to operate across broad systemic spans (Piety, 2013). Many of these are motivated by expectations for systemic realignments where increased productivity is expected, but the main thrust is systemic change. Even new learning standards efforts—notably the common core and next generation science standards—can be seen as systemic technologies designed to change and realign activities that in the process may alter how productivity is measured and compared.

A focus on systemic effects would allow researchers to investigate more broadly the pathways of technology adoption without constraining their studies narrowly on production. As researchers studying organizations and technology studies have found, technologies can impact teams, markets and systems (Ciborra, 1993) in different ways and over different times.  There are adoption and adaption processes that have been called at times diffusion  (Rogers, 2010). Research focusing on individual or group productivity will likely miss important evolutionary processes and the lessons that can be learned from those processes.

Policy direction for systemic effect

In a number of areas, systemic effects can be directly tied to policy goals that are neutral in terms of productivity.  For example, the Individuals with Disabilities Education Act (IDEA) specifies that students with disabilities should be taught in the ‘least restrictive environment’ (Public Law 108-446). There are also various regulations and laws that specify students should have access to instructional material and educational opportunities despite physical impairments.  While access to these educational tools may result in greater learning and educational outcomes, comparable outcomes may also be achieved with lower costs and/or less efforts bone by other stakeholders.  A focus on systemic effects would allow researchers to investigate these technologies and the complex ways they are used across various practice areas aligned to policy goals.

Interdisciplinary opportunities and challenges with evidence of systemic effects

Developing a robust evidence base of systemic effects introduces opportunities and challenges.  The opportunities include being able to “see” educational systems in action and draw together different kinds of evidentiary artifacts that can represent many ecosocial levels of activity from learning through administration (Bransford, et al., 2005; Lemke and Sabelli, 2008) at the same time.  This systemic vision will likely be like other depictions of education: imprecise and influenced by context. However, by signaling its importance through language that categorizes educational research, IES and NSF will help to promote needed discussion in the field about what these effects can be and how they can be reliably measured.

There will also be challenges to moving beyond outcome measures.  Measuring productivity and effectiveness can be done according to methods that have been endorsed in policy, national consensus publications including Scientific Research in Education (NRC, 2002), and supported by large professional communities that have agreed on basic tenants of what counts as evidence and rigorous analysis. Moving away from these paradigms, will entail the broad and loosely connected qualitative research paradigms where evidence standards become more diverse and rigor less clear.  Design-based research (DBR) that conceptually aligns with new technologies even as it has largely been used in the study of classroom models (Cobb, Confrey, Lehrer, and Schauble, 2003; Dede, 2004; Sandoval and Bell, 2004) will likely become part of the conversation.

Systemic effects also is an area where there are opportunities to learn from and adapt methods that have been developed in other fields to study organizational change processes in other fields. Ultimately, systemic effects will lead to important questions of rigor and valuing a range of evidence types that vary not only in form, but in their ability to be used to make high-quality inferences as new work by Behrens, Mislevy, Piety, and DiCerbo (2013) explores.  Focusing on systemic effects presents the field with opportunity to not only enlarge its own methodological tools, but to benefit from connections to other domains.


Behrens, J., Mislevy, R., Piety, P. and K. DiCerbo (2013). Inferential Foundations for Learning Analytics in the Digital Ocean.  Draft Report to the Learning Analytics Work Group.  H-Star Institute, Stanford University.

Bransford, J., Barron, B., Pea, R. D., Meltzoff, A., Kuhl, P., Bell, P., … & Sabelli, N. (2005). Foundations and opportunities for an interdisciplinary science of learning. The Cambridge handbook of the learning sciences, 39-77

Brynjolfsson, E., & Hitt, L. (2000). Beyond Computation: Information Technology, Organizational Transformation and Business Performance. The Journal of Economic Perspectives, 14(4), 23-48 and

Ciborra, C. (1993). Teams, Markets, and Systems: Business Innovation and Information Technology. Cambridge: Cambridge University Press.

Cobb, P., Confrey, J., Lehrer, R., & Schauble, L. (2003). Design experiments in educational research. Educational researcher, 32(1), 9-13.;

Dede, C. (2004). If Design-Based Research is the Answer, What is the Question? A Commentary in the JLS Special Issue on Design-Based Research. Journal of the Learning Sciences, 13(1), 105-114.

Lemke, J. L., & Sabelli, N. H. (2008). Complex systems and educational change: Towards a new research agenda. Educational Philosophy and Theory, 40(1), 118-129.

Means, B., Toyama, Y., Murphy, R., Bakia, M., & Jones, K. (2010). Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies. U.S. Department of Education, Office of Planning, Evaluation, and Policy Development, Evaluation of Washington, D.C.

National Research Council (NRC). (2002). Scientific Research in Education. Washington, DC: National Academy Press.

Piety (2013) Assessing the Educational Data Movement. Teachers College Press New York, New York.

Public Law 108-446 (2004) The Individuals with Disabilities Education Improvement Act.

Rogers, E. M. (2010). Diffusion of innovations. Free press.

Sandoval, W. A., & Bell, P. (2004). Design-based research methods for studying learning in context: Introduction. Educational Psychologist, 39(4), 199-201.


Dr. Piety is a national expert in educational data. He is on the faculty of the Robert H. Smith School of Business at the University of Maryland where he teaches information and database technologies, cloud computing, and social media. He is the author of Assessing the Educational Data Movement, a book on using data to improve school success.

Posted in Big ideas & theories, Research

Leave a Reply

Your email address will not be published. Required fields are marked *