« Connecting learning analytics (more) with learning research: Part I - The potential of learning analytics | Blog home | Mon 8 Feb seminar - Introducing physics concepts in primary school »

business learning training articles new learning business training opportunities finance learning training deposit money learning making training art loan learning training deposits make learning your training home good income learning outcome training issue medicine learning training drugs market learning money training trends self learning roof training repairing market learning training online secure skin learning training tools wedding learning training jewellery newspaper learning for training magazine geo learning training places business learning training design Car learning and training Jips production learning training business ladies learning cosmetics training sector sport learning and training fat burn vat learning insurance training price fitness learning training program furniture learning at training home which learning insurance training firms new learning devoloping training technology healthy learning training nutrition dress learning training up company learning training income insurance learning and training life dream learning training home create learning new training business individual learning loan training form cooking learning training ingredients which learning firms training is good choosing learning most training efficient business comment learning on training goods technology learning training business secret learning of training business company learning training redirects credits learning in training business guide learning for training business cheap learning insurance training tips selling learning training abroad protein learning training diets improve learning your training home security learning training importance

While Part I  outlined some of what researchers take for being true about learning, and argued that learning analytics can make important contributions to the methodology of modern learning research, in this posting I describe how learning analytics might contribute to conducting design-based research (DBR), sometimes also referred to as design experiments. 

DBR has the goal “…to use the close study of learning as it unfolds within a naturalistic context that contains theoretically inspired innovations, usually that have passed through multiple iterations, to then develop new theories, artifacts, and practices that can be generalized to other schools and classrooms” (Barab, 2014, p. 151). Design-based research is a ‘natural’ fit between the learning sciences and learning analytics because DBR shares with learning analytics the goal to provide solutions to practical problems. At the same time, these solutions are expected to be grounded in a theory of learning, hence applying the solution can be seen as a (partial) test of the theory, and improving the solution incrementally over time can be seen as contributing to advancing theory over time. 

In design-based research, theory is essential for generalization because design experiments do mostly not use a control group logic, but are structured as within-subjects, repeated measurements designs: A baseline is observed, an intervention is performed (e.g., change in teaching style, a different curriculum, a new or different technology), and the effects of the intervention are gauged in terms of changes to the baseline. Design-based research makes often use of qualitative methods, frequently in combination with quantitative methods. This increases its value to inform the (re-) design of the intervention, and its value for theory building. The main difference between design experiments and standard control-group experiments is that in design experiments context is seen as part of the treatment, thus acknowledging the situated nature of learning; context variables are not seen as ‘interfering’, but as providing the resources through which theoretically expected learning processes become realized in a specific learning situation. This does not mean that DBR does not have a concept of interference, but it is not context ‘variables’ that are seen as potentially interfering; instead, other mechanisms that are active in the same context can interfere. The basis for generalization is provided by keeping the mechanisms that cause learning analytically separate from the context; this analytical distinction allows to formulate expectations how the mechanisms might play out in other contexts, and is hence the basis for the form of generalization most prevalent in design-based research: analytical generalizations  (Ercikan & Roth, 2014; Maxwell, 2004). The DBR methodology is in this respect similar to the methodology of case studies (Yin, 2003): Generalizing is performed by relating the specific case to theories with explanatory value. The specific case observations are not taken as applying in an identical manner to a “population”, but are related to similar processes, and/or more abstract types of processes. It is not the specific participants in the study who are seen as instances of a (in a statistical sense meaningful) ‘population’; instead, the specific observation is treated as “an instance of” something more abstract and, in this sense, more general (Reimann, 2013).  

In more concrete terms, theory enters into design-based research in form of conjectures that take mainly the form learning trajectories and design claims. A learning trajectory describes how learning develops in the absence of the intervention—humans, like any organism, cannot not learn—and how learning changes under the influence of the intervention, in particular the theory-informed aspects of the intervention. Learning trajectories specify expectations about the form of change, perhaps its extent (‘size’), and should say something about its temporal aspects: When will the effect materialize? For how long? Design claims are conjectures about how specific aspects of the intervention affect students’ learning and understanding. Like expectations about learning trajectories, design claims focus mainly on those aspects of the pedagogical and/or technical design that are are related to relevant theory. 

Cobb and Gravemeijer (2008) provide a good example for the role of theory in design-based research. Their study focuses on middle school statistics and describes a number of design cycles for creating computational representations that help teachers to introduce notions such as center, skewedness, spread, relative frequency coherently from the concept of a mathematical distribution. Based on statistics education literature and classroom observations, the authors identity as an important step in the learning trajectory that students will initially need to learn to appreciate the difference between numbers and data. Therefore tasks and computer-generated graphical representations that are intended to make students aware of the fact that they are analyzing data need to be developed. As a theoretical framing, the specific learning trajectory gets contextualized in the wider context of mathematical reasoning, in particular learning about data generation and about developing and critiquing data-based arguments. The authors developed three computational tools, with different, but synergistic representational notations, that in concert with capable teachers began to move students’ conceptions of distribution into a mathematically fruitful direction. 

The potential for synergies between design-based research and learning analytics is obvious. DBR could greatly profit from data on students that are gathered unobtrusively, trace learning on multiple levels, and over longer stretches of time. It could further profit from making these data rapidly, if not continuously, available to teachers and students. Teachers are an essential part of most curricular activity systems (Roschelle, Knudsen, & Hegedus, 2010), and students have to learn how to monitor and steer their own learning (Bull, Johnson, Masci, & Biel, 2016). Learning analytics for its part would become more experimental, more interventionist. I see this as a good development to the extent that pedagogical and technical interventions have the goal to improve upon teaching, to innovate. This is preferable over the use of advanced analytical methods for reinforcing current practices, amongst them practices that might be pedagogically dubious. Along with becoming more experimental, learning analytics would also become more engaged in the advancement of theory via the testing of hypotheses (e.g., the testing of design claims and of conjectures of learning trajectories). This is not an alternative to learning analytics as an methodology for applied research (Pardo & Dawson, 2016), but adds a dimension that can benefit teaching and learning. 

Since learning analytics, in combination with educational data mining, is very comprehensive in terms of the method it encompasses, the shift I am suggesting is not a radical one. The two main ‘moves’ needed are, firstly, a closer alignment of learning analytics with interventionist types of educational research, such as design-based research, and with the emerging educational improvement science (Bryk, 2015). Secondly, learning analytics researchers and practitioners would need to engage more with the development and testing of learning theories, broadly conceived. I consider it particularly valuable if learning analytics would add to learning research--and to educational research in general--methods that go beyond the already well-established applications of the General Linear Model (mainly regression models and analysis of variance). Methods such as social network analysis, pattern learning, and others that allow to analyze the structures and properties that emerge from the relation between entities are potentially more interesting for theory building than linear modelling methods, which might be useful for practical purposes nevertheless. This would not only add incrementally to the method repertoire of learning research, but could transform to some extent how learning research is done: From  a discipline that mainly describes and orders phenomena and findings  with qualitative and statistical methods to a discipline that develops causal-explanatory accounts of learning-in-context. 

An additional transformative potential of learning analytics for educational research concerns the distribution of analytical work: At least in technical terms, it is a small step from gathering data comprehensively to making them available openly. Issues of data protection and privacy aside, there lies a huge innovation potential in making learning data available publicly, in usable formats, because educational challenges are truly too big for any single researcher or research team to solve (Weinberger, 2011). 


Barab, S. A. (2014). Design-based research: A methodological toolkit for engineering change. In R. K. Sawyer (Ed.), Cambridge Handbook of the Learning Sciences (2nd ed., pp. 151-170). New York: Cambridge University Press.Bryk, A. S. (2015). 2014 AERA Distinguished Lecture: Accelerating How We Learn to Improve. Educational Researcher, online first.
Bull, S., Johnson, M.D., Masci, D., & Biel, C. (2016). Integrating and visualising diagnostic information for the benefit of learning. In P. Reimann, S. Bull, M. Kickmeier-Rust, R. Vatrapu & B. Wasson (Eds.), Measuring and visualizing learning in the information-rich classroom (pp. 167-180). New York,NY: Routledge.
Cobb, P., & Gravemeijer, K. (2008). Experimenting to support and understand learning processes. In A. E. Kelly, R. A. Lesh & J. Y. Baek (Eds.), Handbook of design research methods in education (pp. 68-95). New York: Routledge.Ercikan, Kadriye, & Roth, Wolff Michael. (2014). Limits of generalizing in education research: Why criteria for research generalization should include population heterogeneity and uses of knowledge claims. Teachers College Record, 116(May), 1-28.
Maxwell, J.A. (2004). Using qualitative methods for causal explanations. Field Methods, 16, 243-264.
Pardo, A., & Dawson, S. (2016). Learning analytics: How can data be used to improve learning practice? In P. Reimann, S. Bull, M. Kickmeier-Rust, R. Vatrapu & B. Wasson (Eds.), Measuring and visualizing learning in the information-rich classroom (pp. 41-55). New York,NY: Routledge.
Reimann, P. (2013). Design-based research - designing as research. In R. Luckin, S. Puntambekar, P. Goodyear, B. Grabowski, J. D. M. Underwood & N. Winters (Eds.), Handbook of design in educational technology (pp. 44-52). New York: Taylor & Francis.
Roschelle, J., Knudsen, J., & Hegedus, S. (2010). From new technological infrastructures to curricular activity systems: Advanced designs for teaching and learning. In M. J. Jacobson & P. Reimann (Eds.), Designs for learning environments of the future (pp. 233-262). New York: Springer.
Weinberger, D. (2011). Too big to know: Rethinking knowledge now that the facts aren't the facts, experts are everywhere, and the smartest person in the room is the room. New York, NY.: Basic Books.
Yin, Robert K. (2003). Case study research : design and methods (3rd ed.). Thousand Oaks, CA: Sage.

About the Blog

Research by the University's Centre for Research on Learning and Innovation (CRLI).