business learning training articles new learning business training opportunities finance learning training deposit money learning making training art loan learning training deposits make learning your training home good income learning outcome training issue medicine learning training drugs market learning money training trends self learning roof training repairing market learning training online secure skin learning training tools wedding learning training jewellery newspaper learning for training magazine geo learning training places business learning training design Car learning and training Jips production learning training business ladies learning cosmetics training sector sport learning and training fat burn vat learning insurance training price fitness learning training program furniture learning at training home which learning insurance training firms new learning devoloping training technology healthy learning training nutrition dress learning training up company learning training income insurance learning and training life dream learning training home create learning new training business individual learning loan training form cooking learning training ingredients which learning firms training is good choosing learning most training efficient business comment learning on training goods technology learning training business secret learning of training business company learning training redirects credits learning in training business guide learning for training business cheap learning insurance training tips selling learning training abroad protein learning training diets improve learning your training home security learning training importance

Blog home

Work in progress

While Part I  outlined some of what researchers take for being true about learning, and argued that learning analytics can make important contributions to the methodology of modern learning research, in this posting I describe how learning analytics might contribute to conducting design-based research (DBR), sometimes also referred to as design experiments. 

DBR has the goal “…to use the close study of learning as it unfolds within a naturalistic context that contains theoretically inspired innovations, usually that have passed through multiple iterations, to then develop new theories, artifacts, and practices that can be generalized to other schools and classrooms” (Barab, 2014, p. 151). Design-based research is a ‘natural’ fit between the learning sciences and learning analytics because DBR shares with learning analytics the goal to provide solutions to practical problems. At the same time, these solutions are expected to be grounded in a theory of learning, hence applying the solution can be seen as a (partial) test of the theory, and improving the solution incrementally over time can be seen as contributing to advancing theory over time. 

In design-based research, theory is essential for generalization because design experiments do mostly not use a control group logic, but are structured as within-subjects, repeated measurements designs: A baseline is observed, an intervention is performed (e.g., change in teaching style, a different curriculum, a new or different technology), and the effects of the intervention are gauged in terms of changes to the baseline. Design-based research makes often use of qualitative methods, frequently in combination with quantitative methods. This increases its value to inform the (re-) design of the intervention, and its value for theory building. The main difference between design experiments and standard control-group experiments is that in design experiments context is seen as part of the treatment, thus acknowledging the situated nature of learning; context variables are not seen as ‘interfering’, but as providing the resources through which theoretically expected learning processes become realized in a specific learning situation. This does not mean that DBR does not have a concept of interference, but it is not context ‘variables’ that are seen as potentially interfering; instead, other mechanisms that are active in the same context can interfere. The basis for generalization is provided by keeping the mechanisms that cause learning analytically separate from the context; this analytical distinction allows to formulate expectations how the mechanisms might play out in other contexts, and is hence the basis for the form of generalization most prevalent in design-based research: analytical generalizations  (Ercikan & Roth, 2014; Maxwell, 2004). The DBR methodology is in this respect similar to the methodology of case studies (Yin, 2003): Generalizing is performed by relating the specific case to theories with explanatory value. The specific case observations are not taken as applying in an identical manner to a “population”, but are related to similar processes, and/or more abstract types of processes. It is not the specific participants in the study who are seen as instances of a (in a statistical sense meaningful) ‘population’; instead, the specific observation is treated as “an instance of” something more abstract and, in this sense, more general (Reimann, 2013).  

In more concrete terms, theory enters into design-based research in form of conjectures that take mainly the form learning trajectories and design claims. A learning trajectory describes how learning develops in the absence of the intervention—humans, like any organism, cannot not learn—and how learning changes under the influence of the intervention, in particular the theory-informed aspects of the intervention. Learning trajectories specify expectations about the form of change, perhaps its extent (‘size’), and should say something about its temporal aspects: When will the effect materialize? For how long? Design claims are conjectures about how specific aspects of the intervention affect students’ learning and understanding. Like expectations about learning trajectories, design claims focus mainly on those aspects of the pedagogical and/or technical design that are are related to relevant theory. 

Cobb and Gravemeijer (2008) provide a good example for the role of theory in design-based research. Their study focuses on middle school statistics and describes a number of design cycles for creating computational representations that help teachers to introduce notions such as center, skewedness, spread, relative frequency coherently from the concept of a mathematical distribution. Based on statistics education literature and classroom observations, the authors identity as an important step in the learning trajectory that students will initially need to learn to appreciate the difference between numbers and data. Therefore tasks and computer-generated graphical representations that are intended to make students aware of the fact that they are analyzing data need to be developed. As a theoretical framing, the specific learning trajectory gets contextualized in the wider context of mathematical reasoning, in particular learning about data generation and about developing and critiquing data-based arguments. The authors developed three computational tools, with different, but synergistic representational notations, that in concert with capable teachers began to move students’ conceptions of distribution into a mathematically fruitful direction. 

The potential for synergies between design-based research and learning analytics is obvious. DBR could greatly profit from data on students that are gathered unobtrusively, trace learning on multiple levels, and over longer stretches of time. It could further profit from making these data rapidly, if not continuously, available to teachers and students. Teachers are an essential part of most curricular activity systems (Roschelle, Knudsen, & Hegedus, 2010), and students have to learn how to monitor and steer their own learning (Bull, Johnson, Masci, & Biel, 2016). Learning analytics for its part would become more experimental, more interventionist. I see this as a good development to the extent that pedagogical and technical interventions have the goal to improve upon teaching, to innovate. This is preferable over the use of advanced analytical methods for reinforcing current practices, amongst them practices that might be pedagogically dubious. Along with becoming more experimental, learning analytics would also become more engaged in the advancement of theory via the testing of hypotheses (e.g., the testing of design claims and of conjectures of learning trajectories). This is not an alternative to learning analytics as an methodology for applied research (Pardo & Dawson, 2016), but adds a dimension that can benefit teaching and learning. 

Since learning analytics, in combination with educational data mining, is very comprehensive in terms of the method it encompasses, the shift I am suggesting is not a radical one. The two main ‘moves’ needed are, firstly, a closer alignment of learning analytics with interventionist types of educational research, such as design-based research, and with the emerging educational improvement science (Bryk, 2015). Secondly, learning analytics researchers and practitioners would need to engage more with the development and testing of learning theories, broadly conceived. I consider it particularly valuable if learning analytics would add to learning research--and to educational research in general--methods that go beyond the already well-established applications of the General Linear Model (mainly regression models and analysis of variance). Methods such as social network analysis, pattern learning, and others that allow to analyze the structures and properties that emerge from the relation between entities are potentially more interesting for theory building than linear modelling methods, which might be useful for practical purposes nevertheless. This would not only add incrementally to the method repertoire of learning research, but could transform to some extent how learning research is done: From  a discipline that mainly describes and orders phenomena and findings  with qualitative and statistical methods to a discipline that develops causal-explanatory accounts of learning-in-context. 

An additional transformative potential of learning analytics for educational research concerns the distribution of analytical work: At least in technical terms, it is a small step from gathering data comprehensively to making them available openly. Issues of data protection and privacy aside, there lies a huge innovation potential in making learning data available publicly, in usable formats, because educational challenges are truly too big for any single researcher or research team to solve (Weinberger, 2011). 

References:

Barab, S. A. (2014). Design-based research: A methodological toolkit for engineering change. In R. K. Sawyer (Ed.), Cambridge Handbook of the Learning Sciences (2nd ed., pp. 151-170). New York: Cambridge University Press.Bryk, A. S. (2015). 2014 AERA Distinguished Lecture: Accelerating How We Learn to Improve. Educational Researcher, online first.
Bull, S., Johnson, M.D., Masci, D., & Biel, C. (2016). Integrating and visualising diagnostic information for the benefit of learning. In P. Reimann, S. Bull, M. Kickmeier-Rust, R. Vatrapu & B. Wasson (Eds.), Measuring and visualizing learning in the information-rich classroom (pp. 167-180). New York,NY: Routledge.
Cobb, P., & Gravemeijer, K. (2008). Experimenting to support and understand learning processes. In A. E. Kelly, R. A. Lesh & J. Y. Baek (Eds.), Handbook of design research methods in education (pp. 68-95). New York: Routledge.Ercikan, Kadriye, & Roth, Wolff Michael. (2014). Limits of generalizing in education research: Why criteria for research generalization should include population heterogeneity and uses of knowledge claims. Teachers College Record, 116(May), 1-28.
Maxwell, J.A. (2004). Using qualitative methods for causal explanations. Field Methods, 16, 243-264.
Pardo, A., & Dawson, S. (2016). Learning analytics: How can data be used to improve learning practice? In P. Reimann, S. Bull, M. Kickmeier-Rust, R. Vatrapu & B. Wasson (Eds.), Measuring and visualizing learning in the information-rich classroom (pp. 41-55). New York,NY: Routledge.
Reimann, P. (2013). Design-based research - designing as research. In R. Luckin, S. Puntambekar, P. Goodyear, B. Grabowski, J. D. M. Underwood & N. Winters (Eds.), Handbook of design in educational technology (pp. 44-52). New York: Taylor & Francis.
Roschelle, J., Knudsen, J., & Hegedus, S. (2010). From new technological infrastructures to curricular activity systems: Advanced designs for teaching and learning. In M. J. Jacobson & P. Reimann (Eds.), Designs for learning environments of the future (pp. 233-262). New York: Springer.
Weinberger, D. (2011). Too big to know: Rethinking knowledge now that the facts aren't the facts, experts are everywhere, and the smartest person in the room is the room. New York, NY.: Basic Books.
Yin, Robert K. (2003). Case study research : design and methods (3rd ed.). Thousand Oaks, CA: Sage.

Learning analytics is a young field of research (Baker & Siemens, 2014a; Baker & Yacef, 2009), that along with educational data mining has rapidly grown, driven by the availability of (large) sets of data on students’ learning and the interest in analysing these data for the purpose of improving students’ learning and learning experience. I do not make much of the difference between learning analytics and educational data mining here, but it is worth keeping in mind that there are differences between the two fields, even though they are closely related and draw on an very much overlapping research communities. Siemens and Baker (2012) identify the following differences:

  • EDM researchers are more interested in automated methods for discovery, while LA is more interested in human-led, mixed-initiative methods for exploring educational data;•
  • EDM is more construct-oriented, while LA researchers emphasize a more holistic view of learning and learners;
  • Researchers in EDM develop methods for automatic adaptation of instruction, whereas LA researchers are developing applications that inform teachers, educators, and students. Hence the strong interest in LA on learning visualisations. 

The focus of this paper is on the relation between learning analytics (and EDM) and learning research, in particular the kind of learning research practiced in the Learning Sciences (Sawyer, 2014). My intention is thus similar to the one of Baker and Siemens in their contribution the second edition of the Cambridge Handbook of the Learning Sciences (Baker & Siemens, 2014b): To contribute to a stronger tie between learning analytics and learning (sciences) research. However, different from Baker and Siemens I believe that important contributions from learning analytics to learning research are still a matter of the future. I argue that that while there is the potential for that, it is far from realized, even from being realized. In terms of Pasteur’s Quadrant (Stokes, 1997), I see learning analytics as currently falling into the category of pure applied research, whereas learning sciences can be see as use-inspired basic research, in which the focus is on advancing “the frontiers of understanding but also inspired by considerations of use” (Stokes, 1997, p. 74).

 The main strategy I am following here is to develop some suggestions for how to make LA more relevant for foundational research on learning. I argue that the methods used in learning analytics (and EDM) have the potential to contribute to the applied as well as the foundational objectives of learning research. I further suggest that a more theory-oriented learning analytics can be more than an ‘addition’ to the ‘toolbox’  the learning researcher, that the ‘import’ could be more profound: It could change to a certain extent how we think about research methodology in the learning sciences. 

The potential of Learning Analytics in learning research

The potential of learning analytics for the advancement of the learning research can in my opinion unfold along the four dimensions: (i) data quantity, (ii) longitudinal data, (iii) data from multiple levels, and (iv) data from many locations.  In this section, I map these characteristics of data in learning analytics to modern conceptions of learning and main findings from learning research. 

Quantity of Data

The size of data sets is the primary argument for the value of LA: ”One of the factors leading to the recent emergence of learning analytics is the increasing quantity of analysable educational data (…) Papers have recently been published with data from tens of thousands of students.” write Baker and Siemens (2014a, p. 254). Size is not only measured in number of students; the number of data points per student (captured in log files of learning applications and platforms, for instance) is another quantitative dimension. The Pittsburgh Science of Learning Center DataShop (Koedinger et al., 2010), for instance, stores detailed recordings of students’ interactions with carefully designed tutor software that records step-by-step problem solving operations. 

There are a number of reasons why size is considered to matter. One is that the number of students is taken as useful for establishing the generalizability of findings—a statistical argument. Another is that the more data, the more ‘patterns’ can be found. The flip side to this is that the number of possible relations between variables increases exponentially with the number or variables included in the analysis (Council, 2013). More is needed than just data to ‘discover’ meaningful relations. 

A third argument for the value of large data sets is that they allow us to identify ‘rare’ events: events/patterns that occur in only small numbers of students or only sporadically (e.g., Sabourin, Rowe, Mott, & Lester, 2011).This is particularly interesting if the rare events are defined apriori: events that theory predicts, but that are seldom occurring spontaneously, or are seldom observable because of interactions with other processes (or because of measurement issues). The inverse is interesting as well: Theory might not allow certain events to happen; if they happen, their appearance is interesting because this might not only be just a measurement error, or due to ‘chance’, but indicate a limitation of the theory; it might even render the theory downright wrong. 

While all three aspects of data quantity are beneficial to learning research, the third aspect—rare event detection—deserves more attention. It is the one least often considered, but it can contribute to make learning sciences more theory-guided, and it can help to bridge the gap between qualitative and quantitative learning research. In qualitative research, the frequency with which an event occurs is not automatically identified with the importance of the event; in many cases, important events are rare. An example from learning research is conceptual change, which occurs rarely,  but when it occurs has profound effects on students’ understanding (diSessa, 2006).

Longitudinal Learning Data

Learning needs time.Learning in schools and universities requires often multiple skills—such as mathematical and writing skills—to master complex, hierarchically structured subject matter. In science education, for instance, the hierarchical nature of the subject knowledge also leads to the subject being an intricate association of concepts where deep learning of some basic concepts require comprehension of other basic concepts (Fergusson-Hessler & de Jong, 1987). Theoretical accounts for the depth and extend it takes to comprehend scientific concepts have been suggested from a cognitive psychology perspective and from a socio-cultural perspective. From the cognitive psychology perspective, one line of argument is that learning science can be seen as developing a form of expertise, and that any form of real expertise in cognitively demanding areas requires years of learning (the magic number is 10 years, plus/minus 2), as evidenced by novice-expert research, see (K. A. Ericsson, Charness, Feltovich, & Hoffman, 2006)  for a comprehensive overview. The currently best elaborated cognitive model of expertise development in the cognitive tradition is probably Ericsson’s Deliberate Practice theory (K. Anders Ericsson, Krampe, & Tesch-Römer, 1993). The reason why learning takes long in this model is the incremental nature of the underlying cognitive learning/change mechanisms (chunking, proceduralization). 

Another cognitive account, and one more specific to science education than general models of expertise development, is Chi’s and Slotta’s Ontology Shift theory (e.g., Chi, Slotta & de Leeuw, 1994). On this account, learning scientific concepts is hard and everyday concepts are resistant to change because scientific understanding requires in many cases a change in an ontological category. A classical example is the concept of heat, where students often see heat as a property of matter, whereas in physics it is seen in process terms, as the average velocity of particles. In this theory, the reason that learning stretches often over longer times is that while the ontology change itself can be fairly rapid, it needs often extended time (under current conditions of science learning) before students become sufficiently aware of the limitations of the initial ontology and are ready to accept an alternative one. 

Tracking learning that stretches over months and years—another example for this would be the development of second language skills—is very rarely done in learning research. One reason are the costs, and the logistics, of performing such research. But the costs are being substantially lowered as learning analytics methods find their place in schools and universities. It would be of tremendous benefit  if such data could be made available to researchers, and their acquisition planned in coordination with research projects. Methods for process mining are particularly relevant in this context (Reimann, 2009). Not only would this help to conduct specific projects that study long-term learning, it would also change the way we think about the nature of projects in learning research: From short-term interventions with immediate effects assessment to longer-duration interventions with continuous, long-durations effects (and side-effects!) monitoring. A variant of this kind of research we see developing with improvement research (Bryk, 2015), and the continuous use of data for decision making (Mandinach, 2012). 

Data from Learning on Multiple Levels - Learning is complex

Learning does not only place over long durations, but on other levels of analysis is happening within seconds and even milliseconds. Nathan and Alibali (2010) distinguish between learning in milliseconds and below (biological), seconds (cognitive), minutes to hours (rational), days to months (sociocultural), and years and beyond (organizational). This can be seen as an expression of strictly different kinds of learning, but more productively it may be seen as an expression of the fact that learning takes place at multiple levels at the same time. We can see learning ‘events’ as being produced by a complex, multi-layered system, with minimally three levels: A biological stratum with neurophysiological processes, a cognitive stratum (rational thinking, knowledge) , and a socio-cultural stratum (tools, practices). These strata, or levels, are set in relation to each other by processes of emergence (Sawyer, 2005). 


The concept of emergence as used here is relational: It refers to the phenomenon that wholes (entities, agents, organisms, organisations) have properties that cannot be found in any of their parts.  An emergent property “is one that is not possessed by any of the parts individually and that would not be possessed by the full set of parts in the absence of a structuring set of relations between them.” (Elder-Vass, 2010, p. 17). A key aspect of (relational) emergence is therefore the organization of the parts, how the parts are set in relation to each other, how the whole is structured. Not all properties of an object are emergent; some will be resultant properties. For instance, most objects have mass, which is an resultant property: the mass of the whole is the sum of parts’ masses. Some objects have colour, which is an emergent property; it is dependent on the organization of the objects’ parts. 


If we conceive of learning as a complexity phenomenon (Kapur et al., 2007), then learning needs not only be studied at multiple levels, but the analysis of the relation between the levels—the nature of the emergence—must take center stage. This requires not only to ask what affects learning over time, but also how learning is constituted at each moment in time: Which configurations of neural, cognitive, motivational, emotional, social and contextual processes/elements give rise to a ‘learning event’? Answering the latter question requires  appropriate instrumentation, and appropriate analytical methods. The methods cannot be (only) variants of the General Linear Model (e.g., regression models, including so-called ‘structural’ or ‘causal’ variants), amongst other reasons because these are not appropriate for non-linear complex systems, for systems that transform themselves or get transformed. Instead, methods for the analysis of non-linear systems will be needed (e.g., van Geert, 1998), and methods that can be used to describe relations between parts, in particular graph-theoretical methods such as Social Network Analysis (Burt, Kilduff, & Tasselli, 2013). Learning analytics and educational data mining can play a key role in advancing the learning sciences by bringing about such methodological advances and by making them usable for learning researchers. These includes, but should not be confined to, methods for recording bio-signals, learning behavior and the cognitive-motivational processes causing them,  as well as the social dimension of learning in great detail, with high precision, repeatedly and frequently, if not continuously. 

Data from Learning in Many Contexts - Learning is Distributed 

The methods being developed in learning analytics and educational data mining to capture aspects of students’ behaviour—and physiological and emotional parameters that go along with behaviour—not only over time, but also across locations is tremendously valuable for research. This because learning is situated: It is highly dependent on the resources available to the learner in specific contexts. Not only is learning happening (quasi-)synchronously across multiple levels, it is also distributed over the socio-physical environment—the situation—the learner finds herself in (Sawyer & Greeno, 2009). As Greeno and others have argued, any analysis of learning will be incomplete if it does not (also) conceptualise learning as a socio-cultural practice, as an activity system that stretches far beyond the somato-physical boundaries of the cranium and the body. 


Such an understanding of learning practices is necessary for theoretical as well as pedagogical purposes. For the purpose of theory development, an understanding of the socio-material practices around knowledge objects contributes to de-mystifying the process of learning—how is it possible to learn something genuinely new?— and of idea and knowledge creation more generally (Prawat, 1999). As the entanglement of cognitive work with physical, symbolic and social resources becomes ever better documented and understood—in general (e.g., Clark, 2011) and for specific areas such as scientific research (e.g., Latour & Woolgar, 1986)—it becomes clear that a theory of learning, creativity and idea generation will need to be grounded not only in psychology, but also in sociology, organization science, and semiotics. Any specific study will need to capture knowledge practices in a comprehensive sense. 


The fact that with learning analytics methods behavioural, interactional, and increasingly even some physiological parameters of students’ ‘learning’ activities can be captured across locales and contexts constitutes an essential prerequisite for researching learning-in-context at scale. Learning analytics methods will need to become substantially more sophisticated to become really useful for studying learning-in-context, though. It is not sufficient to keep track of students’ activities (and related parameters) alone; in addition, the context needs to be described and logged as well. This is easier said than done; just think of the many artefacts and tools that students use on average on every day of a semester: at school/uni, at home, while commuting. Along with technical advancements for capturing aspects of students’ behaviour and experience, a main focus of research in learning analytics should therefore be to develop languages, and standards, for describing the context within which behaviour and experience arise, and for describing the relation between the learners and the social, physical and symbolic aspects of learning context. 

Summary


In summary, I argue that there lies a huge potential in learning analytics to advance learning research, and that in order to realize this potential learning analytics researchers should devote more attention to (finding) rare learning events, to focus more on long-term learning, to make more of the fact that learning can be recorded on multiple levels of a complex system (the human learner), and to develop methods for capturing the context in which learning activities occur. None of this can be done without building on theory, on conceptualizations of learning and cognition. Theory is essential, and it is important to repeat what two of the key researchers write: ”The theory-oriented perspective marks a departure of EDM and LA from technical approaches that use data as their sole guiding point…” (Baker & Siemens, 2014b, p. 256/257). Suggestions such as made by Anderson (2008) that big data will render the scientific method obsolete not only express a deep misunderstanding of what the method is about, they are also committing the logical (and ethical) error of using descriptions of the past as prescriptions for the future.  

 

References

Anderson, C. . (2008). The end of theory: The data deluge makes the scientific method obsolete. Wired Magazin.   Retrieved 14 December, 2015, from http://www.wired.com/2008/06/pb-theory/

Baker, R., & Siemens, G. (2014a). Educational data mining and learning analytics. In R. K. Sawyer (Ed.), Cambridge Handbook of the Learning Sciences (2nd ed., pp. 253-274). New York: Cambridge University Press.

Baker, R., & Siemens, G. (2014b). Learning analytics and educational data mining. In R. K. Sawyer (Ed.), Cambridge Handbook of the Leaning Sciences (2nd ed., pp. 253-272). New York: Cambridge University Press.

Baker, R., & Yacef, K. (2009). The state of educational data mining in 2009: A review and future vision. JEDM - Journal of Educational Data Mining, 1(1), 3-17.

Bryk, A. S. (2015). 2014 AERA Distinguished Lecture: Accelerating How We Learn to Improve. Educational Researcher.

Burt, Ronald S., Kilduff, Martin, & Tasselli, Stefano. (2013). Social network analysis: Foundations and frontiers on advantage. Annual Review of Psychology, 64, 527-547.

Clark, A. (2011). Supersizing the mind. Embodiment, action, and cognitive extension. Oxford, UK: Oxford University Press.

Council, National Research. (2013). Frontiers in Massive Data Analysis. Washington, D.C.: The National Academic Press.

diSessa, A.A. (2006). A history of conceptual change research: Threads and fault lines. In R. K. Sawyer (Ed.), The Cambridge Handbook of the Learning Sciences.

Elder-Vass, Dave. (2010). The causal power of social structures. Cambridge, UK: Cambridge University Press.

Ericsson, K. A., Charness, N., Feltovich, P., & Hoffman, R.B. (Eds.). (2006). The Cambridge Handbook of Expertise and Expert Performance. New York: Cambride University Press.

Ericsson, K. Anders, Krampe, Ralf Th., & Tesch-Römer, Clemens. (1993). The role of deliberate practice in the acquisition of expert performance. Psychological Review, 100, 363-406.

Fergusson-Hessler, M.G.M., & de Jong, T. (1987). On the quality of knowledge in the field of electricity and magnetism. American Journal of Physics, 55, 492-497.

Kapur, M., Hung, D., Jacobson, M.J., Voiklis, J., Kinzer, C. K., & Victor, Chen Der-Thang. (2007). Emergence of learning in computer-supported, large-scale collective dynamics: A research agenda Proceedings of the International Conference on Computer-supported Collaborative Learning (CSCL2007). New Brunswick, NJ.

Koedinger, K R, Baker, R S J D , Cunningham, K, Skogsholm, A., Leber, B., & Stamper, J. (2010). A data repository for the EDM community: The PSLC DataShop. In C. Robero, S. Ventura, M. Pechenizkiy & R. Baker (Eds.), Handbook of educational data mining (pp. 43-56). Boca Raton, FL.: Chapman&Hall/CRC.

Latour, B., & Woolgar, S. (1986). Laboratory life: The construction of scientific facts (2nd ed.). Princeton: Princeton University Press.

Mandinach, E. B. (2012). A perfect time for data use: Using data-driven decision  making to inform practice. Educational Psychologist, 47(2), 71-85.

Nathan, M.J., & Alibali, Martha Wagner. (2010). Learning Sciences. Wiley Interdisciplinary Reviews:Cognitive Science, 1(3), 329-345.

Prawat, R. S. (1999). Dewey, Peirce, and the Learning Paradox. American Educational Research Journal, 36, 47-76.

Reimann, P. (2009). Time is precious: Variable- and event-centred approaches to process analysis in CSCL research. International Journal of Computer-supported Collaborative Learning, 4, 239-257.

Sabourin, J., Rowe, J., Mott, B., & Lester, J. (2011). When off-task is on-task: The affective role of off-task behavior in narrative-centered learning environments. . Paper presented at the Proceedings of the 15th International Conference on Artificial Intelligence in Educatoin. 

Sawyer, R. K. (2005). Social emergence. Societies as complex systems. Cambridge, UK: Cambridge University Press.

Sawyer, R. K. (Ed.). (2014). The Cambridge Handbook of the Learning Sciences (2nd ed.). New York: Cambride University Press.

Sawyer, R. K., & Greeno, J.G. (2009). Situativity and learning. In P. Robbins & M. Aydede (Eds.), The cambridge handbook of situated cognition (pp. 347-367). New York, NY: Cambridge University Press.

Siemens, G., & Baker, R.S.J. d. (2012). Learning analytics and educational data mining: Towards communication and collaboration. Paper presented at the Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (LAK 2012). 

Stokes, D.E. (1997). Pasteur's quadrant: Basic science and technological innovation. Washington, DC: Brookings Institution Press.

van Geert, Paul. (1998). A dynamic systems model of basic developmental mechanisms: Piaget, Vygotsky, and beyond. Psychological Review, 105, 634-677.

0 comments |

Are you an undergraduate student interested in nanotechnology, material science, chemistry, chemical engineering, or physics schools? Associate STL researcher Polly Lai is looking for students to participate a learning activity forming part of a research project to gain an understanding of learning outcomes in undergraduate studies in nanotechnology, material science, chemistry, chemical engineering, o physics programs.

Read more...

STL team members are working with a team from the Charles Perkins Centre on evaluating the use of innovative spaces for learning and teaching at the University. Further news on this collaboration was recently published in the University Staff Newsletter.

This multidisciplinary collaboration brings together researchers from various disciplines including spatial design for learning, e-learning, and science-subject domains.

On Monday 8th July, the University of Sydney is hosting a meeting on MOOCs and the student experience of blended learning. This blog entry is the starting point for an online discussion to supplement the presentations and debate in the meeting.

You can participate in the discussion by adding a Comment to this entry. Comments will be treated like 'letters to the editor' - they will be moderated, may be edited, should be expressed in concise and temperate language and will only be published if - in the view of the editor - they make a contribution to advancing the discussion. Be relevant and interesting. Anonymous comments will not be published. Please conclude your comment with your name and brief affiliation.

Peter Goodyear, Faculty of Education & Social Work, STL Network

16 comments | Read more...

Congratulations to STL lead researcher Associate Professor Janette Bobis who has been awarded a Thompson Fellowship for her project - Strengthening the evidence-base in support of ambitious teaching and learning of mathematics.

bobis.jpg

Improving the effectiveness of mathematics teacher education is at the heart of many recent government initiatives. Embedded in such initiatives, is the need for establishing an evidence-base of ambitious (or high-leverage) teaching practices, structures and ideologies that make a positive difference to the enhancement of mathematics teaching and learning. The need for a comprehensive evidence-base that can be used to reshape initial and on-going programs of teacher education and teaching practices has been a central concern of my personal research agenda for more than a decade. It was a driving force behind my ‘practice-based’ program of research at the initial teacher education level (e.g., Bobis, 2007) and for my most recent ARC project (Empowering Teachers of Mathematics). The proposed work emanates from two ARC-funded research projects and will effectively work towards strengthening and expanding this agenda to an international level.

The Thompson Fellowships are named after Isola Florence Thompson, one of the first women graduates of the University of Sydney. The Fellowships aim to promote and enhance the careers of academic and research-only women at the University by providing opportunities to develop and strengthen their research.

CoCo PhD student Kathrine Petersen for her short presentation on serious games for learning.

Overall, research into “Serious Games” for learning indicates that learning outcomes can be gained through the implementation of well-designed virtual environments that use curricular content and appropriate learning methods. “Serious games” research demonstrates that students find learning in virtual gaming environments enjoyable. Results show that students are both motivated and engage in learning tasks inside the learning environment. Further research indicates that various structure and design elements within the learning environment help improve student’s ability to navigate complex problems. However, while some recent research into “serious games” look at whether these advanced learning environments produce learning outcomes for science and mathematics, more research needs to be conducted that explores appropriate learning methods for teaching specific curricular in areas such as complex literature learning.

This study proposal is a prelude to exploring research into the use of serious games for teaching complex literature curricular through the developing an experimental virtual learning environment to
1) exploring successful research outcomes using serious games for learning curricular content in other learning domains that leads to positive academic learning outcomes
2) to explore existing serious games designed for teaching complex literature content
3) to use successful methods and design elements from the studies in 1 & 2 in the creation of a serious games environment for teaching ancient historical poetic text with emphasis on its themes, metaphor and literary text in context and
4) to assess the qualitative and quantitative academic learning outcomes of graduate students who experience the virtual space.

This semester we are trying something new. As we now have 26 PhD students and three post doc.’s at CoCo, catching up at weekly meeting, seminars and in passing no longer gives us any real insight into each other’s work. It also means that more formal opportunities to speak about our work are harder to organise. Enter the ten minute thesis challenge – more than three, less than eleven and kicking it off was trickier than one would imagine.

How does one describe something that involved 549 hours of observation and is very much a work in progress – coherently in less than eleven minutes? Well I guess that’s the point, having to, gets one thinking.

The aim of my project is to explore the relationship between activity and environment in an innovative school on the outskirts of metropolitan Sydney. I conducted my observations in 2012 in a learning space that supports 180 year five and six students and their team of seven teachers.

Read more...

Touchscreen4sml.jpgWondering what we're getting up to in 2013? One of the first projects we got into in the new year was setting up our new Interactive TableTop in the Design Studio.

David Ashe and Martin Parisio built the interactive table top for research in the sciences and technology of learning. The table is a focal point of ongoing research in our agenda, and you can see further information on tabletop research described on the the Computer Human Adapted Interaction (CHAI) website at the School of Information Technologies.

An article in the Sydney Morning Herald's Digital Life News, published during the recent International Conference of the Learning Sciences (ICLS2012) hosted by us at the University of Sydney, highlighted wider interest in such technologies for education.

Read more...

In August, members of the Laureate team facilitated a Design Day as part of the Water in the Landscape project. We have now produced the first in a series of working papers (available for download). Four groups participated, using a variety of tools (for more detailed information about the set-up see the Design Studio website). This paper focuses on one group of five high school students. They used a white-wall (see the Design studio) and a tablet computer (iPad).

The findings were the result of the work of a multidisciplinary team (Kate Thompson, David Ashe, Pippa Yeoman, Dewa Wardak and Martin Parisio) contributing to the analysis of multiple streams of data (video, audio, and photographs). These findings helped to provide understanding of learning by design, learning outcomes, the use of tools, and methods for analyzing the processes of learning.

Overall, learning by design was found to be effective; promoting higher order skills such as collaboration, problem solving and creativity. The understanding of the intersections of social interactions of students, the physical and digital tools, and the development of design ideas, is vital to the on-going design of learning by design projects.

In 2013 we will apply this analysis to the other groups and develop a framework for learning by design activities.

The working paper and an executive summary are available for download

Collabsml.jpgCongratulations to Dr Karl Maton and his fellow researchers who were recently awarded an ARC Discovery Project grant for their proposal, "Pedagogies for knowledge-building: investigating subject-appropriate, cumulative teaching for twenty-first century school classrooms".

To succeed in today's knowledge society, young people need to quickly grasp the organising principles for building different forms of knowledge. This interdisciplinary project explores how teachers marshall the resources of modern classrooms to apprentice students into subject-specific principles for knowledge-building in Science and History.

Research will start in 2013, and run for three years, building on the previous groundbreaking 'DISKS' project. For more information on Karl's research see here (offsite).

EDRS main pic sml.jpgWhat should a space designed for learning about learning ideally include? If you'd like to see one possible answer, head over to our Design Studio website for a run-down on how and why we created this environment, what it has to offer, and how you can get involved.

The Educational Design Research Studio (EDRS or the design studio for short) at the University of Sydney has been created as part of Professor Peter Goodyear’s Australian Laureate Fellowship project. The design studio is equipped to support small teams of people working on existing or new educational design problems, using their own approaches and by using design methods, tools and resources that we can make available.

Read more...


Professor Michael Jacobson of the CoCo Research Centre discusses how innovative multi-user virtual environments (MUVEs) can be designed and used in Australian schools to enhance the learning of important scientific knowledge and inquiry skills, and about the project team's experiences in designing and applying educational MUVE the Omosa Project .


DSC03529.jpg As many of our heads are down, working hard on submissions for the upcoming CSCL conference (the sister conference to ICLS, which CoCo hosted this year ), it’s a nice time to reflect on the variety of work that we do here at CoCo, within this umbrella of the sciences and technologies of learning.


I am currently collaborating with sixteen different people, on various pieces of work, within the laureate team, CoCo, and recent alumni. Topics that we are writing about vary from process mining of patterns of decision making, to analysis of some interesting networked learning environments, and the way that school students used our newly minted Design Studio. I’m often reminded of the opportunities that occur at CoCo for exciting discoveries based on the conversations with our international guests, or encounters in the hallway. This month we will submit our first paper based on the data we collected as part of the Water in the Landscape project. In it, we will examine the combination of tool use, a design framework, and evidence of systems thinking in relation to water and sustainability.

A brief reminder about the call for papers for the BJET Special Issue on e-Research for Education that Peter Reimann and I are co-editing. Feel free to email us your questions.

Call for Papers for Special Issue "e-Research for Education: Applied, methodological and critical perspectives", The British Journal of Educational Technologies

This special issue aims to provide a comprehensive review of the emerging domain of ICT- enhanced research methods in educational research. It seeks contributions in the following broad categories: 1) methodological papers (e.g., learning analytics, collaborative video analysis, digital ethnography); 2) applied case studies of frontier e-research project; 3) conceptual explorations of eResearch implications. Guest editors:




business learning training articles new learning business training opportunities finance learning training deposit money learning making training art loan learning training deposits make learning your training home good income learning outcome training issue medicine learning training drugs market learning money training trends self learning roof training repairing market learning training online secure skin learning training tools wedding learning training jewellery newspaper learning for training magazine geo learning training places business learning training design Car learning and training Jips production learning training business ladies learning cosmetics training sector sport learning and training fat burn vat learning insurance training price fitness learning training program furniture learning at training home which learning insurance training firms new learning devoloping training technology healthy learning training nutrition dress learning training up company learning training income insurance learning and training life dream learning training home create learning new training business individual learning loan training form cooking learning training ingredients which learning firms training is good choosing learning most training efficient business comment learning on training goods technology learning training business secret learning of training business company learning training redirects credits learning in training business guide learning for training business cheap learning insurance training tips selling learning training abroad protein learning training diets improve learning your training home security learning training importance

Kate Thompson, David Ashe, Martin Parisio and I played host to a group of 17 school students aged between 12 and 17, and 13 school teachers and environmental educators on Friday 10th August, as part of the South Creek Project.  

The day was a facilitated design experience, hosted by the Laureate Program and Greening Australia. The aim of the project is to deliver an innovative fieldwork and multimedia framework for engaging students in water and land management. Through the process it is hoped that the students will develop skills in web design and development, research techniques, and links into the wider community.

The students had already participated in planning sessions, a site visit and a day of hands-on site restoration at the Creek. The day at CoCo was an opportunity for them to actively flesh out their ideas, propose possible formats, identify constraints and generate consensus upon which a brief for the multimedia designer could be written.

From my perspective it was a fascinating experience to see the Laureate program’s new Design Studio in action, to watch as the mix of people, space and tools transitioned from stiff, staccato interaction - to fluid collaboration within a matter of hours. The untangling of which feeds into new research into learning by design, and we look forward to sharing our thoughts as we make sense of them.


Good news from Dorian Peters and Rafael Calvo - Dorian has a contract with New Riders/Peachpit Press for a book on "Interface Design for Learning". If you're interested in more information on the topic or updates, you can visit her blog at eLearningInterfaceDesign.com or give her a tweet at @dorian_peters.

Dorian and Rafael also have a new contract with MIT Press for a book titled "Positive Computing: Technology for a better world". For more on this emerging area of investigation, you can visit the resource hub at www.PositiveComputing.org or follow the feed: @Positive_Comp.

They are very interested in perspectives and feedback in these areas, so please feel free to contact them.

Congratulations to Fiona Chatteur, PhD completed under the supervision of Andy Dong (with a little help from Peter Goodyear). Fiona's thesis is called "Design for Pedagogy Patterns for E-Learning".

Congratulations also to Shannon Kennedy-Clark, who has been awarded a PhD (with a few little emendations) for her thesis: "Collaborative Game-Based Inquiry Learning in Science Education: an Investigation into the Design of Materials and Teacher Education Programs" Shannon's PhD was supervised by Michael Jacobson and Peter Reimann.

Nino Aditomo is about to submit his PhD thesis on "Preservice Primary Teachers’ Science Epistemology: Coherence and Consistency of Beliefs Across Contexts" (supervised by Peter Reimann and Peter Goodyear)

and so is Karen Scott: "Change in University Teachers' E-Learning Beliefs and Practices" (supervised by Peter Goodyear and Mary Jane Mahoney).

On Wednesday 8th August, I had the opportunity to give the first CoCo seminar of the semester on socio-environmental synthesis education (to watch the adobe connect recording, click here). Two months ago, I was invited to give a plenary presentation at a workshop at SESYNC, the latest synthesis centre funded by the National Science Foundation. Synthesis centres bring interesting people together in order to learn about new ideas and establish networks and partnerships around a common theme. The first workshop on SES education was an intense two days including twelve presentations, three break outs, and a poster session, followed by an additional day of project planning.

The CoCo seminar was a great opportunity for me to unpack and organize some of what I learned over those three days, and start to apply it to developments in the learning sciences, the STL network, and the laureate team. These included issues of data sharing (see how the ecologists make data sets available to anyone), actionable science, and planning for multi-disciplinary teams coming together to do research, in terms of structuring projects as well as the facilities which afford such collaborative work.

Some interesting projects that may come of this include my inclusion in the teaching study as an adviser, the inclusion of some special projects for Masters of Sustainability students on the education pathway, and a mapping exercise of research from the learning sciences as it applies to synthesis education.

Lucila Carvalho and I are editing a book on "The architecture of productive learning networks". Routledge, New York offered us a contract today, so we are now full steam ahead for completion of the book by the end of January. Further information about the book can be found here.

The book will report outcomes from Strands 1 and 2 of my ARC Laureate Fellowship program.

Lina Markauskaite and Peter Reimann are guest editing a special issue of the British Journal of Educational Technology on the theme of e-research for education. Full details of the call can be found here.

About the Blog

Research by the University's Centre for Research on Learning and Innovation (CRLI).
More