Join us on Monday 8th February for a Centre for Research on Learning and Innovation seminar with Dr Lennart Schalk, "Feasibility and benefits of introducing basic physics concepts in primary school: Preliminary results of a longitudinal study".

Basic physics concepts represent the fundamental building blocks of more advanced scientific concepts that are typically introduced in secondary science education. Dr Schalk will report on the preliminary results of an ongoing longitudinal study that was initiated recently in Switzerland, the so-called MINT study (MINT is an acronym created from the German words for "science", "technology", "engineering" and "mathematics"). The aim of the study is to implement curricula on basic physics concepts in primary school and monitor children’s learning in science education until they graduate.

In this study, more than 500 primary-school teachers have been educated on how to use evidence-based hands-on teaching materials and scaffold students’ learning with these materials. Data examined in this talk has been gathered from more than 5000 primary-school students across different cohorts and preliminary results indicate that early physics education is likely to prepare students for future learning in science and it is worth the effort to directly align science education from primary to secondary education.

Lennart Schalk is a senior lecturer at the ETH Zurich, Switzerland. He teaches in primary and secondary teacher-education programs. His research focuses on learning of relational categories and conceptual change in science education as well as the improvement of educational teaching and learning materials.

Event details
• When: 4.00pm to 5.00pm on 8 Feb 2016
• Where: Room 424, Education Building A35
• This seminar will not be available online or recorded.
• More information here

rainbowtriangles.jpg

While Part I  outlined some of what researchers take for being true about learning, and argued that learning analytics can make important contributions to the methodology of modern learning research, in this posting I describe how learning analytics might contribute to conducting design-based research (DBR), sometimes also referred to as design experiments. 

DBR has the goal “…to use the close study of learning as it unfolds within a naturalistic context that contains theoretically inspired innovations, usually that have passed through multiple iterations, to then develop new theories, artifacts, and practices that can be generalized to other schools and classrooms” (Barab, 2014, p. 151). Design-based research is a ‘natural’ fit between the learning sciences and learning analytics because DBR shares with learning analytics the goal to provide solutions to practical problems. At the same time, these solutions are expected to be grounded in a theory of learning, hence applying the solution can be seen as a (partial) test of the theory, and improving the solution incrementally over time can be seen as contributing to advancing theory over time. 

In design-based research, theory is essential for generalization because design experiments do mostly not use a control group logic, but are structured as within-subjects, repeated measurements designs: A baseline is observed, an intervention is performed (e.g., change in teaching style, a different curriculum, a new or different technology), and the effects of the intervention are gauged in terms of changes to the baseline. Design-based research makes often use of qualitative methods, frequently in combination with quantitative methods. This increases its value to inform the (re-) design of the intervention, and its value for theory building. The main difference between design experiments and standard control-group experiments is that in design experiments context is seen as part of the treatment, thus acknowledging the situated nature of learning; context variables are not seen as ‘interfering’, but as providing the resources through which theoretically expected learning processes become realized in a specific learning situation. This does not mean that DBR does not have a concept of interference, but it is not context ‘variables’ that are seen as potentially interfering; instead, other mechanisms that are active in the same context can interfere. The basis for generalization is provided by keeping the mechanisms that cause learning analytically separate from the context; this analytical distinction allows to formulate expectations how the mechanisms might play out in other contexts, and is hence the basis for the form of generalization most prevalent in design-based research: analytical generalizations  (Ercikan & Roth, 2014; Maxwell, 2004). The DBR methodology is in this respect similar to the methodology of case studies (Yin, 2003): Generalizing is performed by relating the specific case to theories with explanatory value. The specific case observations are not taken as applying in an identical manner to a “population”, but are related to similar processes, and/or more abstract types of processes. It is not the specific participants in the study who are seen as instances of a (in a statistical sense meaningful) ‘population’; instead, the specific observation is treated as “an instance of” something more abstract and, in this sense, more general (Reimann, 2013).  

In more concrete terms, theory enters into design-based research in form of conjectures that take mainly the form learning trajectories and design claims. A learning trajectory describes how learning develops in the absence of the intervention—humans, like any organism, cannot not learn—and how learning changes under the influence of the intervention, in particular the theory-informed aspects of the intervention. Learning trajectories specify expectations about the form of change, perhaps its extent (‘size’), and should say something about its temporal aspects: When will the effect materialize? For how long? Design claims are conjectures about how specific aspects of the intervention affect students’ learning and understanding. Like expectations about learning trajectories, design claims focus mainly on those aspects of the pedagogical and/or technical design that are are related to relevant theory. 

Cobb and Gravemeijer (2008) provide a good example for the role of theory in design-based research. Their study focuses on middle school statistics and describes a number of design cycles for creating computational representations that help teachers to introduce notions such as center, skewedness, spread, relative frequency coherently from the concept of a mathematical distribution. Based on statistics education literature and classroom observations, the authors identity as an important step in the learning trajectory that students will initially need to learn to appreciate the difference between numbers and data. Therefore tasks and computer-generated graphical representations that are intended to make students aware of the fact that they are analyzing data need to be developed. As a theoretical framing, the specific learning trajectory gets contextualized in the wider context of mathematical reasoning, in particular learning about data generation and about developing and critiquing data-based arguments. The authors developed three computational tools, with different, but synergistic representational notations, that in concert with capable teachers began to move students’ conceptions of distribution into a mathematically fruitful direction. 

The potential for synergies between design-based research and learning analytics is obvious. DBR could greatly profit from data on students that are gathered unobtrusively, trace learning on multiple levels, and over longer stretches of time. It could further profit from making these data rapidly, if not continuously, available to teachers and students. Teachers are an essential part of most curricular activity systems (Roschelle, Knudsen, & Hegedus, 2010), and students have to learn how to monitor and steer their own learning (Bull, Johnson, Masci, & Biel, 2016). Learning analytics for its part would become more experimental, more interventionist. I see this as a good development to the extent that pedagogical and technical interventions have the goal to improve upon teaching, to innovate. This is preferable over the use of advanced analytical methods for reinforcing current practices, amongst them practices that might be pedagogically dubious. Along with becoming more experimental, learning analytics would also become more engaged in the advancement of theory via the testing of hypotheses (e.g., the testing of design claims and of conjectures of learning trajectories). This is not an alternative to learning analytics as an methodology for applied research (Pardo & Dawson, 2016), but adds a dimension that can benefit teaching and learning. 

Since learning analytics, in combination with educational data mining, is very comprehensive in terms of the method it encompasses, the shift I am suggesting is not a radical one. The two main ‘moves’ needed are, firstly, a closer alignment of learning analytics with interventionist types of educational research, such as design-based research, and with the emerging educational improvement science (Bryk, 2015). Secondly, learning analytics researchers and practitioners would need to engage more with the development and testing of learning theories, broadly conceived. I consider it particularly valuable if learning analytics would add to learning research--and to educational research in general--methods that go beyond the already well-established applications of the General Linear Model (mainly regression models and analysis of variance). Methods such as social network analysis, pattern learning, and others that allow to analyze the structures and properties that emerge from the relation between entities are potentially more interesting for theory building than linear modelling methods, which might be useful for practical purposes nevertheless. This would not only add incrementally to the method repertoire of learning research, but could transform to some extent how learning research is done: From  a discipline that mainly describes and orders phenomena and findings  with qualitative and statistical methods to a discipline that develops causal-explanatory accounts of learning-in-context. 

An additional transformative potential of learning analytics for educational research concerns the distribution of analytical work: At least in technical terms, it is a small step from gathering data comprehensively to making them available openly. Issues of data protection and privacy aside, there lies a huge innovation potential in making learning data available publicly, in usable formats, because educational challenges are truly too big for any single researcher or research team to solve (Weinberger, 2011). 

References:

Barab, S. A. (2014). Design-based research: A methodological toolkit for engineering change. In R. K. Sawyer (Ed.), Cambridge Handbook of the Learning Sciences (2nd ed., pp. 151-170). New York: Cambridge University Press.Bryk, A. S. (2015). 2014 AERA Distinguished Lecture: Accelerating How We Learn to Improve. Educational Researcher, online first.
Bull, S., Johnson, M.D., Masci, D., & Biel, C. (2016). Integrating and visualising diagnostic information for the benefit of learning. In P. Reimann, S. Bull, M. Kickmeier-Rust, R. Vatrapu & B. Wasson (Eds.), Measuring and visualizing learning in the information-rich classroom (pp. 167-180). New York,NY: Routledge.
Cobb, P., & Gravemeijer, K. (2008). Experimenting to support and understand learning processes. In A. E. Kelly, R. A. Lesh & J. Y. Baek (Eds.), Handbook of design research methods in education (pp. 68-95). New York: Routledge.Ercikan, Kadriye, & Roth, Wolff Michael. (2014). Limits of generalizing in education research: Why criteria for research generalization should include population heterogeneity and uses of knowledge claims. Teachers College Record, 116(May), 1-28.
Maxwell, J.A. (2004). Using qualitative methods for causal explanations. Field Methods, 16, 243-264.
Pardo, A., & Dawson, S. (2016). Learning analytics: How can data be used to improve learning practice? In P. Reimann, S. Bull, M. Kickmeier-Rust, R. Vatrapu & B. Wasson (Eds.), Measuring and visualizing learning in the information-rich classroom (pp. 41-55). New York,NY: Routledge.
Reimann, P. (2013). Design-based research - designing as research. In R. Luckin, S. Puntambekar, P. Goodyear, B. Grabowski, J. D. M. Underwood & N. Winters (Eds.), Handbook of design in educational technology (pp. 44-52). New York: Taylor & Francis.
Roschelle, J., Knudsen, J., & Hegedus, S. (2010). From new technological infrastructures to curricular activity systems: Advanced designs for teaching and learning. In M. J. Jacobson & P. Reimann (Eds.), Designs for learning environments of the future (pp. 233-262). New York: Springer.
Weinberger, D. (2011). Too big to know: Rethinking knowledge now that the facts aren't the facts, experts are everywhere, and the smartest person in the room is the room. New York, NY.: Basic Books.
Yin, Robert K. (2003). Case study research : design and methods (3rd ed.). Thousand Oaks, CA: Sage.

Learning analytics is a young field of research (Baker & Siemens, 2014a; Baker & Yacef, 2009), that along with educational data mining has rapidly grown, driven by the availability of (large) sets of data on students’ learning and the interest in analysing these data for the purpose of improving students’ learning and learning experience. I do not make much of the difference between learning analytics and educational data mining here, but it is worth keeping in mind that there are differences between the two fields, even though they are closely related and draw on an very much overlapping research communities. Siemens and Baker (2012) identify the following differences:

  • EDM researchers are more interested in automated methods for discovery, while LA is more interested in human-led, mixed-initiative methods for exploring educational data;•
  • EDM is more construct-oriented, while LA researchers emphasize a more holistic view of learning and learners;
  • Researchers in EDM develop methods for automatic adaptation of instruction, whereas LA researchers are developing applications that inform teachers, educators, and students. Hence the strong interest in LA on learning visualisations. 

The focus of this paper is on the relation between learning analytics (and EDM) and learning research, in particular the kind of learning research practiced in the Learning Sciences (Sawyer, 2014). My intention is thus similar to the one of Baker and Siemens in their contribution the second edition of the Cambridge Handbook of the Learning Sciences (Baker & Siemens, 2014b): To contribute to a stronger tie between learning analytics and learning (sciences) research. However, different from Baker and Siemens I believe that important contributions from learning analytics to learning research are still a matter of the future. I argue that that while there is the potential for that, it is far from realized, even from being realized. In terms of Pasteur’s Quadrant (Stokes, 1997), I see learning analytics as currently falling into the category of pure applied research, whereas learning sciences can be see as use-inspired basic research, in which the focus is on advancing “the frontiers of understanding but also inspired by considerations of use” (Stokes, 1997, p. 74).

 The main strategy I am following here is to develop some suggestions for how to make LA more relevant for foundational research on learning. I argue that the methods used in learning analytics (and EDM) have the potential to contribute to the applied as well as the foundational objectives of learning research. I further suggest that a more theory-oriented learning analytics can be more than an ‘addition’ to the ‘toolbox’  the learning researcher, that the ‘import’ could be more profound: It could change to a certain extent how we think about research methodology in the learning sciences. 

The potential of Learning Analytics in learning research

The potential of learning analytics for the advancement of the learning research can in my opinion unfold along the four dimensions: (i) data quantity, (ii) longitudinal data, (iii) data from multiple levels, and (iv) data from many locations.  In this section, I map these characteristics of data in learning analytics to modern conceptions of learning and main findings from learning research. 

Quantity of Data

The size of data sets is the primary argument for the value of LA: ”One of the factors leading to the recent emergence of learning analytics is the increasing quantity of analysable educational data (…) Papers have recently been published with data from tens of thousands of students.” write Baker and Siemens (2014a, p. 254). Size is not only measured in number of students; the number of data points per student (captured in log files of learning applications and platforms, for instance) is another quantitative dimension. The Pittsburgh Science of Learning Center DataShop (Koedinger et al., 2010), for instance, stores detailed recordings of students’ interactions with carefully designed tutor software that records step-by-step problem solving operations. 

There are a number of reasons why size is considered to matter. One is that the number of students is taken as useful for establishing the generalizability of findings—a statistical argument. Another is that the more data, the more ‘patterns’ can be found. The flip side to this is that the number of possible relations between variables increases exponentially with the number or variables included in the analysis (Council, 2013). More is needed than just data to ‘discover’ meaningful relations. 

A third argument for the value of large data sets is that they allow us to identify ‘rare’ events: events/patterns that occur in only small numbers of students or only sporadically (e.g., Sabourin, Rowe, Mott, & Lester, 2011).This is particularly interesting if the rare events are defined apriori: events that theory predicts, but that are seldom occurring spontaneously, or are seldom observable because of interactions with other processes (or because of measurement issues). The inverse is interesting as well: Theory might not allow certain events to happen; if they happen, their appearance is interesting because this might not only be just a measurement error, or due to ‘chance’, but indicate a limitation of the theory; it might even render the theory downright wrong. 

While all three aspects of data quantity are beneficial to learning research, the third aspect—rare event detection—deserves more attention. It is the one least often considered, but it can contribute to make learning sciences more theory-guided, and it can help to bridge the gap between qualitative and quantitative learning research. In qualitative research, the frequency with which an event occurs is not automatically identified with the importance of the event; in many cases, important events are rare. An example from learning research is conceptual change, which occurs rarely,  but when it occurs has profound effects on students’ understanding (diSessa, 2006).

Longitudinal Learning Data

Learning needs time.Learning in schools and universities requires often multiple skills—such as mathematical and writing skills—to master complex, hierarchically structured subject matter. In science education, for instance, the hierarchical nature of the subject knowledge also leads to the subject being an intricate association of concepts where deep learning of some basic concepts require comprehension of other basic concepts (Fergusson-Hessler & de Jong, 1987). Theoretical accounts for the depth and extend it takes to comprehend scientific concepts have been suggested from a cognitive psychology perspective and from a socio-cultural perspective. From the cognitive psychology perspective, one line of argument is that learning science can be seen as developing a form of expertise, and that any form of real expertise in cognitively demanding areas requires years of learning (the magic number is 10 years, plus/minus 2), as evidenced by novice-expert research, see (K. A. Ericsson, Charness, Feltovich, & Hoffman, 2006)  for a comprehensive overview. The currently best elaborated cognitive model of expertise development in the cognitive tradition is probably Ericsson’s Deliberate Practice theory (K. Anders Ericsson, Krampe, & Tesch-Römer, 1993). The reason why learning takes long in this model is the incremental nature of the underlying cognitive learning/change mechanisms (chunking, proceduralization). 

Another cognitive account, and one more specific to science education than general models of expertise development, is Chi’s and Slotta’s Ontology Shift theory (e.g., Chi, Slotta & de Leeuw, 1994). On this account, learning scientific concepts is hard and everyday concepts are resistant to change because scientific understanding requires in many cases a change in an ontological category. A classical example is the concept of heat, where students often see heat as a property of matter, whereas in physics it is seen in process terms, as the average velocity of particles. In this theory, the reason that learning stretches often over longer times is that while the ontology change itself can be fairly rapid, it needs often extended time (under current conditions of science learning) before students become sufficiently aware of the limitations of the initial ontology and are ready to accept an alternative one. 

Tracking learning that stretches over months and years—another example for this would be the development of second language skills—is very rarely done in learning research. One reason are the costs, and the logistics, of performing such research. But the costs are being substantially lowered as learning analytics methods find their place in schools and universities. It would be of tremendous benefit  if such data could be made available to researchers, and their acquisition planned in coordination with research projects. Methods for process mining are particularly relevant in this context (Reimann, 2009). Not only would this help to conduct specific projects that study long-term learning, it would also change the way we think about the nature of projects in learning research: From short-term interventions with immediate effects assessment to longer-duration interventions with continuous, long-durations effects (and side-effects!) monitoring. A variant of this kind of research we see developing with improvement research (Bryk, 2015), and the continuous use of data for decision making (Mandinach, 2012). 

Data from Learning on Multiple Levels - Learning is complex

Learning does not only place over long durations, but on other levels of analysis is happening within seconds and even milliseconds. Nathan and Alibali (2010) distinguish between learning in milliseconds and below (biological), seconds (cognitive), minutes to hours (rational), days to months (sociocultural), and years and beyond (organizational). This can be seen as an expression of strictly different kinds of learning, but more productively it may be seen as an expression of the fact that learning takes place at multiple levels at the same time. We can see learning ‘events’ as being produced by a complex, multi-layered system, with minimally three levels: A biological stratum with neurophysiological processes, a cognitive stratum (rational thinking, knowledge) , and a socio-cultural stratum (tools, practices). These strata, or levels, are set in relation to each other by processes of emergence (Sawyer, 2005). 


The concept of emergence as used here is relational: It refers to the phenomenon that wholes (entities, agents, organisms, organisations) have properties that cannot be found in any of their parts.  An emergent property “is one that is not possessed by any of the parts individually and that would not be possessed by the full set of parts in the absence of a structuring set of relations between them.” (Elder-Vass, 2010, p. 17). A key aspect of (relational) emergence is therefore the organization of the parts, how the parts are set in relation to each other, how the whole is structured. Not all properties of an object are emergent; some will be resultant properties. For instance, most objects have mass, which is an resultant property: the mass of the whole is the sum of parts’ masses. Some objects have colour, which is an emergent property; it is dependent on the organization of the objects’ parts. 


If we conceive of learning as a complexity phenomenon (Kapur et al., 2007), then learning needs not only be studied at multiple levels, but the analysis of the relation between the levels—the nature of the emergence—must take center stage. This requires not only to ask what affects learning over time, but also how learning is constituted at each moment in time: Which configurations of neural, cognitive, motivational, emotional, social and contextual processes/elements give rise to a ‘learning event’? Answering the latter question requires  appropriate instrumentation, and appropriate analytical methods. The methods cannot be (only) variants of the General Linear Model (e.g., regression models, including so-called ‘structural’ or ‘causal’ variants), amongst other reasons because these are not appropriate for non-linear complex systems, for systems that transform themselves or get transformed. Instead, methods for the analysis of non-linear systems will be needed (e.g., van Geert, 1998), and methods that can be used to describe relations between parts, in particular graph-theoretical methods such as Social Network Analysis (Burt, Kilduff, & Tasselli, 2013). Learning analytics and educational data mining can play a key role in advancing the learning sciences by bringing about such methodological advances and by making them usable for learning researchers. These includes, but should not be confined to, methods for recording bio-signals, learning behavior and the cognitive-motivational processes causing them,  as well as the social dimension of learning in great detail, with high precision, repeatedly and frequently, if not continuously. 

Data from Learning in Many Contexts - Learning is Distributed 

The methods being developed in learning analytics and educational data mining to capture aspects of students’ behaviour—and physiological and emotional parameters that go along with behaviour—not only over time, but also across locations is tremendously valuable for research. This because learning is situated: It is highly dependent on the resources available to the learner in specific contexts. Not only is learning happening (quasi-)synchronously across multiple levels, it is also distributed over the socio-physical environment—the situation—the learner finds herself in (Sawyer & Greeno, 2009). As Greeno and others have argued, any analysis of learning will be incomplete if it does not (also) conceptualise learning as a socio-cultural practice, as an activity system that stretches far beyond the somato-physical boundaries of the cranium and the body. 


Such an understanding of learning practices is necessary for theoretical as well as pedagogical purposes. For the purpose of theory development, an understanding of the socio-material practices around knowledge objects contributes to de-mystifying the process of learning—how is it possible to learn something genuinely new?— and of idea and knowledge creation more generally (Prawat, 1999). As the entanglement of cognitive work with physical, symbolic and social resources becomes ever better documented and understood—in general (e.g., Clark, 2011) and for specific areas such as scientific research (e.g., Latour & Woolgar, 1986)—it becomes clear that a theory of learning, creativity and idea generation will need to be grounded not only in psychology, but also in sociology, organization science, and semiotics. Any specific study will need to capture knowledge practices in a comprehensive sense. 


The fact that with learning analytics methods behavioural, interactional, and increasingly even some physiological parameters of students’ ‘learning’ activities can be captured across locales and contexts constitutes an essential prerequisite for researching learning-in-context at scale. Learning analytics methods will need to become substantially more sophisticated to become really useful for studying learning-in-context, though. It is not sufficient to keep track of students’ activities (and related parameters) alone; in addition, the context needs to be described and logged as well. This is easier said than done; just think of the many artefacts and tools that students use on average on every day of a semester: at school/uni, at home, while commuting. Along with technical advancements for capturing aspects of students’ behaviour and experience, a main focus of research in learning analytics should therefore be to develop languages, and standards, for describing the context within which behaviour and experience arise, and for describing the relation between the learners and the social, physical and symbolic aspects of learning context. 

Summary


In summary, I argue that there lies a huge potential in learning analytics to advance learning research, and that in order to realize this potential learning analytics researchers should devote more attention to (finding) rare learning events, to focus more on long-term learning, to make more of the fact that learning can be recorded on multiple levels of a complex system (the human learner), and to develop methods for capturing the context in which learning activities occur. None of this can be done without building on theory, on conceptualizations of learning and cognition. Theory is essential, and it is important to repeat what two of the key researchers write: ”The theory-oriented perspective marks a departure of EDM and LA from technical approaches that use data as their sole guiding point…” (Baker & Siemens, 2014b, p. 256/257). Suggestions such as made by Anderson (2008) that big data will render the scientific method obsolete not only express a deep misunderstanding of what the method is about, they are also committing the logical (and ethical) error of using descriptions of the past as prescriptions for the future.  

 

References

Anderson, C. . (2008). The end of theory: The data deluge makes the scientific method obsolete. Wired Magazin.   Retrieved 14 December, 2015, from http://www.wired.com/2008/06/pb-theory/

Baker, R., & Siemens, G. (2014a). Educational data mining and learning analytics. In R. K. Sawyer (Ed.), Cambridge Handbook of the Learning Sciences (2nd ed., pp. 253-274). New York: Cambridge University Press.

Baker, R., & Siemens, G. (2014b). Learning analytics and educational data mining. In R. K. Sawyer (Ed.), Cambridge Handbook of the Leaning Sciences (2nd ed., pp. 253-272). New York: Cambridge University Press.

Baker, R., & Yacef, K. (2009). The state of educational data mining in 2009: A review and future vision. JEDM - Journal of Educational Data Mining, 1(1), 3-17.

Bryk, A. S. (2015). 2014 AERA Distinguished Lecture: Accelerating How We Learn to Improve. Educational Researcher.

Burt, Ronald S., Kilduff, Martin, & Tasselli, Stefano. (2013). Social network analysis: Foundations and frontiers on advantage. Annual Review of Psychology, 64, 527-547.

Clark, A. (2011). Supersizing the mind. Embodiment, action, and cognitive extension. Oxford, UK: Oxford University Press.

Council, National Research. (2013). Frontiers in Massive Data Analysis. Washington, D.C.: The National Academic Press.

diSessa, A.A. (2006). A history of conceptual change research: Threads and fault lines. In R. K. Sawyer (Ed.), The Cambridge Handbook of the Learning Sciences.

Elder-Vass, Dave. (2010). The causal power of social structures. Cambridge, UK: Cambridge University Press.

Ericsson, K. A., Charness, N., Feltovich, P., & Hoffman, R.B. (Eds.). (2006). The Cambridge Handbook of Expertise and Expert Performance. New York: Cambride University Press.

Ericsson, K. Anders, Krampe, Ralf Th., & Tesch-Römer, Clemens. (1993). The role of deliberate practice in the acquisition of expert performance. Psychological Review, 100, 363-406.

Fergusson-Hessler, M.G.M., & de Jong, T. (1987). On the quality of knowledge in the field of electricity and magnetism. American Journal of Physics, 55, 492-497.

Kapur, M., Hung, D., Jacobson, M.J., Voiklis, J., Kinzer, C. K., & Victor, Chen Der-Thang. (2007). Emergence of learning in computer-supported, large-scale collective dynamics: A research agenda Proceedings of the International Conference on Computer-supported Collaborative Learning (CSCL2007). New Brunswick, NJ.

Koedinger, K R, Baker, R S J D , Cunningham, K, Skogsholm, A., Leber, B., & Stamper, J. (2010). A data repository for the EDM community: The PSLC DataShop. In C. Robero, S. Ventura, M. Pechenizkiy & R. Baker (Eds.), Handbook of educational data mining (pp. 43-56). Boca Raton, FL.: Chapman&Hall/CRC.

Latour, B., & Woolgar, S. (1986). Laboratory life: The construction of scientific facts (2nd ed.). Princeton: Princeton University Press.

Mandinach, E. B. (2012). A perfect time for data use: Using data-driven decision  making to inform practice. Educational Psychologist, 47(2), 71-85.

Nathan, M.J., & Alibali, Martha Wagner. (2010). Learning Sciences. Wiley Interdisciplinary Reviews:Cognitive Science, 1(3), 329-345.

Prawat, R. S. (1999). Dewey, Peirce, and the Learning Paradox. American Educational Research Journal, 36, 47-76.

Reimann, P. (2009). Time is precious: Variable- and event-centred approaches to process analysis in CSCL research. International Journal of Computer-supported Collaborative Learning, 4, 239-257.

Sabourin, J., Rowe, J., Mott, B., & Lester, J. (2011). When off-task is on-task: The affective role of off-task behavior in narrative-centered learning environments. . Paper presented at the Proceedings of the 15th International Conference on Artificial Intelligence in Educatoin. 

Sawyer, R. K. (2005). Social emergence. Societies as complex systems. Cambridge, UK: Cambridge University Press.

Sawyer, R. K. (Ed.). (2014). The Cambridge Handbook of the Learning Sciences (2nd ed.). New York: Cambride University Press.

Sawyer, R. K., & Greeno, J.G. (2009). Situativity and learning. In P. Robbins & M. Aydede (Eds.), The cambridge handbook of situated cognition (pp. 347-367). New York, NY: Cambridge University Press.

Siemens, G., & Baker, R.S.J. d. (2012). Learning analytics and educational data mining: Towards communication and collaboration. Paper presented at the Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (LAK 2012). 

Stokes, D.E. (1997). Pasteur's quadrant: Basic science and technological innovation. Washington, DC: Brookings Institution Press.

van Geert, Paul. (1998). A dynamic systems model of basic developmental mechanisms: Piaget, Vygotsky, and beyond. Psychological Review, 105, 634-677.

0 comments |

Starting in January 2016, the Sciences and Technologies of Learning research network will transform into a new research centre. The University has approved our proposal to set up a Centre for Research on Learning and Innovation, as a sustainable way of supporting the research collaborations that have been a feature of STL for the last five years. The new centre will have strong roots in Education, with substantial involvement from Engineering & IT, Science, Health Sciences and Medicine. As with STL, membership of the new centre will be open to all members of university staff, and postgraduate students, who have a serious interest in research in this area.

The primary disciplines involved in STL and CRLI have been recognised in the most recent national assessment of research quality (ERA2015) as showing 'outstanding performance well above world standard’ (Rated 5 - the highest rating possible).

  • 1303 Specialist Studies in Education (including the Learning Sciences and Educational Technology and Computing) - 5
  • 16 Information & Computing Science - 5
  • 1702 Cognitive Sciences - 5

Further information about the transition to the new centre will be posted here in the coming weeks and the new year.
rainbowtriangles.jpg


The Sciences and Technologies of Learning (STL) Research Fest was held in the Charles Perkins Centre Hub on Thurs Nov 5th 2015. Posters and abstracts from the day are available online in a Dropbox folder at http://bit.ly/STLFest15files.

Sciences and Technologies of Learning Research Fest 2015

Dr David Ashe and Melinda J Lewis were very happy to receive the people’s choice award for their poster:

Context in Flux: An invitation to join a think‐aloud installation at Research Fest.

The dynamic and multi-modal poster was an opportunity to explore and expand the research poster genre to one of installation, evoking participation and interaction.

Melinda and David performed a think-aloud about their ideas on why or why not common understandings of context are relevant in their research. Visitors to the poster joined in, voicing their thoughts on context both verbally and in text that they shared directly onto the poster.

Context_in_Flux_sm.jpg

It is not possible to illustrate the dynamic modality of the poster in a static image; however, the image above provides a small insight into the experience of visitors to the Research fest. The textual information on the poster scrolled, images changed and additional text could be entered in real-time. Wireless headphones were also supplied for visitors to listen to audio information.

If you would like further information about this dynamic and interactive poster, please contact David and Melinda.

Contacts:
David.Ashe@sydney.edu.au
Melinda.Lewis@sydney.edu.au

Congratulations to our poster winners at the Research Fest:

JUDGES’ CHOICE
Winner – Yobelli Jimenez and Sarah Lewis, Implementation of immersive virtual technology for radiation therapy education (link to external Dropbox file).
Runner-Up - Ling Wu, Enhancing Young Children’s Empathy Development through Purposely Designed Educational Tablet Games (link to external Dropbox file).

PEOPLE’S CHOICE
Winner - Dr David Ashe and Melinda Lewis, Context in Flux: An invitation to join a think-aloud installation at Research Fest (link to blog post).

Thank you to our poster judges; Sonya Corcoran and Julie King. Posters will be made available online in the next week. Details will be posted here, on our website, and to fest registrants.

Congratulations to Dr Patrica Thibaut Paez, who has been awarded a position by the National Commission for Scientific and Technological Research (CONICYT) through its FONDECYT program's Postdoctoral Contest 2016.

These positions are granted by the National Fund for Scientific and Technological Research (FONDECYT), which was created as an instrument to promote scientific and technological development in Chile. FONDECYT fosters the initiative of individuals and research groups by funding scientific and technological research projects in all fields of knowledge. Resources are allocated through annual public competitions and projects are selected on the basis of their intrinsic quality and the merits of applicants without, distinction of fields, institutional affiliation or gender. The aim of this competition is to stimulate productivity and future scientific leadership of young researchers who hold a Doctorate degree.

Patricia is a researcher at the Centre for Research on Computer Supported Learning and Cognition (CoCo) at the University of Sydney. She also completed her PhD at the Centre. Her research focuses on learning, literacy, and mobile technologies across formal and informal spaces.

On November 5, the STL Research Fest will bring together the wider community of researchers and practitioners in the sciences and technologies of learning to exchange ideas and form new collaborations.

Timetable

Start End Item
9:45 9:55 Registration
10:00 10:40 Opening and shorter plenary
10:40 11:00 Morning Tea
11:00 11:45 Parallel session 1
11:45 12:30 Poster showcase 1
12:30 13:15 Lunch
13:15 14:00 Poster showcase 2
14:00 14:45 Parallel session 2
14:45 15:00 Refreshments
15:00 16:00 Plenary and closing - Learning to work across boundaries - opportunities for research and innovation

Parallel sessions

ID Title Presenters/discussants
Parallel session 1 : 11.00-11:45am
1 Mind the gap Abelardo Pardo, Michael Jacobson, Peter Reimann, Kalina Yacef
2 Teaching how to work across boundaries Lina Markauskaite, Peter Goodyear, Marie Carroll, Tina Hinton, Philip Poronnik, Kim Bell-Anderson, Simon Poon
3 Coding, designing and networking Rob Saunders, Lucila Carvalho
Parallel session 2 : 14.00-14:45
4 Researching Innovative Learning Spaces Rob Ellis, Tina Hinton, Pippa Yeoman
5 Professional learning on-the-go Lina Markauskaite, James Edwards, Meg Phelps, Peter Goodyear
6 Cranking up a notch Adam Bridgeman, Wai Yat Wong, Rena Bokosmaty, Meloni Muir

Posters

Poster Session 1: 11.45 – 12.30. Posters 1-14      

Poster Session 2: 13.15-14.00. Posters 15-27     

  1. Undergraduates as App development partners: a case study from Botany & Computer Science. Alexander Ling, Ahmed Shadid, Michael Johnston, Xilin Huang, Woo Yang Baeg, Scott Dong, Se-Hyun Kevin Ahn, Caroline Cheung, Satyendra Sinha, Rosanne Quinnell
  2. Group Formation - How do students' characteristics and behaviour affect group work performance? Augusto Dias Pereira dos Santos, Kalina Yacef
  3. A proposal for redesigning problem-based learning in medical education: Contrasting student solutions and improving consolidation. Alisha Portolese, Michael Jacobson, Robbert Duvivier, Lina Markauskaite
  4. EQ Clinic: An Online Clinic for Medical Communication Enhancement. Chunfeng Liu
  5. Exercise motivation through fully-immersive gamified virtual reality experience. Crystal Yoo
  6. Investigating the development of scientific inquiry in undergraduate physics students. Gabriel Nguyen, John O'Byrne, Manjula Sharma
  7. Learning and Enactment in Techno-human ecosystem: Embodiment of sociomateriality in sensemaking process. Gilbert Importante, Dr. Lina Markauskaite, Prof. Peter Goodyear
  8. A Student ‘Vision Statement’ as a Catalyst for Educational Innovation in Navitas: Towards the ideal technology-enabled learning environment for English Language Students. Jonathan Hvaal
  9. What offline and online technologies do higher education students use to complete assessment tasks? Lynnette Lounsbury, Dr David Bolton, Dr Paula Mildenhall, Assoc. Prof. Maria Northcote
  10. Learning by enhanced tactile feedback - Montessori sandpaper extended, Michael Tang, Dr. Paul Ginns
  11. Visualising socio-material practices in knowledge creation. Natalie Spence
  12. Clinical development using reflective learning and ePortfolios: staff and student perceptions. Punyanit Rungnava
  13. A quantitative study of students’ experiences, needs and expectations around technology in their personal lives and study in Higher Education, VET and ELICOS contexts. Lucy Blakemore, Yindta Whittington
  14. Mirror, mirror: A pre-learning exercise enhances mathematical problem-solving efficiency. Eleni Smyrnis, Paul Ginns
  15. How collaborative successes and failures become productive: An exploration of emerging understanding and misunderstanding turning points in model-based learning with productive failure. Alisha Portolese, Lina Markauskaite, Polly Lai, Michael J. Jacobson
  16. Context in Flux: An invitation to join a think-aloud installation at Research Fest. Dr David Ashe, Melinda J Lewis
  17. “That thing would have been good for this” Multimodal Interaction Analysis. Dewa Wardak
  18. Invigorating Science Investigations using an Inquiry Oriented Pedagogical Instrument. Evan Hefer, Manjula Sharma, Louise Sutherland, Alexandra Yeung, Scott Kable
  19. Enhancing Young Children’s Empathy Development through Purposely Designed Educational Tablet Games. Ling Wu
  20. Learning at multidisciplinary team meetings leading innovation projects. Amanda Lacy
  21. Learning Nanotechnology with Agent-Based Models versus Animations: Gestures Differences in Problem Solving. Polly Lai
  22. Massive online open science. Dr Rebecca LeBard, Geoff Kornfeld, Dr Rosanne Quinnell, Scientia Professor Rob Brooks, Scientia Professor Brett Neilan, Emeritius Professor Brynn Hibbert
  23. Exploring EFL Teachers Competences in Synchronous Telecollaborative Intercultural Communication. Wissam Bin Siddiq
  24. Talking to oneself and others: How self-explanation affects group discussions. Sanri le Roux
  25. Implementation of immersive virtual technology for radiation therapy education. Yobelli Jimenez, Sarah Lewis
  26. A Mobile App in the 1st Year Uni-Life: A Pilot Study. Yu Zhao
  27. MOOClm: Open Learner Models in MOOCs to Guide and Coordinate. Ronny Cook

28 posters on display at the Fest over in 2 sessions, Poster Session 1 from 11.45-12.30 and Poster Session from 2 13.15-14.00.

Poster Session 1, 11.45-12.30
1. Undergraduates as App development partners: a case study from Botany & Computer Science. Alexander Ling, Ahmed Shadid, Michael Johnston, Xilin Huang, Woo Yang Baeg, Scott Dong, Se-Hyun Kevin Ahn, Caroline Cheung, Satyendra Sinha, Rosanne Quinnell
2. Group Formation - How do students' characteristics and behaviour affect group work performance? Augusto Dias Pereira dos Santos, Kalina Yacef
3. A proposal for redesigning problem-based learning in medical education: Contrasting student solutions and improving consolidation. Alisha Portolese, Michael Jacobson, Robbert Duvivier, Lina Markauskaite
4. EQ Clinic: An Online Clinic for Medical Communication Enhancement. Chunfeng Liu
5. Exercise motivation through fully-immersive gamified virtual reality experience. Crystal Yoo
6. Investigating the development of scientific inquiry in undergraduate physics students. Gabriel Nguyen, John O'Byrne, Manjula Sharma
7. Learning and Enactment in Techno-human ecosystem: Embodiment of sociomateriality in sensemaking process. Gilbert Importante, Dr. Lina Markauskaite, Prof. Peter Goodyear
8. A Student ‘Vision Statement’ as a Catalyst for Educational Innovation in Navitas: Towards the ideal technology-enabled learning environment for English Language Students. Jonathan Hvaal
9. What offline and online technologies do higher education students use to complete assessment tasks? Lynnette Lounsbury, Dr David Bolton, Dr Paula Mildenhall, Assoc. Prof. Maria Northcote
10. Learning by enhanced tactile feedback - Montessori sandpaper extended, Michael Tang, Dr. Paul Ginns
11. Visualising socio-material practices in knowledge creation. Natalie Spence
12. Clinical development using reflective learning and ePortfolios: staff and student perceptions. Punyanit Rungnava
13. A quantitative study of students’ experiences, needs and expectations around technology in their personal lives and study in Higher Education, VET and ELICOS contexts. Lucy Blakemore, Yindta Whittington
14. Mirror, mirror: A pre-learning exercise enhances mathematical problem-solving efficiency. Eleni Smyrnis, Paul Ginns


Poster Session 2, 13.15-14.00
15. How collaborative successes and failures become productive: An exploration of emerging understanding and misunderstanding turning points in model-based learning with productive failure. Alisha Portolese, Lina Markauskaite, Polly Lai, Michael J. Jacobson
16. Context in Flux: An invitation to join a think-aloud installation at Research Fest. Dr David Ashe, Melinda J Lewis
17. “That thing would have been good for this” Multimodal Interaction Analysis. Dewa Wardak
18. Invigorating Science Investigations using an Inquiry Oriented Pedagogical Instrument. Evan Hefer, Manjula Sharma, Louise Sutherland, Alexandra Yeung, Scott Kable
19. Enhancing Young Children’s Empathy Development through Purposely Designed Educational Tablet Games. Ling Wu
20. Learning at multidisciplinary team meetings leading innovation projects. Amanda Lacy
21. Learning Nanotechnology with Agent-Based Models versus Animations: Gestures Differences in Problem Solving. Polly Lai
22. Massive online open science. Dr Rebecca LeBard, Geoff Kornfeld, Dr Rosanne Quinnell, Scientia Professor Rob Brooks, Scientia Professor Brett Neilan, Emeritius Professor Brynn Hibbert
23. Exploring EFL Teachers Competences in Synchronous Telecollaborative Intercultural Communication. Wissam Bin Siddiq
24. Talking to oneself and others: How self-explanation affects group discussions. Sanri le Roux
25. Implementation of immersive virtual technology for radiation therapy education. Yobelli Jimenez, Sarah Lewis
26. A Mobile App in the 1st Year Uni-Life: A Pilot Study. Yu Zhao
27. MOOClm: Open Learner Models in MOOCs to Guide and Coordinate. Ronny Cook

rf_discuss.jpgOn November 5, the STL Research Fest will bring together the wider community of researchers and practitioners in the sciences and technologies of learning to exchange ideas and form new collaborations. Registration to attend is open until Oct 28th at bit.ly/FestReg15. Registration is free but needed for catering purposes.

Timetable

Start End Item
9:45 9:55 Registration
10:00 10:40 Opening and shorter plenary
10:40 11:00 Morning Tea
11:00 11:45 Parallel session 1
11:45 12:30 Poster showcase 1
12:30 13:15 Lunch
13:15 14:00 Poster showcase 2
14:00 14:45 Parallel session 2
14:45 15:00 Refreshments
15:00 16:00 Plenary and closing

The program is still being fleshed out, further details will be posted here, on our website, and emailed to registrants in advance of the Fest.

Parallel sessions

ID Title Presenters/discussants
Parallel session 1 : 11.00-11:45am
1 Mind the gap Abelardo Pardo, Michael Jacobson, Peter Reimann, Kalina Yacef
2 Teaching how to work across boundaries Lina Markauskaite, Peter Goodyear, Marie Carroll, Tina Hinton, Philip Poronnik, Kim Bell-Anderson, Simon Poon
3 Coding, designing and networking Rob Saunders, Lucila Carvalho
Parallel session 2 : 14.00-14:45
4 Learning space research Rob Ellis, Tina Hinton, Pippa Yeoman
5 Professional learning on-the-go Lina Markauskaite, James Edwards, Meg Phelps, Peter Goodyear
6 Cranking up a notch Adam Bridgeman, Wai Yat Wong, Rena Bokosmaty, Meloni Muir

Register now

Registration to attend is open until Oct 28th at bit.ly/FestReg15.

Join us on Weds 28th October for Getting interested, our final Research on Learning and Education Innovation seminar this year.

"Getting interested". Everyone implicitly understands it; everyone recognises its importance. It is clearly a part of learning, and thereby education. That said, where does “getting students interested” figure within teachers' course organisation? Do they consider it as important as the knowledge/skills development aspect of their teaching?

This seminar by Dr Luke Fryer will begin by reviewing the development of the academic understanding of "interest". The discussion will then turn to his research into the role of individual differences within interest. From this general test of interest development, Luke will present an interest model that has been explicitly designed to support instruction within secondary and tertiary education. Two initial tests – both currently under review – will then be discussed, followed by a preview of beta-software developed for the micro-analytic measurement of interest and an examination of future directions for the field, as well as Luke's own research program.

Luke Fryer is a Ewing Post-doctoral Research Fellow at the Faculty of Education and Social Work whose current research focus is working towards understanding why students (don’t) study and more recently what factors are involved in initiating their interest in a domain of study.

Event details
• When: 28 Oct, 11.00-12.30 (come at 10.45 for refreshments)
• Where: Room 612, Education Building A35
• This seminar will not be available online or recorded.
• More information here

There are still problems with problem-based learning: recent innovations and new directions

APpic.jpg
A Research on Learning and Education Innovation seminar with Alisha Portolese.

Problem-based learning (PBL) is widely used in universities, high schools, and even primary classrooms globally. It is considered by many to be the leading learning design for medical education, and has branched out to a wide variety of disciplines in health sciences and beyond. Although widespread, PBL has components that are not adequately grounded in learning theory. In this presentation, PhD candidate Alisha Portolese (pictured) will argue that PBL needs some specific tweaks to better provide the best that we can offer in terms of an efficient, effective, productive learning experience. It will discuss how we can apply strong learning science research about how people learn to improve the design of PBL, highlight strengths and pitfalls, discuss recent improvements and innovations, and suggest future directions. The presentation will speak to PBL learning design at both a research and teaching level.

Alisha Portolese is a PhD candidate at CoCo, researching integrating elements from productive failure and analogical encoding theory into problem-based learning in medical education.

Event details
• When: 21 Oct, 11.00-12.30 (come at 10.45 for refreshments)
• Where: Room 612, Education Building A35
• This seminar will not be available online or recorded.
• More information here


rf_discuss.jpgDo you want to make connections, showcase your work and find out more on recent innovations in learning and knowledge technology research? Register now for the STL Research Fest, our annual event bringing together the wider community of researchers and practitioners in the sciences and technologies of learning to exchange ideas and form new collaborations.

What to expect

We expect the Fest, which takes place this year on Thurs Nov 5th in the Charles Perkins Centre Hub at the University of Sydney, to attract about 150 people for a full day of activities. Our program depends on what our attendees want to see and show but you can expect: plenaries; parallel workshop, demonstration and roundtable sessions; poster sessions; and the opportunity to network over catered breaks.

Details will be posted here, on our website, and emailed to registrants in advance of the Fest.

Want to present?

If you would like to submit a poster or run a seminar, roundtable or workshop event, please register as soon as possible at bit.ly/FestReg15. The closing date for submission content is Oct 4th. Want to present but don’t have results yet? Our poster sessions attract a diverse range of topics at various stages of research. It's a great chance to let others know about your research or present research design, and to get useful feedback and contacts. Some of the posters from 2014 are available online at http://bit.ly/STLFest14files If you, or someone you know, might be interested in presenting please feel free to contact us and forward this information on.

Register now

Registration to submit posters, presentations and other content is open until Oct 4th. You can register to attend until Oct 21st. Registration is free but needed for catering purposes. Register at bit.ly/FestReg15 or below.

Read more...

Traces on the Walls and Traces in the Air: Inscriptions and Gestures in Educational Design Team Meetings

Imagine having to explain to your colleagues during a face-to-face design meeting what your idea looks like. Designers in many fields such as architecture, engineering, and web design are trained in expressing their ideas using drawings and sketches. These designers are encouraged to learn how to draw and to avoid disposing of their sketches, even when they are just messy “scribblings”. A significant portion of the literature in Design Studies is dedicated to the study of visual representations and in particular to hand-drawn sketches produced during the initial ideational phase. In contrast, very little is known about how educational designers use drawing and sketching to support their communication in face-to-face design team meetings.

In this presentation I will describe the findings from my PhD study in which I investigated how five groups of educational designers created and used inscriptions in support of their design activities. Inscriptions are defined here as all types of drawings, sketches, and visual marks created in support of design activities.

A face-to-face design session often involves multimodal communication thus requiring the analysis of other modes such as gestures. In this study gestures were often used as an additional communicative channel. They functioned as complementary representational means through which the participants made sense of the inscriptions.

The results from this study contribute to our understanding of the multimodal nature of communication in face-to-face design and have implications for the design and function of next-generation design tools and design environments, as well as for the training of educational designers.

DW267.jpgA Research on Learning and Education Innovation seminar with Dewa Wardak. Dewa Wardak is a Postdoctoral Research Associate at the Centre for Research on Computer Supported Learning and Cognition (CoCo), University of Sydney. Dewa’s main research area focuses on understanding the role of visual representations, in particular free-hand sketching, and their use by educational designers in design team settings. Her research interests include design for learning, design of online learning environments, learning by design, collaborative learning, online learning communities, and knowledge visualization.

Event details
• When: 16 Sept, 11.00-12.30 (come at 10.45 for refreshments)
• Where: Room 612, Education Building A35
• This seminar will not be available online or recorded.
• No need to RSVP, just come on the day.

Join us on 9 September for "Cognitive load theory" a Research on Learning and Education Innovation seminar with Professor John Sweller, Emeritus Professor of Educational Psychology in the School of Education, University of New South Wales.

Cognitive load theory uses our knowledge of human cognition to devise instructional procedures. The following aspects of human cognition are critical to instructional design.

First, based on evolutionary educational psychology, cognitive load theory assumes that most topics taught in educational and training institutions are ones that we have not specifically evolved to learn.
Second, these instructionally relevant topics require learners to acquire domain-specific, rather than generic, cognitive knowledge.
Third, while generic cognitive knowledge does not require explicit instruction because we have evolved to acquire it, domain-specific concepts and skills that provide the content of educational syllabi, do require explicit instruction.

These three factors interact with the well-known capacity and duration constraints of working memory to delineate a cognitive architecture relevant to instructional design. Because the ability to learn biologically secondary, explicitly taught, domain-specific skills is limited by the capacity of a person's working memory, cognitive load theory has been developed to provide techniques that reduce unnecessary working memory load when teaching these types of skill.

  • When: 11am–12.30pm
  • Where: Room 612, Education Building A35
  • More info available here
  • This seminar will not be available online or recorded.
Dr John Sweller is Emeritus Professor of Educational Psychology in the School of Education, University of New South Wales (UNSW). His research reputation is associated with cognitive load theory, an instructional theory based on our knowledge of human cognitive architecture. Professor Sweller initiated work on the theory in the early 1980s. Subsequently, “ownership” of the theory shifted to his research group at UNSW and then to a large group of international researchers. The theory is now a contributor to both research and debate on issues associated with human cognitive architecture, its links to evolution by natural selection, and the instructional design consequences that follow. It is one of the few theories to have generated a large range of novel instructional design effects based on human cognitive architecture. These include: goal-free; worked-example; split-attention; isolated-interacting elements; and collective working-memory effects. His work has been cited 10,000–20,000 times.

It’s that time of year again – time to welcome new readers! In this post we will demystify all the acronyms and tell you a little about us, our events and where to find us on social media.

Who are CoCo? We are the Centre for Research on Computer Supported Learning and Cognition (you can see why we shorten it), a University of Sydney Research Centre in the Faculty of Education and Social Work. Our core team of about 30 staff and students complete research on the sciences and technologies of learning. We also offer postgraduate study options at the Masters and PhD levels.

CoCo is a core part of a larger research network at the University - STL (The Sciences and Technologies of Learning research network). This network includes other facilities and centres at the University, such as CHAI (Computer Human Adapted Interaction), LATTE (Learning & Affect Technologies Engineering), and the Design Studio. Our multidisciplinary research looks at enhancing the ability of all those involved in education - formal and informal - to create learning environments that help people develop the skills, knowledge and dispositions to make innovative contributions.

Our events include:

  • Seminars on learning and educational innovation, presented by local and international experts, on most Wednesdays in semester

  • STL Research Fest - an annual event inviting researchers and practitioners to exchange ideas, showcase work, and catch up on recent innovations. Our next fest is on 5 November 2015 in the Charles Perkins Centre Hub.

CONNECT WITH US
To hear about upcoming events, join our mailing list - http://bit.ly/cocolist
To attend our Research Fest on Nov 5th go to http://bit.ly/FestReg15
CoCo website - http://sydney.edu.au/education_social_work/coco/
CoCo Twitter - @CoCoCentre
STL website - http://sydney.edu.au/research/stl/
STL Twitter - @STLSydney
STL Blog - http://blogs.usyd.edu.au/stl/
YouTube - https://www.youtube.com/user/CoCoResearchCentre
For general information on CoCo or STL, email stl.info@sydney.edu.au

MJ267.jpgJoin us on 26 August for Beyond carts and horses: issues in the design of advanced learning systems, a Research on Learning and Education Innovation seminar with Professor Michael Jacobson.

In this talk I consider three themes: what we learn with, what we learn, and how we learn. A recently completed ARC funded research project is discussed. Ninth grade students used agent-based computer models to learn difficult scientific knowledge about complex systems of relevance to understanding climate change. We investigated if varying the sequencing of pedagogical structure (SPS) provided for the computer models would result in differential learning outcomes of the targeted complexity and climate concepts. The experimental condition used a low-to-high (LH) SPS sequence based on productive failure (Kapur & Bielaczyc, 2012), whereas the comparison condition was based on a teacher’s suggestion to employ a more traditional teaching approach—which is classified as a high-to-low (HL) SPS sequence—for the classroom activities. The main results found significant learning of ideas such as “greenhouse gases” and “carbon cycle” by both groups on the posttest. However, for the more conceptually challenging complex system ideas, such as “self organization” and “emergent properties,” only the LH experimental group demonstrated a significantly higher performance on the posttest compared to the HL comparison condition. Theoretical implications of these findings for the design of advanced learning systems, such as schema abstraction, are considered. In terms of practical implications, I suggest that these research findings challenge many current edtech approaches such as “flipped classrooms” and “MOOCs” that continue to use LH SPS as the core of their pedagogical learning designs.

Peter Goodyear and Lina Markauskaite have been working in partnership with CSU, Deakin and UWS on a new OLT project "Enhancing workplace learning through mobile technology".

This project explores how students can make best use of personal digital devices in workplace learning to bridge different learning spaces (classroom, workplace and virtual), connect learning and work, and to strengthen networked, collaborative, integrative communication processes between students, academics and workplace educators. The outcomes of this project will be a mobile learning capacity building framework for workplace learning (WPL) with a specific focus to enhance students' ability to create their personal learning environments. The framework will include a conceptual map, physical representations (exemplars), action oriented thinking tools.
The project team already developed and launched the initial toolkit, named the “GPS for WPL”, aimed at helping students, academics and workplace educators to enhance professional learning experiences by making better use of mobile technology.

If you are interested in possibilities to collaborate with Peter and Lina trailing this toolkit within professional experience courses that you teach and/or assist the project's team to improve this resource by providing feedback and suggestions, please contact Lina or Peter.

For more information about the project updates, please visit the project’s blog.

The Network of Academic Programs in the Learning Sciences (NAPLeS) are currently finalizing a series of new interview and short presentation videos, uploading today Jim Pellegrino’s interview and talk on “Assessment and Evaluation in the Learning Sciences” at http://isls-naples.psy.lmu.de/intro/all-webinars/pellegrino_all/index.html.

This will be followed by contributions of Gerry Stahl (July 16), Michael Jacobson (July 23), Susan Goldman (July 30) and Baruch Schwarz (August 6).

NAPLeS is part of the educational mission of the International Society of the Learning Sciences, a network of PhD. and master‘s programs in the Learning Sciences founded at the 2012 ICLS meeting hosted here at the University of Sydney. The overall mission of NAPLeS is to foster high quality Learning Sciences programs internationally through several mechanisms that support teaching and learning. More information at their webiste.

About Us

Find out more about our network and research at the STL website (offsite).

About the Blog

Research by the University's Centre for Research on Learning and Innovation (CRLI).
More