Our Centre for Research on Learning and Innovation (CRLI) Wednesday seminar series restart next week when Dr Christine Preston presents "Toys for learning and teaching science".

Preston267.jpg

Toys are widely recognised as being highly engaging to children, but formal research into the use of toys to support learning in primary science and learning how to teach primary science is sparse. This presentation provides an overview of pilot studies conducted by honours students in the faculty, as well as a summary of toy use in the Master of Teaching program with preservice teachers. Topics include primary-students' responses when their toys incorporate discrepant events, and how musical toys change young children’s explanations about sound. The qualitative research included individual interviews with primary students using think-aloud data collection techniques. Preliminary findings will be discussed along with the potential for further research in this area.

Dr Christine Preston has a unique teaching background, having taught science in NSW schools at both the secondary and primary level. She has experience as a teacher-education lecturer in science for early childhood, primary and secondary settings. Her current research interests are primary children’s interpretation of scientific diagrams, teaching science using toys, early-childhood science and teaching science education in higher education.

The CRLI Wednesday seminars (formerly CoCo and STL) run on most Wednesdays in semester and host local and international experts who present research on learning and educational innovation in an informal setting.

Event details
• When: 11.30am to 1.00pm on 23 Mar 2016. This is a brown bag event, you are welcome to bring your lunch to eat.
• Where: Room 612 Education Building A35
• This seminar will not be available online or recorded.
• More information here

LARG.jpg The Sydney Learning Analytics Research Group is excited to offer two conference travel grants of $3,500 each - the first to attend the 2016 Educational Data Mining (EDM) Conference, and the second to attend the 2017 Learning Analytics and Knowledge (LAK) Conference. The call for applications for the 2016 EDM Conference is now open, with the 2017 LAK Conference call to be announced at a later date.

Applicants must have a submission (of any type) accepted for presentation at EDM 2016, and be either a current staff member or current student of the University of Sydney. The call for submissions for EDM is now open - there are several deadlines, the last of which is 2 April 2016. For more information see the Sydney Learning Analytics Research Group (LARG) website.

LARG is a joint venture of the newly established Quality and Analytics Group within the Education Portfolio, and the new Centre for Research on Learning and Innovation at to the Faculty of Education and Social Work. The key purposes in establishing the new research group are: Capacity building in learning analytics for the benefit of the institution, its students and staff; to generate interest and expertise in learning analytics at the University, and build a new network of research colleagues; and to build a profile for the University of Sydney as a national and international leader in learning analytics.

Join us on Tuesdays 23rd February for a special Centre for Research on Learning and Innovation seminar with Dr Antonia Scholkmann, "The assessment of teacher supportive behaviour in open phases of school lessons by means of video analysis – new approaches and findings from Hamburg University".

Scholkmann_267.jpgVideo analysis has previously shown its potential to shed light on learning processes in naturalistic and especially in open phases of instruction (Knigge, Siemon, Nordstrand, & Stolp, 2013). In its current research, the team of Professor Jens Siemon at Universität Hamburg seeks to assess teachers’ supportive activities in the naturalistic setting of the classroom, and describe every supportive event in a way in which micro-activities (on both the teacher’s and on the student’s side) and process characteristics are adequately considered. For this purpose, existing approaches (van de Pol & Elbers, 2013; Wood, Bruner, & Ross, 1976) were extended with the video-based recording and assessment procedure MuVA (Siemon, Boom, & Scholkmann, 2015) and the new video analysis software Interact (cf. Mangold, 2006).

In her presentation Dr. Scholkmann, Senior Researcher on the team of Professor Dr. Jens Siemon, Universität Hamburg, will elaborate on the potential of these approaches for analysis of teacher supportive behaviour. She will show both examples of the current material and first results on the amount, quality, micro activities and process characteristics of teachers’ supportive behaviours inferred from their dataset.

Event details
• When: 11.30am to 1.00pm on 23 Feb 2016. This is a brown bag event, you are welcome to bring food and drink.
• Where: Room 612 Education Building A35
• This seminar will not be available online or recorded.
• More information here

I predict that we will see a kind of semiotic turn in CSCL, with a focus on materiality; a rising interest in the kind of notional and representational systems that are used when people collaborate in particular practice fields. Semiotics is the study of sign systems, their symbolic as well as physical qualities (Eco, 1979).  While there was a certain interest in the first phase of CSCL--the discussions forums, online forums--in semiotic aspects of collaboration, those first generation semiotic devices were designed for the purpose of asynchronous communication and exchange (‘discussion forum’, ‘thread’). They were not so much informed by people's practices and activities. In more recent years, we've seen a continued interest in these systems, and a surging interest in talk, in synchronous communication. A particularly active area that yielded numerous ideas for representational notations as been research on computer-supported argumentation (Noroozi, Weinberger, Biemans, Mulder, & Chizari, 2012).

The new semiotic turn should focus on artifacts that are representative of people's practices, rather than artifacts designed specifically for the purposes of communication and learning. For instance, the blueprints that building engineers and architects use, the symbol system that musicians use, the specialized document types and codes medical practitioners use. There has been more interest on practice-related notations and artifacts in CSCW than in CSCL (e.g., Turner, Bowker, Gasser, & Zacklad, 2006), and still comparatively little work in CSCL that engages with authentic artifacts and their role in collaboration and learning. 

As an example for what CSCL research with a semiotic perspective could look like, think of Dan Suther's early work on the guidance function of specific notional systems (e.g., Suthers & Hundhausen, 2003), but now with a focus on notations and artifacts that have a more discipline/profession-specific grounding and are more practice-based. 

I can see a number of benefits of the ‘new semiotic turn’: For instance, content would become more important again; we are currently perhaps too much focused on the analysis of the collaboration process (Reimann & Yacef, 2013). But without a concern for content, process remains hard to understand.  Another benefit would be the development of stronger ties between CSCL and CSCW. Thirdly, CSCL would become more relevant for vocational and professional learning because we would now be studying and supporting collaborative learning around a range of artefacts much wider than dedicated ‘knowledge’ artifacts such as concept maps and math equations. Furthermore, a semiotic perspective on collaboration could contribute to HCI research (de Souza, 2005) and to the development of task-related applications that support learning in (collaborative) practice, in addition to getting a task done (solving a problem). 

A question I want to raise is what the reasons might be that practice-related artefacts play still such a little role in CSCL. Why are they left behind?  Maybe it is because they require specialized knowledge, and most of CSCL researchers are not at the same time engineers, doctors, musicians, accountants? Maybe it is because these kind of artefacts are difficult to analyze computationally? Maybe it is because we still make a strong distinction between learning and work, at least in K-12, arguably even in studies that take place in the tertiary sector? 


References: 
de Souza, C.S. (2005). The semiotic engineering of human-computer interaction. Cambridge, MA: MIT Press.Eco, U. (1979). A theory of semiotics. Bloomington, IN: Indiana University Press.

Noroozi, O., Weinberger, A., Biemans, H., Mulder, M., & Chizari, M. (2012). Argumentation-Based Computer Supported Collaborative Learning (ABCSCL): A synthesis of 15 years of research. Educational Research Review, 7, 79-106.

Reimann, P., & Yacef, K. (2013). Using process mining for understanding learning. In R. Luckin, S. Puntambekar, P. Goodyear, B. Grabowski, J. D. M. Underwood & N. Winters (Eds.), Handbook of design in educational technology (pp. 472-481). New York: Taylor & Francis.

Suthers, D.D., & Hundhausen, C.D. (2003). An experimental study of the effects of representational guidance on collaborative learning processes. The Journal of the Learnign Sciences, 12(2), 183-218.T

urner, William, Bowker, Geoffrey, Gasser, Les, & Zacklad, Manuel. (2006). Information Infrastructures for Distributed Collective Practices. Computer Supported Cooperative Work (CSCW), 15, 93-110.

0 comments |

On 11 March 2016, Sydney Ideas presents A Scientific Approach to Teaching Science and Engineering with Nobel Laureate, Professor Carl Wieman.

This talk is co-presented with the Charles Perkins Centre Science of Learning Science Node

professor_carl_wieman_thumb.jpgGuided by experimental tests of theory and practice, science and engineering have advanced rapidly in the past 500 years. Guided primarily by tradition and dogma, science education meanwhile has remained largely medieval. Research on how people learn is now revealing much more effective ways to teach, learn, and evaluate learning than what is in use in the traditional science class.

The combination of this research with information technology is setting the stage for a new approach to teaching and learning that can provide the relevant and effective science education for all students that is needed for the 21st century. Although the focus of the talk is on undergraduate science teaching, where the data is the most compelling, the underlying principles come from studies of the general development of expertise and apply widely.

This talk is presented by Professor Carl Wieman, Department of Physics and Graduate School of Education, Stanford University and Nobel Laureate. More information and registration at this page.

Date: : Friday 11 March, 2016
Time: 4.30 to 6pm
Venue: Charles Perkins Centre Auditorium, Johns Hopkins Drive, the University of Sydney.
Cost: Free and open to all with online registration requested
Registration: Register online at this page.

RSVP now to attend a March 4th Sydney Ideas talk with Professor George Siemens, Director of the LINK Research Lab at the University of Texas, Arlington, titled "Neuroscience and Learning Analytics: a historic leap in understanding learning?".

This talk is co-presented with the Deputy Vice-Chancellor (Education) Portfolio at the University of Sydney

The past decade has solidified and advanced two important tracks in helping researchers understand learning: neuroscience and big data. Sophisticated imaging techniques allow insight into the functioning of the human brain that was until recently unimaginable. Small controlled studies are laying a foundation for a new science of learning.

In contrast, big data, often generated in technological environments, presents researchers with fuzzier and messier data than what is common in neuroscience. The large N, however, offers tantalising insights into the social, affective, and meta-cognitive aspects of learning as it happens in authentic work and school settings.

This presentation will explore the methodological differences that underpin the neuroscience and big data (learning analytics) frameworks of learning research and suggest ways in which they might contribute to future educational models.

This talk is free and open to all, with online registration requested.


  • When: March 4th, 3.00pm - 4.00pm

  • Where: Law School Foyer, Level 2, Sydney Law School, Eastern Avenue, The University of Sydney

  • More information and RSVP online


Join us on Monday 8th February for a Centre for Research on Learning and Innovation seminar with Dr Lennart Schalk, "Feasibility and benefits of introducing basic physics concepts in primary school: Preliminary results of a longitudinal study".

Basic physics concepts represent the fundamental building blocks of more advanced scientific concepts that are typically introduced in secondary science education. Dr Schalk will report on the preliminary results of an ongoing longitudinal study that was initiated recently in Switzerland, the so-called MINT study (MINT is an acronym created from the German words for "science", "technology", "engineering" and "mathematics"). The aim of the study is to implement curricula on basic physics concepts in primary school and monitor children’s learning in science education until they graduate.

In this study, more than 500 primary-school teachers have been educated on how to use evidence-based hands-on teaching materials and scaffold students’ learning with these materials. Data examined in this talk has been gathered from more than 5000 primary-school students across different cohorts and preliminary results indicate that early physics education is likely to prepare students for future learning in science and it is worth the effort to directly align science education from primary to secondary education.

Lennart Schalk is a senior lecturer at the ETH Zurich, Switzerland. He teaches in primary and secondary teacher-education programs. His research focuses on learning of relational categories and conceptual change in science education as well as the improvement of educational teaching and learning materials.

Event details
• When: 4.00pm to 5.00pm on 8 Feb 2016
• Where: Room 424, Education Building A35
• This seminar will not be available online or recorded.
• More information here

rainbowtriangles.jpg

While Part I  outlined some of what researchers take for being true about learning, and argued that learning analytics can make important contributions to the methodology of modern learning research, in this posting I describe how learning analytics might contribute to conducting design-based research (DBR), sometimes also referred to as design experiments. 

DBR has the goal “…to use the close study of learning as it unfolds within a naturalistic context that contains theoretically inspired innovations, usually that have passed through multiple iterations, to then develop new theories, artifacts, and practices that can be generalized to other schools and classrooms” (Barab, 2014, p. 151). Design-based research is a ‘natural’ fit between the learning sciences and learning analytics because DBR shares with learning analytics the goal to provide solutions to practical problems. At the same time, these solutions are expected to be grounded in a theory of learning, hence applying the solution can be seen as a (partial) test of the theory, and improving the solution incrementally over time can be seen as contributing to advancing theory over time. 

In design-based research, theory is essential for generalization because design experiments do mostly not use a control group logic, but are structured as within-subjects, repeated measurements designs: A baseline is observed, an intervention is performed (e.g., change in teaching style, a different curriculum, a new or different technology), and the effects of the intervention are gauged in terms of changes to the baseline. Design-based research makes often use of qualitative methods, frequently in combination with quantitative methods. This increases its value to inform the (re-) design of the intervention, and its value for theory building. The main difference between design experiments and standard control-group experiments is that in design experiments context is seen as part of the treatment, thus acknowledging the situated nature of learning; context variables are not seen as ‘interfering’, but as providing the resources through which theoretically expected learning processes become realized in a specific learning situation. This does not mean that DBR does not have a concept of interference, but it is not context ‘variables’ that are seen as potentially interfering; instead, other mechanisms that are active in the same context can interfere. The basis for generalization is provided by keeping the mechanisms that cause learning analytically separate from the context; this analytical distinction allows to formulate expectations how the mechanisms might play out in other contexts, and is hence the basis for the form of generalization most prevalent in design-based research: analytical generalizations  (Ercikan & Roth, 2014; Maxwell, 2004). The DBR methodology is in this respect similar to the methodology of case studies (Yin, 2003): Generalizing is performed by relating the specific case to theories with explanatory value. The specific case observations are not taken as applying in an identical manner to a “population”, but are related to similar processes, and/or more abstract types of processes. It is not the specific participants in the study who are seen as instances of a (in a statistical sense meaningful) ‘population’; instead, the specific observation is treated as “an instance of” something more abstract and, in this sense, more general (Reimann, 2013).  

In more concrete terms, theory enters into design-based research in form of conjectures that take mainly the form learning trajectories and design claims. A learning trajectory describes how learning develops in the absence of the intervention—humans, like any organism, cannot not learn—and how learning changes under the influence of the intervention, in particular the theory-informed aspects of the intervention. Learning trajectories specify expectations about the form of change, perhaps its extent (‘size’), and should say something about its temporal aspects: When will the effect materialize? For how long? Design claims are conjectures about how specific aspects of the intervention affect students’ learning and understanding. Like expectations about learning trajectories, design claims focus mainly on those aspects of the pedagogical and/or technical design that are are related to relevant theory. 

Cobb and Gravemeijer (2008) provide a good example for the role of theory in design-based research. Their study focuses on middle school statistics and describes a number of design cycles for creating computational representations that help teachers to introduce notions such as center, skewedness, spread, relative frequency coherently from the concept of a mathematical distribution. Based on statistics education literature and classroom observations, the authors identity as an important step in the learning trajectory that students will initially need to learn to appreciate the difference between numbers and data. Therefore tasks and computer-generated graphical representations that are intended to make students aware of the fact that they are analyzing data need to be developed. As a theoretical framing, the specific learning trajectory gets contextualized in the wider context of mathematical reasoning, in particular learning about data generation and about developing and critiquing data-based arguments. The authors developed three computational tools, with different, but synergistic representational notations, that in concert with capable teachers began to move students’ conceptions of distribution into a mathematically fruitful direction. 

The potential for synergies between design-based research and learning analytics is obvious. DBR could greatly profit from data on students that are gathered unobtrusively, trace learning on multiple levels, and over longer stretches of time. It could further profit from making these data rapidly, if not continuously, available to teachers and students. Teachers are an essential part of most curricular activity systems (Roschelle, Knudsen, & Hegedus, 2010), and students have to learn how to monitor and steer their own learning (Bull, Johnson, Masci, & Biel, 2016). Learning analytics for its part would become more experimental, more interventionist. I see this as a good development to the extent that pedagogical and technical interventions have the goal to improve upon teaching, to innovate. This is preferable over the use of advanced analytical methods for reinforcing current practices, amongst them practices that might be pedagogically dubious. Along with becoming more experimental, learning analytics would also become more engaged in the advancement of theory via the testing of hypotheses (e.g., the testing of design claims and of conjectures of learning trajectories). This is not an alternative to learning analytics as an methodology for applied research (Pardo & Dawson, 2016), but adds a dimension that can benefit teaching and learning. 

Since learning analytics, in combination with educational data mining, is very comprehensive in terms of the method it encompasses, the shift I am suggesting is not a radical one. The two main ‘moves’ needed are, firstly, a closer alignment of learning analytics with interventionist types of educational research, such as design-based research, and with the emerging educational improvement science (Bryk, 2015). Secondly, learning analytics researchers and practitioners would need to engage more with the development and testing of learning theories, broadly conceived. I consider it particularly valuable if learning analytics would add to learning research--and to educational research in general--methods that go beyond the already well-established applications of the General Linear Model (mainly regression models and analysis of variance). Methods such as social network analysis, pattern learning, and others that allow to analyze the structures and properties that emerge from the relation between entities are potentially more interesting for theory building than linear modelling methods, which might be useful for practical purposes nevertheless. This would not only add incrementally to the method repertoire of learning research, but could transform to some extent how learning research is done: From  a discipline that mainly describes and orders phenomena and findings  with qualitative and statistical methods to a discipline that develops causal-explanatory accounts of learning-in-context. 

An additional transformative potential of learning analytics for educational research concerns the distribution of analytical work: At least in technical terms, it is a small step from gathering data comprehensively to making them available openly. Issues of data protection and privacy aside, there lies a huge innovation potential in making learning data available publicly, in usable formats, because educational challenges are truly too big for any single researcher or research team to solve (Weinberger, 2011). 

References:

Barab, S. A. (2014). Design-based research: A methodological toolkit for engineering change. In R. K. Sawyer (Ed.), Cambridge Handbook of the Learning Sciences (2nd ed., pp. 151-170). New York: Cambridge University Press.Bryk, A. S. (2015). 2014 AERA Distinguished Lecture: Accelerating How We Learn to Improve. Educational Researcher, online first.
Bull, S., Johnson, M.D., Masci, D., & Biel, C. (2016). Integrating and visualising diagnostic information for the benefit of learning. In P. Reimann, S. Bull, M. Kickmeier-Rust, R. Vatrapu & B. Wasson (Eds.), Measuring and visualizing learning in the information-rich classroom (pp. 167-180). New York,NY: Routledge.
Cobb, P., & Gravemeijer, K. (2008). Experimenting to support and understand learning processes. In A. E. Kelly, R. A. Lesh & J. Y. Baek (Eds.), Handbook of design research methods in education (pp. 68-95). New York: Routledge.Ercikan, Kadriye, & Roth, Wolff Michael. (2014). Limits of generalizing in education research: Why criteria for research generalization should include population heterogeneity and uses of knowledge claims. Teachers College Record, 116(May), 1-28.
Maxwell, J.A. (2004). Using qualitative methods for causal explanations. Field Methods, 16, 243-264.
Pardo, A., & Dawson, S. (2016). Learning analytics: How can data be used to improve learning practice? In P. Reimann, S. Bull, M. Kickmeier-Rust, R. Vatrapu & B. Wasson (Eds.), Measuring and visualizing learning in the information-rich classroom (pp. 41-55). New York,NY: Routledge.
Reimann, P. (2013). Design-based research - designing as research. In R. Luckin, S. Puntambekar, P. Goodyear, B. Grabowski, J. D. M. Underwood & N. Winters (Eds.), Handbook of design in educational technology (pp. 44-52). New York: Taylor & Francis.
Roschelle, J., Knudsen, J., & Hegedus, S. (2010). From new technological infrastructures to curricular activity systems: Advanced designs for teaching and learning. In M. J. Jacobson & P. Reimann (Eds.), Designs for learning environments of the future (pp. 233-262). New York: Springer.
Weinberger, D. (2011). Too big to know: Rethinking knowledge now that the facts aren't the facts, experts are everywhere, and the smartest person in the room is the room. New York, NY.: Basic Books.
Yin, Robert K. (2003). Case study research : design and methods (3rd ed.). Thousand Oaks, CA: Sage.

Learning analytics is a young field of research (Baker & Siemens, 2014a; Baker & Yacef, 2009), that along with educational data mining has rapidly grown, driven by the availability of (large) sets of data on students’ learning and the interest in analysing these data for the purpose of improving students’ learning and learning experience. I do not make much of the difference between learning analytics and educational data mining here, but it is worth keeping in mind that there are differences between the two fields, even though they are closely related and draw on an very much overlapping research communities. Siemens and Baker (2012) identify the following differences:

  • EDM researchers are more interested in automated methods for discovery, while LA is more interested in human-led, mixed-initiative methods for exploring educational data;•
  • EDM is more construct-oriented, while LA researchers emphasize a more holistic view of learning and learners;
  • Researchers in EDM develop methods for automatic adaptation of instruction, whereas LA researchers are developing applications that inform teachers, educators, and students. Hence the strong interest in LA on learning visualisations. 

The focus of this paper is on the relation between learning analytics (and EDM) and learning research, in particular the kind of learning research practiced in the Learning Sciences (Sawyer, 2014). My intention is thus similar to the one of Baker and Siemens in their contribution the second edition of the Cambridge Handbook of the Learning Sciences (Baker & Siemens, 2014b): To contribute to a stronger tie between learning analytics and learning (sciences) research. However, different from Baker and Siemens I believe that important contributions from learning analytics to learning research are still a matter of the future. I argue that that while there is the potential for that, it is far from realized, even from being realized. In terms of Pasteur’s Quadrant (Stokes, 1997), I see learning analytics as currently falling into the category of pure applied research, whereas learning sciences can be see as use-inspired basic research, in which the focus is on advancing “the frontiers of understanding but also inspired by considerations of use” (Stokes, 1997, p. 74).

 The main strategy I am following here is to develop some suggestions for how to make LA more relevant for foundational research on learning. I argue that the methods used in learning analytics (and EDM) have the potential to contribute to the applied as well as the foundational objectives of learning research. I further suggest that a more theory-oriented learning analytics can be more than an ‘addition’ to the ‘toolbox’  the learning researcher, that the ‘import’ could be more profound: It could change to a certain extent how we think about research methodology in the learning sciences. 

The potential of Learning Analytics in learning research

The potential of learning analytics for the advancement of the learning research can in my opinion unfold along the four dimensions: (i) data quantity, (ii) longitudinal data, (iii) data from multiple levels, and (iv) data from many locations.  In this section, I map these characteristics of data in learning analytics to modern conceptions of learning and main findings from learning research. 

Quantity of Data

The size of data sets is the primary argument for the value of LA: ”One of the factors leading to the recent emergence of learning analytics is the increasing quantity of analysable educational data (…) Papers have recently been published with data from tens of thousands of students.” write Baker and Siemens (2014a, p. 254). Size is not only measured in number of students; the number of data points per student (captured in log files of learning applications and platforms, for instance) is another quantitative dimension. The Pittsburgh Science of Learning Center DataShop (Koedinger et al., 2010), for instance, stores detailed recordings of students’ interactions with carefully designed tutor software that records step-by-step problem solving operations. 

There are a number of reasons why size is considered to matter. One is that the number of students is taken as useful for establishing the generalizability of findings—a statistical argument. Another is that the more data, the more ‘patterns’ can be found. The flip side to this is that the number of possible relations between variables increases exponentially with the number or variables included in the analysis (Council, 2013). More is needed than just data to ‘discover’ meaningful relations. 

A third argument for the value of large data sets is that they allow us to identify ‘rare’ events: events/patterns that occur in only small numbers of students or only sporadically (e.g., Sabourin, Rowe, Mott, & Lester, 2011).This is particularly interesting if the rare events are defined apriori: events that theory predicts, but that are seldom occurring spontaneously, or are seldom observable because of interactions with other processes (or because of measurement issues). The inverse is interesting as well: Theory might not allow certain events to happen; if they happen, their appearance is interesting because this might not only be just a measurement error, or due to ‘chance’, but indicate a limitation of the theory; it might even render the theory downright wrong. 

While all three aspects of data quantity are beneficial to learning research, the third aspect—rare event detection—deserves more attention. It is the one least often considered, but it can contribute to make learning sciences more theory-guided, and it can help to bridge the gap between qualitative and quantitative learning research. In qualitative research, the frequency with which an event occurs is not automatically identified with the importance of the event; in many cases, important events are rare. An example from learning research is conceptual change, which occurs rarely,  but when it occurs has profound effects on students’ understanding (diSessa, 2006).

Longitudinal Learning Data

Learning needs time.Learning in schools and universities requires often multiple skills—such as mathematical and writing skills—to master complex, hierarchically structured subject matter. In science education, for instance, the hierarchical nature of the subject knowledge also leads to the subject being an intricate association of concepts where deep learning of some basic concepts require comprehension of other basic concepts (Fergusson-Hessler & de Jong, 1987). Theoretical accounts for the depth and extend it takes to comprehend scientific concepts have been suggested from a cognitive psychology perspective and from a socio-cultural perspective. From the cognitive psychology perspective, one line of argument is that learning science can be seen as developing a form of expertise, and that any form of real expertise in cognitively demanding areas requires years of learning (the magic number is 10 years, plus/minus 2), as evidenced by novice-expert research, see (K. A. Ericsson, Charness, Feltovich, & Hoffman, 2006)  for a comprehensive overview. The currently best elaborated cognitive model of expertise development in the cognitive tradition is probably Ericsson’s Deliberate Practice theory (K. Anders Ericsson, Krampe, & Tesch-Römer, 1993). The reason why learning takes long in this model is the incremental nature of the underlying cognitive learning/change mechanisms (chunking, proceduralization). 

Another cognitive account, and one more specific to science education than general models of expertise development, is Chi’s and Slotta’s Ontology Shift theory (e.g., Chi, Slotta & de Leeuw, 1994). On this account, learning scientific concepts is hard and everyday concepts are resistant to change because scientific understanding requires in many cases a change in an ontological category. A classical example is the concept of heat, where students often see heat as a property of matter, whereas in physics it is seen in process terms, as the average velocity of particles. In this theory, the reason that learning stretches often over longer times is that while the ontology change itself can be fairly rapid, it needs often extended time (under current conditions of science learning) before students become sufficiently aware of the limitations of the initial ontology and are ready to accept an alternative one. 

Tracking learning that stretches over months and years—another example for this would be the development of second language skills—is very rarely done in learning research. One reason are the costs, and the logistics, of performing such research. But the costs are being substantially lowered as learning analytics methods find their place in schools and universities. It would be of tremendous benefit  if such data could be made available to researchers, and their acquisition planned in coordination with research projects. Methods for process mining are particularly relevant in this context (Reimann, 2009). Not only would this help to conduct specific projects that study long-term learning, it would also change the way we think about the nature of projects in learning research: From short-term interventions with immediate effects assessment to longer-duration interventions with continuous, long-durations effects (and side-effects!) monitoring. A variant of this kind of research we see developing with improvement research (Bryk, 2015), and the continuous use of data for decision making (Mandinach, 2012). 

Data from Learning on Multiple Levels - Learning is complex

Learning does not only place over long durations, but on other levels of analysis is happening within seconds and even milliseconds. Nathan and Alibali (2010) distinguish between learning in milliseconds and below (biological), seconds (cognitive), minutes to hours (rational), days to months (sociocultural), and years and beyond (organizational). This can be seen as an expression of strictly different kinds of learning, but more productively it may be seen as an expression of the fact that learning takes place at multiple levels at the same time. We can see learning ‘events’ as being produced by a complex, multi-layered system, with minimally three levels: A biological stratum with neurophysiological processes, a cognitive stratum (rational thinking, knowledge) , and a socio-cultural stratum (tools, practices). These strata, or levels, are set in relation to each other by processes of emergence (Sawyer, 2005). 


The concept of emergence as used here is relational: It refers to the phenomenon that wholes (entities, agents, organisms, organisations) have properties that cannot be found in any of their parts.  An emergent property “is one that is not possessed by any of the parts individually and that would not be possessed by the full set of parts in the absence of a structuring set of relations between them.” (Elder-Vass, 2010, p. 17). A key aspect of (relational) emergence is therefore the organization of the parts, how the parts are set in relation to each other, how the whole is structured. Not all properties of an object are emergent; some will be resultant properties. For instance, most objects have mass, which is an resultant property: the mass of the whole is the sum of parts’ masses. Some objects have colour, which is an emergent property; it is dependent on the organization of the objects’ parts. 


If we conceive of learning as a complexity phenomenon (Kapur et al., 2007), then learning needs not only be studied at multiple levels, but the analysis of the relation between the levels—the nature of the emergence—must take center stage. This requires not only to ask what affects learning over time, but also how learning is constituted at each moment in time: Which configurations of neural, cognitive, motivational, emotional, social and contextual processes/elements give rise to a ‘learning event’? Answering the latter question requires  appropriate instrumentation, and appropriate analytical methods. The methods cannot be (only) variants of the General Linear Model (e.g., regression models, including so-called ‘structural’ or ‘causal’ variants), amongst other reasons because these are not appropriate for non-linear complex systems, for systems that transform themselves or get transformed. Instead, methods for the analysis of non-linear systems will be needed (e.g., van Geert, 1998), and methods that can be used to describe relations between parts, in particular graph-theoretical methods such as Social Network Analysis (Burt, Kilduff, & Tasselli, 2013). Learning analytics and educational data mining can play a key role in advancing the learning sciences by bringing about such methodological advances and by making them usable for learning researchers. These includes, but should not be confined to, methods for recording bio-signals, learning behavior and the cognitive-motivational processes causing them,  as well as the social dimension of learning in great detail, with high precision, repeatedly and frequently, if not continuously. 

Data from Learning in Many Contexts - Learning is Distributed 

The methods being developed in learning analytics and educational data mining to capture aspects of students’ behaviour—and physiological and emotional parameters that go along with behaviour—not only over time, but also across locations is tremendously valuable for research. This because learning is situated: It is highly dependent on the resources available to the learner in specific contexts. Not only is learning happening (quasi-)synchronously across multiple levels, it is also distributed over the socio-physical environment—the situation—the learner finds herself in (Sawyer & Greeno, 2009). As Greeno and others have argued, any analysis of learning will be incomplete if it does not (also) conceptualise learning as a socio-cultural practice, as an activity system that stretches far beyond the somato-physical boundaries of the cranium and the body. 


Such an understanding of learning practices is necessary for theoretical as well as pedagogical purposes. For the purpose of theory development, an understanding of the socio-material practices around knowledge objects contributes to de-mystifying the process of learning—how is it possible to learn something genuinely new?— and of idea and knowledge creation more generally (Prawat, 1999). As the entanglement of cognitive work with physical, symbolic and social resources becomes ever better documented and understood—in general (e.g., Clark, 2011) and for specific areas such as scientific research (e.g., Latour & Woolgar, 1986)—it becomes clear that a theory of learning, creativity and idea generation will need to be grounded not only in psychology, but also in sociology, organization science, and semiotics. Any specific study will need to capture knowledge practices in a comprehensive sense. 


The fact that with learning analytics methods behavioural, interactional, and increasingly even some physiological parameters of students’ ‘learning’ activities can be captured across locales and contexts constitutes an essential prerequisite for researching learning-in-context at scale. Learning analytics methods will need to become substantially more sophisticated to become really useful for studying learning-in-context, though. It is not sufficient to keep track of students’ activities (and related parameters) alone; in addition, the context needs to be described and logged as well. This is easier said than done; just think of the many artefacts and tools that students use on average on every day of a semester: at school/uni, at home, while commuting. Along with technical advancements for capturing aspects of students’ behaviour and experience, a main focus of research in learning analytics should therefore be to develop languages, and standards, for describing the context within which behaviour and experience arise, and for describing the relation between the learners and the social, physical and symbolic aspects of learning context. 

Summary


In summary, I argue that there lies a huge potential in learning analytics to advance learning research, and that in order to realize this potential learning analytics researchers should devote more attention to (finding) rare learning events, to focus more on long-term learning, to make more of the fact that learning can be recorded on multiple levels of a complex system (the human learner), and to develop methods for capturing the context in which learning activities occur. None of this can be done without building on theory, on conceptualizations of learning and cognition. Theory is essential, and it is important to repeat what two of the key researchers write: ”The theory-oriented perspective marks a departure of EDM and LA from technical approaches that use data as their sole guiding point…” (Baker & Siemens, 2014b, p. 256/257). Suggestions such as made by Anderson (2008) that big data will render the scientific method obsolete not only express a deep misunderstanding of what the method is about, they are also committing the logical (and ethical) error of using descriptions of the past as prescriptions for the future.  

 

References

Anderson, C. . (2008). The end of theory: The data deluge makes the scientific method obsolete. Wired Magazin.   Retrieved 14 December, 2015, from http://www.wired.com/2008/06/pb-theory/

Baker, R., & Siemens, G. (2014a). Educational data mining and learning analytics. In R. K. Sawyer (Ed.), Cambridge Handbook of the Learning Sciences (2nd ed., pp. 253-274). New York: Cambridge University Press.

Baker, R., & Siemens, G. (2014b). Learning analytics and educational data mining. In R. K. Sawyer (Ed.), Cambridge Handbook of the Leaning Sciences (2nd ed., pp. 253-272). New York: Cambridge University Press.

Baker, R., & Yacef, K. (2009). The state of educational data mining in 2009: A review and future vision. JEDM - Journal of Educational Data Mining, 1(1), 3-17.

Bryk, A. S. (2015). 2014 AERA Distinguished Lecture: Accelerating How We Learn to Improve. Educational Researcher.

Burt, Ronald S., Kilduff, Martin, & Tasselli, Stefano. (2013). Social network analysis: Foundations and frontiers on advantage. Annual Review of Psychology, 64, 527-547.

Clark, A. (2011). Supersizing the mind. Embodiment, action, and cognitive extension. Oxford, UK: Oxford University Press.

Council, National Research. (2013). Frontiers in Massive Data Analysis. Washington, D.C.: The National Academic Press.

diSessa, A.A. (2006). A history of conceptual change research: Threads and fault lines. In R. K. Sawyer (Ed.), The Cambridge Handbook of the Learning Sciences.

Elder-Vass, Dave. (2010). The causal power of social structures. Cambridge, UK: Cambridge University Press.

Ericsson, K. A., Charness, N., Feltovich, P., & Hoffman, R.B. (Eds.). (2006). The Cambridge Handbook of Expertise and Expert Performance. New York: Cambride University Press.

Ericsson, K. Anders, Krampe, Ralf Th., & Tesch-Römer, Clemens. (1993). The role of deliberate practice in the acquisition of expert performance. Psychological Review, 100, 363-406.

Fergusson-Hessler, M.G.M., & de Jong, T. (1987). On the quality of knowledge in the field of electricity and magnetism. American Journal of Physics, 55, 492-497.

Kapur, M., Hung, D., Jacobson, M.J., Voiklis, J., Kinzer, C. K., & Victor, Chen Der-Thang. (2007). Emergence of learning in computer-supported, large-scale collective dynamics: A research agenda Proceedings of the International Conference on Computer-supported Collaborative Learning (CSCL2007). New Brunswick, NJ.

Koedinger, K R, Baker, R S J D , Cunningham, K, Skogsholm, A., Leber, B., & Stamper, J. (2010). A data repository for the EDM community: The PSLC DataShop. In C. Robero, S. Ventura, M. Pechenizkiy & R. Baker (Eds.), Handbook of educational data mining (pp. 43-56). Boca Raton, FL.: Chapman&Hall/CRC.

Latour, B., & Woolgar, S. (1986). Laboratory life: The construction of scientific facts (2nd ed.). Princeton: Princeton University Press.

Mandinach, E. B. (2012). A perfect time for data use: Using data-driven decision  making to inform practice. Educational Psychologist, 47(2), 71-85.

Nathan, M.J., & Alibali, Martha Wagner. (2010). Learning Sciences. Wiley Interdisciplinary Reviews:Cognitive Science, 1(3), 329-345.

Prawat, R. S. (1999). Dewey, Peirce, and the Learning Paradox. American Educational Research Journal, 36, 47-76.

Reimann, P. (2009). Time is precious: Variable- and event-centred approaches to process analysis in CSCL research. International Journal of Computer-supported Collaborative Learning, 4, 239-257.

Sabourin, J., Rowe, J., Mott, B., & Lester, J. (2011). When off-task is on-task: The affective role of off-task behavior in narrative-centered learning environments. . Paper presented at the Proceedings of the 15th International Conference on Artificial Intelligence in Educatoin. 

Sawyer, R. K. (2005). Social emergence. Societies as complex systems. Cambridge, UK: Cambridge University Press.

Sawyer, R. K. (Ed.). (2014). The Cambridge Handbook of the Learning Sciences (2nd ed.). New York: Cambride University Press.

Sawyer, R. K., & Greeno, J.G. (2009). Situativity and learning. In P. Robbins & M. Aydede (Eds.), The cambridge handbook of situated cognition (pp. 347-367). New York, NY: Cambridge University Press.

Siemens, G., & Baker, R.S.J. d. (2012). Learning analytics and educational data mining: Towards communication and collaboration. Paper presented at the Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (LAK 2012). 

Stokes, D.E. (1997). Pasteur's quadrant: Basic science and technological innovation. Washington, DC: Brookings Institution Press.

van Geert, Paul. (1998). A dynamic systems model of basic developmental mechanisms: Piaget, Vygotsky, and beyond. Psychological Review, 105, 634-677.

0 comments |

Starting in January 2016, the Sciences and Technologies of Learning research network will transform into a new research centre. The University has approved our proposal to set up a Centre for Research on Learning and Innovation, as a sustainable way of supporting the research collaborations that have been a feature of STL for the last five years. The new centre will have strong roots in Education, with substantial involvement from Engineering & IT, Science, Health Sciences and Medicine. As with STL, membership of the new centre will be open to all members of university staff, and postgraduate students, who have a serious interest in research in this area.

The primary disciplines involved in STL and CRLI have been recognised in the most recent national assessment of research quality (ERA2015) as showing 'outstanding performance well above world standard’ (Rated 5 - the highest rating possible).

  • 1303 Specialist Studies in Education (including the Learning Sciences and Educational Technology and Computing) - 5
  • 16 Information & Computing Science - 5
  • 1702 Cognitive Sciences - 5

Further information about the transition to the new centre will be posted here in the coming weeks and the new year.
rainbowtriangles.jpg


The Sciences and Technologies of Learning (STL) Research Fest was held in the Charles Perkins Centre Hub on Thurs Nov 5th 2015. Posters and abstracts from the day are available online in a Dropbox folder at http://bit.ly/STLFest15files.

Sciences and Technologies of Learning Research Fest 2015

Dr David Ashe and Melinda J Lewis were very happy to receive the people’s choice award for their poster:

Context in Flux: An invitation to join a think‐aloud installation at Research Fest.

The dynamic and multi-modal poster was an opportunity to explore and expand the research poster genre to one of installation, evoking participation and interaction.

Melinda and David performed a think-aloud about their ideas on why or why not common understandings of context are relevant in their research. Visitors to the poster joined in, voicing their thoughts on context both verbally and in text that they shared directly onto the poster.

Context_in_Flux_sm.jpg

It is not possible to illustrate the dynamic modality of the poster in a static image; however, the image above provides a small insight into the experience of visitors to the Research fest. The textual information on the poster scrolled, images changed and additional text could be entered in real-time. Wireless headphones were also supplied for visitors to listen to audio information.

If you would like further information about this dynamic and interactive poster, please contact David and Melinda.

Contacts:
David.Ashe@sydney.edu.au
Melinda.Lewis@sydney.edu.au

Congratulations to our poster winners at the Research Fest:

JUDGES’ CHOICE
Winner – Yobelli Jimenez and Sarah Lewis, Implementation of immersive virtual technology for radiation therapy education (link to external Dropbox file).
Runner-Up - Ling Wu, Enhancing Young Children’s Empathy Development through Purposely Designed Educational Tablet Games (link to external Dropbox file).

PEOPLE’S CHOICE
Winner - Dr David Ashe and Melinda Lewis, Context in Flux: An invitation to join a think-aloud installation at Research Fest (link to blog post).

Thank you to our poster judges; Sonya Corcoran and Julie King. Posters will be made available online in the next week. Details will be posted here, on our website, and to fest registrants.

Congratulations to Dr Patrica Thibaut Paez, who has been awarded a position by the National Commission for Scientific and Technological Research (CONICYT) through its FONDECYT program's Postdoctoral Contest 2016.

These positions are granted by the National Fund for Scientific and Technological Research (FONDECYT), which was created as an instrument to promote scientific and technological development in Chile. FONDECYT fosters the initiative of individuals and research groups by funding scientific and technological research projects in all fields of knowledge. Resources are allocated through annual public competitions and projects are selected on the basis of their intrinsic quality and the merits of applicants without, distinction of fields, institutional affiliation or gender. The aim of this competition is to stimulate productivity and future scientific leadership of young researchers who hold a Doctorate degree.

Patricia is a researcher at the Centre for Research on Computer Supported Learning and Cognition (CoCo) at the University of Sydney. She also completed her PhD at the Centre. Her research focuses on learning, literacy, and mobile technologies across formal and informal spaces.

On November 5, the STL Research Fest will bring together the wider community of researchers and practitioners in the sciences and technologies of learning to exchange ideas and form new collaborations.

Timetable

Start End Item
9:45 9:55 Registration
10:00 10:40 Opening and shorter plenary
10:40 11:00 Morning Tea
11:00 11:45 Parallel session 1
11:45 12:30 Poster showcase 1
12:30 13:15 Lunch
13:15 14:00 Poster showcase 2
14:00 14:45 Parallel session 2
14:45 15:00 Refreshments
15:00 16:00 Plenary and closing - Learning to work across boundaries - opportunities for research and innovation

Parallel sessions

ID Title Presenters/discussants
Parallel session 1 : 11.00-11:45am
1 Mind the gap Abelardo Pardo, Michael Jacobson, Peter Reimann, Kalina Yacef
2 Teaching how to work across boundaries Lina Markauskaite, Peter Goodyear, Marie Carroll, Tina Hinton, Philip Poronnik, Kim Bell-Anderson, Simon Poon
3 Coding, designing and networking Rob Saunders, Lucila Carvalho
Parallel session 2 : 14.00-14:45
4 Researching Innovative Learning Spaces Rob Ellis, Tina Hinton, Pippa Yeoman
5 Professional learning on-the-go Lina Markauskaite, James Edwards, Meg Phelps, Peter Goodyear
6 Cranking up a notch Adam Bridgeman, Wai Yat Wong, Rena Bokosmaty, Meloni Muir

Posters

Poster Session 1: 11.45 – 12.30. Posters 1-14      

Poster Session 2: 13.15-14.00. Posters 15-27     

  1. Undergraduates as App development partners: a case study from Botany & Computer Science. Alexander Ling, Ahmed Shadid, Michael Johnston, Xilin Huang, Woo Yang Baeg, Scott Dong, Se-Hyun Kevin Ahn, Caroline Cheung, Satyendra Sinha, Rosanne Quinnell
  2. Group Formation - How do students' characteristics and behaviour affect group work performance? Augusto Dias Pereira dos Santos, Kalina Yacef
  3. A proposal for redesigning problem-based learning in medical education: Contrasting student solutions and improving consolidation. Alisha Portolese, Michael Jacobson, Robbert Duvivier, Lina Markauskaite
  4. EQ Clinic: An Online Clinic for Medical Communication Enhancement. Chunfeng Liu
  5. Exercise motivation through fully-immersive gamified virtual reality experience. Crystal Yoo
  6. Investigating the development of scientific inquiry in undergraduate physics students. Gabriel Nguyen, John O'Byrne, Manjula Sharma
  7. Learning and Enactment in Techno-human ecosystem: Embodiment of sociomateriality in sensemaking process. Gilbert Importante, Dr. Lina Markauskaite, Prof. Peter Goodyear
  8. A Student ‘Vision Statement’ as a Catalyst for Educational Innovation in Navitas: Towards the ideal technology-enabled learning environment for English Language Students. Jonathan Hvaal
  9. What offline and online technologies do higher education students use to complete assessment tasks? Lynnette Lounsbury, Dr David Bolton, Dr Paula Mildenhall, Assoc. Prof. Maria Northcote
  10. Learning by enhanced tactile feedback - Montessori sandpaper extended, Michael Tang, Dr. Paul Ginns
  11. Visualising socio-material practices in knowledge creation. Natalie Spence
  12. Clinical development using reflective learning and ePortfolios: staff and student perceptions. Punyanit Rungnava
  13. A quantitative study of students’ experiences, needs and expectations around technology in their personal lives and study in Higher Education, VET and ELICOS contexts. Lucy Blakemore, Yindta Whittington
  14. Mirror, mirror: A pre-learning exercise enhances mathematical problem-solving efficiency. Eleni Smyrnis, Paul Ginns
  15. How collaborative successes and failures become productive: An exploration of emerging understanding and misunderstanding turning points in model-based learning with productive failure. Alisha Portolese, Lina Markauskaite, Polly Lai, Michael J. Jacobson
  16. Context in Flux: An invitation to join a think-aloud installation at Research Fest. Dr David Ashe, Melinda J Lewis
  17. “That thing would have been good for this” Multimodal Interaction Analysis. Dewa Wardak
  18. Invigorating Science Investigations using an Inquiry Oriented Pedagogical Instrument. Evan Hefer, Manjula Sharma, Louise Sutherland, Alexandra Yeung, Scott Kable
  19. Enhancing Young Children’s Empathy Development through Purposely Designed Educational Tablet Games. Ling Wu
  20. Learning at multidisciplinary team meetings leading innovation projects. Amanda Lacy
  21. Learning Nanotechnology with Agent-Based Models versus Animations: Gestures Differences in Problem Solving. Polly Lai
  22. Massive online open science. Dr Rebecca LeBard, Geoff Kornfeld, Dr Rosanne Quinnell, Scientia Professor Rob Brooks, Scientia Professor Brett Neilan, Emeritius Professor Brynn Hibbert
  23. Exploring EFL Teachers Competences in Synchronous Telecollaborative Intercultural Communication. Wissam Bin Siddiq
  24. Talking to oneself and others: How self-explanation affects group discussions. Sanri le Roux
  25. Implementation of immersive virtual technology for radiation therapy education. Yobelli Jimenez, Sarah Lewis
  26. A Mobile App in the 1st Year Uni-Life: A Pilot Study. Yu Zhao
  27. MOOClm: Open Learner Models in MOOCs to Guide and Coordinate. Ronny Cook

28 posters on display at the Fest over in 2 sessions, Poster Session 1 from 11.45-12.30 and Poster Session from 2 13.15-14.00.

Poster Session 1, 11.45-12.30
1. Undergraduates as App development partners: a case study from Botany & Computer Science. Alexander Ling, Ahmed Shadid, Michael Johnston, Xilin Huang, Woo Yang Baeg, Scott Dong, Se-Hyun Kevin Ahn, Caroline Cheung, Satyendra Sinha, Rosanne Quinnell
2. Group Formation - How do students' characteristics and behaviour affect group work performance? Augusto Dias Pereira dos Santos, Kalina Yacef
3. A proposal for redesigning problem-based learning in medical education: Contrasting student solutions and improving consolidation. Alisha Portolese, Michael Jacobson, Robbert Duvivier, Lina Markauskaite
4. EQ Clinic: An Online Clinic for Medical Communication Enhancement. Chunfeng Liu
5. Exercise motivation through fully-immersive gamified virtual reality experience. Crystal Yoo
6. Investigating the development of scientific inquiry in undergraduate physics students. Gabriel Nguyen, John O'Byrne, Manjula Sharma
7. Learning and Enactment in Techno-human ecosystem: Embodiment of sociomateriality in sensemaking process. Gilbert Importante, Dr. Lina Markauskaite, Prof. Peter Goodyear
8. A Student ‘Vision Statement’ as a Catalyst for Educational Innovation in Navitas: Towards the ideal technology-enabled learning environment for English Language Students. Jonathan Hvaal
9. What offline and online technologies do higher education students use to complete assessment tasks? Lynnette Lounsbury, Dr David Bolton, Dr Paula Mildenhall, Assoc. Prof. Maria Northcote
10. Learning by enhanced tactile feedback - Montessori sandpaper extended, Michael Tang, Dr. Paul Ginns
11. Visualising socio-material practices in knowledge creation. Natalie Spence
12. Clinical development using reflective learning and ePortfolios: staff and student perceptions. Punyanit Rungnava
13. A quantitative study of students’ experiences, needs and expectations around technology in their personal lives and study in Higher Education, VET and ELICOS contexts. Lucy Blakemore, Yindta Whittington
14. Mirror, mirror: A pre-learning exercise enhances mathematical problem-solving efficiency. Eleni Smyrnis, Paul Ginns


Poster Session 2, 13.15-14.00
15. How collaborative successes and failures become productive: An exploration of emerging understanding and misunderstanding turning points in model-based learning with productive failure. Alisha Portolese, Lina Markauskaite, Polly Lai, Michael J. Jacobson
16. Context in Flux: An invitation to join a think-aloud installation at Research Fest. Dr David Ashe, Melinda J Lewis
17. “That thing would have been good for this” Multimodal Interaction Analysis. Dewa Wardak
18. Invigorating Science Investigations using an Inquiry Oriented Pedagogical Instrument. Evan Hefer, Manjula Sharma, Louise Sutherland, Alexandra Yeung, Scott Kable
19. Enhancing Young Children’s Empathy Development through Purposely Designed Educational Tablet Games. Ling Wu
20. Learning at multidisciplinary team meetings leading innovation projects. Amanda Lacy
21. Learning Nanotechnology with Agent-Based Models versus Animations: Gestures Differences in Problem Solving. Polly Lai
22. Massive online open science. Dr Rebecca LeBard, Geoff Kornfeld, Dr Rosanne Quinnell, Scientia Professor Rob Brooks, Scientia Professor Brett Neilan, Emeritius Professor Brynn Hibbert
23. Exploring EFL Teachers Competences in Synchronous Telecollaborative Intercultural Communication. Wissam Bin Siddiq
24. Talking to oneself and others: How self-explanation affects group discussions. Sanri le Roux
25. Implementation of immersive virtual technology for radiation therapy education. Yobelli Jimenez, Sarah Lewis
26. A Mobile App in the 1st Year Uni-Life: A Pilot Study. Yu Zhao
27. MOOClm: Open Learner Models in MOOCs to Guide and Coordinate. Ronny Cook

rf_discuss.jpgOn November 5, the STL Research Fest will bring together the wider community of researchers and practitioners in the sciences and technologies of learning to exchange ideas and form new collaborations. Registration to attend is open until Oct 28th at bit.ly/FestReg15. Registration is free but needed for catering purposes.

Timetable

Start End Item
9:45 9:55 Registration
10:00 10:40 Opening and shorter plenary
10:40 11:00 Morning Tea
11:00 11:45 Parallel session 1
11:45 12:30 Poster showcase 1
12:30 13:15 Lunch
13:15 14:00 Poster showcase 2
14:00 14:45 Parallel session 2
14:45 15:00 Refreshments
15:00 16:00 Plenary and closing

The program is still being fleshed out, further details will be posted here, on our website, and emailed to registrants in advance of the Fest.

Parallel sessions

ID Title Presenters/discussants
Parallel session 1 : 11.00-11:45am
1 Mind the gap Abelardo Pardo, Michael Jacobson, Peter Reimann, Kalina Yacef
2 Teaching how to work across boundaries Lina Markauskaite, Peter Goodyear, Marie Carroll, Tina Hinton, Philip Poronnik, Kim Bell-Anderson, Simon Poon
3 Coding, designing and networking Rob Saunders, Lucila Carvalho
Parallel session 2 : 14.00-14:45
4 Learning space research Rob Ellis, Tina Hinton, Pippa Yeoman
5 Professional learning on-the-go Lina Markauskaite, James Edwards, Meg Phelps, Peter Goodyear
6 Cranking up a notch Adam Bridgeman, Wai Yat Wong, Rena Bokosmaty, Meloni Muir

Register now

Registration to attend is open until Oct 28th at bit.ly/FestReg15.

Join us on Weds 28th October for Getting interested, our final Research on Learning and Education Innovation seminar this year.

"Getting interested". Everyone implicitly understands it; everyone recognises its importance. It is clearly a part of learning, and thereby education. That said, where does “getting students interested” figure within teachers' course organisation? Do they consider it as important as the knowledge/skills development aspect of their teaching?

This seminar by Dr Luke Fryer will begin by reviewing the development of the academic understanding of "interest". The discussion will then turn to his research into the role of individual differences within interest. From this general test of interest development, Luke will present an interest model that has been explicitly designed to support instruction within secondary and tertiary education. Two initial tests – both currently under review – will then be discussed, followed by a preview of beta-software developed for the micro-analytic measurement of interest and an examination of future directions for the field, as well as Luke's own research program.

Luke Fryer is a Ewing Post-doctoral Research Fellow at the Faculty of Education and Social Work whose current research focus is working towards understanding why students (don’t) study and more recently what factors are involved in initiating their interest in a domain of study.

Event details
• When: 28 Oct, 11.00-12.30 (come at 10.45 for refreshments)
• Where: Room 612, Education Building A35
• This seminar will not be available online or recorded.
• More information here

There are still problems with problem-based learning: recent innovations and new directions

APpic.jpg
A Research on Learning and Education Innovation seminar with Alisha Portolese.

Problem-based learning (PBL) is widely used in universities, high schools, and even primary classrooms globally. It is considered by many to be the leading learning design for medical education, and has branched out to a wide variety of disciplines in health sciences and beyond. Although widespread, PBL has components that are not adequately grounded in learning theory. In this presentation, PhD candidate Alisha Portolese (pictured) will argue that PBL needs some specific tweaks to better provide the best that we can offer in terms of an efficient, effective, productive learning experience. It will discuss how we can apply strong learning science research about how people learn to improve the design of PBL, highlight strengths and pitfalls, discuss recent improvements and innovations, and suggest future directions. The presentation will speak to PBL learning design at both a research and teaching level.

Alisha Portolese is a PhD candidate at CoCo, researching integrating elements from productive failure and analogical encoding theory into problem-based learning in medical education.

Event details
• When: 21 Oct, 11.00-12.30 (come at 10.45 for refreshments)
• Where: Room 612, Education Building A35
• This seminar will not be available online or recorded.
• More information here


rf_discuss.jpgDo you want to make connections, showcase your work and find out more on recent innovations in learning and knowledge technology research? Register now for the STL Research Fest, our annual event bringing together the wider community of researchers and practitioners in the sciences and technologies of learning to exchange ideas and form new collaborations.

What to expect

We expect the Fest, which takes place this year on Thurs Nov 5th in the Charles Perkins Centre Hub at the University of Sydney, to attract about 150 people for a full day of activities. Our program depends on what our attendees want to see and show but you can expect: plenaries; parallel workshop, demonstration and roundtable sessions; poster sessions; and the opportunity to network over catered breaks.

Details will be posted here, on our website, and emailed to registrants in advance of the Fest.

Want to present?

If you would like to submit a poster or run a seminar, roundtable or workshop event, please register as soon as possible at bit.ly/FestReg15. The closing date for submission content is Oct 4th. Want to present but don’t have results yet? Our poster sessions attract a diverse range of topics at various stages of research. It's a great chance to let others know about your research or present research design, and to get useful feedback and contacts. Some of the posters from 2014 are available online at http://bit.ly/STLFest14files If you, or someone you know, might be interested in presenting please feel free to contact us and forward this information on.

Register now

Registration to submit posters, presentations and other content is open until Oct 4th. You can register to attend until Oct 21st. Registration is free but needed for catering purposes. Register at bit.ly/FestReg15 or below.

Read more...

About Us

Find out more about our network and research at the STL website (offsite).

About the Blog

Research by the University's Centre for Research on Learning and Innovation (CRLI).
More