Thursday, April 7, 2016

Car versus train & machine learning

It was a fascinating Hangout (Week 5 Personal Learning MOOC #NRC01PL,) with Steven Downes’ guest George Siemens. George works with the Technology Enhanced Knowledge Research Institute at Athabasca University.

Steven started the Hangout by defining a Personal Learning Environment (PLE), where students manage their own learning and interact with a range of learning resources provided by a range of providers; as well a Personal Learning Assistant (PLA) which is a tool attached to the PLE that facilitates access to learning resources and the remainder of the learning environment. He suggested a clear analogy of car versus train, as follows:

In the ELearnspace blog, Open Learning Analytics. Again, Siemens discussed the importance of the development of an open platform for analytics of learning data. He described education today as a data-centric world and highlighted the huge amount of algorithmic sorting that goes on behind the scenes in education today.

In the Hangout, Siemens mentioned an article, No time to think: Reflections on Information Technology and contemplative scholarship by David M Levy. This article juxtaposes the situation where students today have far greater access to digital tools to assess them in their learning while, at the same time, the hectic pace of life has resulted in educators having little to no time for reflection and contemplative scholarship. The challenge is how we learn, interact and share ideas, when we have no time to create and choose.

In response to a question on Learning Analytics, Siemens stated, “The best choice today is the one that gives you the most choices tomorrow.” He spoke of Learning Analytics as antithetical, as serving the skills pipeline with its focus on employment. This skill-based instructional method has a short life-line. You might train for a job in a particular field that becomes automated within a short timeframe. Education should rather focus on future skill sets that will be needed, such as: the ability to think, function, use digital information environments, lead, be socially active, and be part of a functioning and distributed team.

He explained further that future big “start-ups” in education will adopt a machine learning approach, mapping and mining structured data, stitching together learner profiles from activity students have done over many different spaces, and making recommendations, including those beyond the digital. “Recommendations make us more human, not more digital.”

He stressed that we are meeting AI half-way; “dumbing down” our school system so that it can become automated, making us, the educators, replaceable, because we have made the education system something that can be modelled on AI principles rather than making it a creative learning space.

Siemens cited an article, The unreasonable effectiveness of data (Alan Halevy, Peter Norwig, and Fernando Periera. This article deals with problems of human interaction and how these cannot be defined using concise and neat formulae. Halavy and associates suggested that the complexity of these interactions can only be harnessed with the power of data – “if other humans engage in the tasks and generate large amounts of unlabeled, noisy data, new algorithms can be used to build high-quality models from the data”.

Siemens mentioned that the base structural architecture is already in place and this architecture serves as a foundation for creativity. The problem is that developers are trying to build architecture of social processes that run on top of the existing platforms. He suggested that instead of regarding this as just another platform layer, developers need a different frame of reference. This can be seen in the little diagram below:

If working with e.g. WC3 – need structure and routine

Aims at reducing the choices so you do not mess things up, clear away all the ambiguity, need things clear and concise, consistent and predictable
If working at social systems and what that looks like, we need a completely different framework for designing that space

The layer built on top of platforms, the social learning layer must have the opposite attributes. We want ambiguity, want people to grapple with choice, we want unpredictability so that people can interact with one another and create something we had not anticipated