This post is the second of a series where Aminah Ottosdotter Davidsson reflects on technology in education and educational leadership as part of her MBA studies in Educational Leadership at TAMK.
“Valuing what we measure or measuring what we value?” Gert Biesta
With a less clear sight of the purpose of education and an overt belief in technology as the solution to problems in education, I’d like to elaborate a bit more on possible reasons for the shift in value and measurement. In the Innovating Pedagogy Report (IPR) of 2012, learning analytics was mentioned as one of the trends in the loop at the time. After that, analytics, in one form or the other, was mentioned in the IPRs every year between 2013 and 2017. It initially developed from the larger field of data analytics and followed the upswing of ‘big data collection’ in business, also contributing to the further development of business analytics. Subsequently, combined with virtual learning environments, learning analytics entered the scene of education. The impact of learning analytics on today’s education is vast and I dare say that the extensive focus on data has contributed to the increase in valuing what we measure instead of measuring what we value. In this way, it also jeopardizes the purpose of education by contributing to a shift in focus and by turning education into business.
So how did this happen? How did this unbalance in measurement occur related to learning analytics? When reading and comparing the way data analytics is presented in the above IPRs, a gradual emergence into education can be observed. Initially, it started off as a source for analyzing learners’ virtual movements, without paying much attention to the pedagogical aspects. Later, learning sciences and social network analysis were included, leading to a more theory-driven approach. Thereafter, to differentiate between the needs of the institution and the needs of the students, it parted into two different fields, academic analytics and learning analytics. As more and more learning opportunities were offered online, a new focus on using data to improve online course design and the learning experience emerged. Along with this increase, more and more people started questioning the ethics behind this practice and as a result a new field of student-oriented learning analytics occurred. Instead, data collection was connected to seeing students as active agents and collaborators, offering them more transparency which could help them with self-reflection. The past few years the trend has continued in this direction, aiming at personalizing the learning experience even further through innovative learning designs, analytics of emotions and student-led analytics. Today, as part of student-led analytics, the focus is on formative analytics where analytics is used for learning rather than of learning, where the data is used by students to reflect on their learning and to set and measure their own goals.
So, there has definitely been a shift in for what purpose the data is used. However, the data is still there and it is still used to measure proposed ‘learning’ in various ways. Whether or not that is actually happening has been increasingly questioned and it is claimed that learning analytics has failed to live up to its promises. Dawson et al. evaluated to what extent learning analytics “has impacted our understanding of learning and produced insights that have been translated to mainstream practice or contributed to theory.” Interestingly, their findings revealed a limited impact on theory, practice and frameworks. They hypothesized that this was “due to a continuing predominance of small-scale techno-centric exploratory studies that to date have not fully accounted for the multi-disciplinarily that comprises education.” So again, there seems to be this misalignment that I discussed in my last post, where both the use of technology and the research related to technology fail to consider the holistic view of Edtech. Technology, similar to education itself, does not exist in a vacuum and thus in Edtech, one can’t be considered without the other.
Hence, what has happened over the last two decades is a shift in disguise, with technology at the forefront, contributing to valuing the measurement of ‘learning’ through data more than measuring what we value as part of learning. It also greatly connects to a switch from quality to quantity, with influences from business which is described in Dr. Mansoor Al Awar’s statement: “Today’s institutions are more focused on managing personnel, buildings and finance rather than the crucial task of managing learning.” So even though he does have a point here, he also claims that “[a] smart learning service or provider is founded on three pillars: accessibility, flexibility and affordability”, again giving much importance to the tech- and business side of things.
Going back to the ideas of quantity at the expense of quality, the narrative of technology as the solution to problems in education, and an overt focus on measurement, I call for a less is more approach based on up-to-date research. Does research show that students learn more by using the latest technology? Do students learn more when given more content? Are students more skilled when the data shows high scores on tests? Using the newest technology as much as possible doesn’t necessarily generate more and deeper learning. Using learning analytics to determine what comprises successful learning doesn’t automatically prove that our students have more knowledge and skills. Instead, there is a need for “more holistic and integrative systems-level research” in the field of learning analytics in order to understand and optimize learning and learning environments. Instead of chasing tomorrow and losing today, I call for more consistency in education focusing on less, but doing it well and purposefully. Also, I call for revisiting our purpose at every angle of education, to evaluate if we still stand where we want to be and if we are equipped for the future in an ever-changing world. So again, we need to ask ourselves, are we valuing what we measure or are we measuring what we value?
In the next post, I will dive into the story of the pandemic and education and how we live in the midst of the narrative. And in the middle of all that, what choices do we make?
Aminah Ottosdotter Davidsson
Al Awar, Dr. M. (n.a). Pioneering smart learning. Ellucian Europe, Middle East, Africa, India, and Asia Pacific. https://www.ellucian.com/emea-ap/insights/pioneering-smart-learning
Biesta, G. (2014). Measuring what we Value or Valuing what we Measure? Globalization, Accountability and the Question of Educational Purpose. Pensamiento Educativo: Revista de Investigación Educacional Latinoamericana, 51, 46–57. https://doi.org/10.7764/PEL.51.1.2014.5
Dawson, S., Joksimovic, S., Poquet, O., & Siemens, G. (2019). Increasing the Impact of Learning Analytics (p. 455). https://doi.org/10.1145/3303772.3303784
Institute of Educational Technology. (2012-2017). Innovating Pedagogy Reports. https://iet.open.ac.uk/innovating-pedagogy
Ioncica, D., Dona, D., & Militaru, M. (2018). What is lost when technology wins? A study on the benefits and drawbacks of a technology-centered approach to learning. https://www.researchgate.net/publication/329879656_WHAT_IS_LOST_WHEN_TECHNOLOGY_WINS_A_STUDY_ON_THE_BENEFITS_AND_DRAWBACKS_OF_A_TECHNOLOGY-CENTERED_APPROACH_TO_LEARNING