Here’s what I said to a group of doctoral students, the incoming president of SoLAR, and a handful of other researchers and practitioners at the Learning Analytics and Knowledge Conference:
In the 1970’s a philosopher named Hubert Dreyfus began criticizing some of his colleagues at MIT who were working on early versions of Artificial Intelligence- folks like Marvin Minsky and Seymour Papert. In his seminal “What Computers Can’t Do” Dreyfus challenged what he called biological, psychological, and epistemological assumptions- basically any a priori claim about human consciousness and being-in-the-world. Dreyfus challenged a popular conception of the mind as functioning like a computer, that humans are rational agents in rule-bound systems, and the self-rationalizing ouroboros of instrumentalism.
Ultimately, what he was challenging was a scientific way of thinking about people and the world that was largely formed during the enlightenment, but that was explicitly applied to human behavior by folks like Edward Thorndike, and later B.F. Skinner.
It is important to question the claims and ideologies present in any discourse, as a closer interrogation might reveal internal contradictions as well as broader social and structural forces. Specifically the method of Immanent Critique seeks to probe into discursive claims and how they relate or conflict with other discourses.
Now, I’m not trying to start a philosophical argument. Oh wait, yes I am.
I am researching the discourses (texts, media, and other representations) of learning analytics and how they interplay with the discourses related to the purpose of higher education.
When learning analytics claim that they will provide insight into the learning process, I tend to go, “huh, what is the definition of ‘learning process?'”
When higher education discourses say they will advance knowledge creation, I wonder how that jives with adaptive learning language that will make the acquisition of knowledge more efficient. Especially in that latter statement I wonder if efficiency is necessarily a good thing when it comes to learning, or if it might represent broader economic motivations.
What I like about immanent critique is it’s subtlety. It is not intended to replace claims or value judgements with alternative facts, but it is a method of interrogation and exposure. It makes me think of Matlock: “I’m just a small town country lawyer just trying to discover the truth.”
Using a more theoretical discourse analysis is also useful in analyzing meaning both in terms of local or semantic concepts but also in terms of the structural or paradigmatic concepts.
The goal of learning analytics are to predict the likelihood of success or failure and to turn insight into action. The phases of learning analytics are describe, diagnose, predict and prescribe. IBM is already floating trial balloons about how cognitive computing can predict a child’s “learning styles” and career predilections (they call it learning styles- a term that makes education scholars cringe) and custom tailor a personalized, individualized learning pathway from kindergarten to career! Mo Rocca does the voiceover in the YouTube video, though uncredited. I swear it’s Mo Rocca.
Knewton has a video that describes how to increase or decrease the level of student agency. . .
It may be problematic to leave assumptions unexamined, especially those that are validated by their own logic system.
The assumption that people are rational agents within formal structures who simply need to be nudged and prodded toward correct actions. Each of these concepts needs to be unpacked.
Not to sound like an alarmist that we’re heading towards some Orwellian or dystopian future, but language matters- it influences our beliefs and social practices. Language represents power and motivation. I don’t have some maxim or platitude to offer here; and, maybe I’m slow, but I’m just hanging out here at the “describe” stage. And rather than confidently diving into the “what” or the “how” of learning and education, I’m still wondering about the “why.”
Gert Biesta in his book “Good Education in an Age of Measurement” challenges notions such as learning outcomes and evidence based practices, as these things are not devoid of power structures.
Suggesting a critical orientation, Biesta proposes freedom as a goal of education:
we should not think of freedom as sovereignty, that is, of freedom as just doing what you want to do [but] rather. . . a ‘difficult’ notion of freedom, one where my freedom to act, that is, to bring my beginnings into the world, is always connected with the freedom of others to take initiative, to bring their beginnings into the world as well so that the impossibility to remain ‘unique masters’ of what we do is the very condition under which our beginnings can come into the world.
Now I have to admit, framing my research from this critical or philosophical orientation has caused me some ambivalence. I’m an IT administrator- a true blue technocrat. I keep systems running and we measure success by numbers of users. I think it has also broadened my perspective to really think like a humanist, and ponder the ineffable qualities of being human.
I grew up in a very small town in northern Wisconsin. We lived in a hand-built log cabin. For a while we didn’t have electricity or plumbing.
My dad was a high school dropout but later earned a college degree.
I’m part Native American but I check the box for white because that’s how I identify.
By high school I took a career inventory test that said I should become a puppeteer. My GPA took a nosedive but I got several scholarships because of my involvement in clubs and organizations.
It took me six years to earn my bachelor’s degree- I changed majors four times from Business, to undeclared, to Philosophy to English, I failed math three times. I took a bunch of music credits after a jazz instructor who saw me drum once asked if I would be the drummer for the college jazz band. I meandered through my undergraduate experience. I changed my mind. I questioned my own motives and identity.
Many university mission statements talk about personal exploration, creativity, freedom, exposure to a broad range of subjects, and critical thinking.
I breezed through a Master’s degree in Educational Technology.
Charles Eliot, the former president of Harvard talked about the idea of “liberalization before professionalization” as he was proposing a new requirement for students to go through a traditional undergraduate experience in the liberal arts tradition before going on to a professional degree track.
I have no regrets about my education– its errancies or pace
I’m not sure what Learning Analytics interventions would have been applied to me or how they would have altered my life. I did not stay on track nor did I finish on time. I left and came back. Now I’m earning a doctorate and taking longer than I had planned, but don’t really feel too bad about it.
I have no regrets about my education– its errancies or pace.
That being said, my wife uses data to reach out to first generation and minority students to engage them in mentorship programs and other support services. I use data to understand tool adoption and trends to target our training interventions and messaging.
I came across this video on Vimeo a while back. It’s about a woodworker who has a school for “at risk” kids (I put the term “at risk” in quotes because I don’t really agree with it- it’s a degrading term invented by a system that values the wrong things).
I’m not sure why I find this video so inspiring . . . maybe because of the message about making as a form of learning, maybe because I like the metaphor of society as a train that is scooping up and dumping off information, or maybe because the guy reminds me a little bit of my Dad. I don’t know. If you get a chance, watch up until at least the 3:45 mark.
There’s a part of the video that talks about the kids who are kicked out of public school and end up in remedial schools until they are placed in the “court ordered community school” which is the woodworker’s shop. The shop owner says, “they’re great kids. There isn’t anything thing wrong with them- they’re just doers and not sitters.”
I think about this resignation letter that decries the overt emphasis on standards and summative assessment at the expense of “developmentally appropriate” curriculum.
I think about John Medina and his suggestion that our brains are designed to be solving problems outdoors, in constant motion, and in various environmental conditions.
I think about the the credit hour and its colloquial name, “seat time.”
Class sizes, enrollments: “Butts in seats.”
Sit still. Be quiet. Listen. Eyes on your own test.
I used to think the value of learning technology was self-evident and obvious, but I realize that we all view the world through the lens that is most familiar to us. My lens has been colored by 10 years of pursuing both a career and an advanced education in learning technology, or educational technology, or whatever you want to call it.
I prefer the term learning technology, because “instructional technology” sounds too teacher-centric. It implies a model of education where there are discrete facts to be learned, and the role of the educator is to simply transmit those facts from one source (a textbook, a professor’s brain, etc.) into the student. The student should remember said facts for a certain amount of time. Perhaps long enough to reproduce the facts on a test.
“Educational technology” is very broad- to me it connotes all the technologies and systems required by an institution to function and operate.
Learning technology puts the focus on learning.
I often hear about how education, and how IT in particular, is a business. It’s true. We have business processes and business needs. We are a business. Until we become more like Finland or Germany and can call education a public service or a right, we must acknowledge the fact that there are financial transactions conducted in exchange for products and services.
But what is the business that we’re in? What is the product or service that we deliver? Is it an efficient financial aid request process? Is it a cohesive web portal? Is it computer virus removal? Password reset? Wi-fi?
The reason a student attends an educational institution, one presumes, is to learn.
So what is learning technology? When I was an academic applications administrator and support specialist, I thought I knew what learning technology was. Professors would ask me for help; I would walk them through some steps; I would find efficient ways to map rosters from the SIS to the LMS. I would write documentation, and I could pronounce the word pedagogy.
When I was an instructional designer/technologist, I thought I knew what learning technology was. I managed some open source systems; I evaluated clickers and lecture capture solutions; I had an advanced knowledge of the LMS; I could regurgitate things that I had read about constructivism. I began to teach.
When I earned my master’s degree in Educational Technology, I thought I knew what learning technology was. I could wax intellectual about cognitivism and connectivism. I became an evangelist for tech because I knew the “best practices.” I was cocky about the fact that PhD’s never had any teacher training, and I could tell them what good instruction was really about (“It’s the guide on the side; not the sage on the stage,” I would say). I was managing people and platforms. I was making friends in the industry.
Eventually, I began reading the critics, the anarchists, and the humanists. Learning should be an act of rebellion against the status quo, they claimed. Education was a system of indoctrination, they believed.
I re-read the classics: Vygoskty, Dewey, Gagne, and Bloom. I researched the history of the world and the strange tapestry that education wove across time and space. I read the brain scientists and the social scientists. I drew connections between biology and art.
I know now that there are many approaches to learning. I know that social interactions are often mediated by technology. I know that knowledge and meaning are nebulous terms. I know that transformations can be behavioral or spiritual. I know that technology can be wonderful or frustrating, boring or magical, tedious or seducing. I know that the interplays between people and things, people and information, and people and people have limitless variables.
It took me ten years of a career and an education in learning technology to begin to realize that I will never really know what learning technology is (or should be). But I will continue to learn and practice.
I still love reading the critics and the anarchists. I love reading the classics. I love reading about the early behaviorists and smirking at their modern recapitulations. I get excited when I learn about a new learning theory that changes my perspective. I feel like I’m meeting a movie star when I have dinner or drinks with an author or a thought leader. I gush with joy when my friends receive grants, get published, or are honored with awards.
I speculate about the future. Some days I worry that higher education is in its death throes; or that the sexy new startups will swindle the taxpayers with promises of “better student engagement!” “retention!” and “wait, there’s more!”
Some days I beam in my daydream of flexible, personalized systems that also promote creativity and agency– technologies that spark the imagination, trigger aha! moments after aha! moments, challenge learners to strive for excellence and engage with knowledge and each other in new and deeper ways they never thought possible.
I dream of a world where students control the technology. They create with the technology and learn how to plan, produce, and fail. I dream of a world where students learn how to solve the world’s most complex and dire problems. I dream of a world where students form strong relationships, and they develop a deep and rich appreciation for all the diverse facets of culture and nature. I dream of a world where, empowered, students learn how to learn, mediated by technology or not, and they do it well, and they love it. After all, isn’t that the business we’re in?
I sat on a panel presentation today about how to prevent cheating. My wonderful colleagues talked about the tips, tricks, best practices and technologies that could be used to subvert cheating on quizzes and exams. My turn was up, and I offered some sheepish disclaimers that what I had to say was a complete departure from the previous speakers because I had woken up at 4am with these strange Jerry MacGuire-esque thoughts about motivation and assessment. Here is what my half-awake brain produced on the topic of cheating:
Let’s go back in time a little bit to 1993. What happened in 1993? The World Wide Web was unveiled. Within just a few short years there were millions of users and millions of web sites on the Internet. Today, these numbers are in the billions. Some interesting things happened as use of the web increased and evolved; what was originally used as a tool to present information or to foster communication in an alternative mode, the Internet began subsuming other industries. Think about how nearly every news, information, and entertainment industry has been completely shaken by the rapid growth of the Internet. So now, we have a convergence of content accessible through a single portal. This is a good thing right? It also blurs the lines between public and private, scholarly and non-scholarly, formal and informal.
Another pattern that challenged basic assumptions about ownership and attribution is the blurred line between stealing and sharing. With the click of a button, one can “share” a song, a video, some text, etc. Many web-tools even encourage sharing, remixing, and collaborative contribution. And, unlike in the past when physical media impacted the costs and incentives for production; today, bits and bytes have an almost negligible material cost. Also, in an increasingly globalized world, there are alternative perspectives about copying. In some Asian countries, for instance, quoting others is a form of flattery, and Western concepts of attribution are relatively new.
So, do these online media environments impact students’ understanding of cheating?
Although Harvard and Yale began experimenting with recorded test scores and performance ranking systems in the early 1800’s, the first letter grade system that resembles the ones that are so common today was implemented by Mount Holyoke in 1897. Just over 100 years ago. Before that, grades didn’t exist (Durm, 1993).
The number of A’s given to college students rose from 15% in 1960 to just over 50% in 2010. So, students are getting smarter, we’re getting better at teaching, or grade inflation is a real thing. I like to think it’s the first two, but it is probably mostly the last one.
A recent ECAR study showed that one of the top reasons students use a Learning Management System was to immediately see their grades (Dahlstrom & Bichsel, 2014).
What is a grade, anyway? It is a form of summative assessment. Summative assessment is only useful for learning if it is used as part of formative assessment strategies. By itself, summative assessment is valueless to the student in terms of development, except that it serves as an external motivator. With the increase of A’s and the increasingly speed at which students are getting their grades, are we simply feeding a behavioral reward stimulus? Is it an addiction? Will people do unethical things to feed addiction?
More concerning than just the motivational element, are we feeding self-perpetuating incentive systems; super-structures of summative assessment? What happens when students feel not only the rush of endorphins when they log into the LMS and see that A, but also the panic of not maintaining a 4.0 in an increasingly competitive world?
So do we continue to perpetuate the importance of grades because they are easy ways to measure student learning?
. . .are we feeding self-perpetuating incentive systems; super-structures of summative assessment?
To truly subvert cheating, yes, we must educate students about proper citation and attribution. That’s the low-hanging fruit. I would also suggest that as educators we model this behavior and make sure we give attribution in all our lecture notes, PowerPoint images, etc. I would also make a case that well-designed assignments and activities can make it more difficult to cheat. Consider creating assignments that promote creativity, personal reflection, or original research. The more canned or common the questions, the more likely it is students can Google the answer.
The Case for Cheating
When so much information is available to us online, we should encourage the use of this resource to find answers to questions and to solve problems. Good questions can lead to some deep thinking and exploration. Also, working in groups to solve problems can actually lead to better working memory retention than solving problems in isolation. Research from an Indian educator named Sugata Mitra has shown the potential of students working in groups and using the Internet to solve problems together. In a post-test compared against a control group, the students who worked in groups and were able to talk and use online resources, scored better than those who could only study in isolation (Mitra & Dangwall, 2011).
So, should we consider letting students take tests in groups? Maybe. Should we de-emphasize grades in our courses? Perhaps.
There may be examples where de-emphasising grades can be effective. Consider Finland, where pass/fail systems are more common, yet they rank highest on international standardized tests. In my own class, I use a three-scale model where assignments are either non-existent, partially complete, or complete. I also use multiple choice tests for ungraded self-check tools only. And I offer assignments that are completely ungraded, yet many students complete them.
Imagine for a second that we go back in time 150 years, before letter grade systems were created, before compulsory education was diffused across the world, and with it the concept of grades. Would we design assessment systems differently? Would we focus on learner development rather than superficial measures?