Here’s what I said to a group of doctoral students, the incoming president of SoLAR, and a handful of other researchers and practitioners at the Learning Analytics and Knowledge Conference:
In the 1970’s a philosopher named Hubert Dreyfus began criticizing some of his colleagues at MIT who were working on early versions of Artificial Intelligence- folks like Marvin Minsky and Seymour Papert. In his seminal “What Computers Can’t Do” Dreyfus challenged what he called biological, psychological, and epistemological assumptions- basically any a priori claim about human consciousness and being-in-the-world. Dreyfus challenged a popular conception of the mind as functioning like a computer, that humans are rational agents in rule-bound systems, and the self-rationalizing ouroboros of instrumentalism.
Ultimately, what he was challenging was a scientific way of thinking about people and the world that was largely formed during the enlightenment, but that was explicitly applied to human behavior by folks like Edward Thorndike, and later B.F. Skinner.
It is important to question the claims and ideologies present in any discourse, as a closer interrogation might reveal internal contradictions as well as broader social and structural forces. Specifically the method of Immanent Critique seeks to probe into discursive claims and how they relate or conflict with other discourses.
Now, I’m not trying to start a philosophical argument. Oh wait, yes I am.
I am researching the discourses (texts, media, and other representations) of learning analytics and how they interplay with the discourses related to the purpose of higher education.
When learning analytics claim that they will provide insight into the learning process, I tend to go, “huh, what is the definition of ‘learning process?'”
When higher education discourses say they will advance knowledge creation, I wonder how that jives with adaptive learning language that will make the acquisition of knowledge more efficient. Especially in that latter statement I wonder if efficiency is necessarily a good thing when it comes to learning, or if it might represent broader economic motivations.
What I like about immanent critique is it’s subtlety. It is not intended to replace claims or value judgements with alternative facts, but it is a method of interrogation and exposure. It makes me think of Matlock: “I’m just a small town country lawyer just trying to discover the truth.”
Using a more theoretical discourse analysis is also useful in analyzing meaning both in terms of local or semantic concepts but also in terms of the structural or paradigmatic concepts.
The goal of learning analytics are to predict the likelihood of success or failure and to turn insight into action. The phases of learning analytics are describe, diagnose, predict and prescribe. IBM is already floating trial balloons about how cognitive computing can predict a child’s “learning styles” and career predilections (they call it learning styles- a term that makes education scholars cringe) and custom tailor a personalized, individualized learning pathway from kindergarten to career! Mo Rocca does the voiceover in the YouTube video, though uncredited. I swear it’s Mo Rocca.
Knewton has a video that describes how to increase or decrease the level of student agency. . .
It may be problematic to leave assumptions unexamined, especially those that are validated by their own logic system.
The assumption that people are rational agents within formal structures who simply need to be nudged and prodded toward correct actions. Each of these concepts needs to be unpacked.
Not to sound like an alarmist that we’re heading towards some Orwellian or dystopian future, but language matters- it influences our beliefs and social practices. Language represents power and motivation. I don’t have some maxim or platitude to offer here; and, maybe I’m slow, but I’m just hanging out here at the “describe” stage. And rather than confidently diving into the “what” or the “how” of learning and education, I’m still wondering about the “why.”
Gert Biesta in his book “Good Education in an Age of Measurement” challenges notions such as learning outcomes and evidence based practices, as these things are not devoid of power structures.
Suggesting a critical orientation, Biesta proposes freedom as a goal of education:
we should not think of freedom as sovereignty, that is, of freedom as just doing what you want to do [but] rather. . . a ‘difficult’ notion of freedom, one where my freedom to act, that is, to bring my beginnings into the world, is always connected with the freedom of others to take initiative, to bring their beginnings into the world as well so that the impossibility to remain ‘unique masters’ of what we do is the very condition under which our beginnings can come into the world.
Now I have to admit, framing my research from this critical or philosophical orientation has caused me some ambivalence. I’m an IT administrator- a true blue technocrat. I keep systems running and we measure success by numbers of users. I think it has also broadened my perspective to really think like a humanist, and ponder the ineffable qualities of being human.
I grew up in a very small town in northern Wisconsin. We lived in a hand-built log cabin. For a while we didn’t have electricity or plumbing.
My dad was a high school dropout but later earned a college degree.
I’m part Native American but I check the box for white because that’s how I identify.
By high school I took a career inventory test that said I should become a puppeteer. My GPA took a nosedive but I got several scholarships because of my involvement in clubs and organizations.
It took me six years to earn my bachelor’s degree- I changed majors four times from Business, to undeclared, to Philosophy to English, I failed math three times. I took a bunch of music credits after a jazz instructor who saw me drum once asked if I would be the drummer for the college jazz band. I meandered through my undergraduate experience. I changed my mind. I questioned my own motives and identity.
Many university mission statements talk about personal exploration, creativity, freedom, exposure to a broad range of subjects, and critical thinking.
I breezed through a Master’s degree in Educational Technology.
Charles Eliot, the former president of Harvard talked about the idea of “liberalization before professionalization” as he was proposing a new requirement for students to go through a traditional undergraduate experience in the liberal arts tradition before going on to a professional degree track.
I have no regrets about my education– its errancies or pace
I’m not sure what Learning Analytics interventions would have been applied to me or how they would have altered my life. I did not stay on track nor did I finish on time. I left and came back. Now I’m earning a doctorate and taking longer than I had planned, but don’t really feel too bad about it.
I have no regrets about my education– its errancies or pace.
That being said, my wife uses data to reach out to first generation and minority students to engage them in mentorship programs and other support services. I use data to understand tool adoption and trends to target our training interventions and messaging.
My four-year-old is learning how to write. There is some concern at his preschool that this is not happening fast enough. My first reaction to this concern was one of dismay. Then I realized the complexity of the situation. For one, there is not much modeling happening at home. His mother and I are both highly educated, his sister is high-achieving and academically advanced, and we all spend a lot of time reading and writing, but these activities are performed on glowing electronic rectangles, not with pencils and paper. I wonder if we should be skipping the handwriting and teaching my son how to type (he’s already quite adept at touch screens, on multiple OS’ even- is that important? or measurable?).
I don’t want this to come across as hyperbolic tech evangelism (“books will become obsolete; therefore, ipads for all toddlers!”), nor do I want it to be neo-luddite dismissal (“no glowing rectangles allowed in my home; because they rot your brain!”). And this is certainly not an indictment of my son’s preschool (they’re great). But I wonder if there are different ways to approach measurement and development?
Right now it seems like the unwritten rules of educational institutions are designed to be self-reinforcing:
There are right ways to do things and right ways to know things
The use of technology shall be strictly monitored and controlled to prevent disruptive or distracted behavior
What worked for previous generations works for current generations
The memorization of words and symbols shall be the primary objective of most learning activities
The categorization of individuals into year-based grades is too entrenched to reconsider
The categorization of learning into disciplinary subjects is too entrenched to reconsider
Uniformity is equitable
Everyone functions in a scheduled, structured environment
Introversion is a disorder
I don’t have a fully-developed alternative model, but off the top of my head there are some additional assumptions that could be applied to the incumbent conventions of educational institutions, much like creative commons licenses are layered on top of traditional copyright to reimagine the concepts of ownership. Here’s what a sort of creative commons bolt-on set of rules for education might look like:
Behavior and learning should not be conflated
Students should have opportunities to demonstrate and to teach each other things they are good at
Our bodies and moods have lots to communicate; our environments should be accommodating by providing some freedom and flexibility (e.g., be alone or quiet if we feel like it; not eat if we’re not hungry, etc.)
Positive encouragement is good but should not be confused with rewarding apathy or inactivity (which is bad)
Creativity should be encouraged
Things have a tendency to resolve themselves naturally
Anything can be interesting
Passion and love are contagious
I know my son will learn how to write. And he will be great at it, if he wants to be.
Since coming to Boise State almost two years ago, I have been exposed to a different set of organizational and cultural approaches to technology management that have had a very positive impact on my practice and how I consider the holistic impact that new initiatives might have on a variety of institutional resources.
In my current capacity as Interim Director of Learning Technology Solutions (a new department charged helping to bridge the cultures of IT and Academics) I have partnered with an individual who comes from a corporate project and service management background (Daniel Gold). It has been interesting for both of us (Daniel and I) to learn more about each other’s respective disciplines (Project Management and Education) as we are finding common theoretical foundations in psychology, sociology and philosophy that have informed the methodologies and practices of these disciplines.
Together, we are developing what we are calling the PACE Framework. It is a method of evaluation that groups educational technology initiatives into four main categories:
Pilot, Assess, Compare, and Enhance
For each of these, there will be a scoring rubric that helps us prioritize resources and scope the complexity of the initiative. This rubric will include alignment with campus strategies, pedagogical needs, and estimates of cost and resource needs. If the appropriate initiative is a Pilot of new or emerging technologies, we will employ a template that integrates PMI methodologies with Instructional Design methodologies. In general our approaches to educational technology management all reflect a kind of chimera hybrid that blends PMBOK techniques and Instructional Design models such as ADDIE and Backwards Design.
Each initiative will have checkpoints that allow the project participants to reflect on the process and progress on a kind of meta-level and determine how (or whether) to proceed.
We are excited at how these instruments and methodologies will be utilized here at Boise State University to help us establish a cohesive digital ecosystem that supports and promotes effective and innovative teaching and learning.
For samples of specific rubrics, templates, or our best practices in terms of governance, communication, etc., please feel free to contact me:
Daniel and I would be happy to discuss or present these topics.
I used to think the value of learning technology was self-evident and obvious, but I realize that we all view the world through the lens that is most familiar to us. My lens has been colored by 10 years of pursuing both a career and an advanced education in learning technology, or educational technology, or whatever you want to call it.
I prefer the term learning technology, because “instructional technology” sounds too teacher-centric. It implies a model of education where there are discrete facts to be learned, and the role of the educator is to simply transmit those facts from one source (a textbook, a professor’s brain, etc.) into the student. The student should remember said facts for a certain amount of time. Perhaps long enough to reproduce the facts on a test.
“Educational technology” is very broad- to me it connotes all the technologies and systems required by an institution to function and operate.
Learning technology puts the focus on learning.
I often hear about how education, and how IT in particular, is a business. It’s true. We have business processes and business needs. We are a business. Until we become more like Finland or Germany and can call education a public service or a right, we must acknowledge the fact that there are financial transactions conducted in exchange for products and services.
But what is the business that we’re in? What is the product or service that we deliver? Is it an efficient financial aid request process? Is it a cohesive web portal? Is it computer virus removal? Password reset? Wi-fi?
The reason a student attends an educational institution, one presumes, is to learn.
So what is learning technology? When I was an academic applications administrator and support specialist, I thought I knew what learning technology was. Professors would ask me for help; I would walk them through some steps; I would find efficient ways to map rosters from the SIS to the LMS. I would write documentation, and I could pronounce the word pedagogy.
When I was an instructional designer/technologist, I thought I knew what learning technology was. I managed some open source systems; I evaluated clickers and lecture capture solutions; I had an advanced knowledge of the LMS; I could regurgitate things that I had read about constructivism. I began to teach.
When I earned my master’s degree in Educational Technology, I thought I knew what learning technology was. I could wax intellectual about cognitivism and connectivism. I became an evangelist for tech because I knew the “best practices.” I was cocky about the fact that PhD’s never had any teacher training, and I could tell them what good instruction was really about (“It’s the guide on the side; not the sage on the stage,” I would say). I was managing people and platforms. I was making friends in the industry.
Eventually, I began reading the critics, the anarchists, and the humanists. Learning should be an act of rebellion against the status quo, they claimed. Education was a system of indoctrination, they believed.
I re-read the classics: Vygoskty, Dewey, Gagne, and Bloom. I researched the history of the world and the strange tapestry that education wove across time and space. I read the brain scientists and the social scientists. I drew connections between biology and art.
I know now that there are many approaches to learning. I know that social interactions are often mediated by technology. I know that knowledge and meaning are nebulous terms. I know that transformations can be behavioral or spiritual. I know that technology can be wonderful or frustrating, boring or magical, tedious or seducing. I know that the interplays between people and things, people and information, and people and people have limitless variables.
It took me ten years of a career and an education in learning technology to begin to realize that I will never really know what learning technology is (or should be). But I will continue to learn and practice.
I still love reading the critics and the anarchists. I love reading the classics. I love reading about the early behaviorists and smirking at their modern recapitulations. I get excited when I learn about a new learning theory that changes my perspective. I feel like I’m meeting a movie star when I have dinner or drinks with an author or a thought leader. I gush with joy when my friends receive grants, get published, or are honored with awards.
I speculate about the future. Some days I worry that higher education is in its death throes; or that the sexy new startups will swindle the taxpayers with promises of “better student engagement!” “retention!” and “wait, there’s more!”
Some days I beam in my daydream of flexible, personalized systems that also promote creativity and agency– technologies that spark the imagination, trigger aha! moments after aha! moments, challenge learners to strive for excellence and engage with knowledge and each other in new and deeper ways they never thought possible.
I dream of a world where students control the technology. They create with the technology and learn how to plan, produce, and fail. I dream of a world where students learn how to solve the world’s most complex and dire problems. I dream of a world where students form strong relationships, and they develop a deep and rich appreciation for all the diverse facets of culture and nature. I dream of a world where, empowered, students learn how to learn, mediated by technology or not, and they do it well, and they love it. After all, isn’t that the business we’re in?
I sat on a panel presentation today about how to prevent cheating. My wonderful colleagues talked about the tips, tricks, best practices and technologies that could be used to subvert cheating on quizzes and exams. My turn was up, and I offered some sheepish disclaimers that what I had to say was a complete departure from the previous speakers because I had woken up at 4am with these strange Jerry MacGuire-esque thoughts about motivation and assessment. Here is what my half-awake brain produced on the topic of cheating:
Let’s go back in time a little bit to 1993. What happened in 1993? The World Wide Web was unveiled. Within just a few short years there were millions of users and millions of web sites on the Internet. Today, these numbers are in the billions. Some interesting things happened as use of the web increased and evolved; what was originally used as a tool to present information or to foster communication in an alternative mode, the Internet began subsuming other industries. Think about how nearly every news, information, and entertainment industry has been completely shaken by the rapid growth of the Internet. So now, we have a convergence of content accessible through a single portal. This is a good thing right? It also blurs the lines between public and private, scholarly and non-scholarly, formal and informal.
Another pattern that challenged basic assumptions about ownership and attribution is the blurred line between stealing and sharing. With the click of a button, one can “share” a song, a video, some text, etc. Many web-tools even encourage sharing, remixing, and collaborative contribution. And, unlike in the past when physical media impacted the costs and incentives for production; today, bits and bytes have an almost negligible material cost. Also, in an increasingly globalized world, there are alternative perspectives about copying. In some Asian countries, for instance, quoting others is a form of flattery, and Western concepts of attribution are relatively new.
So, do these online media environments impact students’ understanding of cheating?
Although Harvard and Yale began experimenting with recorded test scores and performance ranking systems in the early 1800’s, the first letter grade system that resembles the ones that are so common today was implemented by Mount Holyoke in 1897. Just over 100 years ago. Before that, grades didn’t exist (Durm, 1993).
The number of A’s given to college students rose from 15% in 1960 to just over 50% in 2010. So, students are getting smarter, we’re getting better at teaching, or grade inflation is a real thing. I like to think it’s the first two, but it is probably mostly the last one.
A recent ECAR study showed that one of the top reasons students use a Learning Management System was to immediately see their grades (Dahlstrom & Bichsel, 2014).
What is a grade, anyway? It is a form of summative assessment. Summative assessment is only useful for learning if it is used as part of formative assessment strategies. By itself, summative assessment is valueless to the student in terms of development, except that it serves as an external motivator. With the increase of A’s and the increasingly speed at which students are getting their grades, are we simply feeding a behavioral reward stimulus? Is it an addiction? Will people do unethical things to feed addiction?
More concerning than just the motivational element, are we feeding self-perpetuating incentive systems; super-structures of summative assessment? What happens when students feel not only the rush of endorphins when they log into the LMS and see that A, but also the panic of not maintaining a 4.0 in an increasingly competitive world?
So do we continue to perpetuate the importance of grades because they are easy ways to measure student learning?
. . .are we feeding self-perpetuating incentive systems; super-structures of summative assessment?
To truly subvert cheating, yes, we must educate students about proper citation and attribution. That’s the low-hanging fruit. I would also suggest that as educators we model this behavior and make sure we give attribution in all our lecture notes, PowerPoint images, etc. I would also make a case that well-designed assignments and activities can make it more difficult to cheat. Consider creating assignments that promote creativity, personal reflection, or original research. The more canned or common the questions, the more likely it is students can Google the answer.
The Case for Cheating
When so much information is available to us online, we should encourage the use of this resource to find answers to questions and to solve problems. Good questions can lead to some deep thinking and exploration. Also, working in groups to solve problems can actually lead to better working memory retention than solving problems in isolation. Research from an Indian educator named Sugata Mitra has shown the potential of students working in groups and using the Internet to solve problems together. In a post-test compared against a control group, the students who worked in groups and were able to talk and use online resources, scored better than those who could only study in isolation (Mitra & Dangwall, 2011).
So, should we consider letting students take tests in groups? Maybe. Should we de-emphasize grades in our courses? Perhaps.
There may be examples where de-emphasising grades can be effective. Consider Finland, where pass/fail systems are more common, yet they rank highest on international standardized tests. In my own class, I use a three-scale model where assignments are either non-existent, partially complete, or complete. I also use multiple choice tests for ungraded self-check tools only. And I offer assignments that are completely ungraded, yet many students complete them.
Imagine for a second that we go back in time 150 years, before letter grade systems were created, before compulsory education was diffused across the world, and with it the concept of grades. Would we design assessment systems differently? Would we focus on learner development rather than superficial measures?
My dad once told me that for him, the problem with making art was that the artist must strive to create a piece that captures the entire world and gives it back in a meaningful way. This is both impossible and the greatest aim.
I don’t write because I know it will be largely unread, pedestrian, and straddling the line between self-deprecation and narcissism.
But maybe we should all try to capture the language that makes sense to us and our situation in time and place. Camus implores us to rail against torpidity. Rorty reminds us the language will change anyway. Vygotsky assures us that a multitude is a good thing. And maybe it gives something back.
I don’t write because I am so humbled by and in awe of the people I admire.
But, the ethos that has been a constant thread in my life is that everyone has an important voice if their message is told with sincerity, passion, and clarity. The cacophony of real human voices, of individuals revealing themselves in a morass of media that is otherwise dominated by the dominant, might just paint an authentic picture of humanity, and, one hopes, add meaning to all of them and to each of them.
So while I do not write, I do think everyone should.