The Rise and Fall of the Twitter EdChat

For over a decade, the hashtag #Edchat allowed educators using Twitter to gather, share, and commiserate. However, as the platform formerly known as Twitter transitioned into X, the landscape of this digital discussion site shifted dramatically. Drawing from recent research, this post explores how educators have used #Edchat over time, the stressors inherent in this social media use, and the history of a community in transition. Researchers understand that for a variety of reasons, Twitter and Twitter chats are far less influential than they were a few years ago. They argue that studying the edchat phenomenon historically may have value for other social media platforms generally and specifically, should they hope to involve educators. 

The Golden Era of #Edchat: Purpose and Participation

In its prime, #Edchat provided a means for informal professional development. Research by Willet (2019) and later Willet and colleagues analyzed hundreds of thousands of tweets to understand exactly how and why educators were using the platform. The study identified several key types of engagement:

  • Resource Sharing: The most common use case, where teachers curated and distributed lesson plans, EdTech tools, links, and articles.
  • Pedagogical Debate: Scheduled weekly chats allowed for deep dives into specific topics, from classroom management to the integration of AI.
  • Social Support: Perhaps most importantly, it provided a space for “digital social support,” helping teachers feel less isolated in their professional struggles.

This era was defined by a sense of “augmented intelligence,” in which the collective knowledge of the network enhanced individual teachers’ expertise.

2008–2014: The Golden Era of Synchronous Twitter Connection

The #Edchat phenomenon began in October 2008. This early era was defined by the weekly Tuesday night chat, a highly structured synchronous event that became a must for thousands of educators. An agreement on Tuesday night did not result from any official declaration, but once started, Tuesday night became the default for those wanting to participate in a common chat. 

I encourage anyone interested in the topic of teacher use of social media to read the two references to Willet and colleagues I provide here. These researchers had access to what I have heard described as the Twitter firehose, which was available until 2023, and downloaded over 15 million tweets for analysis using the #edchat hashtag as used over 15 years. Unlike researchers who implemented projects of a much smaller scale and made use of human raters to classify and quantify characteristics of such chats, Willet and colleagues used data analysis tools that quantified specific characteristics (questions, responses, links, secondary tags, retweets) and even tools that attempted to identify themes based on terms appearing in the tweets. These characteristics were mapped against years to identify trends.

This approach allowed certain questions that could only be addressed by this massive scale. What trends could be observed over the history of the edchat phenomenon, but may have been overlooked in the data? For example, the way in which edchats evolved interacted with the capabilities of the Twitter platform. Tags were a user-applied innovation that was later integrated into the tool as a capability. Tuesday night became the impromptu time for edchats, which took on a formalized approach. A chat leader would provide a series of prompts identified as Q1, Q2, Q… and participants would respond using R1,R2, R… . Other tweets could be added within the rough synchronous time frame defined by the prompts. Because chats were stored, others might review the session at their leisure and add their own contributions. 

Just to be clear for those unfamiliar with the reason for this format, this experience was based on a kludge of sorts. By searching for #edchat, you could follow the sequence of questions, responses, and related comments in real time, separate from other Twitter chatter. The hashtag functions as a sort of portal, focusing Twitter use on the chat content and turning Twitter from an asynchronous to a synchronous tool. Tools other than Twitter, such as Tweetdeck (no longer available), even allowed users to create a multi-column display, with individual columns focused on specific tags and updated automatically. Only the #echat contributions would then be displayed within one column. These tools became popular as an easy way to turn the Twitter feed into a synchronous experience unique to those using the #edchat hashtag. 

Research shows that between 2009 and 2014, these Tuesday sessions saw significantly higher engagement than other days, characterized by genuine dialogue and a high volume of questions and replies. Teachers weren’t just “knowledge telling”; they were building communities of practice and exploring new pedagogical ideas in real-time.

The researchers had a special interest in the frequency of questions and replies, and the ratio between the two, assuming these variables would be a good way to assess interaction. In addition, how did these variables differ between Tuesday and other days, assuming this would be related to the higher likelihood of synchronous interaction on Tuesdays? Replies made up a higher proportion on Tuesdays and were significantly higher in the earlier years. My interpretation differs from the researchers’. They argued that there was a decrease in interaction in later years. My interpretation is that the chats drifted away from the formal structure of questions interspersed with participants’ answers. Relying on the massive scale and automated methods employed rather than human raters following the give and take of individual sessions may have led to different interpretations. 

2014–2018: The Shift from Dialogue to Broadcasting

At its peak in 2017–2018, #Edchat was a massive digital footprint, averaging 120,000 tweets per month and involving roughly 200,000 different users. However, beneath these impressive numbers, the nature of the interaction was shifting. Starting around 2014, Willet and other researchers observed a transition from authentic conversation toward broadcast-style communication.

Several key trends marked this transformation:

The Rise of the Link: While early chats focused on natural discussion, later years saw a sharp increase in posts containing hyperlinks to external content, suggesting the platform was becoming a repository for resource sharing rather than deep discussion.

Retweet Dominance: Retweets began to outnumber original posts, and the percentage of questions receiving replies dropped significantly. Retweets could be used by individuals to bring those who followed these individuals but were not chat participants to experience the content.

Exploitation: As the hashtag grew in popularity, it became a target for spam and self-promotion. By 2018, the community faced a spike in problematic content and a decline in “authenticity scores” as commercial interests exploited the tag for marketing.

The transition to less interaction and greater influencer dominance may also be related to the active/passive distinction that researchers have begun to study in social media activity. A focus on Twitter chats as a source of resources is consistent with this research topic.

You can still find the use of #edchat on X, Bluesky, and Mastodon instances. The tag is typically used now to indicate educational content and is seldom used in the same way within a chat sequence. A few chats can still be found, often now, using more idiosyncratic tags.

The Paradox of Digital Support

I wrote a series of posts beginning in 2013, focused on edchats, mostly questioning the information value of the process. I appreciated the camaraderie the chats offered, but the limit on the number of characters Twitter allowed, along with my reaction to the content included in such chats, led me to believe the experience was very inefficient, and I thus proposed tactics I thought would increase the professional development value of the experiences. 

Edchats were often included as one experience within the grad course on technology class I taught, and I proposed, without success, that students might analyze the content of such chats as a potential thesis project. My suggestion at that time was that video-based systems (e.g., Skype, Zoom) would allow a much more productive approach. 

2018–2023: Volatility, X, and Fragmentation

The decline of #Edchat accelerated after 2018, driven by platform volatility and the eventual transition of Twitter into X. Changes in leadership and algorithm priorities disrupted the organic reach of educational hashtags. As the environment became more polarized, many educators began to migrate to other platforms like Instagram, Mastodon, or niche, specialized communities that better served their specific needs.

By 2023, the once-unified #Edchat community had largely fragmented. This decline highlights a critical vulnerability: digital spaces on commercial platforms often lack the stability and continuity of traditional professional development. When profit extraction and algorithmic shifts override user experience, the community suffers.

Lessons for the Future

The history of #Edchat is a reminder that while platforms change, the human need for collaboration remains constant. The legacy of this 15-year experiment suggests that for future digital communities to succeed, they must:

1. Prioritize Active Participation: Moving beyond passive consumption is essential to avoid the stress of social comparison.

2. Foster Authentic Dialogue: Successful communities require mechanisms that encourage genuine interaction over simple content broadcasting.

3. Shift to Knowledge Building: The goal of any digital faculty lounge should be to move from merely “telling” knowledge to collaboratively building it.

Perhaps online interaction among educators isn’t gone; it is simply evolving. As educators move toward new tools, the story of #Edchat serves as both a testament to the power of digital connection and a cautionary tale about the challenges of sustaining authentic community in commercial environments.

I have tried to identify where those educators interested in online interaction with peers went. I could not find the type of quantified data provided by Willett, but other researchers (Greenhow and colleagues) have suggested that Facebook groups and Instagram have become favorite sites for interaction. 

Sources:

Greenhow, C., Galvin, S. M., Brandon, D. L., & Askari, E. (2020). A decade of research on K–12 teaching and teacher learning with social media: Insights on the state of the field. Teachers College Record, 122(6), 1-72.

Willet, K.  (2019). Revisiting how and why educators use Twitter: Tweet types and purposes in# Edchat. Journal of Research on Technology in Education, 51(3), 273-289.

Willet, K., Carpenter, J., & Na, H. (2025). Ex-Edchat: Historic retrospective of X/Twitter# Edchat. Computers & Education, 241, 1-18.

Loading

Evaluating the Consequences of School Choice

The education of learners in the K-12 range is undergoing a noticeable shift as the concept of “school choice” has become a political wedge issue, and state-level legislative policy decisions have given parents and their kids greater control over where and under what conditions students are educated. While often framed from a perspective of the individual family, I have always felt this perspective is too narrow and decisions made for the benefit of one benefit also impact the experiences of others. In various ways, we are all stakeholders, whether it be as citizens living in a society dependent on an educated population, financial contributors as state and local taxpayers, and students and their families involved in school experiences. 

While I have followed the issue of school choice for years, this post was prompted by a recent EdWeek article – As School Choice Goes Universal, What New Research Is Showing. Before I comment on the article, here is some background on K12 School Choice.

School choice is the opportunity for parents to enroll their child in an elementary or secondary school other than the assigned school based on their home address. There are multiple variations. The optional school may be another public school in a different district or a public magnet or charter school. Funding for charter schools, which typically operate with a separate board from the public school district, may be public or private. Private schools can be further differentiated as parochial (religious emphasis) or independent. 

My personal interest is mainly in the impact of tax-based funding as I see funding as a nearly zero-sum variable – schools compete for a fixed pot of money dependent on student enrollment. Private schools rely on tuition paid by families, contributions, and, increasingly, tax money collected and then distributed to the school in which the student is enrolled. When school choice sends tax money to a private school the approach may be as a voucher or a family-controlled educational savings account (ESA). A voucher might be thought of as the cut of tax money a school receives from the state – per pupil expenditure, but by my understanding typically not the local tax based on property taxes. An ESA provides funds parents can use for several forms of educational assistance – tutoring, textbooks, and private school tuition (source – Overview of Public and Private School Choice Options).

According to the EdWeek source I identified previously, 18 states now allow all students to use state funds to attend private schools, approximately 1.5 million students in the 30 states that allow at least some students access to similar programs. An example of the type of situation that accounts for the difference might be an allowance for students attending what are considered low-performing to use such resources.

My interest has been in the academic achievement of both students who leave their designated public school and a more nuanced issue of what happens to public schools when a sizeable number of students and the funds associated with these students leave. Despite all of the research on such topics, the EdWeek analysis notes that only one of the 18 states identified as providing all students access to state funds made use of the same standardized achievement test in both public and private schools. Obviously, having large numbers of participants from both public and private institutions taking the same tests would offer the cleanest and most powerful comparison of academic impact. 

Proposed advantages and disadvantages of each approach to K12 education

At present, the political winds seem to be blowing toward greater parental choice. This is the case despite the lack of consistent findings on whether such choice is of greater benefit to student achievement. Depending on the sample of students used and the method used to quantify achievement, studies generate every possible outcome. The EdWeek attempt at a summary concludes that “Preliminary studies on earlier iterations of these programs have shown “neutral to negative” effects on state test scores, though some programs, like Ohio’s EdChoice, have demonstrated positive outcomes regarding graduation and college-going rates.”

What follows is my attempt to summarize the advantages and disadvantages based on two books by Diane Ravitch that I have read. Dr. Ravitch is a defender of public education as you can probably tell from the titles of her books, but pros and cons are more about the arguments and not the data. Ravitch’s work is heavily focused on research findings, but as I have already indicated, the subissues are so complex that it is very difficult to offer general conclusions. The issue that interests me – what happens to the public schools when students leave (the final con described below under the cons of Private schools) has not, to my knowledge, been a focus as an appropriate methodology would be very difficult to put together. 

Pros of Public Schools:

  • Universal Service and Stability: Public schools are charged with serving all children, providing not just education but also essential social services like nutritious meals, medical care, and mental health counseling. 
  • Accountability and Transparency: Public schools operate under strict state regulations and testing mandates, ensuring a level of transparency regarding student progress and the use of taxpayer funds that can be absent in the private sector.
  • Professionalism: Public schools generally require higher standards for teacher certification and may provide due process (tenure) to protect academic freedom.

Cons of Public Schools:

  • Impact of Poverty and Segregation: The biggest “con” of the public system is often beyond its direct control. Concentrated poverty and racial segregation significantly drive lower academic performance, and schools alone cannot solve these structural societal issues.
  • Curriculum Narrowing: Due to high-stakes testing mandates, public schools may reduce time for the arts, history, and physical education to focus more on tested subjects like math and reading.
  • Bureaucracy and Funding Disparities: Public schools are often burdened by intrusive regulations and suffer from persistent underfunding, particularly in districts with low property tax bases.

Pros of Private Schools:

  • Autonomy and Flexibility: Private schools enjoy the freedom to design their own curricula and select their own testing methods, allowing them to cater to specific educational philosophies or religious values.
  • Personalization and Choice: Families can select schools that align with their specific needs or interests, whether through small religious schools or specialized academies. This “market-driven” approach appeals to values of freedom and optimism.

Cons of Private Schools:

  • Lack of Oversight: some private schools receiving public funds are not accredited and are exempt from state accountability systems. This makes it difficult for the public to evaluate if tax dollars are producing academic results.
  • Selective Enrollment: Unlike public schools, private institutions can be selective. Critics argue this leads to greater segregation and less equity, as schools may shun students with the highest needs or those who are the “toughest challenge” to teach.
  • Draining Public Resources: Every dollar diverted to a private school voucher is a dollar removed from the public school system, which still incurs fixed costs such as building maintenance and teacher salaries. This can lead to increased class sizes and program cuts in the remaining public schools.

Summary

Parents and their kids are being allowed greater control over where they attend school. School choice comes with multiple pros and cons, and these issues have been difficult to evaluate because public and private schools in most states are not required to use the same achievement tests. My personal interest is in what happens to the public schools that lose students to private schools when choice is allowed.

Sources for Pro and Con Section

Ravitch, D. (2014). Reign of error: The hoax of the privatization movement and the danger to America’s public schools. Vintage.

Ravitch, D. (2020). Slaying Goliath: The passionate resistance to privatization and the fight to save America’s public schools. Vintage.

Loading

RSS – Be Your Own Content Platform

Tim Wu, in his recent book, The Age of Extraction: How Tech Platforms Conquered the Economy and Threatened Our Future, examines the timeline of a variety of platforms and the manner in which they consistently morph from being initially attractive, innovative, and genuinely helpful resources into systems that become confining, controlling, and ultimately draining for their users. Using examples that range from Facebook and Amazon to UnitedHealth he argues that this transformation, from open utility to extractive gatekeeper, is not an accidental side effect, but rather a predictable, structural characteristic of platform business models as they achieve scale and market dominance.

In considering the examples from Wu’s book it occurred to me that while he emphasized the major players a wide variety of people use, the same issues apply to smaller platforms. Those of us who write and use platforms to share our work (e.g., Substack, Reddit, Medium) have likely experienced the same timeline. 

Many authors who write books with a similar message to The Age of Extraction do a great job of explaining the problem and its history, but even though they make an effort offer little as a remedy. I have read many such books. I typically find myself contemplating but failing to generate suggestions to augment what the author was able offer.

Like Wu, I bought into the original promise of the Internet as a leveling platform that would give content creators, sellers, and the “little guy” in general greater opportunities. In the early days (2002), I started a blog and did so from a server that was also my desktop computer (I worked at a University and had a dedicated IP which was more of the challenge than the ease with which any Mac could be used as a server). Things change. I now pay a hosting company a couple hundred dollars a year to allow me to run blog software and the related backend database and register my domain name. Still, as a hobby, once I pay for the space, I can function independently.

I believe the way we create and share content has changed. You can still do it, but it seems you have fewer and fewer regular readers. I have noticed a change that suggests more and more of my posts are read through search rather than by readers who regularly view the blog. I track hits out of curiosity and find little immediate interest in most posts. I check say a year or six months later and find that some posts have been read hundreds of times. Logically, I interpret this to mean I have written something that people found through search. I shouldn’t complain about this, but the switch from pure search to AI search now being developed by the big platforms means there will be far less attention to source material when an AI summary based on this homogenized and integrated material is made available. This is an emerging but I think obvious issue and a perfect example of what Wu means by platform extraction. 

The big switch (another book) to focus on extractive platforms has resulted from a) integrative platforms such as those I have already mentioned hosting multiple content creators and b) a related move away from the use of RSS readers by individual consumers. I certainly understand the benefits of single-stop platforms that provide a convenient way to reach a wide audience. My complaint is based on the history of these platforms. The pattern of extraction is evident. Start by offering a service in which the platform and content creators share in the risk and the rewards, and once a critical mass for a network effect is achieved, reduce the benefits to the producers and to the consumers. Wu suggests Amazon makes a familiar example of this approach.

I do post my content to one of these community platforms and continue to post the same content to my own blog. Yes, this means I pay twice and I continue to be frustrated by this situation. One approach allows me to own my content and the other to reach a larger audience – for a price.

My solutions:

I do have suggestions for an alternative approach, but I understand that each requires an effort that most consumers are unwilling to invest. You can be your own platform with easy-to-use tools.

Use Google Alerts – Yes Google is a big company, but it does offer some beneficial services. Google Alerts might be imagined as a period search process based on specific interests you specify. You provide a typical search request, then select how often you want to receive the results. Updates are sent to you in an email periodically according to the time intervals you request. I have multiple alerts that generate a week list of new content. In this approach, you are following a topic rather than specific content creators.

RSS is still around and modern readers make the process easy to implement. With RSS, you designate the sources (e.g., specific blogs) you want to follow, and an RSS reader accumulates new content generated by these sources. You check in to your reader when you have time and see what is new. Some contend that Google’s abandonment of its very popular Reader in 2013 signaled the end of this tool category, but more modern alternatives have since emerged. 

Yes, RSS readers do offer a subscription level and any provider realistically has costs. While the pro level offers great features, most users will find the free level meets their needs.

My preference is for web-based readers – the service is accessed through a browser rather than standalone apps. Feedly is my recommendation. I like Inoreader and Reeder (Apple) as apps. 

I have written more detailed descriptions elsewhere (Feedly, Inoreader) and you could consult these sources if you need more information. 

Summary

I didn’t really intend this post as a book review, but Tim Wu’s book is interesting and informative. As I suggested, the book identifies the typical timeline of extraction consumers should recognize and use to guide their decision making. Again, solutions, should that be what you are seeking, are not easy to imagine.

I think we have a classic “chicken and egg” problem with platforms versus independent sources. Content creators will go where their content is more likely to be consumed. Tools for sharing will exist and be improved where there are content creators and content consumers. 

For the great majority of creators and consumers, the motivation of income is deceptive and a trap. Most writers would seem better off thinking of their goal as visibility rather than profit. Writing for a platform for the vast majority should be treated as a hobby, recognizing the reality of being trapped by the network effect. 

Loading

The Decline in Reading Activities – Does it matter?

Lately, I have encountered several essays that examine possible connections between the claims that “few people, including students, read books anymore” and “the declining percentage of individuals moving through our educational systems who are competent readers” (see links to several of these articles included at the end of this post). To attract a general audience, this decline in reading proficiency is often framed as a concern for future international economic competitiveness. What caught my attention in this unexpected batch of essays was the claim that researchers had failed to provide evidence that differences in time spent reading, and more specifically in reading lengthy works, resulted in improved reading skills and related intellectual benefits. In several of these articles, it was claimed that such relationships had not been proven. I had to move beyond these secondary sources to primary sources to understand the bases for these claims, and then to explore further on my own to determine whether I agreed. Is the lack of interest in books a serious problem?

The low rate of reading proficiency in the U.S.

Reading skills are declining and this trend has been ongoing for years. This decline is evident in multiple tests used to assess reading skills, with the most attention likely given to the NAEP (National Assessment of Educational Progress), which compares the performance of U.S. students with that of students from other nations. Implications for economic productivity and international competitiveness are sure to catch the attention of the general public and politicians. 

The pattern of decline among U.S. students is notable: over the years, the less proficient have declined year by year, whereas the most skilled have not shown this trend and score at roughly the same level on the tests used to track reading skill. Among other issues, this means that the range of competence educators must address within a single grade level is increasing, thereby complicating instruction. Multiple factors potentially associated with this trend have been identified and such claims have provided ideas for remediation. Most of you have likely heard of the “science of reading” controversy, which in recent years has prompted greater emphasis on phonics. Screen time is often cited as a culprit in many areas, including reading. Recently, absenteeism has risen at an alarming rate and it seems logical that missing school for many days means students miss instruction. There are likely multiple causes for a general academic trend and these causes interact in ways that defy simple solutions. 

Recent changes in reading proficiency broken down by initial level of functioning.

Reading Skill and Reading Activity

Daniel Willingham, an educational expert I often cite, proposes a more straightforward explanation in claiming that the “main differentiating factor” between strong and weak readers is “the volume of reading they do,” because that’s how they build both reading skills and a rich background knowledge. The mention here of background knowledge should not be overlooked and requires a little more detail. Studies that differentiate background knowledge from reading skill demonstrate that background knowledge is more critical in predicting differences in what readers understand from what they read. Put another way if good and poor readers are also differentiated by what they already know about a topic (e.g., good readers with poor existing knowledge, poor readers with good existing knowledge), the difference in background knowledge is the better predictor of understanding and retention. So, what Willingham argues is that the amount of reading has a unique importance, possibly ignored in other discussions of the development not just reading skill, but the more general impact of reading as a component of learning more generally. 

It is not that time spent reading and its relationship has been ignored (see Allington & McGill-Franzen). Attempts to track the amount of reading students and adults spend have been recorded with great diligence. How many minutes, how many books, whether the reading is on a digital device or paper, how many books are available in the home, how frequently does an individual visit a library, does a family subscribe to a newspaper, and similar variables have been quantified and related to reading skill and potential moderating variables such as family income. These variables consistently relate to measures of reading skill, and because these measures of activity seem to be in decline, have been argued as possible causes of poorer reading skills.

Causal Relationship Between Activity and Performance

Allington and his colleague raise an interesting point as have other scholars I have recently read about why the amount of reading has not received more attention. Allington makes an observation that blames reading researchers, which I find interesting. He claims that researchers have difficulty explaining the observed relationship between time spent reading and reading skill, because this relationship has seldom been tested experimentally. This position should be presented to mean that the observed relationships are correlational. How can it be known if more reading alone builds reading skill when the data could also be interpreted to argue that more capable readers enjoy reading and read more? Why might this matter? Allington notes that elementary school reading instruction focuses far more time on skill development than on extended reading. Extended reading, often called free reading, allows students to use school time to read independently and would include basal reader time and library books.

Just to be clear, there are many topics of interest to scientists in other fields that involve limited “manipulative” research. When investigating factors we believe have negative consequences, scholars seldom create the unfavorable circumstances to determine whether those harmful consequences follow. Sometimes, potential causal relationships are analyzed through correlational techniques because the causes cannot be manipulated or would be extremely expensive to manipulate. 

Allington does provide two counterexamples to the claim that the amount of reading, and in these examples, the reading of books, cannot be manipulated. The first example involves the gifting of books to randomly selected children through existing programs (e.g., the Dolly Parton Imagination Library) and the subsequent administration of follow-up assessments. The second employed a similar approach, focusing on the “summer slide” in reading performance among children from low-income families. Again, in selecting participants at random and comparing the selected group with non-selected individuals based on a gifted home library for the summer on a follow-up assessment. In both cases, greater access to books was associated with higher reading proficiency scores. The random assignment of access to the gifted books allows for a more persuasive argument for causality.

Voluntary vs Required, Intensive vs Extensive

The topic I have presented here has a number of subtopics that are relevant, but examining these issues should probably be pursued in other posts. The voluntary vs required distinction relates to reading set in a formal education setting or pursued without direction outside of the classroom. Voluntary reading at any age has consequences. The difference between intensive and extensive reading is relevant to a variety of issues that relate to reading for different purposes (research papers, technical manuals vs books) differ in potential benefits. When it comes to the amount read, does a number of shorter pieces generate the same impact as a book, or are there unique skills involved when persistence is required, and relevant pieces necessary for understanding are spaced over greater units of time?  This issue is relevant to the different type of reading most of us might do online.

Resources

Academic sources:

Allington, R. & McGill-Franzen, A. (2021). Reading volume and reading achievement: A review of recent research. Reading Research Quarterly, 56, S231-S238.

Abdurakhmonova, Z., & Pardayeva, R. (2025). Exploring intensive and extensive reading. International Conference on Culture & History. 1(3), 34-38.

Gioia, D. (2008). To read or not to read: A question of national consequence. Diane Publishing.

Willingham, D. T. (2017). The reading mind: A cognitive approach to understanding how the mind reads. John Wiley & Sons.

News sources addressing the decline in the popularity of reading books

Kids Rarely Read Whole Books Anymore 

Whole books or excerpts: Which does the most to promote reading ability

Novels vs. excerpts: What to know about a big reading debate 

Loading

Word processing: Desirable difficulty or  opportunities get taken

When it comes to how we use technology, familiarity can limit analysis and exploration. I started thinking about this challenge when encountering the work of an academic who has examined the history of word processing. When I first wrote about word processing in the 1990s, the issues were similar to current topics of whether students should read from paper or screen and whether it was better to take notes on paper or on a laptop. There were comparisons of which method was more productive and efforts to account for the advantages that were identified. It was once similar with word processing. Should students learn to write on paper or using a computer? What were the advantages and disadvantages of each approach? How might the instruction of writing skills with computers be modified to take advantage of the unique capabilities of a digital approach? My thought is that many are now no longer aware of these questions and conclusions and that personal practice and instructional emphases may ignore key findings. This concern seems especially relevant given the new issues raised by the use of AI in writing and learning to write.

I decided to write again about this topic after listening to an interview with Matthew Kirschenbaum on the “This Week in Tech” network’s “Intelligent Machines” podcast focused on Kirschenbaum’s recent focus on the role of AI in writing and learning to write. The podcast guest was a member of the  Modern Languages Association (MLA) panel, generating what are likely to become several influential position papers on learning to write and AI. This interview, which makes up maybe the first half hour of the podcast, is worth the attention of any educator trying to make sense of how AI will impact schools and universities. As part of the brief introduction of the podcast guest, Krischenbaum it was noted that the guest had recently written “Trach Changes: Literary History of Word Processing”.As I suggested, word processing had been a personal interest so I did purchase and read the book.

It wasn’t that the book wasn’t well written, but I did struggle to get through it. The podcast focus resulted in my misunderstanding of the topic of the book. The history of the transition from writing on paper and typewriter was of some interest, because I lived through that transition and the mention of technology hardware and software and the required skills involved in writing with a computer brought back plenty of memories. I was less interested in which noted author had made his or her transition from a notepad or typewriter to a word processor during their career. Concerns of the reading community related to how technology might influence literature likely offers similar insights into what some think about AI. A better example might be how Bob Dylan’s fans reacted when he switched from acoustic to electric guitar. What I had falsely anticipated was that the author would examine how digital storage and revision changed writing and the teaching of writing. 

I imagined I would encounter an analysis of changes in personal revision, educator feedback and learner revision, peer revision, and possibly even AI as a sounding board for a writer’s efforts. These are the topics in what I see as the evolution of writing and writing education. I decided to generate a post that would offer my own thoughts about the role of word processing in the writing process. The podcast and the book on word processing are still worth your time. 

Will digital tools change our writing?

I assume you complete many of the writing tasks you take on using a word processing application. Do you do this because you assume this approach makes you more efficient or do you assume this approach makes you a better writer? Maybe you have never even thought about these questions. However, when functioning as a teacher and asking your students to engage in activities in a particular way, it may be helpful to consider why the approach you expect students to use will be productive. Often, to realize the full potential of an activity, the details matter and some insight into why an approach is supposed to be productive may be helpful in understanding  which details to track and emphasize. The following comments summarize some ideas about the value of word processing and of learning to write using word processing applications.

In learning, as in other areas of life, you seldom get something for nothing. Still, a logical case has been proposed for how simply working with word processing for an extended period may improve writing skills and performance. Perkins (1985) calls this the “opportunities get taken” hypothesis. The proposal works like this. Writing by hand on paper has a number of built-in limitations. Generating text this way is slow, and modifying what has been written comes at a substantial price. To produce a second or third draft requires the writer to spend a good deal of time reproducing text that was fine the first time, just to change a few things that might sound better if modified. Word processing, on the other hand, allows writers to revise at minimal cost. They can pursue an idea to see where it takes them and worry about fixing syntax and spelling later. Reworking documents from the level of fixing misspelled words to reordering the arguments in the entire presentation can be accomplished without crumpling up what has just been painstakingly written and starting over.

With word processing, writers can take risks and push their skills without worrying that they may be wasting their time. The capacity to save and load text from some form of storage makes it possible to revise earlier drafts with minimal effort. Writers can set aside what they have written to gain new perspectives, show friends a draft and ask for advice, or discuss an idea with the teacher after class, and use these experiences to improve what they wrote yesterday or last week. What we have described here are opportunities—opportunities to produce a better paper for tomorrow’s class and, over time, opportunities to learn to communicate more effectively. 

Do writers take the opportunities provided by word processing programs and produce better products? The research evaluating the benefits of word processing (MacArthur, 2006; Wollscheid, Sjaastad, & Tømte, 2016) is not easy to interpret. Much seems to depend on the experience of the writer as a writer and familiarity with word processing, and on what is meant by a “better” product. If the questions refer to younger students, it also seems to depend on the instructional strategies to which the students have been exposed. It does appear that access to word processing is more beneficial for older learners and some even interpret this difference as having a neurological basis (Wollscheid, Sjaastad, & Tømte, 2016). General summaries of the research literature (e.g., MacArthur, 2006) seem to indicate that students make more revisions, write longer documents, and produce documents containing fewer errors when word processing. However, the spelling, syntactical, and grammatical errors that students tend to address and the revision activities necessary to correct them are considered less important by many interested in effective writing than changes improving document content or document organization. The natural tendency of most writers appears to be to address surface-level features. 

Writers appear to bring their writing goals and habits to writing with the support of technology. Beginning writers and perhaps writers at many stages of maturity may not have the orientation or capabilities to use the full potential of word processing, and their classroom instruction may also emphasize the correction of more obvious surface errors. Thus, there are typically improvements in the products generated when working with word processing tools, but the areas in which younger writers seem to improve are not necessarily the most important ones

Many of the potential educational advantages of word processing appear only as students acquire considerable experience writing with the aid of technology and some question whether using a keyboard is better than a pencil for young writers (Wollscheid, Sjaastad, & Tømte, 2016). Perkins’s (1985) argument that writing with word processing programs will improve writing skills because word processing allows students to experiment with their writing makes sense only in situations in which students have written a great deal and experimented with expressing themselves in different ways. The fact that most research evaluating the benefits of word processing has examined performance over a short period of time, with students having limited word processing experience, thus represents a poor test of the potential of word processing (Owston, Murphy, & Wideman, 1992). Research based on a three-year study following elementary students as they learned to write with and without access to word processing opportunities has demonstrated a significant advantage for students with ready access to technology (Owston & Wideman, 1997). A recent study (Yamaç, et al., 2020) examining the benefits of consistent writing on laptops found a similar advantage in contrast to paper and pencil writing tasks for early elementary learners. These researchers point to social media activities such as blogs and multimedia writing with tablets as expanding the writing opportunities available in classrooms. 

The National Assessment of Educational Progress (NAEP) demonstrates that in the U.S. greater experience writing with technology is predictive of schools with more proficient writers (Tate, Warschauer & Abedi, 2016). Studies such as this are still controversial as it is difficult to parse out other variables such as the income levels of the majority of students in different schools that may influence both access to technology and writing proficiency. Overall, the role of word processing in developing writing skills depends on the goals of the teacher and individual students, the social context provided for writing, and the amount of writing that students do with the assistance of word processing. 

Summary

Many of the posts I write concern the cognitive processes involved in learning, thinking, and academic behavior. Often, I focus on how these processes are impacted for good or bad by involving technology. We seem to be past the point at which educators question writing on a computer, but the distinction I raised between opportunities get taken and desirable difficulty have yet to be resolved with writing. This is clearly the case when educators debate the role AI should play. My suggestions related to the opportunities get taken hypothesis should also be approached would even be that we examine whether the opportunities (often called affordances) of revision are actually employed. Do students get useful feedback from which they might learn to improve what they have written? Despite the likely benefit of revision, do students quantitatively do much revision? Perhaps like other ideals (tutoring, personalizing learning) that are impractical for one reason or another (e.g., cost, teacher time), AI might find a productive role in guiding revision experiences. 

References:

MacArthur, C.A. (2006). The effects of new technologies on writing and writing processes. In C.A. MacArthur, S. Graham, & j. Fitzgerald (Eds.) Handbook of Writing Research, pps. 248-262. New York: Guilford.

Owston, R., Murphy, S., & Wideman, H. (1992). The effects of word processing on students’ writing quality and revision strategies. Research in the Teaching of English, 26 (3), 249–276.

Owston, R., & Wideman, H. (1997). Word processors and children’s writing in a high-computer-access setting. Journal of Research on Computing in Education, 30 (2), 202–220.

Perkins, D. (1985). The fingertip effect: How information-processing technology shapes thinking. Educational Researcher, 14, 11–17.

Tate, T. P., Warschauer, M., & Abedi, J. (2016). The effects of prior computer use on computer-based writing: the 2011 NAEP writing assessment. Computers & Education, 101, 115-131.

Wollscheid, S., Sjaastad, J., & Tømte, C. (2016). The impact of digital devices vs. Pen (cil) and paper on primary school students’ writing skills–A research review. Computers & Education, 95, 19-35.

Yamaç, A., Öztürk, E., & Mutlu, N. (2020). Effect of digital writing instruction with tablets on primary school students’ writing performance and writing knowledge. Computers & Education, 157, 1-19.

Loading

Before the web did anything, Hypercard did everything

“Before the web did anything, Hypercard did everything” was the subtitle of an Ars Technica article I cited and liked in 2019. I use it here to celebrate the life of Apple software genius Bill Atkinson following his recent passing. HyperCard played a significant role in my academic career, and while I have written about the personal connections in the past, one more mention seems appropriate. Hypercard, often called a software erector set, was innovative and accurately foretold many future innovations we now take for granted. 

Among the unique features were:

  • Hypertext/hypermedia
  • Object-oriented programming
  • Low bar / high ceiling software

I will attempt to call attention to these features as I offer a brief description. I spent hundreds of hours developing educational applications in Hypercard and would love to display some screen captures of my work. Unfortunately, Apple last updated HyperCard in 1998. I do have some old discs with my original work, but neither an appropriate drive nor an Apple operating system suited to run HyperCard. For a few images, I used my phone to capture from a textbook I wrote with my wife in 1996. One more comment. While I did work on HyperCard stacks for hundreds of hours, that was 25 or so years ago, so I may be fuzzy on some of the details. 

The analogy – a stack of cards

Card, field, button

HyperCard was based on a hierarchy of objects. The most basic and encompassing element is a stack. A stack is made up of cards. Each card has a background and a foreground to which the user could add elements, such as text, images, text boxes, and buttons. The difference between the background and foreground was important when a card was duplicated. Typically, cards were intended to look the same with only key elements changing (e.g., a unique image, text information). Rather than create each card from scratch, it was possible to duplicate a card to retain the background and then add the unique features. 

The following image shows a card from a series of cards built to provide information about the birds most commonly found in North Dakota. There are several text fields and buttons that would be common across all cards, and a unique image and descriptive text for each species. HyperCard, in this case, was used as a type of database. 

Adding Action – Hypertalk was the language of HyperCard

Nearly anything done with HyperCard, e.g., opening a stack, clicking on a button, generated what was called an event. Events had no consequence unless there was a handler available appropriate to that event. A handler was pretty much a segment of code that would be executed when the relevant event was encountered. So, for example, mouseUp would respond to a mouse click.

On mouseUp

go next

End mouseUp

This handler added to a button would advance the card from a stack that was visible to the next card.

A unique feature of HyperCard, consistent with the philosophy of low bar, high ceiling, was that HyperCard would write many of the basic event handlers for you. You could do many things without coding, but also use the powerful scripting language, hypertalk, to do much more. 

HyperCard came with a large assortment of symbols many appropriate as buttons (first image). However, when added to a card, the symbols were inert unless scripted. It was easy to find and add a button to a card, but then HyperCard was willing to help make the button do something (Button actions 1 and 2). The process here was to select an icon from button choices and then select the desired action.

Some of the button choices

Button Actions 1

Button Actions 2

If you opened the script for the button created in this fashion, you would see the script I included above as an example. Of course, once you learned hypertalk, the scripting language, you could just input this script from the keyboard. 

Scripts could be added to most HyperCard objects and created actions originating from the object to which the script was attached. There was no single program, but rather a collection of programs associated with the various objects from which a stack was assembled (object-oriented programming). Events associated with one object (say a card) might trigger a handler somewhere else, where it would make more sense. I think of this as an event looking for a home. If an event does not find a handler on the object that generated the event, it falls through the HyperCard hierarchy to the next level. Because so many actions generate an event, every mouse click, opening of a card, opening the stack, etc., many do not find a handler and nothing happens. If you forget to assign a “go to the next card script” to a MouseUp associated with a button, nothing happens. 

I have tried to find an old online manual that might provide a complete exposure to HyperTalk without luck. This description of hypertalk from Wikipedia includes sufficient samples of scripts to offer a way to consider just how powerful the language could be. 

What I did with HyperCard

I used HyperCard for a variety of things. Here are a couple of examples.

The North Dakota Wild clipart collection

I used my tech skills outside of my role as a college professor. I had an interest in technology applications in science education and was interested in project-based learning. North Dakota Game and Fish had a program called OWLS (outdoor wildlife learning sites) and gave small grants to schools to support the development of small sections of land (think native plant gardens) associated with schools. I became involved in attempting to create activities and develop the sharing of project ideas between schools using technology. One of the projects involved the creation of HyperCard stacks of North Dakota Wildlife. The image used here was from the Bird stack. The idea was to provide students with images that the students could use in their own creations. The entire collection of images is still available from my server. Again, to imagine how this resource might be used, you have to put yourself back in time more than 20 years. 

Technology Enhanced Study

Much of my academic research concerned study effectiveness, and I was particularly interested in large lecture, introductory, college courses. The freshman lecture course represents an interesting case. It places the least experienced students in a setting that is the most impersonal and most isolating of their college experience – heavy reading loads, long lectures, and the expectation that you process the inputs without guidance to take high-consequence exams that are likely unlike those you took in high school. My interest was in helping student identify their areas of weakness as they studied and providing an efficient way to address any weaknesses identified. You don’t need to review the entire chapter again. Study the poorly understood sections or ask a colleague for assistance with the topics you don’t understand. Don’t just mindlessly go over the same material again and again. 

The course I had in mind was the Introduction to Psychology. I had access to the test item database provided by the textbook company (approximately 200 MC questions for each chapter). The items from the test bank included associated page numbers from the book and what I would call topics (maybe 20-25 for each chapter). Topics were simply identified by number and designated a group of questions addressing the same topic and section of the chapter. These questions provided the basis for the digital study environment I provided to students. Half of the questions for each chapter were used in the study tool, and the other half were reserved for possible use on examinations 

The HyperCard study tool took me most of three months during a sabbatical to write. A study experience from the perspective of a student would work like this. Opening the stack on a Mac would offer the student the option of working on one of several chapters (usually 1-4) covered on the next examination. Once selected, Hypercard would display a randomly selected multiple-choice item from that chapter. The student would respond and the computer would then indicate whether the response was correct, and if not, the page numbers in the textbook associated with that question. The idea was that the student could use their textbook to review the page or so associated with that question and possibly take a note in their notebook or highlight the textbook. If the response was incorrect, the stack would then select another item at random from the other items with the same topic number. This would not guarantee a direct test of their understanding of a specific issue they found difficult, but at least would address the same topic. If correct, the system would select another topic at random from the chapter. As the student worked, the stack recorded the date and time of each response and whether the response was correct or incorrect. Aside from general performance, the data allowed consideration of when during the interval between examinations the student used the system (was the student cramming or studying systematically) and the time delay between questions. The delay following incorrect responses was regarded as particularly important because longer median delays were assumed to indicate targeted rereading, notetaking, or some other form of reflection. Students who showed little difference following correct and incorrect responses were using the system in a more passive manner, possibly assuming that responding to the questions alone was beneficial. 

The process of preparing the HyperCard study tool was augmented by other Hypercard programs that would take a flat file of questions and create the full study tool by creating the study tool card by card adding a card for each line of text from the flat file representing one question. A procedure built into the Study Tool stack also exported data and added an identifier for the student who had used that specific stack.

Making this system work also required some concessions. Many students were not assumed to have their own Macintosh computer, and for research to be meaningful, access to the tool had to be standardized. At the beginning of each course segment, I took trays containing 200 disks to the reserve desk of the library. The library had a Mac lab always available to students. The disks were assigned an identification number with each student assigned their own disk that had a unique bar code so the disks were checked out in the same manner as other material made available for a specific course. After each exam, I would collect the disks, dump the data for analysis, and prepare the same disks with the questions appropriate to the next examination. 

Readers who are familiar with such topics as comprehension monitoring, retrieval practice (testing effect), and distributed versus massed practice should be able to identify the theoretical bases for the tool and how I intended it to be used. What may not surprise educators is the finding that study opportunities are typically designed to address recognized study challenges, yet when released into the wild, students do not use them as intended. The tool was great for investigating actual student study behavior, but the benefits assumed are more difficult to generate than one might think. Capable students not really needing support flocked to the study tool. They used it more frequently and as intended. When answering questions incorrectly, their post-question delays were longer, indicating they were engaged in some effort to act on this awareness. This is not to say that students who were more likely to need help received no benefit, but the difference in who used the system and who used it most appropriately was one of the more challenging findings. 

Eventually, I moved on from HyperCard to create online versions of a technology-enabled study environment using a small server, PHP, and MySQL. Students could work from their dorm rooms or any location with Internet access using their own equipment and any browser. My interest in actual student study behavior was more authentic when it did not require a trip to the library. Still, HyperCard was a way to get started and provided a sophisticated, user-friendly experience. For anyone who finds this type of research to be interesting, my published studies based on the study environments can be picked from the list of my publications provided by Google Scholar.

One final Bill Atkinson product

When Bill retired, he began working on one final vision. He lamented the total focus on online image collections and wondered what physical objects would remain to save personal memories. His passion project resulted in PhotoCard was a tool for creating personal postcards. These cards could be sent using the Internet, but Bill really wanted you to pay a small fee to have him print cards you created and he would send them for you. 

I corresponded with Bill about PhotoCard and explained I had been a fan since HyperCard. I ended up with this mental image of him personally printing these postcards on his high-quality printer and making the effort to make small adjustments to get things just right. Then, he would head off with a box of these cards to mail them at the post office. This was not about the money as seemed his general approach. This was about making great things that people would use.

Bill Atkinson will always be one of my technology heroes. 

Loading

The Concept of Disruption Applied to Education


The Concept of Disruption Applied to Education

I think of Clayton Christensen as a theorist and writer mainly focused on business. He seems to focus on the challenges of change. Two core ideas have stuck with me. First is the concept of the innovator’s dilemma, which describes the difficulty of moving on from a successful approach when faced with new circumstances. Why does success at one point limit future success. The second core topic considers the opportunities and risks that arise when a field encounters great disruption. These two big ideas are interrelated, as great disruptions may create circumstances challenging an existing successful approach.

Big ideas are not necessarily specific to a given field (e.g., business) and Christensen has made the effort to apply his theories to education. This effort resulted in his 2008 book Disrupting Class, written with Michael Horn and Curtis Johnson. For those interested in K12 education, it should not be difficult to anticipate how Christensen’s core ideas might apply. The methods of K12 teaching are often criticized as unchanging often noting that someone from a century ago would not be lost if suddenly dropped into a classroom of today. The long-standing model involves age-based classes of students reporting to a teacher standing at the front of a room, engaging the class with presentations and discussions covering isolated subjects of math, science, history, and language arts with performance evaluated using tests at regular intervals. This model has existed in pretty much the same approach for a long period of time and seems very resistant to change.

While stable, the K12 model faces significant challenges. Students are often unmotivated with high absentee rates. Despite significant resources invested in the U.S., performance measured by standardized tests lags behind that observed in other countries. The COVID crisis appears to have created problems that remain when students could not meet in the traditional face-to-face classroom situation. Despite the mantras of “all students succeed” and “leave no child behind”, student performance has grown increasingly more variable as students move through the grade levels, bringing many questions of meeting the needs of a diverse population of learners into focus.

These issues exist within an environment of educator complaints of low pay and a lack of public support, with increasing pressure to address an ever-increasing number of goals. Why aren’t students more capable of understanding the basics of money management? Why are kids spending so much time on their phones and reading so little? Why are kids questioning their sexual orientation? Why can’t K12 schools do a better job of preparing those students who are interested in a job after high school rather than pursuing the experiences of higher education? Why do kids have so many psychological problems?

Rereading Disrupting Class

After the delay of a decade or so, I just reread Christensen’s Disrupting Class, which turned out to be an interesting experience in a couple of ways. First, I don’t reread many books because there always seems to be something new I want to read, but the investment of time may be beneficial with a certain type of book. Rereading books that gave you important insights into an emerging or changing situation can be quite informative. The delay between reads allows the accumulation of relevant experiences that allow further reflection on your original conclusions and insights. These intervening experiences can also guide rereading to discover nuances that did not initially stand out. There is more. Some writers who explore an emerging trend are willing to predict specific accomplishments and related time frames. What actually happened? Were the predictions realized, and if not, what factors turned out to prevent the predicted outcomes?

I became interested in Christensen’s analysis of the challenges of K12 education because I thought the challenges I have already outlined could be addressed through the use of technology to individualize learning opportunities in ways that addressed differences in aptitude, existing knowledge, and preferences for different approaches to learning (Christensen used Howard Gardner’s theory of multiple intelligences as a justification for alternate learning experiences.). When translating these suggestions to my own way of thinking about instructional alternatives, I equated Christensen’s suggestions with an updated way of implementing mastery learning, taking advantage of online technology. This was not the term used in his book, but Christensen did mention k12.org as one of his examples and I would define this as an example of online mastery instruction.

Christensen observed that disruptive practices tend to emerge in the margins with mainstream practices being very resistant to disruption. Established mainstream practices fall prey to the innovator’s dilemma, with efforts limited to doing a better job of what is already being done. He predicted that home schooling, credit recovery experiences (a way to address failed or incomplete courses), and unique courses that cannot draw enough students in an individual school would provide the earliest exposure for online, individualized learning experiences.

While online, digital approaches have succeeded on the fringe, these successes are intended to encourage more mainstream adoption. Now this is what I mean by having the opportunity to test predictions. Implementation has not progressed as anticipated. For example, Christensen predicted that by 2019, about 50% of high school courses would be delivered online. This prediction proved to be overly optimistic, and I would guess that any movement in that direction was reversed by negative experiences associated with the total forced online efforts of the COVID years. There are other failed predictions. One concerns the alternative educational materials that would be available to satisfy the multiple intelligences Christensen identified. Rather than a single, common textbook, he predicted that technology would mature to the point that educators and student peers could create learning materials. Course management systems (CMSs) were emerging and these systems would serve to store, sequence, and provide options generated by commercial providers as well as educators themselves. Perhaps some attempts have been made in this direction. I would point to concepts such as Google app-based Hyperdocs and teacher-based content production processes such as Teachers Pay Teachers (TPT) as consistent with this prediction, but the scale of adoption has been small.

I spent some of my time teaching in a graduate program focused on Instructional Design and I did find the concept of teachers as instructional designers of interest. To this point, educators are not prepared as designers and instructional design is different from lesson planning. Teachers would certainly have the background to become instructional designers, but like so many things educators could do, there is always the issue of how their time should be allocated.

Is technology a tool for disruption or will it be limited to isolated and focused improvements?

Perhaps technology is not a tool that will result in educational disruption, but will play a more focused role in improving some specific areas of what we already do. The alternative could be that technology provides the only reasonable means to break free from an approach that seems to be floundering in complexity, inefficiency, and frustration. Perhaps Christensen succumbed to Amara’s law. You may have heard of this but not know to whom the argument has been attributed. When associated with progress involving technology, Amara proposed that we tend to overestimate the impact of technology in the short term and underestimate the impact in the long term. Perhaps we are too impatient.

I understand that one criticism of those who advocate alternatives to the status quo is that when an alternative does not seem to generate much change, the reaction is to claim “you didn’t do it right”. There is certainly some legitimacy in the claim that people are reluctant to abandon good ideas when the ideas do not result in practical successes. Keeping this in mind in combination with the limited success of current approaches, I still believe there are some areas of potential.

Here are some examples:

Teachers can create instructional materials efficiently with the use of AI. Specialized AI tools (e.g., eduaide.ai, Diffit) assist educators in creating instructional materials customized to suit students who prefer different types of learning experiences.

AI offers the opportunity for personalized tutoring. Christensen refers frequently to the impact of tutoring, but he uses tutoring to refer to instructional tutorials. It is true that there are certain ways in which tutorials provide individualized learning experiences. When I think of tutoring, and this has been a topic I have always found interesting, I tend to think of a one-to-one interaction with another individual. Again, AI can approximate this experience and is more accessible and economical than traditional tutors. I have written several posts focused on AI tutors and I encourage educators to explore this possibility. The common complaint that technology is isolating is worth considering. As an adult independent learner, I would counter that all of us end up learning in isolation. Having suggested developing the skills of independent learning to be essential, I do note Christensen lists two requirements that must be met to motivate K12 learners?—?a way to feel competent as a learner and social connections. I agree and see peer engagement within an AI tutoring environment as a very reasonable approach. This strategy offers a combination of AI and peer tutoring.

Individualized learning opportunities?—?I am not ready to give up on online, individualized learning experiences. These experiences might best be used as a supplement to other classroom experiences (i.e., as an alternative to a textbook) or assigned for enhancement or remedial purposes. Khan Academy offers a flexible environment and set of tools that should be available for assignment by educators or access by individual learners.

I am not ready to dismiss the role technology can play in K12 education. I am waiting for those who point to limitations to offer reasonable options.


Loading