Class Disrupted

This is a recommendation for the Class Disrupted podcast. This podcast is now in its 4th season and is hosted by Dianne Tavvener and Michael Horn. I follow several podcasts that include a focus on mastery learning in k12 settings and this wide-ranging podcast does frequently discuss mastery learning topics.

Aside from mastery learning, I am fascinated with this series because there is a lived historical element there for consideration. The podcast started in response to the COVID disruption. Rather than first listening to the most recent episodes or to episodes that have an interesting description, I would recommend listening from the first episode on. If you have an interest in technology and the now related topic of the reaction of many educators to the forced use of technology during the “learn from home” years or are intrigued by what has to have been one of the most disruptive periods in our lives and the long term impact of this disruption, there is plenty to consider by working your way through the episodes from the beginning. What were our expectations? What were our plans? How are things working out?

Loading

InsertLearning Update

InsertLearning is one of my most frequent recommendations for K12 classrooms. It is one of the examples I highlight in my Kindle Primer – Designing Instruction Using Layering Services. The InsertLearning environment allows a teacher to add elements to existing online web pages and then assign this combination to students. It is a great tool for teachers wanting to “ditch the textbook” or who want to emphasize OER. The elements include comments and highlights (which are not common additions among tools of this type), but also questions and discussion prompts. I call InsertLearning an environment because it also includes a management system allowing educators to assign resources prepared in this fashion and to view individual student responses to questions. The original service was a bargain at $8 a month, but I see the new price has been increased to $100 a year. For what this service can do, this is still a reasonable price.

What had confused me about the service was that when I attempted to review the service for updates all of the content my searches seemed to surface ended a few years ago. I finally contacted one of the founders and he explained that the service had been purchased by fan.school. This company has made educational services originating in Minnesota a target of acquisition and provides a more stable for continuation of several smaller companies. InsertLearning works as a source to learn about the program and fan.school provides an access point specific to this product.

As far as I can tell, my videos describing InsertLearning still offer an adequate overview of the service. My price quote is no longer accurate.

Loading

Addressing middle school math

Educators are likely familiar with the learning challenges students experienced during the COVID years when face-to-face instruction became impractical. The concern for student achievement during this period of time has been documented in declining performance on the NAEP scores. It appears that math achievement was particularly hard hit and the middle school years which set students up for the study of algebra represents a unique problem area.

I decided to focus on a resource that called attention to this problem mostly because it proposes one productive response would be to make more frequent use of mastery instructional strategies in middle school math classes. Readers who follow this blog will understand that mastery learning is one of the topics I spent time writing about. [other more detailed posts about mastery instruction can be found by selecting the category “mastery” that can be found in the left-hand column of this blog]

Mastery instruction individualizes learner experiences meeting students at the level of their understanding and advancing them as individuals when understanding has been achieved. It is an approach concerned that learning goals be met even when group-based instruction would likely move ahead leaving some students missing skills that are prerequisite to new material.

Among the other recommendations to address the middle school math challenge is to double up on math class frequency. This would be a second way to provide additional time to assure the mastery of essential skills, but it is more of a group-based approach than the individual learner emphasis on mastery learning.

Loading

I was wrong about ChatGPT

Like so many, I tested the capabilities of the new AI tool ChatGPT and offered a personal assessment. While I immediately saw that the tool would have significant consequences for classroom practice and especially those educators who assigned writing to learn tasks, I proposed that a fix would involve requiring the inclusion of careful citations in work students submitted. The samples I had ChatGPT generate did not provide citations and my naive understanding of the corpus of content the AI was drawing on did not include citations for the content.

I was wrong.

It took multiple efforts to find how to phrase questions so that the material produced would contain citations and then how to actually view the full citations for the work mentioned in a generated product. The following example involved two requests and are a modified version of the topic I used as an example in my first post. I have highlighted the mention of sources.

Me: How can a semantic network explain the tip of the tongue phenomenon and list citations that support this explanation?

The tip of the tongue phenomenon, also known as TOT, is a common experience in which a person knows that they know a particular word or piece of information, but they are temporarily unable to retrieve it. A semantic network can help explain this phenomenon by providing a framework for understanding how words and concepts are related to one another in the brain.

In a semantic network, words and concepts are represented as nodes, and the relationships between them are represented as edges or links. When a person is trying to retrieve a word or piece of information, they search through their semantic network to find the relevant node. If the person is unable to retrieve the word, it may be because they are unable to find the appropriate node in their semantic network, or because the node is not well-connected to other nodes in the network.

One study that supports the idea that the tip of the tongue phenomenon is related to the organization of semantic networks is a 2013 study by J. K. Stanley and colleagues, which used functional magnetic resonance imaging (fMRI) to investigate the neural basis of TOT. The study found that during TOT, the brain exhibits increased activity in the temporal lobe, which is known to be involved in language processing and the organization of semantic networks.

Another study that supports the semantic network explanation of TOT is a 2001 study by J. M. McRae and colleagues, which used a computational model of semantic networks to investigate the retrieval of words during TOT. The study found that the model was able to reproduce many of the characteristics of TOT, including the feeling of knowing a word but being unable to retrieve it, and the ability to retrieve the word after a delay.

Overall, the evidence suggests that the tip of the tongue phenomenon is related to the organization of semantic networks in the brain, and that semantic networks can provide a useful framework for understanding this phenomenon.

Me: Provide citations that explain how the theory of a semantic network can explain the tip of the tongue phenomenon.

  1. Stanley, J. K., Marslen-Wilson, W. D., & van Hell, J. G. (2013). The tip of the tongue phenomenon: An fMRI study. Journal of Cognitive Neuroscience, 25(6), 891-900.
  2. McRae, J. M., de Sa, V. R., & Seidenberg, M. S. (2001). On the tip of the tongue: What causes word finding failures in young and older adults? Journal of Memory and Language, 44(4), 542-579.

Loading

Instructional responsibility and independent learning skills

I started posting about student highlighting, note-taking, and annotation as part of my observation that reading with a digital device offers some lifetime advantages over reading paper resources. The issue of comprehension vs. paper is only the initial stage of reading and the opportunities for layering personal comments on primary sources, the storage and organization of these comments, and the retrieval of these comments offers advantages for studying and written projects based on ideas from multiple sources. 

Once you begin exploring digital annotations and note-taking you can move into many different subtopics. For educators, any claims related to digital annotation and related activities (note-taking, highlighting) would typically involve short-term student uses of these skills (e.g., preparing for an exam, generating a multi-source written product). 

There are many investigations and analyses related to these ends. If I were to recommend a single source relevant to this topic it would be one that is a couple of decades old. It is also a study conducted with college students. Simpson and Nist made several key points in their introduction that I believe continue to be of great importance. Even though and perhaps because they worked with college learners, I think their most important observation was that treatments that involve a skill such as highlighting or annotation require the assurance that learners applying these study strategies do so effectively. They raise the same argument I have made multiple times. Have learners ever received guidance in how they should go about learning strategies such as highlighting and annotation? If those investigating study techniques in college learners are concerned with what is often called treatment fidelity, a similar concern should be raised when conducting annotation, highlighting and pretty much any study strategy with younger learners. 

The Simpson and Nist study developed over a three-week period of time using the type of content college students would encounter in their classes (I.e., extended period of time and authentic content). Students were exposed to sample annotated materials, provided examples of specific types of annotations (e.g., summaries, questions of understanding, potential examples), and offered samples of their attempts at annotation for review by peers and the instructor. The comparison of multiple choice questions responses covering each target source involved the treatment group and a group asked to study and generate possible exam questions. The annotation group demonstrated a significant advantage on the MC exams.

This study made use of paper source material, but offers a realistic example of what it takes to apply the treatment strategies in a way that generates an advantage. It cannot be assumed that common study techniques such as highlighting or annotation are applied in a cognitively generative way even though some within a peer group use such strategies. It is very possible students have never been involved in a conversation focused on the use of such strategies or practiced them under supervision, Students read and write in most classes, but responsibility for developing proficiency in such skills in secondary and lower-level college classes may be one of those things assumed to be someone else’s job.

Simpson, M. L., & Nist, S. L. (1990). Textbook annotation: An effective and efficient study strategy for college students. _Journal of Reading_, _34_(2), 122-129.

Loading

AI writing is here – teachers be aware

I intend this post as a public service announcement of a sort for educators at all levels. If you have not personally explored AI assisted writing, you need to do so. I will explain how. I say this because your students will soon explore AI services that will generate written products for them and you will have to adjust how you think about writing assignments as a consequence.

My suggestion is that you give a service called ChatGPT a try. Sign up and get a free account. The service is very easy to use. Once you have an account, you simply enter a description of what you want into a text box and wait for a reply. I think the sophistication of what is generated is likely to shock most and this reaction should cause you to consider what access to a service like this means for the tasks you describe in your classroom.

Here are a couple of examples. These examples relate to topics I might raise in a graduate educational technology course. My point is these are challenges I might offer students first to see if they understood a topic we had covered and then if they could connect this topic to other cognitive phenomena.

Me: What is a semantic network?

ChatGPT: A semantic network is a graph-based representation of a knowledge domain. It is a way of representing structured knowledge by modeling the relationships between concepts. In a semantic network, concepts are represented as nodes in the graph, and the relationships between concepts are represented as edges that connect the nodes. This allows for the representation of complex ideas and their relationships in a clear and intuitive manner. For example, a semantic network could be used to represent the relationships between different types of animals, their characteristics, and the ways in which they interact with each other.

Me: How can a semantic network explain the tip of the tongue phenomenon?

ChatGPT: The tip of the tongue phenomenon, also known as “TOT,” is a common experience in which a person knows that they know a word or a piece of information, but they cannot recall it at the moment. One possible explanation for this phenomenon is that the information is present in the person’s semantic network, but it is not easily accessible at the moment. In a semantic network, information is typically represented as nodes in the graph, and the connections between these nodes encode the relationships between different pieces of information. When a person experiences the tip of the tongue phenomenon, it may be because the information they are trying to recall is present in the network, but the path to access it is not easily accessible. In other words, the person knows that the information is there, but they cannot retrieve it easily because the connections between the relevant nodes in the network are not strong enough.

Me: Can the semantic network explain the testing effect?

ChatGPT: Yes, the semantic network can help explain the testing effect, which is the phenomenon whereby taking a test on material improves subsequent recall of that material compared to simply studying the material. The testing effect is thought to occur because taking a test forces a person to retrieve information from their memory, which strengthens the connections between the relevant nodes in their semantic network. This makes it easier to access the information in the future.

——–

Just so you appreciate what you are reading try this. Take the final question “Me: Can the semantic network explain the testing effect?” and enter it as a Google search. You will generate plenty of hits, but the hits are not really answers to the question. The hits at the top of the list I received were explanations of what a semantic network is and represented content I would have to read to generate an understanding. I did find an explanation of what the testing effect is in another hit further down the list. I did not find a resource near the top of the list that would cover a possible connection between the two. Educators may be familiar with the testing effect, but why it works so successfully is fairly speculative. I explain it by connecting the concepts of a networked associative model of memory and the cognitive mechanisms of the retrieval process which is kind of what I would be looking for if I had asked the question of students who have discussed these concepts with me. I was impressed AI put it together in a similar way.

So, AI writing is not simply retrieving what someone else has written (search), but is building from a huge collection of content at a more abstract level.

So? Well, students can “write” impressive products by giving an AI bot instructions. They may do this because they want to read how the AI bot generates a response they would find informative (Note: Search may soon be replaced by this type of AI process because it answers questions rather than just providing resources you can use to answer them yourself). AI can be your teacher.

However, if you are teacher using writing to evaluate learning or writing skill or if you are a teacher using writing as a way to encourage student learning (writing across the curriculum), the “writing” generated by AI is a challenge for you. It does not require much from the student in terms of content review, application of writing skills, or personal understanding. Like math teachers who figured out students could take a picture of a problem the students were assigned and have an online service solve it and show a solution process, you are now dealing with a new reality. Writing extended content is also a process that is much more difficult to monitor. You can watch students write a theme in class, but a research paper is a different beast.

Let’s start with this. Take some time to explore ChatGPT. You really need this personal experience to understand present capabilities. Don’t start by thinking how you might ban access. This will not be realistic AND it ignores the positive opportunities this type of tool offers. Are there opportunities you can think of? Now, what about limitations? What about citations? This service strikes me as a very intelligent colleague who can spout off on nearly any topic, but leaves me with little I can read and consider myself. How could I as a writer even use content a service like this might generate in something I would write? How would I justify the statements made? My first reaction is that educators review what they know about the Flower and Hayes writing process and consider what tangibles they might ask students provide at each stage of the process. For example, what are the notes they generated from the sources they intend to cite in their final product. How did they put these notes together in getting to this final product?

One more thing. I wrote a previous post about ChatGPT for Medium and in this post I included the response to a question about how AI writing might limit what students learn from writing themselves I asked of ChatGPT. I thought the response was quite useful and honest (now I am anthropomorphizing). Honesty is not always something you get with human authors.

Loading