Notes and the Translation Process

I recently read a research article (Cohen and colleagues, 2013) about students notetaking in college lectures that included interesting observations about the challenges students face. First, information comes at students quickly and to decide what to record and then manually recording what is selected is very demanding preventing little more than getting something down on paper or screen. The second challenge was what I found most interesting. The researchers proposed that students experience a linear flow of information that does not contain much of the structure of what the instructor is trying to communicate. The article proposed that students must try to create a structure after they leave the lecture hall and proposed one approach for doing so. 

This comment got me thinking about a more general model of learning from textbooks and presentations. Lecturers and authors must generate a product that is experienced linearly – i.e., presentations and books. With the exception of headings and subheadings in written material, content creators have a structure in mind that guides the creation of what they produce, but is difficult to share. I read elsewhere a suggestion that a presentation should flow from an outline and the presenter should refer back to the outline from time to time to try to communicate this structure. 

Thinking about the process perhaps at an even deeper level, I came up with the following representation. By increasing the complexity a bit, it might be possible to identify points of intervention.

So, this graphic is intended to suggest that the knowledge of a content creator is present is a cognitive network. To create a practical product for communication, the content creator has to transform aspects of this knowledge network into a hierarchically focused structure. I think an outline (physical or conceptual) is a good way to understand this transition step. This structured representation is then transformed into a linear representation that is shared in one way or another with an audience. As I suggested, a physical form of this outline may also be shared in some cases (the outline itself, or headings and subheadings). The learner then processes this input and from this processing, perhaps consisting of several steps, attempts to generate their own network of personal understanding. 

The initial notetaking or perhaps highlighting would be a basic process and perhaps many students decide this will be sufficient. However, those who propose study skill or personal knowledge management strategies focus on what other activities might be added to improve retention and understanding.

What other activities can be added to recreate the structure intended by the content creator or formed in a more personalized way by the learner? Some of these “post-processing” activities may be familiar. For example, creating concept maps, sketchnoting, the left-hand column and summary of Cornell notes, and the proposal that students take class notes on the left-hand page or their notebook and save the right-hand page for follow-up recollections and additions would fit. All of these tactics involve at least basic connections if not hierarchical relationships.

For those interested in translating the processing of information from the perspective of personal knowledge management. You can substitute a “smart note” for a node in the concept map strategy and consider the similarity of links created among notes by tags and forward and backward links. The sharing of this structure as Obsidian makes possible with Obsidian Publish offers a way to share both information and more complex structure as externalized by a content creator.

I have a book club colleague, History Professor Dan Alosso, who is building something like this for his U.S. History class. The idea is not to replace lectures but to offer related content as organized by the lecturer. Dan writes and offers videos through Substack.

So, what are the points of intervention I mentioned? Certainly, study strategy advocates have many ideas about what processing stage of the model I suggest. The sharing of a structure during or after the exposure of students to content is less frequently explored.

Reference

Cohen, D., Kim, E., Tan, J., & Winkelmes, M. A. (2013). A Note-Restructuring Intervention Increases Students’ Exam Scores. College Teaching, 61(3), 95-99.

Loading

AI in Readwise

 How do most Readwise users use the service? Is it the central location in which you suck in the notes and highlights from the multiple tools you use to read the multiple categories of content you consume to review and work with that content or is it a relay station between these sources and the tools you use to store, organize, expand on, and apply this content? I can’t really remember what I was thinking when I first paid the subscription price, but over the majority of the time I have used Readwise, it was mainly as a relay station. 

For those who have never tried Readwise, it may be unclear why you would want to pay the price of a subscription. The first paragraph of this post may have meant little to you even though I think it represents a reasonable description of the ways Readwise is used. Consider this example. I have made use of Kindle for years and have a collection of more than 300 books. I highlight a lot while I read and add occasional annotations. Most of this content is nonfiction and the source for what I write about. All of these highlights and annotations are out there somewhere, but how do I locate what might be helpful when it is scattered across so many sources many of which I might have read years ago? Readwise accepts the highlights from each of these books that is automatically output from Kindle and this entire body of material ends up in Readwise and can then be searched.

Now, somewhere along the way, Readwise added Readwise Reader and this addition became a major tool.  With Reader I found a “read it later tool” I used mainly to collect web content I could highlight and annotate and then send the content I added or identified through the relay system to Readwise or export it depending on my whim of the moment. 

Without describing other content sources, I hope you get the idea. Readwise allows the collection of highlights and notes from many different content sources. 

AI Chat within Readwise

Now, like many other digital tools, Readwise has added AI. This makes sense as the AI can be used to chat with all of the content or if you choose certain designated content that has been accumulated. The AI is easy to use, similar to other AI chats, and is powered by OpenAI’s GPT-4o model. If you are a Readwise user you may not have noticed this recent addition (see the red square enclosing the small word chat at the top of the following image). I have also used a red box to call your attention to import. I will get to an important import issue at another point, but wanted to make certain you see how to get to the import options.

Selecting chat will bring you to the following page. Here you find the typical request for a prompt and some suggestions. The suggestions will change as you make use of this feature.

As an example, I entered a prompt related to a recent topic I have been exploring. I don’t generate my posts using AI, but I sometimes ask for something written in a format I might use as a model. Within the content the AI generated, you will see link (blue color). Selecting a link will show the highlight or note within Readwise that was used to encourage a that part of what the AI wrote (the second of the two images appearing below). You can get the full set of content stored on Readwise from the displayed snippet of text by selecting the snippet.

If you are a Readwise user, I assume you can easily explore the AI chat just following the simple process I have outlined. This is not intended to be a full Readwise tutorial, but many can be found by searching online.

One additional comment

Most of what I write is not based on books, but rather on journal articles. I am an academic and this is typical of how we work. We read articles from many journals and for the last 10+ years I have read nearly entirely from pdfs of journal articles. This is what I can access through my university library and more suited to my work that even getting up from my desk and walking across my office to pull a journal off a shelf. I don’t want to highlight on paper because I want highlights and notes in a digital format.

I could have included the highlights from the hundreds of journal articles I had read in Readwise to create a massive collection of content I could explore via chat. However, I have not used a pdf reader that generates highlights in Readwise if I try to import the pdfs. This appears to be a common problem as I have explored this issue online. I will first note that you can highlight pdfs within the Readwise/Reader environment, but this has not been part of my workflow. I have found a way to fix the problem which I will describe here, but it is unlikely I will now import one by one my large collection of highlighted and annotated pdfs to Readwise. I will explain the hack I have discovered for others who may want to do so.

You should recall at the beginning of this post I showed the import link for Readwise. This link will bring up the many import options. I automatically import from Reader and Kindle. There is an option to import from pdfs. It is a one pdf at a time approach and requires that the pdfs with the highlights have been stored in the correct format. The import options are shown below.

I have multiple tools to highlight and annotate pdfs. Most recently, I have used Bookends and Highlights. Both are software for the Apple environment and work great on an iPad with an Apple Pencil. Unfortunately, the storage format is not acceptable to Readwise.

However, I found that I can open my highlighted and annotated pdfs in Preview which is the universal Mac tool for opening many different data files. It turns out you can export from Preview in multiple PDF formats and the first one I tried created a file that would be read by Readwise.

So, there is a way for those frustrated with the specific demands of Readwise.

Summary

Readwise if a powerful tool that stores the highlights and notes that have been added to a wide variety of content sources (e.g., web pages, Kindle books, Apple books. pdfs). Recently, an AI chat capability has been added to Readwise and can be used to interact with the content stored by Readwise. Because the quantity of this content is immense and represents what a user has found interesting or useful, being able to ask questions of this content offers very interesting possibilities. The AI chat capability is easy to explore and may even represent a selling point for those considering paying the subscription to use Readwise and Readwise Reader. 

Loading

Twitter Chats Now On BlueSky

Ten years ago or so when I was still involved in teaching graduate courses in instructional design, a kludged way of using Twitter popularly referred to as Twitter Chats emerged and became popular within the education community. It is fair to say that I was not a fan, but in keeping with the charge for my course I spent a lot of time in such chats and exposed my students to the experience through class assignments.

I tried without luck over several years to get a student to do a thesis focused on these chats. I proposed creating a system based on research studies from the past analyzing classroom interaction. How much time was devoted to teacher talk and to student talk? Who initiates questions and who responds? Who responds to responses? Does the teacher rephrase requests for participation based on categories the teacher could be asked to provide about learner characteristics – e.g., male/female, advanced/struggling? What proportion of classroom interaction was devoted to maintenance, content, discipline, socialization?

An observation of my own regarding chats was that they were extremely inefficient in comparison to other technology tools – discussion boards, group video interactions. So what was the point? My proposal to students was that a classification system of chat transcripts would be a way to investigate questions related to chat behavior. 

I never did get a grad student interested in my proposal and then Twitter Chats seemed to fade away. Until now that is. I have switched from being a Twitter (X) user to BlueSky and see that many other educators have as well. I just saw that the old Twitter Chat procedures are now being promoted on BlueSky. This encouraged me to search for something I wrote years ago about my suggestions for improving these chats even though I thought other tools offered educators better learning and communication experiences. What follows is that content minimally modified to be more timely. I have left the original use of Twitter as the focus, but replacing Twitter with BlueSky would be legitimate as the chat techniques are identical. 

***

Many educators have taken to using Twitter as a tool for “discussions”. Among participants, these discussions are more commonly described as chats and may be used as a way for students to share content, but more commonly seem a way for educators and their colleagues to interact.

Twitter chats, often called edchats when used in education, tend to follow a particular format partly to take advantage of characteristics of Twitter and partly because the approach is an efficient way to impose a synchronous approach on a tool not necessarily designed to be used in the way it has come to be used. Twitter was developed to share comments with followers. An edchat does not require that participants follow each other.

The essential feature of a Twitter chat is a common hashtag. All comments during a chat must contain the same hashtag. A Twitter hashtag is the symbol # followed by some series of letters or numbers; e.g., #grabechat. Participants in a chat actually search for the designated hashtag rather than watch their Twitter feed.

Following a series of tweets containing a common hashtag during a chat works best with a tool that automatically updates itself so the user does not have to repeat the search over and over again. My tool of choice is Tweetdeck (see image that follows). This tool allows an on-going search to be established based on a designated phrase (e.g., #ndedchat) and will keep this search current.

The other “rules” for a Twitter chat are conventions, i.e., made up rules. To have a synchronous chat, participants need to be online at the same time – e.g., Wednesday at 9 P.M. A variant called a slow chat, uses many of the same techniques but relies on an asynchronous approach – participants connect when they can over a greater amount of time.

The most common approach for a Twitter chat is a question and answer format based on a theme. A “moderator” may generate the questions for the week or participants may share responsibility for this task. Posting the questions before the chat allows participants to prepare. Some participants may even generate answers and then paste them into the chat tool when the questions are presented. This slows the discussion process down for these participants and allows them to spend the time thinking about what others have to say. This approach is uncommon, but would seem to lead to greater reflection (see my criticism of the typical chat that follows this description).

Another convention is used to deal with other typical challenges of an online discussion. Because real-time chats involving many participants have the potential to become disjointed, questions and answers are often numbered; e.g., Q1, Q2, … and A1, A2, … . The appropriate label is added to each question or answer. This approach allows individuals to make clear how their responses match with a specific question or earlier replies from other individuals. A typical hour-long chat seems to be based on 8-10 questions. Note that the inclusion of a hashtag and the indicator for a given question reduces the length of any given tweet.

Critical analysis and suggestions

I have participated in and viewed many edchats. These experiences have resulted in criticisms both of the technical tool and the way chats tend to unfold (the tactics).

I have fallen into analyzing educational technology experiences in terms of tools and tactics and this approach may be useful here. The idea is to separate the consideration of the potential and actual perceived value of the tool (the specific service or application) and tactics (the strategies of use). My assumption in the comments that follow is that the general goal for an edchat is professional development – the acquisition by professionals of new knowledge and skills. The existing tool is Twitter and the tactic is participant responses to a series of approximately 10 questions within an hour long block of time.

Assumed advantages of tool (Twitter) – free, easy to learn, large installed base of users

Assumed advantages of tactic – educators are familiar with a question and answer format and can participate with little preparation

Issues

A general issue with social media is that once a platform (tool) has attracted a user base, new and better tools fail to gain participants because individuals are reluctant to migrate for fear their social connections will be lost. I think this is the case with Twitter in the education community. I think Twitter has inherent issues because of the brief comments it allows. This limitation and the time to enter comments from a keyboard or screen, in my opinion, leads to rather shallow interactions. It may be a great way to learn about new things via links, but it is not a tool suited to meaningful, synchronous discussion.

The edchat format (the tactic) has taken hold and it seems popular to have such chats. There is a certain momentum here. There is also the issue of doing it like everyone else does. Conformity seems to limit a consideration of both tool and tactic.

I tend to look at this setting as if it were a class I was facilitating. As educators, does the typical edchat generate the type of interaction you would want to see in your class. What would you change?

How to improve edchats – some ideas:

Prepare beyond the generation of a lengthy series of questions. Either come up with 2-3 questions of greater depth or offer a common preparation task (read this post, read this book, etc.). Perhaps the moderator for the week should either find a resource or write a position statement.

I find the questions and topics to be too general. As an academic, I understand that since we are frequently described as being abstract and not getting the level of actual application this would seem a strange concern, but review chats and see what you think. I try to recognize my own possible biases here by looking at the responses the questions generate. The questions seem to generate few specific suggestions or examples.

I see very little interaction. Put more bluntly – the discussions are seldom discussions. Sometimes a response from another participant is praised, but there are few reactions, counter examples, requests for clarification, etc. If this was a FTF classroom, the typical edchat would be similar to choral responding rather than a discussion. I would propose these limitations are the result of both the tool (lack of room for depth) and the tactic (too many questions and responding without preparation).

Blogging before discussing might be helpful. Taking a position on an issue before interacting can be productive. Give some thought to your position before you are tainted by what others have to say. Offer an example. Process your own experiences and externalize a position for others to consider. Post before you participate. A moderator and other participants might then use these comments to request clarification or note differences of opinion.

Some comments on tools.

I admit at this point that it is difficult to isolate tool and tactics. I think moving beyond Twitter would be helpful.

I think it is time to consider other tools. I have always had access to discussion tools and I see greater opportunity for depth in synchronous commenting and responding in using these tools.

I understand that folks enjoy the social experience of Twitter chats, but I think it important to consider whether group socializing is the primary goal.

I am not familiar with all of the tools available to educators. Does the state or school offer a general set of tools (a discussion option, a blogging option)? What about Zoom or Teams?

Twitter chats may be the “in thing” but it may be time to think through the tool and the tactics and either make adjustments or move on to a better tool and improved tactics.

Summary:

1) Reduce the number of questions and give more thought to the type of questions used

2) Have a pre-session expectation for preparation of some type. I think expecting a product is always helpful related to this preparation is always helpful. Somehow, the popularization of “flipping” various education experiences should apply here. Prepare before you participate should be the expectation.

3) The moderator needs to encourage more give and take rather than limiting “discussion” to call and response. As I have already suggested, existing position statements that can be contrasted would be a great place to start. I understand the concern with how stating a different position will be received, but the generic positive reactions add little.

4) Consider other technology tools.

5) Generate a discussion summary (perhaps the moderator or a designated discussant). Did the summarizer learn anything?

Given these observations, I encourage you to form your own opinions. I wish Twitter chats had been analyzed more empirically, but to my knowledge this has not been the case at the time this content was generated. It is easy enough to explore on your own.

The following video summarizes some of these ideas.

Loading

The Space Between Encountering Information and Application

One way to characterize Personal Knowledge Management (PKM) is to suggest it involves the analysis of actual and potential tactics applied between encountering information and the application of that information. I came to this topic with a background in the development and evaluation of technology tools for academic studying which involves considerable overlap with PKM. I think it fair to say that studying offers an advantage to interested parties because it has a superior theoretical framework and a large volume of theory-driven research. PKM seems to have developed within a framework I could describe as logical rather than research-based, but it is related to methods of considerable longevity (e.g., commonplace books, note-taking within procedural systems such as the Zettelkasten). 

This post was prompted by the announcement and availability of a new version of Mem.ai. There are many digital note-taking tools available, but for some time I have concentrated on two – Obsidian and Mem.ai. My rationale has been that I wanted to invest sufficient time in creating and using a personal knowledge management system so that I could offer credible comments on the tools I use and the tactics that are recommended and that I have employed. Part of this involves building a significant collection of notes over an extended period of time. Many recommended practices cannot really be evaluated with a small body of material used for a short period of time. 

When I started using Mem it was because I wanted to explore how AI could be applied within a PKM system. With time, Obsidian extensions allowed several different ways to add AI to Obsidian so there was no longer a unique difference, but I have continued to use both nonetheless. 

Comparing Obsidian and Mem.AI

When comparing how Obsidian and Mem serve writers between reading and writing, there are distinct approaches each platform takes to facilitate the transition from note-taking to writing.

Obsidian

Obsidian is known for its flexibility and emphasis on linking notes to create a network of ideas. It supports a bottom-up approach to writing, where notes are interconnected through backlinks and tags, allowing users to discover relationships between ideas organically. This method aligns with the slip-box or Zettelkasten approach, which encourages the creation of permanent notes that can stand alone and be easily integrated into future projects. Obsidian’s use of markdown files and its ability to handle large volumes of notes make it a powerful tool for writers who prefer a structured yet flexible environment for developing their ideas.

Mem

Mem, on the other hand, focuses on enhancing the linking capability through AI-driven suggestions. It extends beyond manual tagging and keyword searches by proposing related ideas and documents, which can come from the user’s own mems or those shared by team members. This AI-driven approach aims to improve the retrieval and linking of information, making it easier for writers to access relevant content and insights. Mem’s design is centered around the concept of a “second brain,” where storage, retrieval, and linking are optimized to support the writing process.

Key Differences

  • Linking and Organization: Obsidian relies on manual linking and tagging, while Mem uses AI to suggest connections.
  • Flexibility vs. Automation: Obsidian offers more flexibility in how notes are organized and linked, whereas Mem provides automated suggestions to enhance the linking process.
  • User Experience: Obsidian’s interface is more suited to users who prefer a hands-on approach to organizing their notes, while Mem’s AI features cater to those who appreciate automated assistance in discovering connections.

Both platforms offer unique advantages depending on the writer’s preferences and workflow. Obsidian is ideal for those who enjoy a more manual and customizable approach, while Mem provides a more automated and AI-enhanced experience. 

When is the process the product?

Part of the marketing for the original Mem.ai made the argument that the AI capabilities freed users from some of the process requirements of other note-taking tools. The differentiation of notes into folders and the connecting of notes by manual links was not necessary. You could search and chat with your notes to accomplish your goals. Such capabilities were there (@ in Mem to create a link instead of the [[]] in Obsidian), but were claimed to be unnecessary.

The AI can do it for you is what concerns practitioners in some domains for some purposes. Educators may be concerned that students use AI to complete homework assignments. Writing assignments can easily and reasonably be completed by giving an AI tool a prompt. With writing there are two interrelated problems. As a skill writing needs to be learned, so practicing the subskills (procedures) involved in skilled writing are not practiced when the work is done by the AI. A separate concern is that writing is a way to process the content that can be the focus of the assigned writing task and this processing does not happen when the AI provides and assembles the content. There are counters to these concerns as AI can contribute in different ways allowing some subskills that are involved to be ignored so that others can be emphasized, but this possibiity is making my example unnecessarily complicated.

With note-taking, I think of the argument for what I am calling the manual approach is based on the assumed value of generative cognitive processing. I describe a generative activity as an external task that is likely to increase the probability of an internal (cognitive) process. When proposing an example of a generative activity, I use questions. In theory, connecting new concepts with experiences is an important learning process. Individuals may or may not do this on their own. If I request that they provide an example of concept XXX, it is fairly likely they will think and come up with something. Hence, questions function as generative activities

The organization of notes into folders or categories and the searching for connections to be made permanent with links involves thinking and decision-making that is less likely without the commitment to tasks that require such thinking. These actions may also serve generative functions. While educational researchers have proposed and evaluated many manual processing activities associated with note-taking as part of studying, to my knowledge such research does not exist for some of the procedures recommended by recent, digital note-taking gurus (see an earlier post on the lack of such research). So, unlike the abundant research on the benefits of provided and self-generated questions, the specific activities associated with digital (and manual) note-taking skills are largely untested. This is partly the reason I continue to duplicate my collection of notes within both Obsidian and Mem. Personal experience is a weak research tool, but better than nothing. 

This is what I mean by questioning whether the processing requirements of the various note-taking tools strongly contributes to the eventual application. The recent development of systems such as Obsidian and Mem seem more likely driven by the long-term use of information in comparison to what might be associated with academic studying, Purpose and length of the exposure to use processes may be important differentiators. What is interesting about Mem is that it has come out with the argument that AI can eliminate many of the activities focused on and debated by Obsidian users.

Summary

This post attempts to identify and differentiate two note-taking and note-using approaches that can be associated with two specific products. While both systems can now be used in the same ways, the proposed differences are interesting. How important are the manual actions AI can eliminate? I will offer one observed advantage to the AI capabilities that can be applied with either system, with the large collection of notes I have now accumulated, I have found that AI prompts surface useful notes I would not have identified based on the manual links I had accumulated. I suppose there might have been benefit in a continuation of exploration by the use of links, tags, and search, but I must deal with the reality I could not necessarily make the effort. Perhaps continuing to use both and adding links to connections identified by AI makes the most sense.

Loading