• Skip to primary navigation
  • Skip to main content

Center for Excellence in Teaching and Learning

University of Mississippi

Hide Search
  • About
    • Mission and Purpose
    • History
    • Meet Our Team
    • Location & Hours
  • Services
  • Programs
    • Faculty
      • Faculty Reading Group
      • Inclusive Teaching Learning Community
      • Teaching Development Grants
    • Graduate Students
      • Graduate Teaching Orientation
      • Fundamentals of Teaching
      • Graduate Reading Group
      • Graduate Teaching Credentials
      • Graduate Excellence in Teaching Awards
      • Graduate Consultant Program
      • Graduate Teaching Resources
    • Supplemental Instruction
      • Student Information
      • SI Schedule Fall 2023
  • Events
  • Resources
    • Teaching Resources
    • Teaching Awards
      • University Teaching Awards
      • Graduate Excellence in Teaching Awards
  • Blog
  • Thank an Instructor
  • Show Search

Teaching Resources

Recap: Teaching in the Age of AI (What’s Working, What’s Not)

· Sep 21, 2023 ·

by Derek Bruff, visiting associate director

A robot writing on a sheet of paper on a cluttered desk, as imagined by MidjourneyEarlier this week, CETL and AIG hosted a discussion among UM faculty and other instructors about teaching and AI this fall semester. We wanted to know what was working when it came to policies and assignments that responded to generative AI technologies like ChatGPT, Google Bard, Midjourney, DALL-E, and more. We were also interested in hearing what wasn’t working, as well as questions and concerns that the university community had about teaching and AI.

We started the session with a Zoom poll asking participants what kinds of AI policies they had this fall in their courses. There were very few “red lights,” that is, instructors who prohibited generative AI in their courses and assignments. There were far more “yellow lights” who permitted AI use with some limitations. We had some “green lights” in the room, who were all in on AI, and a fair number of “flashing lights” who were still figuring out their AI policies!

Robert Cummings and I had invited a few faculty to serve as lead discussants at the event. Here is some of what they said about their approaches to AI this fall:

  • Guy Krueger, senior lecture in writing and rhetoric, described himself as a “green light.” He encourages his writing students to use AI text generators like ChatGPT in their writing, perhaps as ways to get started on a piece of writing, ways to get additional ideas and perspectives on a topic, or tools for polishing their writing. Guy said that he’s most interested in the rhetorical choices that his students are making, and that the use of AI, along with some reflection by his students, generated good conversations about those rhetorical choices. He mentioned that students who receive feedback on their writing from an instructor often feel obligated to follow that feedback, but they’re more likely to say “no, thanks” to feedback from a chatbot, leading to perhaps more intentional decisions as writers. (I thought that was an interesting insight!)
  • Deborah Mower, associate professor of philosophy, said she was a “red light.” She said that with generative AI changing so rapidly, she didn’t feel the time was right to guide students effectively in their use or non-use of these tools. She’s also teaching a graduate level course in her department this fall, and she feels that they need to attend to traditional methods of research and writing in their discipline. Next semester, she’ll be teaching an undergraduate course with semester-long, scaffolded projects on topics in which her students are invested, and she’s planning in that course to have them use some AI tools here and there, perhaps by writing something without AI first and then revising that piece with AI input. (It sounds like she’s planning the kinds of authentic assignments that will minimize students’ use of AI to shortcut their own learning.)
  • Melody Xiong, instructor of computer and information science, is a “yellow” light in the 100-, 200-, and 400-level courses she’s teaching this fall. She teaches computer programming, and while she doesn’t want students to just copy AI output for their assignments, she is okay with students using AI tools as aids in writing code, much like they would have used other aids before the advent of AI code generators. She and her TAs offer a lot of office hours and her department provides free tutoring, which she hopes reduces the incentive for her students to shortcut the task of learning programming. She also knows that many of her students will face job interviews that have “pop quiz” style coding challenges, so her students know these are skills they need to develop on their own. (External assessments can be a useful forcing function.)
  • Brian Young, scholarly communication librarian, and also described himself as a “green light” on AI. In his digital media studies courses, generative AI is on the syllabus. One of his goals is to develop his students’ AI literacy, and he does that through assignments that lead students through explorations and critiques of AI. He and his students have talked about the copyright issues with how AI tools are trained, as well as the “cascading biases” that can occur when the biases in the training data (e.g. gender biases in Wikipedia) then show up in the output of AI. One provocative move he made was to have an AI tool write a set of low-stakes reading response questions for his students to answer on Blackboard. When he revealed to his students that he didn’t write those questions himself, that launched a healthy conversation about AI and intellectual labor. (That’s a beautiful way to create a time for telling!)

Following the lead discussants, we opened the conversation to other participants, focusing on what’s working, what’s not, and what questions participants had. What follows are some of my takeaways from that larger conversation.

One faculty participant told the story of a colleague in real estate who is using generative AI to help write real estate listing. This colleague reports saving eight hours a week this way, freeing up time for more direct work with clients. This is the kind of AI use we’re starting to see in a variety of professions, and it has implications for the concepts and skills we teach our students. We might also find that generative AI can save us hours a week in our jobs with some clever prompting!

Several faculty mentioned that students are aware of generative AI technologies like ChatGPT and that many are interested in learning to use them appropriately, often with an eye on that job market when they graduate. Faculty also indicated that many students haven’t really used generative AI technologies to any great extent, so they have a lot to learn about these tools. One counterpoint: Deborah Mower, the “red light,” said that her graduate students have been content not to use AI in their course with her.

Course policies about AI use vary widely across the campus, which makes for a challenging learning landscape for students. I gather that some departments have leanings one way or another (toward red or green lights), but most course policies are determined by individual instructors. This is a point to emphasize to students, that different courses have different policies, because students might assume there’s a blanket policy when there is not.

This inconsistency in policy has led some students to have a fair amount of anxiety about being accused of cheating with AI. As AIG’s Academic Innovation Fellow Marc Watkins keeps reminding us, these generative AI tools are showing up everywhere, including Google Docs and soon Microsoft Word.

Other students have pushed back on “green light” course policies, arguing that they already have solid processes for writing and inserting AI tools into those processes is disruptive. I suspect that’s coming from more advanced students, but it’s an interesting response. And one that I can relate to… I didn’t use any AI to write this blog post, for instance.

A few participants mentioned the challenge of AI use in discussion forum posts. “Responses seemed odd,” one instructor wrote. They mentioned a post that clearly featured the student’s own ideas, but not in the student’s voice. Other instructors described posts that seemed entirely generated by AI without any editing by the student. From previous conversations with faculty, I know that asynchronous online courses, which tend to lean heavily on discussion forums, are particularly challenging to teach in the current environment.

That last point about discussion posts led to many questions from instructors: Where is the line between what’s acceptable and what’s not in terms of academic integrity? How is it possible to determine what’s human-produced and what’s AI-produced? How do you motivate or support students in editing what they get from an AI tool in useful ways? How can you help student develop better discernment for quality writing?

One participant took our conversation in a different direction, noting the ecological impact of the computing power required by AI tools, as well as the ethical issues with the training data gathered from the internet by the developers of AI tools. These are significant issues, and I’m thankful they were brought up during our conversation. To learn about these issues and perhaps explore them with your students, “The Elements of AI Ethics” by communication theorist Per Axbom looks like a good place to start.

Thanks to all who participated in our Zoom conversation earlier this week. We still have a lot of unanswered questions, but I think the conversation provided some potential answers and helped shape those questions usefully. If you teach at UM and would like to talk with someone from CETL about the use of AI in your courses, please reach out.  We’re also organizing a student panel on AI and learning on October 10th. You can learn more about this event and register for it here. And if you’d like to sign up for Auburn University’s “Teaching with AI” course, you can request a slot here.

Update: I emailed Guy Krueger, one of our lead discussants, and asked him to expand on his point about students who have trouble starting a piece of writing. His response was instructive, and I received his permission to share it here.

I mentioned that I used to tell students that they don’t need to start with the introduction when writing, that I often start with body paragraphs and they can do the same to get going. And I might still mention that depending on the student; however, since we have been using AI, I have had several students prompt the AI to write an introduction or a few sentences just to give them something to look at beyond the blank screen. Sometimes they keep all or part of what the AI gives them; sometimes they don’t like it and start to re-work it, in effect beginning to write their own introductions.

I try to use a process that ensures students have plenty of material to begin drafting when we get to that stage, but every class seems to have a student or two who say they have real writer’s block. AI has definitely been a valuable tool in breaking through that. Plus, some students who don’t normally have problems getting started still like having some options and even seeing something they don’t like to help point them in a certain direction.

Recap: What Instructors Need to Know When Working with Neurodivergent Students

· Sep 13, 2023 ·

by Liz Norell, associate director of instructional support

In our August 8 blog, we shared a preview of our September 8 workshop on supporting neurodivergent students, including the following definitions of key terms:

  • Neurodivergent: a person with a brain that processes information in a way different from most individuals.
  • Neurotypical: a person with a brain that processes information in a way typical of most individuals.
  • Neurodiverse: a group of people with diverse ways of processing information, including those considered typical.

Because any learning space (or any group gathering, for that matter) includes people with diverse ways of knowing, processing information, and learning, all of these spaces will be neurodiverse. Meeting the needs of the neurodivergent, though, requires some awareness and intentionality.

Neurodivergent describes those who have some condition that impacts how their brains work; this might be a learning disability, attention deficit or anxiety disorder, obsessive-compulsive disorder, Tourette’s syndrome, bipolar disorder, and more.

In our workshop, we talked about the two most common neurodivergent conditions that show up in college classrooms: autism and attention-deficit/hyperactivity disorder (ADHD).

But first, a few notes about language. Some disability advocates prefer to use what’s called person-first language (e.g., a person with ADHD, a person with autism), whereas others prefer identity-first language (e.g., an autistic person, an ADHD person). Recent research suggests that in the United States, autistic adults largely prefer identity-first language (“I am autistic”) over person-first language (“I have autism”), whereas friends, family members, and professionals often expressed a preference for person-first language.

Within disability advocacy networks, the terms Asperger’s syndrome and high-/low-functioning are no longer used. We also use bipolar disorder to describe the condition, rather than the outdated manic-depressive terminology.

Autism

Many of us have a stereotype in our minds about what autism looks like. It’s usually a young boy who is somewhat withdrawn, doesn’t make eye contact, makes repetitive movements, and is hyper-focused on a few specific interests. There is some truth to that stereotype, but it’s far from a complete picture of how this neurodivergence manifests in all autistic people.

Things you might notice among students and colleagues who are autistic:

  • Repetitive movements, like rocking, tapping, or picking at things
  • Unusual eye contact (too little or too much)
  • Overly eager desire to share information
  • Anxiety during transitions or unexpected events
  • “The lights are too loud”
  • Needing something explained multiple times
  • Meltdowns (including crying) for no apparent reason
  • Fixation on minor details
  • Seemingly distracted gaze out a window or at something on the wall

Autism is formally known as “autism spectrum disorder,” or ASD, because there is a wide range of symptoms that people with autism experience, both in type and severity. People with autism can be very successful in educational and professional contexts, particularly if those settings are supportive of neurodivergent people. There are many often-overlooked benefits to having an autistic brain, such as the ability to hyperfocus, exhibit unusual attention to detail, and creativity. The film Autism Goes to College and its companion podcast, which shares stories of neurodivergent college students, are terrific resources for those in higher ed.

ADHD

We also have a stereotype about what ADHD looks like. It’s usually (once again) a young boy who cannot sit still, talks quickly, and interrupts often. While this may be true, ADHD looks different across those who have this kind of neurodivergent brain. You might notice:

  • Cannot seem to talk fast enough to get ideas out
  • Bursts of speech
  • Paralyzing reaction to negative feedback
  • “A racecar brain with bicycle breaks” (a phrase coined by Dr. Edward Hallowell)
  • Crave novelty, excitement, and challenge
  • Work better when around someone else (“body doubling”)
  • Appear to be “lazy” (but this is a mischaracterization)
  • Difficulty sitting still–often fidgeting or needing to move
  • Easily distracted by something interesting

One of my favorite writers about ADHD, a British woman who writes as “Authentically Emily” online, says that ADHD brains have an “interest-based nervous system,” as opposed to a neurotypical’s “importance-based nervous system.” Emily says that ADHD brains need at least one of the following four components to focus their attention on a specific task:

  1. Novelty: The task is new and exciting in some way.
  2. Interest: The person is deeply interested in the task or topic.
  3. Challenge: A sense of competition or inherent difficulty motivates attention.
  4. Urgency: The task needs to be done right now.

Lacking any of these four components, and a task might be impossible for a person with ADHD to focus their attention long enough to complete it.

This is why Karen Costa, who is working on a book for educators about ADHD, urges instructors to provide structure and deadlines for their students: “Many ADHDers need more structure, not less. We need more deadlines, not less.” (Listen to Karen Costa on the Teaching in Higher Ed podcast to learn more.)

Tangible steps to support neurodivergent students

Although neurodivergence shows up differently for each person, there are many things you can do as an instructor to help give your neurodivergent students equitable access to learning and success. Here are a few ideas:

  • Openly welcome students of all abilities and neurotypes. Something as simple as a statement that says: “If you are disabled, I welcome a conversation to discuss your learning needs. I want to make sure you succeed in our course.” Don’t assume every neurodivergent student has a diagnosis or formal accommodations; the testing process to obtain a diagnosis is lengthy and costly, and the process of requesting formal accommodations is also time-consuming. Look for ways to design your course so students need as few accommodations as possible, and every student will benefit. (There are some good language examples here.)
  • Manage the sensory input in your spaces. Look for ways to dampen the sensory overload many neurodivergent students struggle to manage. For example, can you dim the lights? Use indirect or diffuse lighting? If you have a whole-class discussion, discourage cross-talk (multiple people talking at once) to limit the auditory noise. Allow students to rearrange the furniture to meet their needs, such as physically separating their chair/desk to a corner or sitting on the floor if they like. Explicitly welcome things like doodling, knitting/crocheting, coloring, using fidget toys, or occasionally moving around the room.
  • Communicate clearly and regularly. Share your expectations and directions multiple times and explicitly. Don’t assume your students can pick up on subtle cues or social norms. Limit your use of sarcasm or inside jokes—and if you use them, be over-the-top in signaling that you’re doing so. Define any subjective or unclear terminology in concrete terms (e.g., professional, collaboration, effectiveness, etc.). Provide written complements to any verbal directions and provide other ways to access important material outside of class.
  • Ensure social interactions are intentional and clear. Normalize stimming behaviors in and out of class—by which I mean make it explicitly allowable, even encouraged. Clearly define the purpose and expected roles in any interpersonal interactions (such as think-pair-share or groupwork activities in class). Whenever possible, offer an option to work alone. Provide students with alternative ways (other than verbally) to communicate or interact with one another and with you.
  • Create (flexible) structures. Hide or remove any unnecessary or unused features in Blackboard. Whenever a specific skill is needed (such as time management or completing a multi-stage project/assignment), provide how-to directions (or link to them). Scaffold assignments to create regular and meaningful deadlines. Assume good intentions and allow your students to make mistakes without debilitating penalties. (For more advice on this point, listen to this interview with instructional designer Cathryn Friel.)

If you want to learn even more, you can download the presentation deck or to reach out to Liz in CETL.

Student Disability Services can also provide meaningful support for faculty who have neurodivergent students in their classes—as most of us do.

Policies and Practices for Generative AI in Fall Courses

· Aug 15, 2023 ·

by Derek Bruff, visiting associate director

Last Friday, CETL co-sponsored an online workshop titled “Generative AI on the Syllabus” with our parent organization, the Academic Innovations Group (AIG). Bob Cummings, executive director of AIG, and I spent an hour with 170 faculty, staff, and graduate students exploring options for banning, embracing, and exploring generative AI tools like ChatGPT, Bing Chat, and Google Bard in our fall courses.

After starting the session providing a brief overview of the current landscape of generative AI tools, I asked participants to provide a few words describing how they were feeling about generative AI and its impact on teaching and learning. The resulting word cloud (seen below) is full of words like excited, curious, and intrigued, but also apprehensive, concerned, and overwhelmed. It was clear to me that the instructors present on Friday hadn’t completely figured out their approach to generative AI for the fall.

Bob and I then shared the CETL Syllabus Template, a document that has suggested syllabus language, information about UM policies, and links to course design resources. My CETL colleague Emily Donahoe led a small team of us this summer in developing a new section of that template focused on generative AI. If you’re still working on your AI policy for the fall, I high recommend opening that document and scrolling to page 9 for some thoughtful options to consider. (Please note: We have released AI section of the Syllabus Template under a Creative Commons license so that those outside of the University of Mississippi can use and adapt it as they like.)

For example, if you’re leaning toward prohibiting the use of AI text generators in your course, you might use the suggested syllabus language under the heading “Use of Generative AI Not Permitted”:

Generative AI refers to artificial intelligence technologies, like those used for ChatGPT or Midjourney, that can draw on a large corpus of training data to create new written, visual, or audio content. In this course, we’ll be developing skills that are important to practice on your own. Because use of generative AI may inhibit the development of those skills, I ask that you refrain from employing AI tools in this course. Using such tools for any purposes, or attempting to pass off AI-generated work as your own, will violate our academic integrity policy. I treat potential academic integrity violations by […]

If you’re unsure about whether or not a specific tool makes use of AI or is permitted for use on assignments in this course, please contact me.

You’ll need to fill in those ellipses with your own words, but this is a good start on a conversation with students about use of generative AI in their learning process in your course.

If you’re more open to student use of generative AI tools in your course, then read the section titled “Use of Generative AI Permitted (with or without limitations).” That section offers language to talk about the ways AI tools can support or hinder learning, appropriate uses of generative AI tools in your course, and options for disclosing one’s use of an AI tool on an assignment (e.g. the APA’s newly issued recommendations). That section also lists a number of AI tools as a reminder that we’re not just talking about ChatGPT here.

The recommendation section of the syllabus template expands on that idea:

As you craft your policy, please keep in mind that students may encounter generative AI in a variety of programs: chatbots like ChatGPT; image generators like DALL-E or Midjourney; writing and research assistants like Wordtune, Elicit, or Grammarly; and eventually word processing applications like Google Docs or Microsoft Word. Consider incorporating flexibility into your guidelines to account for this range of tools and for rapid, ongoing developments in AI technologies.

There’s also a caution about the use of AI detection tools:

Please be aware, too, that AI detection tools are unreliable, and use of AI detection software, which is not FERPA-protected, may violate students’ privacy or intellectual property rights. Because student use of generative AI may be unprovable, we recommend that instructors take a proactive rather than reactive approach to potential academic dishonesty.

After discussing the CETL Syllabus Template and its new language about AI, I shared a few ideas for revising assignments this fall in light of the presence of generative AI tools. I walked through an “assignment makeover” for an old essay assignment, a makeover that I detailed on my blog Agile Learning last month. In that post, I suggest six questions to consider as you rethink your assignments for the fall:

  1. Why does this assignment make sense for this course?
  2. What are specific learning objectives for this assignment?
  3. How might students use AI tools while working on this assignment?
  4. How might AI undercut the goals of this assignment? How could you mitigate this?
  5. How might AI enhance the assignment? Where would students need help figuring that out?
  6. Focus on the process. How could you make the assignment more meaningful for students or support them more in the work?

There were two big questions that emerged from the Q&A portion of the workshop. One, is there any practical way to determine if a piece of student work was ghostwritten by ChatGPT? Answer: No, not really. All the AI detectors are unreliable to one degree or another. Two, how might we teach students about the limitations of AI tools, like the fact that they output things that are not true? Answer: One approach is to have students work with these tools and critique their outputs, like these divinity school faculty did in the spring. (I think that example is my favorite pedagogical use of ChatGPT that I’ve encountered thus far.)

What’s next for this topic? Bob and I shared a few possibilities for UM instructors:

  • Auburn University has opened its online, asynchronous “Teaching with AI” course to faculty across the SEC, which includes UM faculty. This course is a time commitment (maybe 10 to 15 hours), but it’s full of examples of assignments that have been redesigned for an age of AI. Reach out to Bob Cummings if you’re interested in taking this course.
  • Marc Watkins, lecturer in writing and rhetoric and now also an academic innovation fellow at AIG, has also built a course available to UM instructors. It’s called “Introduction to Generative AI,” and it offers a lot for faculty, staff, and students interested in building their AI literacy, something Marc recommended in his recent Washington Post interview. To gain access, just contact me.
  • This past summer, the UM Department of Writing & Rhetoric offered an AI summer institute for teachers of writing. At some point this fall, they’re planning to offer the institute again for UM faculty, but with a broader scope. Keep an eye out for announcements about this second offering.
  • Here at CETL, our staff of teaching consultants are available to talk with UM instructors about course and assignment design that integrates or mitigates the use of generative AI. You’re welcome to contact me or any of our staff with questions.
  • Finally, we’re planning two more events in this CETL/AIG series on teaching and AI, both on Zoom later this semester. “Teaching in the Age of AI: What’s Working, What’s Not” is scheduled for September 18th, and “Generative AI in the Classroom: The Student Perspective” is scheduled for October 10th. See our Events page for details and registration.

A reading list on active learning in STEM courses

· May 5, 2023 ·

by Derek Bruff, visiting associate director

This spring CETL hosted a faculty learning community on the topic of active learning in large STEM courses. Over a dozen faculty from biology, chemistry, mathematics, physics, and other departments met every other week, mostly on Zoom, to share and discuss shared challenges teaching large courses, particularly introductory courses. I organized and facilitated the learning community, and one of the fun parts of that work was selecting the readings for each of our meetings. I thought I would share the reading list here on the new CETL blog, in case its useful to other educators or educational developers.

Session 1: Introductions

For our first meeting, we shared introductions and challenges and discussed a couple of modern classics of the STEM education research literature:

  • “Active learning increases student performance in science, engineering, and mathematics,” Freeman et al (2014), https://www.pnas.org/doi/full/10.1073/pnas.1319030111
  • “Active learning narrows achievement gaps for underrepresented students in undergraduate science, technology, engineering, and math,” Theobald et al. (2020), https://www.pnas.org/doi/10.1073/pnas.1916903117

Session 2: The Coverage Challenge

By far, the most common challenges shared by learning community members was the need to cover a lot of content in these large courses. We tackled that challenge head on in our second meeting using these readings:

  • “The tyranny of content: ‘Content coverage’ as a barrier to evidence-based teaching approaches and ways to overcome it,” Peterson et al. (2020), https://www.lifescied.org/doi/10.1187/cbe.19-04-0079
  • “Worried about cutting content? This study suggests it’s OK,” Supiano (2022), https://www.chronicle.com/newsletter/teaching/2022-07-07
  • “Inclusive and active pedagogies reduce academic outcome gaps and improve long-term performance,” Dewsbury et al. (2022), https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0268620

Session 3: The Flipped Classroom

One response to the coverage challenge is to shift some of the learning outside of the rather limited class time we have with students. The flipped classroom was thus the topic of our third discussion, using these readings:

  • Flipped Learning: A Guide for Higher Education Faculty, Talbert (2017). You can read most of the first chapter online. See also his final definition of flipped learning.
  • “114 studies on flipped classrooms show small payoff for big effort,” Barshay (2020), https://hechingerreport.org/proof-points-114-studies-on-flipped-classrooms-show-small-payoff-for-big-effort/
  • “The impact of a flipped classroom model of learning on a large undergraduate statistics class,” Nielson et al. (2018), https://iase-web.org/ojs/SERJ/article/view/179.

Session 4: Student Reactions

How do students respond to active learning instruction? It’s a mixed bag. See these readings:

  • “Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom,” Deslauriers et al. (2019), https://www.pnas.org/doi/full/10.1073/pnas.1821936116
  • “What I wish my instructor knew: How active learning influences the classroom experiences and self-advocacy of STEM majors with ADHD and specific learning disabilities,” Pfeifer et al. (2022), https://www.lifescied.org/doi/10.1187/cbe.21-12-0329

Session 5: Active Learning Classrooms

One reason for hosting this faculty learning community is the new science building on campus set to open in fall 2024. It’s full of active learning classrooms!

  • “Separated by spaces: Undergraduate students re-sort along attitude divides when choosing whether to learn in spaces designed for active learning,” Ralph et al. (2022), https://journals.sagepub.com/doi/10.1177/14697874221118866 or https://libkey.io/libraries/1153/articles/533235795/full-text-file?utm_source=api_597
  • “Transformation of classroom spaces: traditional versus active learning classroom in colleges,” Park & Choi (2014), https://link.springer.com/article/10.1007/s10734-014-9742-0

Session 6: Show and Tell

We didn’t have any readings for our sixth meeting. Instead, participants were invited to share a lesson plan or activity they had used in their courses.

Session 7: Group Work

Since active learning often involves group work, we took a deep dive into organizing and facilitating groups in our seventh session.

  • “Evidence-based teaching guide: Group work,” Wilson, Brickman, & Brame (2017), http://lse.ascb.org/evidence-based-teaching-guides/group-work/

Session 8: Exams and Evaluation

Our final meeting fell on the last week of classes, which made it an appropriate time to talk about evalution.

  • “Science exams don’t have to be demoralizing,” Clements & Brame, https://cft.vanderbilt.edu/science-exams/

We will continue the faculty learning community in some form in the fall. If you’re a STEM instructor at the University of Mississippi and are interested in participating, please let me know!


Sign up to get an email whenever we post something new on the blog

  • « Go to Previous Page
  • Go to page 1
  • Go to page 2
The University of Mississippi logo
EEO Statement Give Us Your Feedback Accessibility Ethics Line UM Creed
Copyright © 2025 The University of Mississippi. All Rights Reserved.