Improvement of Academic Intern Experience and Performance in Introductory Computer Science (CS)

Tags: , , , , , , ,

Categories: GSI Online LibraryTeaching Effectiveness Award Essays

by Tiffany Perumpail, Electrical Engineering and Computer Sciences

Teaching Effectiveness Award Essay, 2018

Academic Interns (AIs) are former CS61A students who help current students in office hours and labs. Any student who passed CS61A can go through training and volunteer as an AI.

In Spring 2017, our students’ Final Survey Responses showed that CS61A AIs’ helpfulness and AI instruction quality could be improved. Additionally, feedback from staff members included the finding that some AIs were providing partial solutions to students, which was obstructing student development. A former CS61A GSI dubbed this issue “GPS Syndrome,” because some students were becoming reliant on step-by-step guidance in office hours instead of gradually building their own intuition and independence. Furthermore, some CS61A staff members received anecdotal feedback from AIs themselves that they did not always feel confident to help out particularly struggling students or tackle challenging problems.

In Fall 2017, I modified the existing AI training regiment to move away from lecture-style training and to involve hands-on teaching activities in which students are assigned different student and teacher roles and gain practice teaching difficult topics. Then I released lab and project guides targeted towards new teachers. Instead of detailing solutions, these guides emphasized tips for explaining concepts and advice on helping students discover common bugs. The goal of these resources was to show AIs how to help students build a foundation to succeed in computer science, rather than narrowly focusing on each problem.

Additionally, I made weekly AI Piazza posts in which I introduced the new topics and assignments and discussed different analogies that AIs could use for explaining historically challenging topics, such as Tree Recursion, OOP Nuances, and Interpreters. I linked AIs to existing teaching resources for particularly difficult topics. Posting regularly on the AI Piazza encouraged AIs to share their own experiences and to ask teaching-related questions on Piazza, which did not happen the previous semester.

In order to receive more direct input from AIs, I created an Anonymous Feedback form, which was open throughout the semester, and a Mid-Semester Feedback form, which asked specific questions about the Fall 2017 changes and about which resources AIs felt were the most helpful. By the end of the semester, the Anonymous Feedback form garnered over 45 responses on issues ranging from feeling isolated in section to feeling inadequate after explaining a problem poorly. Over the course of the semester, I tried my best to address new feedback as it was submitted and to tailor the advice I gave in the weekly posts accordingly.

Finally, to change the culture surrounding the AI role—which is often viewed as only a “stepping stone” to becoming a paid staff member—I created the Outstanding Academic Intern award. Based on staff nominations, student feedback, and tutoring ratings, four AIs were chosen to receive this award and were celebrated at an award ceremony with CS Faculty at the end of the semester. In order to assess the effectiveness of these different changes to the AI experience, I’ve examined the Mid-Semester feedback surveys we received. In Spring 2017, in response to “Overall, have AIs been able to provide you sufficient help in office hours/homework? (On a scale of 1 to 5, 1 being insufficient and 5 being excellent)”, the mean of 778 student responses was 3.48 and the standard deviation was ~1.01. In Fall 2017, in response to the same question, the mean of 1,271 responses was 3.81 and the standard deviation was ~0.93. A two-sample heteroscedastic T-Test on both of the survey responses resulted in a 0.0064 p-value. Additionally, I’ve received anonymous qualitative survey responses from staff members attesting that AIs have been less likely to provide partial solutions and that AIs appear to be more focused on building student independence in Fall 2017 than in Spring 2017. By releasing teaching guides and reducing the emphasis on assignment solutions, our staff expanded our AIs’ approach to instruction. By opening more avenues for AI feedback, actively responding to AI concerns, and rewarding AIs for their excellent contributions, we showed our AIs that they are valued. As a result, we improved AI performance in our course and positively impacted the experience of our students