Socially Sourced Feedback: The Experiment

820 feedback

This semester has been quite a challenge – I have really pushed myself (in addition to my students) in CEP 820.  Quick background – CEP 820: Teaching K12 Students Online is one of the required courses needed to receive the NP Endorsement from MSU.  Additionally, the course can be taken as an elective for the MAED online program.  The main culminating project in the course is the creation of a complete online course module.  The creation of the module is scaffolded through the semester,  students must evaluate and choose a CMS and they are highly encouraged to use content from their professional practice to create the online or hybrid course (or unit.)

I have been involved in educational technology for about 12 years now – based upon my past experience – one might categorize me as an expert (or at least highly proficient) in assisting with technology integration in cross curricular and multi-disciplinary educational settings.  If you’re familiar with the TPACK model – I have the T and P down and I help people who have the “C” (subject matter expertise in science, special education, composition…etc.)  Therefore, to give my students a rich, authentic assessment of their final projects, I wanted to connect them to people who had the “total package” (a play on t-pack.) In essence, people who had experience with online/hybrid course development (T),  veteran teachers (P) and subject matter experts (C).

I am so lucky to have a large and diverse Personal Learning Network (PLN.) This network consists of professors, MAET alumni, PhD students, practicing teachers, online learning experts, and on and on.  Back in November I sent out an innocuous email to a cohort of people in my PLN. <Here is the text of the invite> In short, I was asking friends and colleagues to access a student course and use Jing + Screencast.com (both free tools) to record their reactions, feedback and suggestions for improvement. I specifically targeted people in my network who I thought would match well with my current students.

I was able to collect 16 brave souls willing to participate in the experiment.  I sincerely want to thank the following for giving the gift of their precious time to assist in this experiment:

Now that I had my reviewers in place the next task was to gather the details for 40 student courses and send them on to the external reviewers in individual emails.  I had been giving feedback on the development of the online modules all the way through, so I was very familiar with the content, I also know my PLN very well, so I did my best to match content area experts with content area experts, created a “form email” <see it here> and shipped each reviewer 2 modules to evaluate.

Keep in mind, that we were not working on any prior use model or case study — this was a new venture for everyone involved. I did my best to mediate the fear and trepidation on the student and reviewer side – putting forth the brave front – knowing that this would somehow work out. It could not fail with such a talented and adventurous group of both students and colleagues!

By now I’m sure you’re interested in seeing what actually happened — Here are a few highlights (links open in new windows, make sure your audio is up!):

Troy’s review of Erin’s “6 Traits of Writing” on Weebly Unit
Jessica’s review of Emily’s “American History” Moodle Unit
Sean’s review of Marc’s “Jared Diamond unit for World History” in Blackboard

So….what did I think of the whole process?

The downsides –

  • I am risking the fact that I may be relying too heavily on my PLN. At the end of the semester, something like this is the icing on the stress cake.  I purposely timed the submission of the final project 4 weeks before the official end of the semester to try and alleviate some of this stress.
  • This was incredibly tedious to manage.  I wish I could have figured out a way to make the revier/reviewee process a little more automatic, but it was all done manually.  This required minute attention to detail, making sure the right hyperlinks were going to the right people and making sure no one fell through the cracks. With everyone using different course management systems, keeping track of all the access points was a bit of a challenge.
  • Not everyone turned in their assignment on time.  I’m an understanding instructor and I know things happen (all of my students are working adults, many carrying 2 or more courses plus a full time job) – but it is not fair to the external reviewers to send them late reviews (as they too have full plates.) Laeeq (my teaching assistant), Jess Knott (who graciously volunteered) and I conducted any of the late submission reviews.
  • I was not able to match everyone up with the “perfect” person — I had to stretch a bit when it came to content experts. (All of  my reviewers were expert in online/hybrid course creation and pedagogy.)

The upsides –

  • I have connected my students to AMAZING people
  • My network was able to experience an alternative form of assessment in a “low-risk” setting.  Low-risk in the sense that they could try this out with the 820 students. (We took on the high-risk!)
  • The external reviewers expressed appreciation in learning how to use screencasting as a form of assessment/evaluation.

If I do this again…

  • I will be more explicit with my reviewers and give them some tips on microphone and recording techniques.  (I have a degree in audio production…I can’t help but be picky about production value!) I provided this support structure to the students (screencasting was one of the technologies we focused on during the semester) – it’s only fair I do this for reviewers as well!
  • I will only ask external reviewers to perform one review
  • I will try to adjust the final project due date one week earlier to avoid the end of semester crunch.

Help me continue the evaluation of the experiment
There is so much more to talk about! With the evidence presented — I need to hear from the reviewers and students.  I am on the fence about doing this again next semester (leaning more towards doing it again, I thought it was quite successful…but I am wavering a bit.) CEP 820 students, was this a valuable experience for you? Evaluators, was this too much to ask? Others, are you intrigued to try this – or have you tried this and do you have tips to share, ways I can improve the process? Do you want to be on the reviewer list next semester!?

I look forward to continuing the conversation!

8 thoughts on “Socially Sourced Feedback: The Experiment

  1. Great post! I thought the project was great, and not at all too much to ask. My only regret is that my last two reviewees got boring Jings instead of the Camtasia goodness due to my being sick. :P I do agree that a guideline of what you want us to tell them would be good, I was never sure I got as deep as expected, but they’re also new to the tech thing so I tried to balance that. Super fun, overall!

  2. Hi Leigh,

    Your comments and insights in this reflection are wonderful — I wondered what this management of this task looked like from your perspective, and I appreciate your honesty here as I think about doing something like this in my own teaching.

    As a reviewer, I have to say that doing one review felt very manageable, and I would be happy to do it again. If, however, there was a way to get a peer-review process in place — with teachers from different institutions — that would be great, as we could then teach our students how to be responders and use the tools, too. (That could be a bit TOO much work, however!).

    Also, in the introductory screencast, I would encourage your teachers to provide a “tour” of their module. That would be much more robust than the static screenshot with a voice over that I watched as an introduction.

    Overall, I think that your idea of using socially sourced feedback was innovative and worthwhile, and I hope to see how it develops in the future.

    Thanks for including me in the conversation.

    Troy

  3. I must admit it took much longer than I anticipated to explore the courses via a screencast, complete the assessment rubric (hoping that it would be useful), and doing a purposeful screencast response. I enjoyed it all. I love to play around with technology and be part of new ideas that you come up with. Plus it keeps me connected with the MSU EduTech community.

    I agree with Troy; doing one is very manageable. However, I did like having two to compare and links to some of the others just to know the possibilities. A rubric for reviewers or perhaps a webinar so we’d have a consistent method for offering feedback could help. I never felt trepidation as it was understood that our feedback was just one piece of the pie and the final assessment would come from you. For that same reason, did the CEP 820 students appreciate the feedback from outsiders?

    Yes, you should do this again and so will I, if you’d like. I’m proud to be in your PLN!

  4. Hi Leigh,

    This was a valuable experience for me because I gained a lot of experience with Moodle and online learning in general. If I were a reviewer, I would also prefer to see and hear a “tour” of the students modules that Troy mentioned. It would give the reviewers a better overall “picture” of the modules being reviewed and I think it would benefit your students (teachers) also. As a student, I did appreciate outside feedback.

  5. Leigh…
    I really liked having an “outsider” review my module. It made the whole experience feel more real, knowing that actual people were going to look at my course. It forced me to pay close attention to detail and not just slap something together that I thought my instructor would be looking for. The whole experience was great and I would definitely continue on with it. It’s a valuable piece to the class…and WAY better than simply having other class members provide feedback. We are all just learning and it was great to hear true experience from our reviewers. LOVED the class. Thanks.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.