The iSchool’s First MOOC: Lessons Learned

I recently worked with a team that offered a four-week massive open online course (MOOC) here at the School of Information Studies at Syracuse University: “An Introduction to Data Science with R.” I’m an iSchool doctoral student and eScience Fellow, and felt fortunate to have an opportunity to get my feet wet teaching data science research methods and improve my presentation and video production skills in this new educational medium.

For the iSchool this MOOC had two purposes:

  1. Adding to knowledge about MOOCs and contributing to SU’s and the iSchool’s experience in this area, and
  2. Marketing our new data science certificate of advanced study (CAS). The MOOC built awareness of the program and provided an opportunity for students to “Try before they buy.” 

Two ways this MOOC advanced  knowledge about MOOCs were our focus on group work for learning technical material (coding in R), and our strategy for implementing the group work that prioritized a smooth, seamless user experience. I was pleasantly surprised to see that students participated in the group work and discussion forums, especially early on. Participation dropped off a bit later – this could have been because as the course progressed the focus was on more technical material that didn’t lend itself to discussion. Unlike some other MOOCs that let students form their own groups, which can take weeks to form; in our MOOC students were placed into groups immediately. The MOOC team’s focus on technical support and user experience paid off: by the end of the course there were under 30 emails about technical problems out of over 800 students.

My role 

mooc word cloud
Image by

I made videos for each week of the course sharing additional online resources to supplement the course lecture and textbook. This was my first time making videos, which was a great learning experience. One of the most challenging aspects was getting the content right for an audience that was harder to know due to its size. It turns out I’m not the only one with this problem; course content that’s too easy or too hard are among the reasons people don’t finish MOOCs
Factors I considered when selecting extra resources:

  • Keeping weekly cognitive load manageable for students by avoiding concepts not addressed in that week’s material,
  • Elegance in explanation,
  • Unique explanation that deepened my own understanding of the concepts,
  • Accessibility (for people with disabilities and for platform diversity—e.g. I avoided sources that used java for this reason),
  • Meeting the needs of novice learners, 
  • Meeting the needs of advanced learners,
  • Getting at why students should care about X, not just what it is or how to do it.

My biggest lessons learned 

  1. Video production can take a long time. It took longer for me to produce my weekly videos than it would have taken for me to deliver the same material face to face. Some of my favorite video bloggers give the impression that they flip on the camera, start talking, and finish in one take; I now suspect this isn’t the case, at least not for those who are new to the medium. Producing quality video content takes practice, practice, practice.
  2. There is a great deal of logistic overhead in running a MOOC. I was struck by how much work went into planning and running the mechanics of the MOOC. Is this because of the newness of the medium, or is it an inherent feature of the medium? I suspect it’s bit of both. Just as it’s the case that regular online classes are more work than face-to-face courses, MOOCs are more work than regular online courses. Our MOOC team had an array of course instructors and platform specialists. Someone was available during the day and in the evening (including the weekend, when the discussion forums were busiest) to trouble shoot and answer questions. 

MOOCs offer interesting research opportunities

There is much that still needs to be figured out about MOOCs. Some areas for further learning are how to:

  • Use assessment to improve learning in online environments. Improving the feedback loop between learner and instructor maximizes learning. Having students answer questions can be more than a method of assessing learning, but also a way to learn material. When we shorten the time between learning and assessment, we can prevent mistakes from solidifying in students’ minds. Our course had weekly quizzes to allow students to test their learning. Ideally assessment would be even more frequent, and more deeply integrated into the learning experience.
  • Support social learning in MOOCs in a way that’s manageable for instructors. We had some success with this, though a significant amount of work went into making sure there was always someone available to respond to questions, monitor the discussion forums, and grade group homework submissions. This degree of instructor and platform specialist involvement could be be too intensive for some instructors and educational institutions.
  • Make MOOCs work for novice learners. Though some view MOOCs as an instructional medium for novice students (e.g. the Cal State system may be using MOOCs to meet demand for introductory courses), there is evidence that suggests successful MOOC completers are often advanced learners who are already in the workforce or who have advanced degrees. Sixty-one percent of our participants said they were taking the course for professional development.
  • Improve the efficacy of various online features for learning by running A/B experiments within MOOCs. Experiments often aren’t feasible for educational settings, and most online courses do not have enough participants for inferential statistical analysis. A MOOC with hundreds of users coming back every day to an educational environment offers an opportunity to test online features for improving learning.