What Harvard Business School Has Learned About Online Collaboration From HBX

From Harvard Business Review by Bharat AnandJan Hammond and V.G. Narayanan
RL: https://hbr.org/2015/04/what-harvard-business-school-has-learned-about-online-collaboration-from-hbx#


In June 2014, Harvard Business School launched HBX, its new online education initiative. At the time, the norm, increasingly, in many online courses was to create a “lean back”, individualized experience where students would primarily watch streamed video lectures that centered on experts. Stimulating lectures with star professors, perhaps, but lectures nonetheless — with a passive, individualized learner experience. We wanted to change that.

To that end, we designed a program that would:

  • Focus on solving real-world business problems. Videos capturing real managers discussing real problems would anchor the course offerings, to help students understand the applicability of even the most abstract and esoteric concepts.
  • Encourage active learning. Students would engage with the material in “lean forward” mode, rather than passively watching video lectures. Students would not spend more than 3-5 minutes on the platform before being required to interact with the material.
  • Foster social and collaborative learning. Students would engage meaningfully and regularly with others on the platform. We believed that such collaborative learning would not only make it more engaging, but would draw participants more deeply into a process of discovery.

Here are some of the most important things we’ve learned since launching HBX, as it relates to creating a social, collaborative experience online:

Collaboration doesn’t occur in a vacuum. An important premise behind our efforts was that students must first know each other before they can engage with each other. To do this, we helped participants “meet” virtually: everyone was required to first submit a personal picture and complete a publicly viewable profile before they could view any course content. This simple requirement had meaningful results: on the first day of the program, everyone who uploaded personal profiles viewed the profile information of, on average, 40 other participants. Typical online courses showcase course materials on the first page; we showcased the students and their locations around the world. This simple modification created the conditions for engaging collaboration later on.

Incentives matter. Collaboration doesn’t just occur by getting people together. You need to trigger it. To do this, we tied participation and online collaboration to course grades. Students responded. More than half asked a question of others, and roughly 75% answered others’ questions at some point during the program. In comparison, most discussion boards in online courses might follow a typical power rule: estimates are that less than 10% of participants are responsible for over 90% of contributions, while the rest barely engage (We are grateful to Professor Andrew Ho for this data).

Collaboration doesn’t require large incentives, though. Ours were framed in terms of requiring a “basic level” of participation — after which the momentum of the process itself took over.


Fruitful collaboration comes as much from asking questions as it does from answering them. We rewarded both. We even encouraged students to categorize the type of question they were asking — rudimentary, esoteric, abstract, practical, and so on — effectively giving them license to ask questions on a broad set of issues related to a learning topic.

Capability, commitment, and continuity are important to collaboration. Effective collaboration requires participant capability, commitment, and continuity. When we assemble teams in the workplace, it’s rarely a random assortment of persons. The same principle holds online.

To judge candidates’ capability and commitment, we required everyone to submit an application for HBX CORe, our first program (denoting the Credential of Readiness) which was comprised of three courses: Business Analytics, Economics for Managers, and Financial Accounting. In addition to requesting past academic information, the application required each candidate to write an essay. There’s no doubt that the product’s price also played a role in selecting people who would be committed — it cost $1,500 per participant, for a program that was spread over 10 weeks and required 10-15 hours of work per week. (Financial aid was also available for those in need.)

Two of our design principles were also critical to fostering ongoing collaboration. First, we set out to create an engaging experience, platform, and curriculum. Second, we required “rough synchronization” in CORe. Specifically, we released material and set module due dates to ensure that participants were roughly synchronized with others. Thus, although participants were given flexibility for when they completed course materials between deadlines, all participants were required to meet module deadlines. This approach ensured that participants were never more than a few days ahead of or behind other participants. Online courses are increasingly gravitating towards greater flexibility, asynchronization, and “non-linear” personalized pathways through a course. Flexibility is desirable; but shared experiences trigger peer conversations and collaboration.

Our selection, motivation, and design choices affected the “stickiness” of the learner cohort from week to week — and, as a result, affected ongoing collaboration. If large numbers of participants do not engage or have poor quality participation, conversation with, and learning from, peers would suffer. In contrast, participants in HBX CORe reported enthusiasm at learning from their peers, some reporting spending more time on non-required peer-help discussions than on the required components of the course.

These findings are relevant to completion rates in online courses. There may be good reasons not to over-emphasize such metrics. But when drop-out rates exceed 90% (as they do on average), conversation and collaboration will inevitably be undermined.

Norms of online collaboration can be shaped. Online interactions typically go one of three ways: they never take off at all, they get better over time, or they converge to the “lowest common denominator”. One sees this on many websites too: uncivil or disrespectful comments drive out good ones. Other times, however, it’s the reverse. But which outcome occurs need not be left to chance. Online norms can be shaped, too. At the program outset, we tried to actively encourage certain behaviors, to discourage others, and to clarify standards for online conversation. We encouraged participants to disagree with others — but to do it with respect. Rude, uncivil, or arrogant behaviors were not welcome. Those who violated these and other policies faced warnings, and possibly expulsion from the program.

The biggest benefit to being clear about norms at the outset is that, once you’ve created the broad conditions and expectations, students end up monitoring each other. We saw this on our platform on several occasions. In the words of game theorist Tom Schelling, one often just needs a focal point to coordinate behaviors around. Leave it to chance, and you could be unpleasantly surprised.

Social learning can substitute for expert knowledge. Most education — even online education — is based on the belief that experts drive learning. This may be true when it comes to creating the content (indeed, each of our courses was created over several months, painstakingly, by a team of faculty and course associates). But it’s not the case when it comes to the student’s understanding of the material, learner retention, or learning by discovery. Have the expert intervene too early, and it can crowd out discovery.

Social learning was one the major bets we made at HBX. It also yielded some of our most profound learnings. When students asked a question on the platform, we resisted the urge to jump in, instead leaving it to peers to do so. When students struggled with a concept, we resisted (even more) the urge to jump in and correct the group, but relied on peers to do so. The results were remarkable (and somewhat humbling if you’re an expert): in more than 90% of cases, questions were precisely and accurately answered by the peer group. One of our HBX CORe students had previously been the head teaching assistant (TA) for one of the most popular MOOCs (massive open online courses). He noted that a typical approach to intervention in online courses was to amass larger numbers of TAs, so that some “expert” was ready to intervene quickly on any question as it arose. One unintended consequence? “Soon, everyone expected the TA’s to answer questions. No one took it upon themselves to do so.”

“Trust the students,” we preach in our classrooms. It’s one of the hardest axioms to follow. The temptation for an expert, or a teacher, is to help at the first sign of confusion. But letting it simmer can aid learner discovery. Indeed, the power of collaboration comes when you trust the group so that they are strongly encouraged — forced, even — to resolve problems on their own. Let an expert intervene, and you could undermine collaboration itself.

Online collaboration can overcome certain biases and behaviors that arise in face-to-face environments. Clearly there are advantages in face-to-face interaction. You can engage more; it’s personal; it’s real. However, online collaboration also offers certain advantages. The lack of face-to-face interaction might help overcome certain biases and behaviors that arise — often unintentionally — in our traditional classrooms.

One of the salient differences we discovered concerned gender. In our first cohort of participants, women were more likely to both ask questions of others and to answer others’ questions — nearly twice as likely as men. Higher female participation online is the opposite of what we typically encounter in our residential classrooms. These outcomes merit a far deeper exploration of what drives these differences, and biases, and even whether they persist in different settings.

So what are the results of the first HBX CORe program? Completion rates, student satisfaction, and engagement are the most common metrics used to evaluate online courses. In these respects, the results were notable. The completion rate —typically in single digits for MOOCs — was just over 85% for HBX CORe. Satisfaction scores were high too: 80-90% of participants rated the program content and teaching either a 4 or 5 on a 5-point scale. Engagement with course materials and with peers was extremely high — all participants completing the program regularly submitted analyses and reflections via required participation, and a majority of participants participated in content discussions with peers. Similar numbers are being seen in follow-on versions of the CORe program offered by HBX.

These were some of the most important things we’ve learned and discovered in our first HBX CORe program. There were others too. Make it fun, for example. And, make the groups as diverse as possible, because diversity creates the conditions for students to listen more carefully to others (perhaps there’s something you don’t know that others will) and fosters a greater willingness to learn. We are exploring these differences, and others, in ongoing programs through HBX.

As we continue to experiment and explore the potential of online learning, HBX has made a conscious decision to embrace social learning as one of the principles to anchor its online programs around. Online education in general has not yet recognized the great potential of social, collaborative learning. But it should.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s