It’s all about “me”. Due to my busy schedule, I have limited time to provide feedback on design discussions, project plans and other technical documentation that I’m asked to review. Simplifying my review tasks means that my feedback will be more timely and of higher quality.
When I author documents, I want to maximize the value of the resulting feedback. So, it’s all about them. The easier that I can make it for reviewers to provide high quality feedback, the faster I can finalize my document’s content. So, it’s also about “me”, the author as well. What are the aspects of a collaborative review to help achieve those goals, then?
First, let’s make the distinction between collaborative review and collaborative authoring. To optimize the collaborative review process, it needs to be addressed separately. Table 1 summarizes the major differences between these two collaborative processes.
Aspect | Collaborative Authoring | Collaborative Review |
---|---|---|
Who can change content? | All participants | Author only |
Scope of distribution | Small – size of a working team | 10′s to 100′s |
When are document content updates visible? | Immediately | When the author publishes a version |
What document history can be retrieved by reviewers? | Undo of each edit | Any version |
In collaborative authoring, there is less distinction between a reviewer role and an author role. Even though an individual will ultimately be responsible for content, all participants are encouraged to actively create content and sometimes even edit existing content.
Pure collaborative authoring tools and environments provide capabilities that best support the middle column in the table. These tools include features like shared, real-time editing and immediate change propagation to recipients.
The collaborative review process is typically more controlled than collaborative authoring. For business documents, a collaborative review phase will often follow a collaborative authoring phase. The content is more stable than in the collaborative authoring phase and the scope of distribution and feedback can be much wider. Using collaborative authoring in this later stage would be counterproductive. With a larger number of reviewers, there would likely be too much content churn for readers to successfully follow along.
In a collaborative review, reviewers do not get to change the document content directly to provide feedback. Instead, they mark up the document with additional information for the author and other reviewers. The reviewer will select the text she wants to refer to and then add comments into a text box that overlays the document content. This solution offers the dual advantage of not altering document content, yet allowing the author to easily accept reviewer change suggestions with a single click.
Over the past few decades, many types of systems have been built to address specific aspects of collaborative review. Table 2 outlines the characteristics of these system types. The key capabilities of each type of system are highlighted, although they may also appear within the other kinds of support systems. For example, some file synch and share services provide versioning.
ECMs | File Synch and Share | Document Review and Comment |
---|---|---|
VersioningCentral shared storageAccess control | Active distributionFile access tracking | Time limitedVersion specific |
Seems great. So why are collaborative reviews still such a pain for reviewers and authors? As I reviewer, my time is precious. Although the above tooling is significantly better than using email with cut/paste of document content, it still lacks several capabilities that guarantee productive and timely reviews. In particular:
Leveraging a Reviewer’s Existing Knowledge
Reviewers are often already mostly familiar with content being sent their way. It is likely because they have read previous versions of the same document or related documents. Most systems don’t leverage that previous document knowledge. And systems largely don’t recognize a reviewer’s interaction with a document’s prior versions.
Properly Resolving Feedback Issues
A reviewer should be able to quickly understand a new version’s changes, by comparing what’s different in the new version, to the content they last read. A tool or system can improve a reviewer’s productivity by presenting changed content in this personalized way. That system would guide each reviewer through these changes helping them save time and improving the overall review turnaround.
For example, Jane reads version 3 and then spends 4 days at a customer site. When she returns, she discovers that 5 new versions of the document have been published for review. She want to see the latest content and how it has been changed relative to version 3. She also will likely want to focus on those pages where she has provided feedback via comments to the author.
In most review and comment systems, a document version is circulated for comment with a review close date. Reviewers add their comments, which might result in a comment thread being captured prior to the review close. Once the review version is closed, the author creates a new version with updated content and publishes it. In most of these systems, comments remain attached to the document version that they were created for. A comment on version 3 does not automatically move to version 4. Moving comments forward into subsequent versions requires additional work by the author or reviewer. If the author doesn’t cut and paste unresolved issues in comments in a new version, it forces reviewers to go back and find them in the previous version. Then the reviewer needs to see if the most recent document content addresses the issue they raised. This is all very time consuming.
A better solution would be to automatically include a comment into new document versions until the author or reviewer specifies that it has been addressed. Reviewers can immediately see if their issues have been dealt with in the newly changed content. And reviewers can always go back to previous versions of the document to see the original text that the comment referred to.
As for authors, well their time spent in reviews is equally if not even more precious:
Tracking Reviewer Feedback When Content Changes
Not only should unresolved comments be automatically moved forward into new versions, but their location should remain as close as possible to the original content. Let’s say Jane highlights “the rain in Spain” on page 2 and adds a comment that says “this should say France”. Arthur, the document author doesn’t address Jane’s issue but changes the rest of the document drastically. Now “the rain in Spain” is on page 4. The system needs to track Jane’s comment and keep it associated with her original text selection as much as possible. If Arthur changes the text “under” Jane’s comment to “the hail in Spain”, the system still needs to track the location of Jane’s original comment. Systems should do their utmost to preserve reviewer feedback in the face of content changes. If the underlying text is completely deleted, but the comment issue hasn’t been resolved, the comment should remain on the page of the version the selected text last appeared in.
Determining the Progress of the Review
Lastly, authors need to know whether review activity is happening or not. Although some systems will track a document download or open, few systems will track reviewer interaction within the document. Tracking includes time spent reading or commenting on specific pages, as well as the extent of the review feedback provided. Although these measurements cannot determine reader understanding, they can help an author determine if they need to reach out to particular reviewers.
Doing it Better
Although existing document collaborative review systems have brought great improvements over email based review, they remain inadequate in today’s fast paced and resource constrained enterprise environments. What’s really needed are systems that can serve up changed content to reviewers in a personalized way, and that keep valuable feedback history readily available, rather than throwing it away with each new version. Ignoring history and reviewer knowledge will continue to result in lack of feedback, missed deadlines, poor comments, misunderstanding and lost information in document content review.
It’s time to raise the bar on collaborative reviews. The people I work with are doing it.