I found this week’s reading to be quite interesting. I really like how the book stated that revision process of your document is actually “re-visioning”. It made me realize how you should gain a new perspective on your document through this process. In addition to the revising process, I found the reading on usability testing to be quite practical. However, I did find some subjects to be sort of questionable.
While reading the assigned material document cycling struck a thought in my head. While we do become so biased from working with our documents, I feel that co-workers and supervisors may also become biased in this process. Many times these documents might be developed alongside these audiences and they too may become biased. These audiences may also be very knowledgeable in the subject you’re writing about. After all, you do work in the same department as them. What I’m trying to say here, is that just because the reading says to use draft cycling with coworkers and supervisors I think it’s imperative to assess what their pre-determined notions of the document may be before completely relying on them. On another note, I do believe that usability testing is a much stronger approach to assessing your document because of the different audiences it breaches.
I found that through the reading there are many different types of usability tests. One of these tests immediately stuck out to me. Quite simply the text states Performance Tests are used to test instructions and procedures. Well there you have it! Most of us are writing instructions and procedures, so this seems quite appropriate. Now there’s a caveat to performance testing, or really any kind of subject testing that requires monitoring. The catch is that putting a video camera or sitting there taking notes significantly alters results of a test. So, I’m a little lost as to how one should go about this in the best manner. Putting someone, such as a novice on the spot like this can really put pressure on them, and alter how they would usually use the document. I really think it would be different if they were sitting at home utilizing a document, rather than being observed “through cameras and a one way mirror” as the text states. That just sounds intimidating to me! I do want to answer the questions the text poses such as “Did they seem to find the document easy to use?” and “Where did the stumble or have frustration?” I think the instructor’s blog really pinpointed my problem here and has led me to a decent solution for now. However, I think it’s still imperative to pay attention to how you monitor your subjects. I plan on doing the closest thing to a performance test that I can out of our choices outlined in the instructor’s blog.
The most similar thing to a performance test I could find was the document markup, “where users read your documentation as they perform the task and mark any places where they get confused.” I think this will allow me to see how easily my different audiences will be able to actually utilize the document while they critique its usability as they go along. It almost as if they’re monitoring themselves by marking the document. I think this will lead to a much more comfortable experience than rather doing a fully fledged performance test with mirrored glass and video cameras.
Hopefully utilizing this test and the revision process will lead to a much more useful document. This all really just encapsulates the recurring idea that we need to understand our audience and put ourselves in their shoes as much as possible! I think these processes and step we’ll be taking is the closest thing we can get to actually getting in their shoes!
The Labratory
Matt, I think you are right in pointing out that some forms of usability testing can themselves "get in the way" of usability. This is an issue, however, with any kind of testing. I was listening to "Mike and Mike in the Morning" a few days back and they were discussing a poll conducted by ESPN and a university I can't remember. The survey was attempting to gauge sports fans' attitudes about steroids in baseball. The survey was as follows:
How much do you care about steroids in baseball?
See the problem? For starters, the poll was self-selecting. That is, it was not sent to a random sample of fans, but was completed by fans who sought out the survey. Additionally, it offer a limited range of responses. It also doesn't assess how these attitudes manifest themselves. For instance, does caring a lot lead to not watching or watching baseball? This is clearly an issue for usability. You don't just want to know whether something is confusing or not; you want to know what the confusion will do to or for the user.
This is why we need to think carefully about our usability tests. As you point out, the context of a test can affect the outcome. And, just as importantly, so can the questions themselves. Every question suggests a certain kind of answer.
Questions Suggesting Answers
From political science classes at Purdue, to statistics, and data mining classes, all have highlighted this notion that every question suggests an answer. What this means for the pollster is that the results need to be interpreted in the correct manner, and questions that produce excess bias need to be thrown out. A professor in an Information Systems class I've taken also highlighted that when performing research on a user's needs for a new information system, observing the user utilize the old system, and identifying his issues is a great starting place. However, when people know they are being observed, they will follow the correct procedures more carefully, and you might not ever notice potential problems.
Lab testing
I think you bring up a really interesting point that I hadn't considered before. The more I think about it though, the more I think I would be very intimidated if some told me to sit in a room with a camera and people watching, and told me to use this instructions and see how well they work. That is definitely alot different then when I go home after purchasing something and pull out the instruction manual and read it at my desk at home. So while your right it is important to monitor the user while they are trying the instructions to see where they stumble or get confused, I think the document markup is a good usability test. Because the user will be doing this at home, at his or her own pace, it seems like it will give you a much better representation of how most users would be using your instructions
Patrick Griffin
pgriffin@purdue.edu
Document Markup
For my usability tests I will also be using the document markup. I think that this method provides the most benefit to those of us that are writing instructions on very technical topics. In the past while reviewing instructions that I have written, they have seemed perfect to me. Only after someone says something to me do I realize that I was subconsciously using my previous insight. While this might work for the expert audiences, it is imperative for us to make sure that we are explaining things to our more novice audiences. This test will help ensure that we are adequately explaining details to both of our audiences.
Is document markup the way to go?
I understand your point that document markup is a very effective format to use to evaluate a set of instructions. However, is it the best format for this class? I feel that unless you are writing a set of instructions on how to used the editing notes within Microsoft Word, I would evaluate your document by summarizing what I though I would do different. To me this is the summary usability test and not the markup test. I agree that we are trying to get the most effective feedback possible and manytimes we have oversights in our work, I just don't think that this class allows us to use the markup test as effectively as it would be used if we had printed documents in front of us.
Mike Sheridan
It might be a bit hard
I do agree with you that the markup test will be harder to do in a digital format compared to how we would do it with a physical copy. This will be especially true since we have to submit our files as PDF files. Personally I have no experience creating comments for PDF files. Perhaps for those of us who hope to use the document markup test, we will be able to also submit our files as Microsoft Word files. If so I do not think it will be too hard for the document markup test to be done. In the past I have had good experiences with utilizing Microsoft Word’s commenting feature. I do not think that the summary test would be very effective for my instructions. This is because all of the steps are equally important, if one step is omitted the outcome will be effected. I hope to stress this as I write my instructions.
Mark-ups with Personality
For mark-ups, I've found that as an editor it's usually easiest to look at a physical copy and make written remarks. If I have a thought, I can write it down instantly on a hard copy, whereas I might lose it while fumbling around trying to get it into an electronic format. Plus you can viciously red-line elements and draw expressive pictures to make points.
I believe many of Purdue's labs have scanning capabilities, and if you can't take advantage of that it shouldn't be too hard to find a friend with a scanner. You could also digitally recreate comments off of a written mark-up, but it seems like that's a lot of wasted effort if the mark-ups are legible.
I do agree this will be a little challenge
Mike and Ben, you both make very valid points. I do agree that this will be a challenge as far as submitting these documents. I did take this into consideration a little already though. I was thinking about using a scanner in the DLC to submit a marked up document and just scan it to PDF. I think as long as its legible this will suffice for this class. I do unerstand where you coming from when you think a summary test will be sufficient. However, i would like to get direct feedback on what sections of the document work and don't. I think the markup usability test will best provide me with this feedback.