go back home...
CGT 512 - Reading Relection - Week 4
Cooper – Ch 2
Cooper idenitifies three models in this reading. They include the implementation, mental, and represented models. The represented model is for the designer though. The represented model is where the designer chooses how to represent the working program. This needs to match the mental model the users have. This makes a lot of sense. When would a designer not want to design for their users? Never.
I liked how Cooper talked about how we limit ourselves to what is already there. It’s true. In the 90′s we were stuck in Web1.0 and the stupid animated GIF era. Why were we there so long? We didn’t want to change and we limited ourselves. I feel this is a great example.
SNIF-ACT: A Cognitive Model of User Navigation on the World Wide Web
This seems to be the first paper I am having a good amount of trouble understanding. I’m thinking it has to do with how links are laid out on a web page and how users follow them. It’s based on scent… or something like that. I am going to read my fellow bloggers pages to see what they think. Hopefully they can help me out in figuring out this paper.
So I re-read part of it and I think I figured some of it out. It seems to study a model used to predict user navigation (web-based) and to understand the cognitive approach users use while navigating. I was correct about information scents. They are the relevance of link texts to the users goals on finding information. These ended up being a better predictor of navigation when compared to position.
They mentioned satisfice in this article. I barely remembered what it was about, but that’s what they make Google for right? So satisfice is a strategy that attempts to meet criteria for adequacy, rather than to identify an optimal solution. Is there something that could be what I’m looking for? Okay I’m going to click this without looking a little more to find what I actually need. This is just a small example of what it could be.
Heuristic Evaluation (link)
This link explains what a heuristic evaluation is, good thing because I only had a general idea. It states a heuristic evaluation “is [a] discount usability engineering method for quick, cheap, and easy evaluation of a user interface design.”
To explain more, it is a way for people to test the usability of a site. Usually there is a small group of examiners who test the site for usability compliance. The goal is to identify usability problems so they can be fixed during the iterative process of design.
Jacob Nielsen goes more into it, and after following the links I saw just how much goes into it.
- How to conduct a heuristic evaluationThis article goes into how to actually conduct a heuristic evaluation. I liked how they emphasize isolation in it. Only after all testers have finished their evaluation are they able to talk to one another and compare findings. This is really a good idea. I think it would help a lot to actually because no one is influencing you.I found out that they are not intended to find fixes. They state this in saying “Heuristic evaluation does not provide a systematic way to generate fixes to the usability problems or a way to assess the probable quality of any redesigns.” I like how they identify the problems and then someone else has to figure out how to fix it. Sounds like more job openings to me.
- Ten Usability HeuristicsThe 10 usability heuristics are listed below. I re-summarized what they each are.
- Visibility of system status: Feedback should always be given by the system as to what’s going on.
- Match between system and the real world: Real world language conventions should be used with the user instead of system oriented terms to make it appear natural and logical.
- User control and freedom: An easy exit is needed when something goes wrong during an evaluation.
- Consistency and standards: No ambiguous functions should be allowed by the system in regards to words, actions, or situations meaning the same thing.
- Error prevention: Fix the problem before it even causes a problem through good design. Eliminate error-prone conditions and tell users before they commit that error.
- Recognition rather than recall: Make everything visible. Do not rely on users to remember information from step to step. Instructions should be easily visible or accessible.
- Flexibility and efficiency of use: Give users the option to customize their frequent actions. Called accelerators, which are unseen by the user
- Aesthetic and minimalist design: No irrelevant information should be included.
- Help users recognize, diagnose, and recover from errors: Plain english/language error messages that indicate the problem and give possible solutions.
- . Help and documentation: Documentation is a necessary evil.
- Severity Ratings for Usability ProblemsThis article gives the severity ratings for usability issues. There are specific numbers assigned to each problem based on the severity, 0-4 scale. It is based on three things.
- Frequency: common or rare
- Impact: will problem be easy to overcome when found
- Persistence: How often it occurs. Onc-time or often.
- Uptake in HistoryThis is a paper where the researchers wanted to see how heuristic evaluation was being used in industry. They sent out questionnaires to companies asking them questions about their use of usability studies. They found that user testing was the most beneficial, with heuristic evaluation coming in a close second. Studied the relationship of having rated benefits of using a method and the number of times the method had been used. I would assume the more it was being used the better the benefits were, and I think this is what they came to the conclusion of.
Got this error so I could not read it. Found some interesting ones online though that looked pretty cool.
Generic Exception-Invalid permissions to execute action.