Friday, November 12, 2010

What a Difference a UI Designer Makes

I use an audio editing program called Audacity for various audio editing tasks. Most recently I used it to edit a sound file I created from a screen-cast I made using Camtasia Studio. I needed to be able to listen to a sound bite offline via my iPod.

I created the file but when I went to export and save the mp3 file, I was confronted with this:



I added a "Title" and an "Artist", but was overwhelmed when I had to decide a genre. I did eventually find "Vocal" (it was an interview from the web that I saved to listen to later). I had to scroll three times and read each and every genre name. Shouldn't this be in alphabetical order?

Once the file was complete I imported it into iTunes to add it to my iPod. When I checked the iTunes information I saw this:



All the genres in alphabetical order!

Should I really be this critical? Should I really compare and open source software to a software created by a multi-million dollar company? Well...yes. As far as any user is concerned, both software products are "free" - one can use them without having to pay for them. (Although a donation to the open source developers is accepted. If you use an open source software, make a donation in time or money. I do.)

If open source software developers want their software to be taken seriously they need to pay attention to the details. Looking for the genre in Audacity is hard; looking for a genre in iTunes is easy.

Wednesday, November 3, 2010

One Check-Box - Thousands of Help Desk Calls

"Its just a check-box, that should be easy." This was a rallying cry from some of my fellow UI designers. "Oh, yeah. We can fix that in the user preferences. We'll just add a check-box." It never turned out that way. One check-box turned into two which turned into another tabbed panel, which turned into a wizard with a link to the company's web page... Well you get the idea. Design creep. If we had just stuck with the original check-box which was, actually, the best solution to the problem for this iteration of the design.

I've found that some of the most puzzling usability errors (for the user, not the designers) come from the simplest, but least thought out designs. Take, for example, the image below:



This log in screen is from a much anticipated and publicized portal I was required to use for an employer. We had been hearing for months how wonderful it was going to be, how useful, the answer to our prayers and then some. Trouble was, nobody could access it.

On the day it went live, we all got an email bright and early that morning encouraging us to log in, complete our profiles, add a picture, explore its features. Thousands of employees tried. And failed. I don't know first hand as I wasn't in either of these departments, but I suspect the HR department and Help Desk phones lit up like Christmas trees. I think the calls went something like this:

"It won't let me type my D# in."
"I can't type anything. I tried to click in the box but nothing happens."
"I can't log in."

About an hour later every HR department in the company had to send out an email explaining how to log into the new employee portal. Really? Anyone who uses a computer logs into dozens of web sites, applications, banks, etc. One can ask the generic question, "How do you log into a computer?" The most likely answer would be, "You type in your name and then your password." Simple enough. But that is not how the interface pictured above works.

Before we could type in a "D#" (user name/employee number) we first had to agree to the terms of use. Why would we assume that the last item, in what appears to be a list of items to complete, should be the first item we must complete? And don't forget that the first two items do not look disabled. They look as if someone can actually start typing in them.

And it get worse. We had to agree to the terms of use every time we logged into the employee portal. Not just the first time we used the portal. Not just the first time every day we used the portal. Every time we used the portal. This is an application that was connected to and sponsored by the Human Resources departments. Didn't they know we agreed yesterday? They know our job grades, our salaries and our manager's names, but they can't record that we agreed to the terms of use just five minutes ago?

I also do not know how many persons with disabilities worked for the company, but they would have an even more difficult time using the system. There is no logical tab order, in fact some controls don't have tab stops (the "Enter button") and the controls are not identified textually for screen reader users.

Guess what happened to the portal of our dreams? Very few people actually used it for its intended purpose. (The log-in wasn't the only usability issue, nor was the content so fantastic that people just ignored the usability issues.)

How much time was wasted by the Tech Support explaining that one must check the check-box first? How much time was wasted because the HR department had to write, then distribute the explanation email? How many customer calls were dropped because the phone system was over-loaded by employee calls?

Just a check-box. Indeed.

Monday, November 1, 2010

There Are No Stupid Users, Only Stupid Systems

I went to a local coffee shop this evening. I was going to get a cuppa joe, then sit outside and read my book. But first, I needed to use the restroom.

That's the problem. For the second time in four days, both the women's and the men's room were closed for cleaning. I understand that; the restrooms need to be cleaned. However, there was only one person cleaning both restrooms. Like me, she can only multitask in the vitual world. She cannot split herself in two and clean both rooms at the same time. Then why were both restrooms closed? (If the women's room is closed or occupied, I have no trouble using the men's. I just ignore the extra fixture.)

Well, long story short, one of the staff got a little snippy with me and I decided to send a note to the corporate headquarters of the coffee shop. I went to their website and started filling out the form. Name, no problem. email, no problem. Address, no problem. Then I ran into this:


"date of visit" Two things wrong here. Exactly what format should I be entering the date in? Is it "mm/dd/yyyy" or "mm/dd/yy"? Secondly, why am I "cleaning" your data for you? If I enter 8/16/10 and your database requires 08/16/2010, why can't you just parse the input text and format it for me? It is possible to do that - I've written code that parses text myself, and I'm not even a code monkey. (I'm a User Experience Professional - we don't write code, we just draw pictures of software. Hence the usability rant here.)

"time of visit" I'm not in the military, but I respect those who are. I admire them and not just because they know what time 21:00 is. I always have to do the math, and I don't do math well in my head. I always get it wrong. So why should I enter the time in military time?

I'm already rippin' mad, don't tick me off any more with your poor usability.

Accessibility - Part 1

Note: This blog post was originally published on my personal blob in July or 2010.

This post got a little long, so use the following links to skip sections:



As you have probably read from my profile, I work for the online division of a large university. (I don't know if I can say publicly, so I'll let you guess.) One of my jobs is to determine how usable our online courses for accessibility for persons with disabilities.

I fell into this job; it isn't something I would have selected as a speciality. When I was a student at DePaul University in the Human Computer Interaction (HCI) program (Note: it appears DePaul no longer offers an undergraduate degree in HCI - interesting), I debated taking the "Designing for Disabilities" course, but it didn't fit into my schedule when it was offered, so I passed. I finished the program and graduated in 1999 and was off to the work force with my newly minted degree.

In my first job as a User Interface Designer at SPSS Inc. (now an IBM company) I heard things in team meetings about "accessibility", "section 508", "government contracts", etc. but it wasn't part of my job. The lead UI designer on my team, Chuck P, was the contact person for any accessibility questions. I have to give Chuck P a shout-out. He was the one who supplied me with the best defense when co-workers (or anyone) asks " Why do we have to do this? Does it really matter? Home many blind students do we have anyway, it can't be that many." (I, myself asked those questions. I hang my head in shame now at the memory of it.) Chuck P's answer was: "Because its the law. And because its the right thing to do!" He was the go-to guy for this. Until he left the company. Then I get a call from the marketing department wanting to know if I could check one of our products for accessibility.

Now, at the time, I didn't know much about accessibility, neither did my department manager. We had to learn. Fast. Very fast. Extremely fast, as the product was already about 60% complete.

Accessibility Tip #1: The time to think about accessibility is at the start of the project, not at the end.

Long story short, learned fast, finished project, became a victim of the tech down-turn and got laid off, went to grad school, took the "Designing for Disabilities" course, graduated, got a crummy job, got a better job (the one I have now).

So now I determine if our online courses are accessible.

So how does a person who has a visual impairment use a computer? He or she uses assistive technology, in this case either a screen reader or a screen magnifier in addition to some of the built-in features of the operating system the person is using.

Accessibility Tip #2: It the person using the computer cannot see the display or cannot see the display very well, he or she is probably not using a mouse.

People with visual impairments ("users" - the official term we UX people use) are more like "power users" as power users tend to use keyboard shortcuts instead of mouse-centered commands like menus and toolbars. This article is from The Onion, but I know it was written by a usability person: http://www.theonion.com/articles/area-man-knows-all-the-shortcutkeys,1566/ .

I created the video below using JAWS, the screen reader I use at work. In it I show how a person could create a Word document, complete with font changes.

Some things to note while watching the video.

My mouse pointer never moved. I'll reiterate tip #2, people with visual impairments rarely use the mouse. I worked on a project in which I had to look at the accessibility of an eBook reader. The developer insisted that the screen reader would read the menu bar of the application. I found it didn't work. The problem: the menus were read when the user hovered, with the mouse, over the menu items. I know this as I was testing the application with only the keyboard. Ask a visually impaired user how to copy something from one document to another and he will say: "Hold down the shift key and use the arrows to select the text you want to copy then hit control and c. Then alt tab to the document you want to paste into and hit control v."

Screen readers users are honorary computer geeks. The screen reader tells the user she is in the "font dialog" on the "font page". A user who doesn't use a screen reader and who hasn't done any computer programming may not know what a dialog box is. (Trust me, I was helping a friend once, a pretty novice computer user, via the phone, and I told him to "Just close that dialog box." There was a silence then he said "Close Internet Explorer?".) A screen reader user knows a dialog box is a window, a check box is a window, a list box is a window. (They even know what a list box is.) Technically, they are windows as they all inherit from the "Windows" class. If you are writing Help files for a screen reader user, go ahead and call it a dialog box, they know what you are talking about. Oh, and include that "Accessibility" help topic. Many software companies do, including my former employer SPSS.

Sloppy interface programming is inexcusable. There are standards. Both Microsoft and Apple publish them online for Windows and MAC developers. I haven't read them, but I'm sure there are JAVA, Unix, Linux and Web standards as well. Use them. The colon at the end of a label means something. The ellipse; it does as well. A tab order is important. Text boxes are not buttons. It is amazing to me to see software that is so sloppily written. Would we buy a book full of typos? Would you buy a CD full of badly recorded out tracks? The eBook reader I tested followed no known programming standard.

This is Verdona in RGB (255, 0, 0). Don't make assumptions about what someone with little or no vision can do. Don't make assumptions about what any person with any disability can do. Heck, don't make assumptions about what any user can and cannot do. I repeatably hear things like:
  • "Well, a blind person can't take a graphic design class." They can. Open up any Adobe Illustrator document in notepad. You can create images by editing the postscript.
  • "Would you go to a doctor who was dyslexic and couldn't read?" Well, actually I would. I know she isn't going to stop and check her old textbooks while I'm on the table in the ER.
  • "How can a deaf person edit a sound file?" Open up a sound file in an audio editing program and get back to me.

In the video I was able to change the font face (I'm sighted and even I don't know what Adobe Castelon Pro looks like.), change the font color thanks to accessibility information programmed into Microsoft Word (RGB values), and change the style to "Small Caps".





Monday, October 18, 2010

Top 10 Things CIOs Need to Know About Accessibility

I found this list in a group discussion on LinkedIn.com: Top 10 Things CIOs Need to Know About Accessibility

I particularly like numbers 2 and 3.

"Designing in accessibility is much more cost effective than trying to “fix” deployed resources, so accessibility must be a part of the planning process for new and updating existing IT resources and services."

I have been called on by development teams to "check the accessibility" of a product, feature, service; after it has been built, after it has been incorporated into a product, after the ink has dried on the contract. The development team expected a completed checklist filled with check-marks in the "passed" column but nothing in the "failed" column. But checking items off a list never insures anything. (Winberg) What becomes of the features that did, in fact, fail? I have had to be the one to request that a beautifully designed and engineered feature be removed from a product because it just couldn't be make accessible.

Accessibility must be a included in purchasing requirements and RFPs. Accessibility testing must be integrated into the evaluation of products for purchase. Products will only become more accessible when vendors are held accountable to accessibility standards. (Emphasis mine)

I heard an interesting story during a tutorial at a conference. A university was considering a service for the purchase of tickets to the university's sporting events. When the service was tested for accessibility, it failed so the web master rejected to proposal to use the service until that service was accessible. The web master took a lot of heat for that decision, but, because the accessibility was added globally, all users of that service benefited.

Enjoy the list.


Reference:
Winberg, Fredrik. (1999). Discount Accessibility Engineering: Haven’t we Met Before. INTERACT 99 Workshop: Making Designers Aware of Existing Guidelines for Accessibility.

Wednesday, October 13, 2010

Accessibility Heuristics


Introduction


Note: this post was created from research and work done while I was a:


  • User Experience Specialist at DeVry University

  • Student at DePaul University

  • User Interface Designer at SPSS Inc., an IBM company


“The situation today with the existing guidelines for making new technology and information accessible (for example World Wide Web Consortium) reminds much about the state of HCI or Usability Engineering in the late 1980s”. (Fredrik Winberg)


Current accessibility guidelines are difficult for user experience designers and engineers to use. They need a simple, plain English, set of guidelines they can use during the development process to insure the applications they are developing are accessible.

I first became exposed to accessibility while at SPSS Inc. My team lead there was the accessibility expert. When he left the company I started getting calls from developers and other designers with questions about accessibility. I needed to learn fast; there was a product that was about 60% complete and they needed an accessibility expert to look at it. I learned, and a few months later we had a fully accessible, drag and drop, data mining product. And it was written in JAVA. (The engineers on the product were amazing. They learned JAWS, explored the accessibility features in JAVA, and then programmed the application.)


As a graduate student at DePaul University, I expanded on the knowledge I gained at SPSS. Accessibility was my topic of choice in independent study courses. In my capstone course I wrote a set if accessibility heuristics that I will publish and expand on here in this post and in future posts.



The Heuristics


Honor display settings – A user sets the display properties for a reason. Make sure your application knows what they are and displays all its interface elements with those properties.


Provide a tab order -- Provide a well thought out tab order and include every actionable interface element. Do not include un-actionable items, for example disabled controls. For users who use the mouse, the tab order is not as important as to the user who navigates via the keyboard. When there is a potential conflict, choose the tab order that makes the most sense to the keyboard user.


Indicate focus and selection -- Every interface element must indicate that it has focus, and, if applicable, which items are selected within that element. This information needs to be indicated onscreen and passed textually to the Operating System. Additionally, only the user has the right to change the current focus.


Provide textual information -- Provide text information for every interface element, even if that information doesn’t appear in the display. A Screen Reader cannot describe an interface element that it knows nothing about.


Follow the standard -- Use standard interface elements as they are meant to be used. There is no such thing as a single button radio group. Pattern custom control behavior on standard control behavior and identify them as such.


Use the keyboard – Every interface element must be actionable via the keyboard. Take advantage of standard keyboard shortcuts to provide functionality. Provide application specific keyboard shortcuts but never overwrite standard shortcuts.


Color is for enhancement only -- Don’t use color as a sole indicator of functionality, state or mode.


Identify images -- When using images and animation, be consistent, provide textual information, and avoid flashing.


Test, then test again -- Test to assure the application works with the assistive technology included in the Operating System as well as a third-party Screen Reader.

References


Below is a collection of references I've used in past projects.


Barnicle, Kitch. Usability Testing with Screen Reading Technology in a Windows Environment. Arlington, Virginia, USA. (CUU’00).


Henry, Shawn Lawton. (2007) Just Ask: Integrating Accessibility Throughout Design (lulu.com)


Kinash, Shelly. (2006) Seeing Beyond Blindness (IAP-Information Age Publishing)


Mueller, John Paul. (2003) Accessibility for Everybody: Understanding the Section 508 Accessibility Requirements. (apress).


Nielsen, J. Ten Usability Heuristics. http://www.useit.com/papers/heuristics/heuristic list.html.


Nielsen, Jakob. (1994) Heuristic Evaluation. In J. Nielsen & R.L. Mack (Eds.), Usability Inspection Methods (pp 25-62), New York: John Wiley & Sons Inc.


Paddison, Claire & Englefield, Paul. (2003). Applying Heuristics to Perform a Rigorous Accessibility Inspection in a Commercial Context. Vancouver, British Columbia, Canada. (CUU’03)


Seale, Jane K. (2006) E-Learning and Disability in Higher Education (Routledge)


Section 508. http://www.section508.gov.


Sloan, David & Gregor, Peter & Rowan, Murray & Booth, Paul. (2002). Accessible Accessibility. Arlington, Virginia, USA. (CUU’03).


Theofanos, Mary Frances & Redish, Janice. (2003). Guidelines for Accessible – and Usable – Web Sites: Observing Users Who Work with Screen readers. Interactions. 10, (6), 31-51.


Web Content Accessibility Guidelines http://www.w3c.org/WAI/


Winberg, Fredrik. (1999). Discount Accessibility Engineering: Haven’t we Met Before. INTERACT 99 Workshop: Making Designers Aware of Existing Guidelines for Accessibility.

Monday, October 11, 2010

Lonely Data

I once read a beautiful story by Sherman Alexie in his book "Ten Little Indians". The story was about a woman who checks out an old book from the library and finds that it has never been checked out before. She wonders how the book must feel; neglected on the shelf, passed by by so many potential readers. I, too, felt for the book and its author.

What about data that is collected and never used? Every survey that I have either taken or designed has had the "Is there anything else..." open-ended text question at the end. I know some of those open-ended responses are rarely, if ever, analyzed.

Sure, one can fill Excel, SAS, Access, and SPSS with rows of quantitative data then slice and dice it. But open-ended, qualitative, responses are different. If one collects 100 responses, its a manageable task to sort through them. Even 1000 is manageable with a couple of people helping, although that presents its own challenges. But what about 10,000 responses? Do those responses just wait for some ambitious analyst to download and analyze them?

That is exactly what I found. Thousands of open-ended text responses collected every few weeks from customers. Thousands of potential solutions to the customer pain points just waiting, while the customers asked, "Why don't they listen to us?"

The survey showed, via questions using a likert scale, that a product "scored" a 2 out of 5. "That is bad, we need to fix this," said the product managers. They put together a team to assess the situation, came up with recommendations, implemented them, and the product still scored a "2 out of 5". "What did we do wrong," they asked.

They didn't listen to the users. Quantitative data will tell us there is a problem. Yes, 2 on a scale of 5 is a red flag. However, qualitative data, e.g., open-ended text data will tell us what the problem is and how to solve it.

Lets propose a hypothetical situation based on a situation I encountered; we have a store that specializes in woman's clothing, specifically evening dresses. They sell the type of dress a woman may wear on a night out. The store manager realizes that a lot of her customers bring along their mates (spouses, significant others) for feedback on the potential purchases. Eventually, these men are milling around the front of the store glassy-eyed, or getting in the way of the female customers. She has even seen some arguments break out with the customer leaving the store, empty-handed. This is not good.

She designs a survey, and gets the sales people administer them. The survey contain only three questions, much like an NPS survey. The survey is as follows:

  • If male, ask "How likely would you be to recommend our store to your friends as a place where they should take their girlfriend or wife?"
  • If female, ask "How likely would you be to recommend our store to your friends?"
  • Ask both female and male, "Why did you give that answer?"
The results are not surprising. The men would not recommend the store, but the women would. "I know," the manager says, "I'll make it more comfortable for the men. I'll put in a TV and set it to the sports station. That should make them more comfortable." And it does. A few weeks later she repeats the survey, but now the ratings have flipped; the women would not recommend the store while the men will. And, the manager has noticed more arguments between the women and the men. What went wrong?

She didn't read the comments. One would assume that with an NPS-style survey, the manager would read these comments. However, she didn't. Some of the comments she missed:

  • "There is not enough space around the changing room. I don't have anywhere to wait while my wife changes then comes out to show me the dress."
  • "Its like a fashion show, we are all moving around trying to get out of the way so we can see our girlfriends."
  • "I can't talk to my wife while she is putting on the dress. I saw a dress I'd like her to think about on another person, but I couldn't tell her to try it."
  • "Maybe like a fashion show would be cool. The ladies could come out and model the dresses and we could see. If I see one I like I could tell my wife to look at it."
  • "My girlfriend is kind of shy. She won't come out and show me if a lot of people are around. It would be nice if I could see her without everyone seeing her. Maybe with a video camera like on that one TV show."
Although this a hypothetical situation, it is clear what recommendations I would make. Clearing the area around the changing room and adding chairs would be an inexpensive way to increase satisfaction. The ideas of a "fashion show" and a "private screening" are interesting - I would assess the data for more like responses and perhaps make a recommendation based on those ideas. Communication seems to be a theme. Perhaps there is a way to get messages to the women from their husbands/boyfriends.

Open-ended text can be a difficult data type to analyze but it can be a useful source of information. There are many tools available to help with this analysis. I used SPSS Text Analytics for Surveys. In future posts I will share some insights I had from using the tool.