Monday, October 18, 2010

Top 10 Things CIOs Need to Know About Accessibility

I found this list in a group discussion on LinkedIn.com: Top 10 Things CIOs Need to Know About Accessibility

I particularly like numbers 2 and 3.

"Designing in accessibility is much more cost effective than trying to “fix” deployed resources, so accessibility must be a part of the planning process for new and updating existing IT resources and services."

I have been called on by development teams to "check the accessibility" of a product, feature, service; after it has been built, after it has been incorporated into a product, after the ink has dried on the contract. The development team expected a completed checklist filled with check-marks in the "passed" column but nothing in the "failed" column. But checking items off a list never insures anything. (Winberg) What becomes of the features that did, in fact, fail? I have had to be the one to request that a beautifully designed and engineered feature be removed from a product because it just couldn't be make accessible.

Accessibility must be a included in purchasing requirements and RFPs. Accessibility testing must be integrated into the evaluation of products for purchase. Products will only become more accessible when vendors are held accountable to accessibility standards. (Emphasis mine)

I heard an interesting story during a tutorial at a conference. A university was considering a service for the purchase of tickets to the university's sporting events. When the service was tested for accessibility, it failed so the web master rejected to proposal to use the service until that service was accessible. The web master took a lot of heat for that decision, but, because the accessibility was added globally, all users of that service benefited.

Enjoy the list.


Reference:
Winberg, Fredrik. (1999). Discount Accessibility Engineering: Haven’t we Met Before. INTERACT 99 Workshop: Making Designers Aware of Existing Guidelines for Accessibility.

Wednesday, October 13, 2010

Accessibility Heuristics


Introduction


Note: this post was created from research and work done while I was a:


  • User Experience Specialist at DeVry University

  • Student at DePaul University

  • User Interface Designer at SPSS Inc., an IBM company


“The situation today with the existing guidelines for making new technology and information accessible (for example World Wide Web Consortium) reminds much about the state of HCI or Usability Engineering in the late 1980s”. (Fredrik Winberg)


Current accessibility guidelines are difficult for user experience designers and engineers to use. They need a simple, plain English, set of guidelines they can use during the development process to insure the applications they are developing are accessible.

I first became exposed to accessibility while at SPSS Inc. My team lead there was the accessibility expert. When he left the company I started getting calls from developers and other designers with questions about accessibility. I needed to learn fast; there was a product that was about 60% complete and they needed an accessibility expert to look at it. I learned, and a few months later we had a fully accessible, drag and drop, data mining product. And it was written in JAVA. (The engineers on the product were amazing. They learned JAWS, explored the accessibility features in JAVA, and then programmed the application.)


As a graduate student at DePaul University, I expanded on the knowledge I gained at SPSS. Accessibility was my topic of choice in independent study courses. In my capstone course I wrote a set if accessibility heuristics that I will publish and expand on here in this post and in future posts.



The Heuristics


Honor display settings – A user sets the display properties for a reason. Make sure your application knows what they are and displays all its interface elements with those properties.


Provide a tab order -- Provide a well thought out tab order and include every actionable interface element. Do not include un-actionable items, for example disabled controls. For users who use the mouse, the tab order is not as important as to the user who navigates via the keyboard. When there is a potential conflict, choose the tab order that makes the most sense to the keyboard user.


Indicate focus and selection -- Every interface element must indicate that it has focus, and, if applicable, which items are selected within that element. This information needs to be indicated onscreen and passed textually to the Operating System. Additionally, only the user has the right to change the current focus.


Provide textual information -- Provide text information for every interface element, even if that information doesn’t appear in the display. A Screen Reader cannot describe an interface element that it knows nothing about.


Follow the standard -- Use standard interface elements as they are meant to be used. There is no such thing as a single button radio group. Pattern custom control behavior on standard control behavior and identify them as such.


Use the keyboard – Every interface element must be actionable via the keyboard. Take advantage of standard keyboard shortcuts to provide functionality. Provide application specific keyboard shortcuts but never overwrite standard shortcuts.


Color is for enhancement only -- Don’t use color as a sole indicator of functionality, state or mode.


Identify images -- When using images and animation, be consistent, provide textual information, and avoid flashing.


Test, then test again -- Test to assure the application works with the assistive technology included in the Operating System as well as a third-party Screen Reader.

References


Below is a collection of references I've used in past projects.


Barnicle, Kitch. Usability Testing with Screen Reading Technology in a Windows Environment. Arlington, Virginia, USA. (CUU’00).


Henry, Shawn Lawton. (2007) Just Ask: Integrating Accessibility Throughout Design (lulu.com)


Kinash, Shelly. (2006) Seeing Beyond Blindness (IAP-Information Age Publishing)


Mueller, John Paul. (2003) Accessibility for Everybody: Understanding the Section 508 Accessibility Requirements. (apress).


Nielsen, J. Ten Usability Heuristics. http://www.useit.com/papers/heuristics/heuristic list.html.


Nielsen, Jakob. (1994) Heuristic Evaluation. In J. Nielsen & R.L. Mack (Eds.), Usability Inspection Methods (pp 25-62), New York: John Wiley & Sons Inc.


Paddison, Claire & Englefield, Paul. (2003). Applying Heuristics to Perform a Rigorous Accessibility Inspection in a Commercial Context. Vancouver, British Columbia, Canada. (CUU’03)


Seale, Jane K. (2006) E-Learning and Disability in Higher Education (Routledge)


Section 508. http://www.section508.gov.


Sloan, David & Gregor, Peter & Rowan, Murray & Booth, Paul. (2002). Accessible Accessibility. Arlington, Virginia, USA. (CUU’03).


Theofanos, Mary Frances & Redish, Janice. (2003). Guidelines for Accessible – and Usable – Web Sites: Observing Users Who Work with Screen readers. Interactions. 10, (6), 31-51.


Web Content Accessibility Guidelines http://www.w3c.org/WAI/


Winberg, Fredrik. (1999). Discount Accessibility Engineering: Haven’t we Met Before. INTERACT 99 Workshop: Making Designers Aware of Existing Guidelines for Accessibility.

Monday, October 11, 2010

Lonely Data

I once read a beautiful story by Sherman Alexie in his book "Ten Little Indians". The story was about a woman who checks out an old book from the library and finds that it has never been checked out before. She wonders how the book must feel; neglected on the shelf, passed by by so many potential readers. I, too, felt for the book and its author.

What about data that is collected and never used? Every survey that I have either taken or designed has had the "Is there anything else..." open-ended text question at the end. I know some of those open-ended responses are rarely, if ever, analyzed.

Sure, one can fill Excel, SAS, Access, and SPSS with rows of quantitative data then slice and dice it. But open-ended, qualitative, responses are different. If one collects 100 responses, its a manageable task to sort through them. Even 1000 is manageable with a couple of people helping, although that presents its own challenges. But what about 10,000 responses? Do those responses just wait for some ambitious analyst to download and analyze them?

That is exactly what I found. Thousands of open-ended text responses collected every few weeks from customers. Thousands of potential solutions to the customer pain points just waiting, while the customers asked, "Why don't they listen to us?"

The survey showed, via questions using a likert scale, that a product "scored" a 2 out of 5. "That is bad, we need to fix this," said the product managers. They put together a team to assess the situation, came up with recommendations, implemented them, and the product still scored a "2 out of 5". "What did we do wrong," they asked.

They didn't listen to the users. Quantitative data will tell us there is a problem. Yes, 2 on a scale of 5 is a red flag. However, qualitative data, e.g., open-ended text data will tell us what the problem is and how to solve it.

Lets propose a hypothetical situation based on a situation I encountered; we have a store that specializes in woman's clothing, specifically evening dresses. They sell the type of dress a woman may wear on a night out. The store manager realizes that a lot of her customers bring along their mates (spouses, significant others) for feedback on the potential purchases. Eventually, these men are milling around the front of the store glassy-eyed, or getting in the way of the female customers. She has even seen some arguments break out with the customer leaving the store, empty-handed. This is not good.

She designs a survey, and gets the sales people administer them. The survey contain only three questions, much like an NPS survey. The survey is as follows:

  • If male, ask "How likely would you be to recommend our store to your friends as a place where they should take their girlfriend or wife?"
  • If female, ask "How likely would you be to recommend our store to your friends?"
  • Ask both female and male, "Why did you give that answer?"
The results are not surprising. The men would not recommend the store, but the women would. "I know," the manager says, "I'll make it more comfortable for the men. I'll put in a TV and set it to the sports station. That should make them more comfortable." And it does. A few weeks later she repeats the survey, but now the ratings have flipped; the women would not recommend the store while the men will. And, the manager has noticed more arguments between the women and the men. What went wrong?

She didn't read the comments. One would assume that with an NPS-style survey, the manager would read these comments. However, she didn't. Some of the comments she missed:

  • "There is not enough space around the changing room. I don't have anywhere to wait while my wife changes then comes out to show me the dress."
  • "Its like a fashion show, we are all moving around trying to get out of the way so we can see our girlfriends."
  • "I can't talk to my wife while she is putting on the dress. I saw a dress I'd like her to think about on another person, but I couldn't tell her to try it."
  • "Maybe like a fashion show would be cool. The ladies could come out and model the dresses and we could see. If I see one I like I could tell my wife to look at it."
  • "My girlfriend is kind of shy. She won't come out and show me if a lot of people are around. It would be nice if I could see her without everyone seeing her. Maybe with a video camera like on that one TV show."
Although this a hypothetical situation, it is clear what recommendations I would make. Clearing the area around the changing room and adding chairs would be an inexpensive way to increase satisfaction. The ideas of a "fashion show" and a "private screening" are interesting - I would assess the data for more like responses and perhaps make a recommendation based on those ideas. Communication seems to be a theme. Perhaps there is a way to get messages to the women from their husbands/boyfriends.

Open-ended text can be a difficult data type to analyze but it can be a useful source of information. There are many tools available to help with this analysis. I used SPSS Text Analytics for Surveys. In future posts I will share some insights I had from using the tool.

Thursday, October 7, 2010

My Resume

Professional Experience


DeVry University Online 2007 – 2010


User Experience Specialist



  • Performed user testing on various online courses and online course materials including a live electronics workshop lab in which students could view lab procedures and receive assistance with lab issues. Testing was performed in-house and remotely using online meeting technologies.

  • Audited online courses and course materials for accessibility to students with disabilities. Made recommendations to course development group and student advising group.

  • Created and maintained surveys for DeVry Academics department.

  • Categorized open-ended comments from student surveys with text analytic software. Created a report of categorized comments for use by the Academics group that allowed them to easily scan comments for student issues.

  • Created content for faculty training seminar to educate faculty on the accommodations for students with disabilities.

  • Presented training sessions to Instructional Designers, Program Deans, and DeVry personal on how students with disabilities use DeVry Online Courses.


American Mediconnect, Chicago, IL 2005 – 2007


Project Lead



  • Leading a project in which a text-based call center software is replaced with an interactive scripted interface.

  • Successfully converted approximately 200 customer accounts to the interactive system.

  • Communicated with software vendor regarding bugs in interactive system.


DePaul University, Chicago IL 2004 – 2005


User Experience



  • Performed content editing and page layout on the CTI’s Student Technology Guide.
  • Performed approximately 60 Contextual Interviews with the students, faculty, and staff of the School of Computer Science, Telecommunications, and Information Systems (CTI) for a re-design of the School’s web sites.
  • Performed a card-sorting activity with CTI students and compiled results into a taxonomy for the new web site.

SPSS Inc., Chicago IL 1998 – 2003


User Interface Designer



  • Wrote design specifications for many SPSS products including SPSS versions 10 – 12 and SPSS’s charting package DeltaGraph. The design specs were the last step in a process which included competitive analysis, customer feedback, prototyping, and product team feedback.

  • Participated in various usability tests for SPSS products. My participation in those tests included: recruiting participants, preparing testing materials, managing equipment resources, facilitating tests, and compiling results.

  • Co-authored an SPSS company-wide standard for assuring accessibility in new and existing desktop products. The standard was based on a set of accessibility heuristics I developed. As a student at DePaul, I expanded this set of heuristics.

  • Produced 22 computer based training films for the SPSS product “Surveys With Confidence” using Lotus ScreenCam.

  • Created new dialogs for Mixed Models procedure in SPSS 11.0. These new dialogs allowed, for the first time, users to perform highly requested “nesting” operations from a GUI.

  • Designed and prototyped a data-enhancement wizard, in cooperation with Acxiom Corporation, which would allow SPSS users access to valuable demographic data for use in Statistical calculations.

  • Created example graphics for the “Restructure Data” Wizard in SPSS 11.5 which allows user to easily select the restructure scenario they need. See: http://distdell4.ad.stat.tamu.edu/spss_1/restructure.html “Data Manipulation -- Restructure Data” for an example.

  • Designed approximately 50 – 100 application icons for use in various SPSS applications.


Tribune Media Services, Chicago IL 1991 – 1997


Accounting Assistant



  • Implemented a spreadsheet front-end onto an existing database to automate royalty payments to Tribune Media Services Cartoonists and Columnists. This decreased the time to prepare checks from three days to one.

  • Redesigned existing forms to match system input flow eliminating approximately 50% of data input errors, and subsequent corrections.

  • Assisted in gathering requirements for accounting system upgrades.


Education



  • MS—Human Computer Interaction—DePaul University Chicago, IL—2006

  • BS with Honors—Human Computer Interaction—DePaul University Chicago, IL—1999.

  • Fine Arts—University of Illinois at Chicago—1979-1983.


Awards and Public Speaking



  • Recipient of Ron Taylor award for performance July 2010

  • Tutorial accepted for 2003 Usability Professional’s Association: “The DADA Movement: Designing Accessible Desktop Applications”