Accessibility in Second Life

Cloud created by:

Carolyn Hunt
1 May 2011

Working with learners in higher education who have disabilities, I am very conscious of the dual effect that new technologies have for people with disabilities. If they are created well then they can enhance learning opportunities but they can also provide frustrating barriers. 

I have created this cloud to share some resources on how people are working hard to ensure that SL can be accessible for people with disabilities because we cannot use it in education unless we ensure it is accessible for all our learners.

Extra content

An estimated 50 million to nearly 200 million people use virtual worlds like Second Life (the range can be attributed to an overlap of users among sites and a differentiation between registered and active users). We don’t have hard numbers regarding how many users have disabilities, but statistics on video gaming offer insight. As many as 20 percent of the estimated 300 million to 400 million video gamers are disabled, a 2008 survey by PopCap revealed. Considering that roughly 15 percent of the U.S. population is disabled, people with disabilities are overrepresented in the gaming market. Those surveyed reported more significant benefits from playing video games than their nondisabled counterparts.

(Springer, 2009, Speech in a Virtual World - see references)

 Programs have been designed specifically to integrate assistive technologies with SL so disabled users can participate. Two of these are TextSL and Max, the Virtual Guide Dog.

TextSL, a free download, harnesses the JAWS engine from Freedom Scientific to enable visually impaired users to access SL using the screen reader. TextSL supports commands for moving one’s avatar, interacting with other avatars, and getting information about one’s environment, such as the objects or avatars that are in the vicinity. It will also read the text in the chat window. The program, which was created by Eelke Folmer, an assistant professor of computer science and engineering at the University of Nevada-Reno, is compatible with the JAWS screen reader and runs on Windows, Mac OS, and Linux.

Max, the Virtual Guide Dog, was created as a proof-of-concept to show that SL could be made accessible to people with all types of disabilities. Max attaches to one’s avatar, and its radar moves the user and interacts with objects. Max can tell a user what she can reach out and touch, printing the information into the chat window. Max can also help a user find a person or place and transport the user to a desired location. If a device or object has a .WAV file associated with it, then Max can play the audio file as well.

(Springer, 2009, Speech in a Virtual World II - see references)

Carolyn Hunt
15:24 on 1 May 2011

Restricted Upper Limb Mobility

Those with limited or no uppper limb functionality can use one of the Dragon products such as Dragon Naturally Speaking in order to move their avatar, interact with objects and create new objects.

A good guide can be found on the Second Life Wiki - see links or click  Dragon

Carolyn Hunt
15:32 on 1 May 2011 (Edited 15:33 on 1 May 2011)

Embedded Content

Contribute

Contribute to the discussion

Please log in to post a comment. Register here if you haven't signed up yet.