Skip to content
ngorgi edited this page Feb 29, 2012 · 6 revisions

LockPick is an Android game created for the 2012 SS12 competition. It uses the phone’s sensors along with haptic feedback to simulate the act of picking a lock (well, that’s about the closest thing). Since it doesn’t rely on visual feedback for the main game component, it can be played by normal, blind, and deaf-blind users. Since it has the ability to be a fun game for the disabled, it was built from the ground up with accessibility in mind. You can find it on the Android Market by searching for SS12 Lockpick or by going to https://market.android.com/details?id=com.Norvan.LockPick.

Accessibility Features

3 different modes with fundamental differences -- The first thing that the user does when they start the game for the first time is to select the “user type”, which can be normal, blind, or deaf-blind. This selection determines how almost every aspect of the app behaves afterwards.

The first difference is the layout of the user interface. For normal users, the game looks like just any other app with lots of margins and other design aspects to make it visually appealing. However, since this only leads to confusion when you can’t actually see the screen, a different layout scheme is used for blind and deaf-blind mode. The main menu screen is divided into 4 quadrants, with each one representing a different navigation button. The game screens are also set up in the same way but when the buttons are clicked, they instead read out different stats about the game in progress. Also, the game takes full advantage of the volume buttons which (since Android 4.0) are the only hardware buttons that can be used by applications. It is much easier to press a real, tactile button than to find a virtual one on a screen that you can't see and hope you didn't just press something else.

The second difference is how the application communicates instructions and feedback to the user. All these events are first sent to the AnnouncementHandler object. From here, it checks what the user type is and preforms the appropriate actions. If the user is blind, it will send the data to the TTSHandler object which sends it to Android’s Text-to-Speech engine where it is spoken out loud. If the user is deaf-blind, it takes the data, transforms it into a Morse code pattern using MorseCodeConverter, and then sends it to the VibrationHandler object to pulse out through the phone’s vibrator.

Here is a brief overview of the accessibility features are incorporated into the app:

  • Blind support through the use of the TalkBack screen reader (with more support added in places that the screen reader comes short).

  • Deaf-Blind support through Morse code vibration output. This is not the most elegant solution, but it’s the only way possible on Android since there is no built in support for braille devices.

  • Targeted for Android 4.0 (Ice Cream Sandwich) - While older versions of Android had accessibility features, they were the bare minimum. With 4.0, the developers at Google made HUGE improvements to Android’s accessibility features and this app takes full advantage of them.

  • Easy to follow tutorial - One of the biggest early issues with the game was that nobody could figure out how to play it. It now has a tutorial which actively guides the user through the game play mechanics meaning users won’t be turned away just because they can’t read the big wall of text that was the old instructions.

  • The game itself - Touch screens are meant for eyes, plain and simple. They are small objects that are designed to present visual information and to take in precision input. Instead of being a game that you play ON your phone, LockPick is a game that you play WITH your phone. It turns your body into the controller, using the phone’s sensors to take in input and the vibrator and speaker to provide feedback.

Technologies Used

  • Android 4.0.3 - LockPick targets the latest version of Android and takes full advantage of all the new accessibility features provided by it.

  • IntelliJ IDEA - The IDE that it was developed on. IntelliJ IDEA is light years ahead of the standard IDE for Android development, Eclipse. In fact, if I had been using Eclipse, I doubt I would have been able to finish the project on time.

  • Git, Github, and Dropbox - A combination of the 3 was used for version control. The project files were stored in a local Git repository which was placed in a shared Dropbox folder. This meant that I had some of the best version control features along with portability between computers thanks to Dropbox. To top it all off, the files were shared with the world using Dropbox.

  • Google Analytics - Google Analytics was used in the early builds of the game to gather info about different aspects of user interaction within the game. Apparently, over 70% of players couldn’t even get past the first level, so I toned down the difficulty and added the improved tutorial mode.

Application Diagram

Component Diagram

Components

  • MainActivity - Where it all begins. It is the entry point to the app and serves as the main menu from which the other Activities are accessed
  • FirstRunActivity - This is where the user type is selected (normal, blind, deaf-blind).
  • SurvivalGameActivity, TimeTrialGameActivity, and TutorialActivity - These are the user interfaces for the three game modes (the tutorial is technically a game, just a very short and simple one). They interact with the game handlers and return feedback to the user.
  • SurvivalGameHandler, TimeTrialGameHandler, and TutorialHandler - These take in UI input from the Activities and sensor input from SensorHandler, process them, and return the appropriate output. They are essentially state machines and their inputs determine what state they are in.
  • LevelHandler - Creates and stores the game's "playing field". Levels are algorithmically generated and the difficulty level is scaled based on what level the user is on. It takes in tilt readings from the game handlers and returns what action they should take.
  • AnnouncementHandler - Takes in an event and provides the appropriate feedback based on the user type. Take for example the event of winning a level. If the user type is normal, it will congratulate them. If it user type is blind, it will congratulate them and also read out some stats (since they can't read them on the screen). If the user type is deaf blind, it will output all this as Morse code vibrations.
  • SensorHandler - Abstracts out all the functions related to communicating with the phone's sensors into a small package.
  • VibrationHandler - Abstracts out all the special vibration functions that I had to design, such as variable vibration intensity and a VibrationCompletedInterface.
  • ScoreHandler - Nothing fancy here, deals with matters relating to the player's score and high score.
  • TimingHandler - Provides more advanced timing features for Time Attack Mode.
  • SharedPreferencesHandler - Used to store and retrieve data from the phone's storage.
  • TTSHandler - Used to simplify interaction with Android's Text-to-Speech service.

You can also refer to the Javadoc comments in the source code itself. Any methods in the game components that are not self descriptive have proper explanations allowing anyone to use them in their own applications (since I had to write a lot of things that Android doesn't include). The Activities however are not as thoroughly documented since they are mostly just a lot of user interface boilerplate.

If you have any further questions, feel free to email me at nourvan @at@ gmail .dot. com.