I usually don't announce apps in development until they are ready to ship, approved by Apple, and available in the App store. But I recently demonstrated an early version of an app, on stage, at the close of the iOSDevCamp 2011 weekend Hackathon. So I might as well write about the idea behind an app still very much under construction.
Since adding support for Apple's VoiceOver Accessibility to a few of my apps, I've received enough customer feedback to realize that that iPhones and iPads are more popular among blind (or otherwise vision impaired) users than I might otherwise have suspected. By user request, I even developed an app specifically for this market segment, the HotPaw Talking Tuner.
Now, to segue into a bit of computer history: On December 9, 1968, Douglas Englebart, in what has been called the "Mother of All Demos", demonstrated on stage several innovations including the computer mouse, hypertext linking, online collaboration, and some precursors to bitmapped graphical user interfaces. He also demonstrated something less well known, the chord keyboard. This was a one-handed keyboard that could be used to enter text by hitting a combination of buttons. Chord keyboards haven't become popular because it takes some time to learn the key combinations. But if a person needs to enter text with one hand, and takes the time to learn the chords, it is a fairly fast input method.
One current problem with iPad text entry is the lack of tactile feedback when tapping on keyboard icons and other GUI buttons. This makes it difficult to enter text "eyes-free", that is, without looking at the display. Apple's VoiceOver Accessibility technology provides one solution, by giving synthesized voice feedback about where the graphical keys are located. But it currently seems difficult to make VoiceOver work for chorded multitouch input positioning.
My prototype solution is to have an app dynamically place 5 keyboard buttons wherever a person places all five fingers on the display, then record actions, not by tapping on buttons, but by lifting fingers up off the multitouch display. These finger actions can be converted in a "chording" input method by using the combination of which fingers one picks up to select a character for input. Combinations of lifting touches and swiping gestures can be used expand the potential vocabulary. Since this method is actuated, not by touching, but by lifting fingers up, I am calling this input method:
UnTouchType
In a day or so of coding at the Hackathon, I got the an app working well enough to demonstrate to a few people; and they thought it was an interesting enough idea for me to demo. Unfortunately, it didn't work for me on stage at the conclusion of the iOSDevCamp Hackathon. (Playing with the app and accidentally turning on the mute switch just before my presentation slot didn't help. Live and learn.)
If the kernel of this idea is a new innovation, consider it to now be in the public domain. Some experimental works remains on how fast this input method is compared to other eyes-free input methods on touchpads, and which finger/chord/letter combinations might be faster and/or easier to learn. Stay tuned.
Monday, July 18, 2011
Dance Metronome/BPM
This month, HotPaw Productions introduced a new iOS app, Dance Metronome/BPM now available from Apple's iTunes App Store. This app has both a tap BPM (beats per minute) meter and a very accurate metronome. The BPM meter is designed for ballroom dancers, as it indicates whether a particular tempo is within strict tempo range. All the International Standard, Latin and American Style Smooth and Rhythm dance tempos are included. The metronome in the app can play beats, on time, to sub-millisecond accuracy. (Given an audio sample rate of 44100Hz, there are 44 samples per millisecond, and the Dance Metronome/BPM app calculates the nearest exact sample to start playing each on-time beat.)
Subscribe to:
Posts (Atom)