Advances in human-computer interaction technologies have progressively shrunk the gulf separating user and computing device, making it easier for a user to accomplish a task using means that feel more and more natural to the user. Early in the history of computing, for example, communicating with a computer was cumbersome and consisted of plugging wires in patch panels of circuits. The arrival of programming languages and the ability to communicate with computers through the use of symbolic commands empowered users to think at a more natural level, and further progress was achieved with the advent of graphical user interfaces where users can point and click on labeled screen objects such as menu items and buttons, instead of taxing their memory to recall the exact syntaxes for various commands.
But even today's WIMP-based GUIs (WIMP stands for Windows, Icons, Menus, and Pointers) may be sub-optimal for performing certain tasks. If you've watched the movie Minority Report or CNN's Election Reporting, where screenfuls of information can slide in and out of view with a flick of a finger, or images enlarged or reduced with just the simultaneous movements of two fingers, or multiple people changing the contents in different parts of a screen simultaneously using gestures only, you'll know what kind of tasks I'm talking about. With more and more devices having capabilities to recognize touch and gestures, the next evolutionary step in software application development would surely involve the seamless integration of these new input modalities to the design of natural user interfaces (NUIs). This book will help you understand what you need to know in order to get started with such an endeavor.
The book begins with a discussion of what the authors mean by a natural user interface, qualities to look for in such interfaces, and computing niches where touch- and gesture-responsive NUIs will have an important role to play.
The authors emphasize * repeatedly * in the book that when attempting to integrate new modalities into user interfaces, one should avoid the temptation to simply copy old paradigms. For example, some of the earlier graphical menuing systems that attempted to mimic the way command-centric applications worked by requiring users to first select a desired operation before they could select the object to be acted upon failed miserably because that interaction style was not the optimal way for interacting with a GUI. Equating touches with mouse clicks would similarly not work because there are important differences between those two kinds of inputs.
The authors provide ample discussions of similarities and differences among touch, gesture, mouse click, and mouse movement, and give plenty of guidelines on how to handle touch and gesture inputs, provide effective feedback to users so they'll know whether their inputs are getting received and interpreted correctly by the system, and if not, potentially why, and how to compose interaction patterns that are easy to learn. The book then concludes with some suggestions on how to test the learnability of those proposed interaction patterns.
Recommended readings are provided at the end of each chapter.
Overall, I thought the book is well written but a bit dry. The information provided is practical and valuable.