Anna Ostberg

User Experience Researcher

Force Input on Touchscreens
A series of experiments to understand the basics of force input on touchscreens, and to determine what force sensing features are most important to user experience and usability.
Force input on touchscreens was rapidly growing in the market around 2015 and Synaptics had created a type of force sensor that could be built and manufactured to meet the hardware requirements and tight costs of smartphone components. In order to understand how force sensing might realistically be used, I performed several experiments to understand the basics of force input, including force perception, effect of hand grip, force levels, and force modulation. I also developed several other interaction concepts for force input and implemented them for demos and usability concept testing.
In order to support the force input experiments, I created several Android applications that displayed the testing task and collected data. The tasks were quite simple, including basic taps and presses on the force sensor. The early studies were primarily for force data collection and did not include ratings or qualitative data collection. The final study evaluated a force transfer function (a mathematical function that translates raw force data to provide better control and feedback to the user) that had been created based on the results from the previous studies. This study included ratings and comments from participants comparing the new force interactions to the baseline interactions.
The results of my research directly informed our engineering team about what force features were actually important for user interaction, and which aspects of the implementation needed to be improved. For example, the knowledge that force interaction varied widely depending on how the person was holding the device was a surprise to much of the engineering team and an important consideration for the overall functioning of the sensor. Even more significantly, the results helped support our sales team in discussion with our OEM customers. Customers were demanding very high granularity of force reporting across a large range of forces, which was not feasible with the type of sensor we were working on (it can only be enabled with much costlier sensors). My results showed that people were not able to comfortably control high forces, they could not differentiate forces unless there were large differences, and that a large number of force levels was not practical for real usage. All of these results showed that there was no user experience need for high resolution force sensing across a large force range. Results from these studies were so impactful within the organization that I had colleagues contacting me several years after the studies to learn more or refresh their understanding of the results.
This screenshot shows the task and data collection application. Participants applied force to the "force button", and the red bar showed the real-time force. When the force fell within the target range (black bar), the background turned green for additional feedback.
This graph shows applied forces for different interactions and device grips. An important finding was that people used much higher forces during one-handed grip with thumb input (green).
This chart shows an example of a force modulation and dwelling trial, with the orange areas showing dwell within a target boundary, and the blue region showing the modulation period between the two targets.