In computer science, adaptive system refers to a process in which an interactive system adapts its behavior to individual users based on information acquired about its user(s), the context of use, and its environment. Although adaptive systems have been long-discussed in academia and have been an aspiration for computer scientists and researchers, there has never been a better time than today to realize the potential of what future interaction with computer systems will be like. The abilities of today’s network information technologies to create rich, immersive personalized experiences to track interactions and aggregate and analyze them in real time, together with the data collected by the sensors we carry in our smart devices, provides us an opportunity like never before to design adaptivity, in order to ultimately offer a better user experience that is both unobtrusive and transparent.
This article will explain the fundamental concepts for utilizing smart device technologies and sensor data in order to understand context and introduce “adaptive thinking” in to the UX professional’s toolset. I will demonstrate the importance of context when designing adaptive experiences, give ideas on how to design adaptive systems, and perhaps inspire designers to consider how smart devices and context aware applications can enhance the user experience with adaptivity.
Examples of Adaptive Systems
An early example of an adaptive feature is found in GPS navigational devices. Using the device, a user is able to easily locate and navigate to any location he can drive to. When the sun sets or while driving through a tunnel, the system automatically changes the interface color to a dark “night mode” so not to blind the driver with a bright light from the device. The system knows the user’s exact location and the position of the sun, and by understanding these two factors, the system maintains a safe driving environment by adapting to the user’s needs.
GARMIN Zumo 660 Day and Night Interface
Adaptive design is about listening to the environment and learning user patterns. Combining smart device sensor data, network connectivity and analyzing user behavior is the secret sauce behind creating an adaptive experience. By combining these capabilities, we not only understand the context of use; we can also anticipate what the user needs at a particular moment. Google Now is an interesting example of an adaptive application that gives users answer to the questions they’ve thought rather than typed. Through a series of smart cards that appear throughout the day on the user’s mobile phone, Google Now tells you today’s weather before you start your day, how much traffic to expect before you leave for work, when the next train will arrive as you’re standing on the platform or your favorite team’s score while they’re playing. It does this by recording and analyzing your preferences while using your phone. For example, updates on your favorite sports team are based on your web browsing and search history. Another example is by analyzing your current location, previous locations, and web history, Google Now presents a card with traffic conditions on route to your next likely destination.
As UX professionals, we understand that mobile users are not keen to use the virtual keyboard and we try to avoid the necessity as much as possible. By utilizing the user’s personal behavior as a sensor together with smart device capabilities and enabling voice commands (similar to iOS’s Siri), Google Now creates an adaptive experience that helps users avoid using the virtual keyboard, thus further adapting to the mobile user’s needs and helping users quickly get the information they require on the go.
Adaptive systems are not only limited to mobile devices. Ubiquitous computing (UbiComp) is a surround of smart devices and networked digital objects that are carefully tuned to offer us unobtrusive assistance as we navigate through our work and personal lives. Similarly, ambient intelligence (AmI) refers to digital environments that are sensitive and responsive to the presence of people.
Nest, The Learning Thermostat
Nest, The Learning Thermostat, is a great example of an adaptive system integrated to home environments. Using a variety of sensors for temperature, humidity, touch, near-field activity, far-field activity, and even ambient light, it can detect whether there are people home and how active the home is at any time. Doing so, it can automatically cut up to 20% off a home’s heating and cooling bills.
When no one is around, Nest learns to turn the heat down. When you come home from work, it knows that the heat should go back up. After the first few weeks, it learns when you come home from work and can turn the heat up before you arrive so that you come home to a warm house.
In 1991 Mark Weiser wrote,
“The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it.”
Nest is a great example of UbiComp and how technology can disappear into our surroundings until only the user interface remains perceivable by users.
These devices create contexts of sensor and user data that provide a superior user experience by anticipating what the user might need before the need is expressed, and this is the future of UX design.
In contrast to traditional desktop systems, mobile devices are normally used in many different situations. However, mobile applications nowadays do not often utilize the context of their use and hence are only usable for very specific purposes. For example, a city maps application for local businesses is used in different contexts, walking through town, at home, or used with no network connectivity. Today’s users can customize their device’s system through modifying their preferences and settings or by choosing what application works best for their needs. Even after the implementation of user-centered design processes that assures a certain degree of user acceptance and yields a richer understanding of the context, it is impossible to anticipate the requirements of all users and map it to a single best or optimal system configuration.
“Adaptive thinking” is a mindset that provides the tools necessary to significantly improve the user experience and enhance the intended purpose of the product by utilizing the technology that is readily available in every pocket. It is about learning the environment and the user and adapting to his current needs and situation. Therefore, designers should first design for the context of use and then design the set of functions that are triggered in relevant situations.
Here is an instructive case where adaptive thinking was used to create a mobile app for a bike sharing program. A bicycle sharing system, also known as bike rental, is becoming an integral part of major cities around the world. Bicycle sharing helps reduce traffic congestion, air pollution and encourages locals to maintain a healthy lifestyle. A user looking to rent a bike will turn to a mobile application to look for the nearest bike rental station with available bikes to rent. The user will then use the application to navigate to the rental station (if he is unfamiliar with the city); this is the core functionality of the application. An adaptive system will realize when the user has arrived at the bike rental station and automatically offer additional options, i.e., adapt to the current situation. For example, it may offer him a quick way to rent a bike, a feature that was not available before arriving at the rental station. During the rental period, the system will anticipate the user’s needs and offer nearby bike rental stations with available parking spots and show him the current balance for the rent time.
By using the assisted GPS device capabilities, network connectivity, and by understanding the user’s story at any given time through the product lifecycle, adaptive design will provide users of the mobile application a reliable extension to the bike rental program.
Adaptive and Responsive Design
An adaptive system is one that adapts automatically to its users according to changing conditions. Responsive design is a subset of adaptive design, an approach to web design in which a site is crafted to provide an optimal viewing experience across a wide range of devices. In my UX magazine article “The Multiscreen Ecosystem” I discuss how responsive design can also be adaptive by understanding the context of using a mobile device and by designing contextual paths.
Context for Adaptability
I quote below from the 2007 book The Adaptive Web, which talks about the importance of context for adaptive Mobile Guides: It explains adaptivity in the scope of mobile systems as context-aware computing, i.e., the ability to use information in the current context to adapt the user interaction and the presentation of information to the current situations of users.
“Understanding the context is an important prerequisite for the adaption process. Context is not just the location, but encompasses also information like the ambient noise or lighting level, the network connectivity or bandwidth, and even the social circumstances of the user. Furthermore, systems have to anticipate the user’s goals and intentions, which might be inferred from their actions or from physiological sensors and appropriate environmental sensors (e.g. light, pressure and noise sensors).
One prerequisite for adaptive systems is the proper assessment of the user’s situation. For this purpose, systems need to rely on a representation of relevant situations. Depending on the supported task, situations can be characterized by many different attributes. Therefore, designers of suitable adaptation for mobile devices need to look at a variety of spatial, temporal, physical and activity related attributes to provide effective assistance. For example, a mobile application that assists users in a shop needs to know about the current spatial environment of the users (e.g. which products are nearby), the temporal constraints of the user (e.g. how much time is available for shopping), the general interests of the users and their preferences (e.g. if the user prefers red or white wine with tuna), details about the shopping task itself (e.g. which items are on the shopping list and for which purpose the products are needed) and maybe even about the physiological and the emotional state of users (e.g. whether users are enjoying the shopping or not).”
That being said, understanding the locational context and the user story is now easier than ever before. We can utilize the fact that we carry our phone wherever we go. This phone is packed with technology and information about the user that designers can use to understand context. The highly sophisticated, advanced technology in our pockets not only allows designers to analyze if the user is walking, standing, or in a loud or quiet environment, but also can help us understand the precise location of a person within a department store or even a specific aisle.
AislePhone, an Israeli startup currently in the beta stage, is developing a platform for precise in-store positioning that can understand the exact position of a person within a specific aisle level. With this technology shopping with your mobile phone at hand will be a common experience, as mobile apps for supermarkets and other large retail stores will utilize locational and user data to enhance the shopping experience, much like a personal shopping assistant in your pocket.
Google Indoor Maps allows users to view and navigate floor plans of several kinds of commercial locations such as airports, department stores or malls within Google Maps.
This technology not only understands your indoor location, but on what floor you’re on. Depending on the data available, the map will show notable places in the building you’re currently viewing, including stores, restrooms or the nearest foot court.
With this type of technology, “you are here” directory maps will become obsolete when you enter mall or a department store. You will be able to understand your location and orient yourself using a smartphone, and this experience will adapt to your specific needs. For example, apps will offer you relevant discounts as you walk through the mall or highlight several shops based on your gender and age.
Designing Adaptive Systems
Adaptive design integrates both subtle and obvious features. Often, adaptive qualities can be very subtle and unobtrusive: sometimes a seemingly small adaptive feature can greatly improve the overall experience. For example, did you ever notice that Google Search can read your mind? When you start typing, Google autocomplete (Google Instant) knows what you’re thinking even when you enter only three letters searching. It does this because Google Search considers and records all search queries within a session in order to have a better understanding of the user’s intent. When a user searches for “The Beatles,” Google understands this as part of a research session and will help you quickly discover Ringo Starr or Paul McCartney as you enter the first three letters of their name; it understands the context of your search and compares it with other similar popular relevant results. Another example of a subtle feature that helps enhance the user experience could be a system for students that adjust the difficulty of test questions according to prior answers. Or a music discovery app that looks into your current play list and adapts to your taste, helping you discover additional music you may like.
Google Instant understands the context of your search
Although the experience should always be unobtrusive, adaptive interfaces need to be obvious so users understand the context for the adaptation and always feel in control. For a better experience, applications should allow users to manage adaptive features. For example, if at nighttime the interface changes to a darker night mode (like in navigational devices), the user should always be able to change it back manually. Or, if entering a shopping mall triggers a different experience, the user must understand the context for this adaptivity and want to embrace the added functionality.
Charles Darwin wrote,
“It is not the strongest of the species that survives, nor the most intelligent that survives. It is the one that is the most adaptable to change”.As human beings, we adapt to our surroundings naturally; it is the key to our survival. As designers, we can utilize this inherent ability and use our physical senses and the powers of the brain to analyze and design what we would do in adaptable situations. For example, to communicate in a loud environment, we adapt by raising up our voice up to be heard. Similarly, an adaptive system will raise a device’s volume. In an even louder environment, we use hand gestures to get attention and focus our eyes on the other person’s mouth to try to read his lips. However, unlike computers that can process multiple layers of data, human beings have limited sensory resources and a limited cognitive workload.
In today’s world, a person carries in one pocket more advanced technologies than ever before possible. An intelligent device like a smartphone is embedded with highly sophisticated sensors. These sensors, together with advanced computing power and network connectivity, can help us analyze and understand the context of use. The smart device’s ability to analyze the context of use in real time, together with understanding the user’s story, allows opportunities to provide an even greater user experience by adapting to the user needs.
Analyzing User Behavior
Similar to the Google Now example, analyzing user behavior and his interaction with the digital world can yield us a great understanding of the user’s context. Analyzing the user’s search patterns or what applications he downloads can tell us about his preferences and hobbies. Tracking current location and location history can yield us the user’s surroundings and the physical boundaries of his life, so we can understand what subway station he takes to work, or where he likes to eat his lunch. Note that when this is done without the knowledge of users, it may be considered a breach of browser security and illegal in many countries.
Here is a practical example of how analyzing the user’s behavior could help in creating an adaptive system. In the now famous Google Glasses video, we follow the user throughout his morning as he eats his breakfast and then leaves his house heading for it’s the subway. Upon arriving at the subway, the user receives a message that subway service is suspended and is offered a walking route. As useful as this may be, a true adaptive system will analyze the user behavior as the user gets up and will warn the user ahead of time that the subway service is suspended.
Understanding the user behavior (whether he takes a subway or walks to work) and connecting it with available information online allows us to understand and adapt to his needs. Most times, using one data source is not enough, combining the technologies (network connectivity, user behavior, and sensor data) is the only way to understand context. For example, we can gauge the outside temperature by using the user’s current location combined with online weather information, and with this data offer numbers for nearby cab companies in addition to a walking route, assuming he may not wish to walk to work in the rain.
Making Use of the User’s Story
Behavioral targeting or personalization refers to a range of technologies used by online website publishers and advertisers which allows them to increase the effectiveness of their campaigns by capturing data generated from website and landing page visitors and adapting to their needs. Personalization technology enables the dynamic insertion, customization or suggestion of content in any format that is relevant to the individual user, based both on the user’s explicitly provided details and his implicit behavior and preferences.
Another aspect of personalization is the increasing prevalence of open data on the Web. Many companies make their data available on the Web via APIs, web services, and open data standards. For example, Pipl is a search engine designed to locate people’s information across the web. Pipl uses identity resolution algorithms to aggregate information and cross-link various sources before delivering an online profile containing a summary report of everything that’s publicly available for each individual. Pipl offers all that wealth of information to developers via an API. Useful applications for this are, for example, running an API request for an email address; one can determine the user’s gender, age, location, interest, and provide an adaptive experience based on the individual user.
Understanding the user story is possible with a network connection. However, network connectivity is not only important to understand the user and his online record; it is a vital instrument that connects all other technologies together–cloud computing, understanding local weather, traffic conditions, or even the type of connection itself (Wi-Fi or G3) can help us understand context. Ultimately, the possibilities inherent in understanding and designing to the user’s story—his context—are possibilities built upon the collection of sensor data and user data via the network.
A sensor for adaptive systems is any technology that allows a device to understand and evaluate context. It includes the built-in accelerometer in smart devices, a camera, a clock, or even a microphone. We can use the various sensors embedded in smart devices to better understand the user’s environment. For example, the built-in accelerometer can be used to gauge if a user is walking or running.
There are two main scenarios for using sensors: everyday objects transmitting data like temperature or noise level to other devices, for example, iGrill, a cooking thermometer and app that communicates with smart devices via a secure, long-range Bluetooth connection. Or smart device applications, utilizing the built-in sensors to receive, process and output data to the user. By using these sensors and mixing other technologies discussed above, we can often obtain powerful information on the context of use and utilize it to create adaptive systems.
iGrill Cooking Thermometer
Sensors can be a powerful design tool of the future. For example, with the aid of sensors e-commerce checkout will be as easy as logging into our bank account with no passwords. Here is an example of how using four layers of sensor data to secure with a degree of certainty the user’s identity so to create password-less banking that would present the user a “light” version of his bank account, so he quickly could check his account balance. Imagine a user is at home surfing to his bank account through his tablet computer. The first layer of security is the username associated with the tablet. Second is the location sensor, which will give us a greater degree of certainty that the user is in his home vicinity, cross-checked with his registered address with the bank. The third layer is the WiFi connection (its MAC address, a unique identifier assigned to a network) the user is surfing on. For the fourth layer, we can check for other nearby WiFi connection (The neighbors are sure to have unique WiFi MAC address) that can also be used as a security verification. If these bits of data are consistent across several password logins, the system can adapt and allow the user to enter without any password.
To learn more about adaptive design and how to get from sensors to context, it is highly recommended to read this paper by Albrecht Schmidt about Context-Aware Computing (Interaction Design Foundation Encyclopedia).
Today, we’re just starting to see the potential of using sensors and technology to connect between devices and people. The term “Internet of things” refers to uniquely identifiable objects that are network-connected, for example, a smart flowerpot that sends a signal when it’s time to water the flowers. There is no doubt that adaptive design will play a key role in making future devices and functional user interfaces that give users an intuitive control over their environments in any situation or context.