Never Make the User Feel Foolish

What the heck happened to the Xbox UX?

One morning I, and 250-million other Xbox users, woke to a new Xbox UX. Leaving aside strategic-business issues regarding the wisdom of a late-night imposition of radical UI change, the change itself was radically in violation of so many of the basic tenets of UX best practice that I was, frankly, gob-smacked!   

How could this thing have been green-lighted? What UX-designer preconceptions, focus, or habits led to such inappropriate decisions?   How did experienced and genuinely top-notch designers come to such wrong conclusions, and how on Earth did those conclusions make it into production and release?   What unconscious bias appeared in the PM and UX preparation of user-testing criteria that allowed these designs to succeed in user studies?   What misconceptions led to this radical degradation of an already adequate, albeit less-than-perfect, UX?   

Negative feedback and telemetry from that prior less-than-perfect UX obviously drove the effort at improvement. Certainly, all the best of intentions were applied to the correction of the identified faults. It’s even obvious, from the changes addressed by the update, where the perceived issues were and how that checklist of objectives was arrived at.

Unfortunately, there’s a gulf between the problems and the solutions deployed.

Empathic UX, focused on the user’s state of mind, feelings, and reactions evoked by the activities and environment of the interaction, not to mention a good deal of established UX Best Practice, could have bridged this gap with any number of successfully-architected solutions. Instead, it’s almost as if bridge segments were built and placed in position, to respond to each identified need, in a vacuum, but without a holistic unifying structure.

Kind of like trying to cross this bridge, made up only of patches.   Not only would it not stand up by itself, but any attempt to cross this impossible structure would end in failure as the users fall through the cracks!

Expectations, Habituation, and Reflex

First off, previous generations of the Xbox User eXperience have trained users to expect that the controller’s Home Button brings them Home, to the Home Screen.   While learning about the Home Button’s multiple uses, and the many featuresthat different the various controller buttons activate,, might be less than totally intuitive, once learned, they become rapidly ingrained and anticipated.   Sadly, as a result of this recent update, the Home Button now brings the user, not to Home, but to a new menu paradigm, on the left side of the screen, that which is now, at best, an additional two actions away from their intended target:, the Home Screen.   

If that isn’t counterintuitive enough, after the Home Button brings the user to a menu instead of a Home Screen, its focus is on the mid-point of the menu, not the top, a universal habituated expectation across languages and cultures since learning how to read. To add to this confusing eXperience, while on the secondary flyout submenu, the goal—the Home target—is at the top, nowhere near the user’s point of focus. This result is totally unexpected, and, to the casual user (i.e., nearly everyone using an entertainment device), it leads to several disruptive interactions.

3.png

One reflex is to immediately flick up to the highlighted link.   While this is obviously not as intended by the design, it is still instinctive—a result of that primary entrenched expectation of starting at the top.   Of course, this moves the user to a different left-menu item, losing focus on Home entirely.   Whoops, the user feels stupid, this (and every) time a physical reflex takes over from an intentional, but casual, act. 

UX Best Practice is to co-locate current point-of-focus with anticipated targets of sub-navigation, and for good reason. The new highlighted Xbox logo “home” button anchors the user’s focus to mid-screen left. But the corresponding secondary menu has nothing highlighted. It leaves the user’s eye without an immediate anticipated target. The user is then obliged to scan the screen, trying to find the actual way to their goal:, the Home Screen. This unnecessary friction leaves the user feeling momentarily foolish because they can’t find where to go. Then, if the user has avoided the physical reflex noted in the preceding paragraph, they will (as clearly anticipated in the new design) navigate to the sub-menu on the right, either because they’ve found the Home menu item at the top, or simply because that secondary menu invites exploration and might aid in their search.

Having navigated right, the top menu item, Home, now highlights. This either successfully draws the user’s gaze, or is missed entirely because of its distance from the persistent point of focus mid-left—the epitome of the reasoning behind sub-navigation co-location.   Either way this adds another speed bump that requires the user to consciously process what’s happening because their goal is so far from that initial point of focus.

What used to be a one-touch controller operation for the user has now become, at best, multiple potentially confusing touches, and they’re not even yet at their expressed goal.

Once navigated right to the secondary menu, if the user is a gamer they will probably see the highlighted Home menu item and hit their A button—one more touch, but thankfully the final one—and they’ll arrive at their goal: the Home Screen. If they are a more casual user, perhaps focused on using the Xbox as an Entertainment Center, to access media like Netflix and YouTube, etc., they may, unnecessarily but reflexively, attempt to use the controller’s joystick to navigate up to the Home menu item. They will now lose the highlight on the Home menu item but the user will quickly recover by cycling up through the menu items until they refocus on the Home menu item and activate it. This should only require one mistake to learn and adopt this interaction. But this type of learning process has again unnecessarily made the user feel foolish.

None of the above is a smooth, intuitive interaction. As Apple once said in their design guide, circa 2005, “never make the user feel stupid.” Many parts of this interaction are not likely to be something users can even get used to over time because they run counter to very strong reflex. The expected/reflex behavior, established before this current update, is constantly being reinforced by nearly all games and other apps within the Xbox environment, clearly emphasizing the incongruity of this new, counterintuitive interaction.

Beyond the hunt for Home

If this overcomplicated trip to “Home” were alone, it would be nearly intolerable, but it’s only one of several annoyances introduced by the new menuing paradigm.

My second “favorite” is the left to right joystick navigation between the two menus. As nearly anyone habituated to the use of a controller would attest, it is purposely somewhat imprecise, with valuable slop in its action, essential to the fluid playing of interactive games, the principal use case for the Xbox.   This slop, and the studied control thereof, which is beneficial in-game, immediately becomes the enemy of the left to right use cases in this new navigator.

For most users, navigating these new menus is a casual, distracted, necessary-but-tolerated interstitial action, whereas gaming itself is a concentrated highly-intentional activity—thus entirely different. This can lead to constant, inevitable missed targeting, causing the user to ‘accidentally’ over-navigate to the right, one step too far. This ends up introducing the high-probability of additional user error, dismissing all menus prematurely, and, takes the user back to where they started:, the very place they attempted to leave when all this surplus navigating began. Xbox has made the user feel incredibly foolish, again! Now the user has to re-initiate the interaction, sometimes over and over again, only to find themselves frustrated and back where they started.

I have a suspicion as to how this menuing came to be. It is very closely related to tablet and touch navigation, a finger swipe sliding in from a point off screen.   While such touch navigation is equally different from traditional PC-style navigation, it is, however, not at all the same interaction environment as console/controller navigation. In other words, if designers attuned to the needs of tablet UI were assigned to the Xbox UI, without proper redirection to study the interaction habits and profiles of console users (essentially consistent across Xbox, Play Station, or Wii) this UI train wreck could be the result.   

Consider this an educated guess, as I wasn’t a member of the Xbox team, but I did spend over 11 years at Microsoft, leading me to paint this as a possible picture.)

The best-laid plans oft go awry

Obviously, earnestly working towards the goal of accelerating access directly to additional applications from the new menu—essentially attempting to supplant the need for a Home Screen at all—a most-recently-used  “Recents,” and a “Pins” sections were added. A logical concept… but… if the user’s target proves not to be on the secondary-menu “Pins” list, there is an oblique “more-content” icon, which, instead of responding intuitively, by adding more items to the list, in place, under the user’s point of focus, disposes the entire new menu system, and returns the user to the old Home Screen, vertically navigating to the user down to their Home Screen pinned applications. This, however, is right where the user would have been umpteen clicks ago had nothing ever changed. This is sure to make users wonder, why did Xbox make the change at all? This is not only a reversion, but shines an aggressive spotlight on the inadequacies, and redundancy of the new menuing system by reverting to the old menu upon failure!

There are always alternatives

I understand full well that these were most probably deemed hot fixes and required to be accomplished with minimal impact to the development team, but many of these problems could have been solved more elegantly, and with less user experience disruption, (and probably less dev time).  I’ll give just a few examples (where there are many more alternatives).   

The secondary menu, off the primary Logo menu, is already sectioned into multiple subgroups, being:

  • Menu Items (Home, Games, Store)
  • Recents (redundant to what’s already on the home screen)
  • And finally, the utterly redundant Pins section, which requires scrolling off screen to discover it (“below the fold”)

Rather than remain married to the top-to- bottom menu structure, even when the entry point is in the center, one could have the secondary menu fan out from that center, so that Home was directly co-located with the Logo button. Then, place, Games, just above, and Store, just below, with Recents more prominent above, and Pins more prominent (in fact, visible) below.   

As for the over-flick to the right problem, one quick ‘n’ dirty approach could be an activation delay (200 ticks, at a guess) which would allow the user to flick back onto the menu, which is where they’d intended to be, before losing focus.   Alternatively, an equivalent of greying out the non-active surface could be deployed, requiring the user to actively click the A button to intentionally choose a return to the prior application, or, allowing the user to merely flick back onto the menu, facilitating recovery from their inadvertent over-flick without navigational  consequences.

By adding a white translucence over the screen body it becomes a tertiary navigation object. If the user over-flicks off of the menuing system the screen body highlights, letting the user know that they’ve overcompensated, signaling to the user that the prior application is now the navigation target, and allowing the user to recover, navigating back to the left, or, if intentional, to click on the screen body (which highlights momentarily) if the user’s intent (unlikely but possible) was actually to return to their previous application.

Where there are these two or three alternatives, there are probably many, many more.   

What does it mean in practice to empathize with the target users?

Awareness of interaction context, objectively collecting user feedback, and openness to the inevitability of those pesky users rejecting the first few ideas, are essential to effective UX design. While time pressures can lead to a decision to bypass essentials of the process, by respecting those essentials, of best practice, user intent, user habituation, and user reflex, better outcomes can still be accomplished rapidly and can avoid harming the experience of 250,000,000 loyal users, many of whom might very well be thinking about buying PlayStations when the next generation of consoles are shipped. That is the power of Empathic UX, of the designer(s) placing themselves in the “moccasins” of their end users, attempting to feel what they feel, embracing their user’s goals, kinesthetically attempting to place themselves in the mindset and physicality of their users, and ultimately, attempting to empathize with and feel those users’ emotional state.