Apple has released their new Macbook Pro, and the internet is abuzz with speculation about what it means for the future of the company.
The biggest new addition is what they are calling the Touch Bar, a bar that runs the length of the keyboard and can dynamically display functionality based on which app is running on the primary screen.
Much of the commentary I’ve heard thus far feel that this bar is a sign that eventually, more and more of the keyboard area will become a screen. The logic is that if a tiny area is good, a larger area would be better.
I’m not so sure.
When thinking about this new product, I like to consider what the question or thinking behind it was. Like in Jeopardy, we have the answer already, but what was the question? In my imagination, it was something to the effect of, “how do we integrate touch into our laptop line” or to put it another way, “how can we make the Mac feel more like iOS.”
The answer to this question can be quite complicated. The first and most obvious answer is to make the screen touchable. I don’t think this is something that’s going to happen anytime soon. For starters, macOS isn’t designed with touch targets in mind. Interface elements are far too small and close together to provide good usability. A change like this would require a complete redesign of the operating system to accommodate touch to the level that iOS does.
The other issue is ergonomics. Apple has already said how holding one’s arm out to touch the screen for long periods of time doesn’t work well. Other companies have side-stepped this issue by making their machines convertible. This could be a future product for Apple, but it would bring a lot of confusion to their product line to have two different tablets that run two different operating systems.
If the screen isn’t getting touch, the only other place to add it is in the keyboard area. The trackpad got enlarged quite a bit, perhaps that could be dynamic as well? Or the entire keyboard area? Maybe we could just type on a flat, capacitive surface that could change completely based on what we’re doing?
The problem here is one of focus. Imagine opening an application, and then the entire keyboard area changes. The keys move down, and above them little preview windows open and slide in. As you start interacting with the application, it would become too easy to be focused on what’s going on under your fingers rather than what’s going on in front of you. This approach would be powerful, but it would also almost negate the need for a second “main” screen if all of your interactions are happening with the content under your fingers. Why split focus between the two?
In application development it is common to have a toolbar. A toolbar is a row of icons that allow quick access to commonly used functionality within the application. A toolbar is distinct from the content area that is being represented or interacted with.
When one imagines a large dynamic display that sits underneath another display, the line between what is a toolbar and what is the content becomes quite blurry. However, a screen no taller than a keycap and no longer than the keyboard blends into the existing interaction area. It remains a toolbar.
I think the solution of the touch bar is that it is meant to be overlooked. It is not meant to steal focus from the content, it is only meant to enhance and augment the current modes of interaction that exist.
The problem Apple is facing now is that even though they have decided this is the preferred way forward, how can they bring everybody else along? Many developers feel that this new Touch Bar is a solution to a problem that doesn’t exist, so the feeling is that Apple is losing touch with its own community.
From a marketing perspective, I totally understand why the main options for this new pro laptop all have the touch bar, because this is a way to force the community and their users to have and use it. It also allows developers to know that there’s a market that has this feature.
Down the road, it gives Apple more opportunities to expand the functionality. It has been revealed that this new bar is essentially an iOS device, complete with its own processor. As the Macbook Pro continues to evolve, I imagine we’ll see more and more of this hybrid device approach. One day we’ll get to a point where so much of the device already runs iOS, that some type of unified computing platform will emerge. The interface to that platform will rely heavily on touch, but we could continue to see hardware keyboards, styli, and pointing devices as well.
I admit to being confused when I first saw this device, but the more I consider its implications, the more I see it as a step on the road to more powerful and interactive interfaces. So for now, Apple has kept this pro interested.