Text Highlighting in iBooks 2.1

The announcement of Apple’s new iPad brought along a number of updates to many existing Apple iOS apps, among them iBooks. One particular new feature in iBooks immediately caught my attention:

Use your finger as a highlighter when swiping over text

I was particularly curious about this new feature because i’ve always felt that the poor text selection mechanism on iOS is a major hurdle that prevents me from getting any meaningful, serious work done on these devices. Maybe i’m an oddball (though i doubt this, having observed how some of my colleagues work), but text selection and extensive copy-pasting between apps is a common part of many of my workflows, whether it’s preparing a powerpoint slide deck, programming, doing administrative stuff, writing reports and papers or even something as mundane as writing an email. The promise of a new text highlighting feature in iBooks got me excited that Apple was exploring new means of text selection on their touchscreen devices. (Let me note that i’m aware that text highlighting is something different from text selection on a functional level, but they are pretty similar as far as interaction style is concerned.)

So i downloaded the new iBooks version and set out to give this new highlighting feature a try, immediately running into a problem: i couldn’t figure out how the text highlighting was supposed to work from Apple’s one-line description in the patch notes. I swiped across sentences and paragraphs, left to right, only to find myself leafing through pages. I tried swiping top to bottom, thinking that maybe Apple was trying to avoid gesture overload by distinguishing swipe directions, but with no result. After a minute or two i gave up and turned to Google in hope of finding a description of how this new text highlighting feature was supposed to work. So much for the intuitiveness of gestural interfaces…

When i couldn’t find any helpful tips or instructions i returned to iBooks, determined to figure it out myself. After a few more minutes of experimentation i finally discovered how text highlighting in iBooks works: Before you can highlight text with a finger swipe, you need to initiate a very short long press on the text. If you immediately begin with your swipe-gesture nothing happens, but if you first rest your finger on the screen for a very short while (i haven’t been able to time this, but it’s probably much less than half-a-second) and then swipe over the screen, the highlighting feature works.

The text highlighting works pretty great once you’ve figured it out, but it introduces another interesting new problem (and inconsistency between iBooks and pretty much every other app on iOS): Normally on iOS, a long press on a body of text (where you touch the screen with your finger and leave it there motionless for a short period of time) opens a magnifying glass to start a text selection operation. During the long press, while leaving your finger on the screen, you can drag your finger around to move the magnifying glass to change your text selection, a very useful feature in case you positioned your finger ever-so-slightly incorrectly, which, let’s be honest, can easily happen given the size of a typical finger compared to the size of a single word on a tiny iPhone screen.


The magnifying glass is gone in iBooks 2.1

Of course now that iBooks highjacks the long press to initiate text highlighting, this couldn’t possibly work anymore. Instead, text selection in iBooks 2.1 works like this: You start a long press on the screen and iBooks waits for you to move your finger (in case you want to highlight text) all the while providing no visual feedback whatsoever. If you don’t move your finger but lift it off the screen in place, the text selection operation is triggered, with the word that your finger rested on selected. Gone is the magnifying glass for text selection. There are three major drawbacks with this method compared to how text selection used to work in iBooks and continues to work in other iOS apps that immediately come to mind:

  • Because the magnifying glass is gone, you don’t get any immediate feedback while touching the screen that you are about to perform a text selection operation triggered by the long press. You also don’t get any feedback for your text highlighting operation until you move your finger. I guess that’s why it took me so long to figure text highlighting out…
  • You can no longer change your initial text selection in case the position of your finger was a little off. Instead you have to manually adjust the selection using the two small handle-bars to the left and right of the selection marker (a particularly onerous and annoying operation in my opinion), or you keep long press-stabbing at the screen until you hit the right word.
  • Because people have a reasonable expectation of how text selection is supposed to work in iOS from prior experience with pretty much any other app on iOS, it becomes all the more difficult to figure out why nothing is happening when you long press on the text body in iBooks.

I think it would have been preferable to keep this text highlighting feature as part of the action menu that pops up after a long press (of which it’s a part of anyway) instead of introducing this confusing new shortcut.

Admittedly, text selection is a pretty arcane affair on iOS anyway, marred by inconsistencies across different apps and obscure shortcuts. Did you know that you can select a word in Notes by double tapping it? Of course that won’t work in Safari, because a double tap will trigger the context-aware zoom…

More importantly though, this tiny issue got me thinking that we’re now at a point where we’re running into the expressive boundaries of multitouch gestural interaction, inevitably leading to small inconsistencies like this one while we’re trying to get the most out of a limited gesture vocabulary. Gestural interaction increasingly epitomizes a style of interaction devoid of consistency, affordances and intuitiveness. Look no further than Lukas Mathis excellent critique of iPhoto on iOS to see that this is no isolated case.

When i think back how bright the future of gestural interaction looked back in 2007 when the first iPhone launched, i can’t help but wonder at which point we lost the magic. Maybe we’re just trying too hard, trying to do too much with a too limited vocabulary. And maybe buttons and menus weren’t such a bad idea in the first place.

See also: Two Kinds of Gestures by Phillip Bowden.

CleanReader v3

In case you’re one of the probably two or three people (at best) who are using my CleanReader NetNewsWire style and also following this weblog, there’s a new version available. New features include dynamic image resizing (surprisingly easy to do in CSS if you spend five minutes on Google), saner margin handling and improved decluttering of widget-crufty-stuff. No major changes, but i believe it’s well worth an update, especially if you’re using it on a smallish screen.

Aaron Straup Cope:

We did a lot of stuff wrong during my time at Flickr but if I had to highlight one thing we fucked up it was somehow creating an environment where people started to believe that their photos were not good enough for Flickr. I mean, really, how did we ever let that happen? I was speechless the first time a friend said that me and for the record: It was never part of the plan. How did we ever let people think that there is one measure of photography? How did we let people imagine that a medium which gave the world both Ansel Adams and Garry Winogrand (a photographer who died with a reported 10, 000 rolls of undeveloped film in his studio and who said that every time you take a picture you are hopefully risking failure) and everyone else in between was about any other than the joy and the discovery of the possible, foofy equipment and technique and measures of “good”-iness be damned?

Bits as a Service

Tim Bray:

I’ll grant that the distinction between bits-as-bits and bits-as-a-service may not always be obvious. But it’s crucial, because people will pay for only one of the two. What seems crucial in making the distinction is that in the case of a service, I’m paying ahead, for a predictable response to the unpredictable future.

TV Is Broken

Patrick Rhone watches TV with his four year old daughter Beatrix for the very first time:

“Why did you turn the movie off, Daddy?”, Beatrix worriedly asks, as if she has done something wrong and is being punished by having her entertainment interrupted. She thinks that’s what I was doing by rushing for the remote.

“I didn’t turn it off, honey. This is just a commercial. I was turning the volume down because it was so loud. Shrek will come back on in a few minutes” I say.

“Did it break?”, she asks. It does sometimes happen at home that Flash or Silverlight implode, interrupt her show, and I have to fix it.

“No. It’s just a commercial.”

“What’s a commercial?”, she asks.

”It is like little shows where they tell you about other shows and toys and snacks.”, I explain.

“Why?”

“Well the TV people think you might like to know about this stuff.”

“This is boring! I want to watch Shrek.”

“I know, honey. It will be on in a bit. Just be patient.”

Android’s Broken Software Buttons

Alan Zeino:

I’ve spent seven days with my Galaxy Nexus and I still find it hard to believe that Google ships a mobile operating system with such a broken navigation system, centered around that one back button. Let’s begin with the downright obscene. The back button never, ever tells you if you’ve gone ‘back’ as far as you can go. Tap on that button ad infinitum until every screen/app in the device’s history disappears and you’ll find yourself at the home screen. But the back button will not dim, nor will it disable. You can tap on it to your heart’s content, and you won’t move (though your device will vibrate and chime).

It doesn’t get better from there. This resonates with how i’ve always felt about the Android back button, even before they were implemented as software buttons: It’s a lazy design that doesn’t scale for the complexity of multiple navigation stacks.

We artists can only go so far as the people can follow us. We are not alone, we are part of the system. We can take risks, but if you want to go to the peak of your consciousness, you may very well find yourself alone. Even if you know how to translate what you see, maybe only ten people will be able to understand what you tell. But, if you have faith in your vision, and retell it again and again, you will start noticing that, after a time, more people will begin to catch up with you.

~Jean Giraud / Moebius / Gir

via.