Microsoft just recently introduced the Surface Pro 3. Just like its predecessors it strikes me as a solidly engineered, high-quality product delivering on something no one really asked for. This time around Microsoft was careful to emphasize the Surface Pro 3 as an alternative not to Apple’s iPad, but rather the Macbook Air (and obviously other laptops as well, but Microsoft could never admit to that without disgruntling its hardware partners).
I still remain unconvinced that there’s significant demand for a tablet-laptop hybrid, especially in a form like Surface Pro (I think if tablet-laptop hybrids were to succeed, something like Lenovo’s IdeaPad Yoga has a better chance at success). Most laptops today are much better at doing laptop-y things than the Surface. iPads are undoubtedly much, much, MUCH better at doing tablet-y things than the Surface. This tension is well exemplified by how Panos Panay, Corporate Vice President of Surface Computing, stressed the improvements made to the kickstand hinge, enabling more flexible viewing angles: The original Surface Pro kickstand only supported one viewing angle, the Surface Pro 2 kickstand supported two different viewing angles, and the brand-new Surface Pro 3 supports flexible viewing angles. This could be a solid improvement, except that laptops have supported flexible viewing angles for as long as I can think back. It basically took Microsoft two iterations and more than 18 months to fix a weakness they designed and created themselves.
I will admit that I wouldn’t mind the Surface’s aspirations as a full laptop replacement if I didn’t believe that these aspirations compromise the Surface’s qualities as a tablet. As it is, the Surface Pro in all its iterations strikes me as a tablet that’s trying so hard to be a laptop that it becomes a worse tablet for it.
You chose to use helvetica for Businessweek. How do you perceive and use modernism?
I chose helvetica because it felt like the wrong thing to do at the time. And because I was lazy. I am lazy. So I think in some ways I use modernism just like everyone else – lazily. I perceive modernism now (at 3:34pm on a rainy Tuesday afternoon) to be the result of design by committee, everyone trying to solve different problems, unable to agree on anything except a grudging mutual acceptance of what they hate least. An autocratic ideology where content is always king, the product is always the star, about a design system that gets out of the way. A simple, reductive form of slippery group-think. I see modernism being a fear of personality. Or rather personality that bends to the demands of the machine. Clean lines and an absence of mistakes. Tasteful and compliant. Easy to navigate. Fearful and elegant. Unarguably average. Confidentally minimal.
032c Interviews (Former) Businessweek Creative Director RICHARD TURLEY.
100 years from now someone will have to explain why all the written artifacts from 2014 are stored as images. I would love to be around to hear the explanation.
Scripting News: [A rant re Twitter]
Gofor: Interesting piece of design fiction imagining the potential for a drone rental service:
Two Sentences About Getting Older and Working on the Web by Frank Chimero:
Now is the time to come clean: Github is confusing, Git is confusinger, pretty much everything in a modern web stack no longer makes sense to me, and no one explains it well, because they assume I know some fundamental piece of information that everyone takes for granted and no one documented, almost as if it were a secret that spread around to most everyone some time in 2012, yet I some how missed, because—you know—life was happening to me, [...]
This hit a little closer to home than I like to admit. I’ve never used a CSS preprocessor in my life (mostly because I hate fiddling with toolchains and avoid them whenever possible) and AngularJS gives me headaches (even though I can appreciate its utility).
A few weeks ago I went to a local Mobile Monday event on the topic of wearable computing. Attendance was quite good, as apparently wearable computing captures the current technology zeitgeist quite well.
Philipp Nagele of Wikitude was one of the presenters and panelists. Wikitude is an Austrian augmented reality company that jumped on the mobile augmented reality bandwagon a couple years ago when it was all the rage. They offer both a consumer-oriented app for all major smartphone platforms as well as an SDK and developer tools. Naturally, Wikitude’s main interest in wearable computing is currently focused on wearable displays, HUDs and augmented reality glasses – eyewear similar to Google Glass.
Philipp shared three interesting things that stuck in my head:
First, he doesn’t use Google Glass personally because he doesn’t find it sufficiently pleasant and useful at this point. Which struck me as a curious thing to say from someone working in a professional capacity on augmented reality applications :)
Second, he shared that most of their (current) business and income is coming from business- and developer-oriented offerings, not their consumer-oriented smartphone apps. I didn’t find this particularly surprising, but good to have that hunch confirmed.
Third, he predicted that most of the early demand for Google Glass and similar applications will come from specialized industries and professions in vertical markets. Applications tailored for narrow and specific niches, to be used by highly-trained professionals in specialized domains (or maybe also to reduce the need for highly-trained professionals in these specialized domains). Like the solutions offered by companies such as Augmedix and Wearable Intelligence:
This makes sense: I’ve always had a hard time imagining what I could use Google Glass for. I already spend most of my day in front of a PC. When I’m not in front of the PC, my tablet or smartphone are always within reach. Pretty much the only use case I could see for myself right now is cooking, because I’m a terrible cook and getting cooking instructions while having both hands free seems somewhat useful to me. Maybe that will be the first wave of successful, useful apps for wearable displays: apps for people who don’t sit at a desk and work with their hands (unfortunately that target demographic doesn’t include me).
As far as Google Glass’ appeal is concerned: I’m far more intrigued by the prospect of a wearable camera than a wearable display.
DeVaul insists that it’s often just as easy, or easier, to make inroads on the biggest problems “than to try to optimize the next 5% or 2% out of some process.” Think about cars, he tells me. If you want to design a car that gets 80 mpg, it requires a lot of work, yet it really doesn’t address the fundamental problem of global fuel resources and emissions. But if you want to design a car that gets 500 mpg, which actually does attack the problem, you are by necessity freed from convention, since you can’t possibly improve an existing automotive design by such a degree. Instead you start over, reexamining what a car really is. You think of different kinds of motors and fuels, or of space-age materials of such gossamer weight and iron durability that they alter the physics of transportation. Or you dump the idea of cars altogether in favor of a substitute. And then maybe, just maybe, you come up with something worthy of X.
The Truth About Google X: An Exclusive Look Behind The Secretive Lab’s Closed Doors
Invisible design propogates the myth that technology will ‘disappear’ or ‘just get out of the way’ rather than addressing the qualities of interface technologies that can make them difficult or delightful.
Intentionally hiding the phenomena and materiality of interfaces, smoothing over the natural edges, seams and transitions that constitute all technical systems, entails a loss of understanding and agency for both designers and users of computing. Lack of understanding leads to uncertainty and folk-theories that hinder our ability to use technical systems, and clouds the critique of technological developments.
As systems increasingly record our personal activity and data, invisibility is exactly the wrong model.
No to NoUI – Timo Arnall.