We had some reading to do for class. It wasn’t only that I took exception to dated text. I get persnickety when it comes to presenting ideas regarding art, design, history, politics, and technology to artists. Then again some say I am persnickety in general. Anyway, the first reading was Don Norman...
I was bored to tears by the insipid, clever, whining of Don Norman about design. Reading Norman was reminiscent of someone attempting to divert accountability for keyboard entry errors (due to a lack of understanding) by referring to anomalous events as computer “glitches.” His tone is not dissimilar from the individuals who detonated ALL of the fireworks at once for the 2012 Fourth of July display in San Diego (call it a case of premature detonation) and blamed it on a “glitch.”
“Glitches” went out as an excuse for failure by all except the most non-technical illiterati after the release of MS-DOS 2.0 right after the, “dog ate my homework.” Though a case might be made for persistent glitches in Windows operating systems right up to XP service pack 2.
As usual, I Googled the writer. On his UCSD teaching site Professor Norman had this to say: “Today’s technology, especially that of the Personal Computer, is too complex. But the potential is enormous. ” You would have thought that an MIT EE & Computer Science graduate would have been a bit more forward thinking. Professor Norman was certainly correct about the potential of the PC, even if he failed to grasp what was so unique about the protean nature of microprocessors.
Creating the mantle of cognitive scientist, he was able to focus on exploiting the insecurities of others of his generation – and his seniors. He made a career out of being a whiner: inventing problems was a realized necessity for his job:
“Academics get paid for being clever, not for being right.” – Don Norman, 2011
Professor Norman’s quote is a good lead in to what I thought while reading Bret Victor’s, “Brief Rant on the Future.” Yes, I agree with a possible interface of the hand with accompanying gestures.
Let us bourree beyond our cro-Magnon ancestry, beyond Bret Victor and his hammer. What of speech as an interface? I talk to the machine, it talks back to me (ideally telling me that it did what I asked). If Apple quit being so Microsoft these days, maybe they would fix Siri and we would be one step closer to a better interface for everyday uses.
Or how about multiple manners of human-machine interface? Why must there be an exclusive interface? What ever happened to choice? In my studio I have eleven hammers, each one for a different job. I cannot use a sledge hammer to drive a tack (accurately, anyway), any more than I can use a ball peen hammer to drive a wedge to split a log: match the tool to the task.
While the future is never what anyone says its going to be, the seeds for tomorrow are around us now. Engineers and designers will continue to experiment and innovate. Today’s machines will evolve in order to successfully compete in the global marketplace. Ultimately it will be the customer who will determine the evolutionary paths of products.
About a year ago I was complaining to an Apple technician about not having control over the system font size on the iPhone. He patronizingly told me that Apple was planning on making smaller devices, not larger, and that I should look to another vendor for a phone. Now Apple is planning on rolling out four and six inch phones in response to the pounding they are taking from Samsung and others.
Bummer that Apple has become so much like Microsoft. But this is the fast moving near-present; not the future.
You must be logged in to post a comment.