Tech and the Tyranny of the Familiar

I’m going to indulge in a rare bit of tech nostalgia, but not out of sentimentality. It’s for a good purpose!

Early Silicon Valley was exciting. It was also rough-and-tumble in both cultural and corporate ways. I thought Apple’s “90 Hours a Week and Loving It” (or other number) t-shirts were insulting and, because they were doled out by the company, manipulative.

The times were defining how personal computers would work-look-operate, and for what kinds of things. Those decisions, or accidents of history, are fundamental to the familiar devices we use today.

The times also were establishing an over-work culture that still typifies much of tech industry.

It is easy for today’s users to assume this is just the way computers are: Mouse and windows and keyboards. Disk drives, wires, and wi-fi passwords. Email and notifications and alerts. Software updates and licensing.

In how many ways has it all become intrusive? We don’t need to count them. We know our brains used to be occupied with other things, better or worse. Looking at it one way, much intrusiveness is a side-effect of the consumer side of the technology being very new. Immature, that is.

Growing beyond the ruling paradigms requires great innovation requires original thought.

Creators of the next evolution in tech must inquire their way back to uttermost fundamentals. Why a screen? Must we be seated in order to be productive? Can we get rid of pointing devices, select with a look and a blink — also fixing the widespread dry-eye problem?

Must a computer as a discrete device even be a thing?

We are not stuck with the familiar.

I wish we could resurrect some important presentations by Jef Raskin. Apart from his significant history at Apple, he’s easily remembered as the inventor of the Canon Cat. It was an also-ran contender when “word processors” still lined retail shelves and manufacturers were learning how to be marketers.

The Cat might have been brilliant, and I’ll leave the story of its history-and-demise to others.

Jef must have been disappointed it didn’t set the world on fire. His concepts of interface were novel and deep, made exquisite sense, and conferred impressive amounts of power on the user. Perhaps unparalleled. I never asked, but I wonder if he thought Apple’s “computers for the rest of us” Macintosh brand campaign was demeaning. Did it dumb-down PCs rather than elevate users?

What if a new approach to computing devices eliminated the need for 99% of all tech support? What if the clarity of a novel approach to interface — one that, who knows, allows users to transcend windows and menus and cursors — were so great that, instead of command-keystrokes and nested navigation and drag-dropping, people had more “It’s right here!” moments?

Users would be empowered in a way they would experience as confidence born of fluency. They would devote less attention to the device and more to their actual task. Device interaction would no longer be an intermediate layer between users and the object of their intentions.

If we don’t imagine it, we won’t conceive it.
If we conceive it but don’t test and share it…

Much specific expertise like Jef’s is lost. Some was so conceptual it would have been difficult to document. Some seemed frivolous or castle-in-the-sky thinking.

Innovation at the bedrock can be a costly process that has uncertainty as a core requirement. Some early work was deemed not predictably profitable enough, soon enough to invest in.

Forgotten specifics of the guiding philosophies from the early days of personal computing might no longer be pertinent. We can’t know that now. But even the generalized lessons learned or proposed then are lost. Lost too soon, I believe, to have sufficiently inseminated the collective body of tech wisdom.

Tech’s next-ten-generations evolution will need thinkers and explorers who are willing to take a turn into truly novel territory. Individuals who are willing to risk exposing themselves to the (very) unfamiliar — and to be perceived by some peers as being out in left field — and committed to holding the human-machine interface paramount.

(Speaking too broadly, we suspect it is unlikely one can surpass current paradigms while coding within commercial production frameworks.)

Adventurers are needed if tech is to evolve so significantly as to be unrecognizably better. Such an evolution could expand an even greater potential of these devices for larger swaths of the population. Future users could wield much more power, much more fluently and effortlessly, if we will open ourselves to greater possibilities and nurture the visionaries among us.

We’ve barely scratched the surface.