Silicon Valley is doing what it always does: Re-inventing the future.
The PC future gave us power and capabilities previous generations could not have imagined. But it also chained us to our desks, and destroyed our penmanship.
The Internet future gave us thousands of Libraries of Congress at our fingertips, but it also gave us privacy intrusions, cyber-crime and lazy academic researchers who just to go Wikipedia instead of seeking out primary sources.
The social media future reconnected us with our high school friends, and enabled us stay in touch with far more people than we used to be able to do. But it also gave us new brands of trolling, stalking and privacy invasion.
Now, Silicon Valley is ushering in three new futures: The future of personal virtual assistants, the future of customized results and streams and the future of information appliances.
Each of these futures brings with it unique benefits. But all will share the same bad result: They will take away our control.
Personal virtual assistants
http://www.apple.com/iphone/features/siri.html The technology is a small leap forward from the Android Voice Actions technology available on Google’s mobile platform.
Siri and Android Voice Actions represent the dawn of a new era, one in which an artificial intelligence agents will “understand” what we mean when we speak normal, everyday language.
They’ll place our words in context. They’ll learn. And most importantly, they’ll do things that go far beyond search results. They’ll buy things for us. They’ll make reservations. They’ll reschedule our meetings, and communicate with our friends, family and colleagues.
Apple’s Siri will mainstream this technology. Millions will be using it daily by the end of next year. Within five years, pretty much every person in every industrialized country will use personal virtual assistants as one of the main ways they interact with computers, the Internet and each other.
Over the months and years ahead, each new version of this virtual assistant technology will prove more capable of doing more things for us. We’ll tell our virtual assistants, “from now on, pay my electric bill.” “Go ahead and delete all email from Joe except for the important ones.” “When someone publishes an article on the upcoming presidential election you think I might enjoy, tell me about it.”
The more advanced this technology becomes, the bigger the decisions we’ll rely on them to make for us. Choices we now make will be “outsourced” to an unseen algorithm. We’ll voluntarily place ourselves at the mercy of thousands of software developers, and also blind chance.
We will gain convenience, power and reliability. But we will lose control.
Customized Results and Streams
Another big trend driven by Silicon Valley companies creates a “Filter Bubble,” according to author Eli Pariser.
In order to improve the signal-to-noise ratio in our online information gathering and social networking, companies like Google and Facebook have been customizing results using algorithms that pay attention to user attributes and actions.
In Google’s case, the company looks for dozens of user “signals” that affect your search results when you Google things. The most useful example is when you search for the word “bank” on Google, the site will use your location to offer local branches on the first page of results.
But, according to Pariser, Google also takes into consideration whether or not you travel, what your prior searches were, who your friends are, what your zip code says about you and even what your choice of browsers says about you. So when you search for things more interesting than “bank” -- say, “religion,” “sex” or “politics” -- your results may reflect all kinds of personal, individual “signals” that Google has collected on you.
You think you’re getting “the results” when in fact you’re getting “your results.”
In its defense, Google uses signals not only to give you what it thinks you want, but also to deliberately give you the opposite, whatever that may be. And they also don’t “filter” results, as charged by Pariser. They merely sort them. The difference is that with sorting, they still give you all the results. They just prioritize them.
Facebook, on the other hand, censors. That company’s “EdgeRank” algorithm actually prevents most of the posts your friends do on Facebook from ever reaching your main central News Feed. Software is deciding what you see and don’t see.
That same software is judging your own status updates, too, choosing to censor them so some of your family and friends never see them.