More than 20 years ago, I met an interesting guy named Jack Ganselle; he was interesting, as in “He sailed across the Atlantic solo in a small boat” interesting.
Ganselle had an electronics firm that specialized in equipment that tested chips and other electronic gear. Like many people, he had started his business in a small corner of his house, and realized it was successful enough to move into “real” leased space by the time his family was living in a corner of the living room and the business had taken over the rest of the house.
He had just started an Internet service provider called Softaid. (Please, no Viagra jokes; this was pre-Viagra). He showed me a new, interesting idea called the Internet, which was then strictly a playground for nerds and electronic types.
Still, his prediction stuck with me: “This is going to be the revenge of the liberal arts majors, because what you need is content.”
Of course, everyone now talks about content and how you have to be constantly updating your website with new stuff that appeals to many markets — and hopefully attracts a following.
Also, of course, much of that talk comes from people who would be delighted to write it for you, for a small fee (of course). And in our short-attention-span world, you have interesting stuff or people click away within 20 seconds. So, Jack was way ahead of his time.
Talk to Me
What brought this to mind was an article in the Washington Post about the rise of virtual assistants and how this has created a market for people such as poets, historians and screenwriters to craft the responses to queries and requests that people make of their phones or computers.
Apple’s Siri was the first (or at least the first well-known) of the bunch, but now Microsoft has Cortana, and Google, Amazon and Facebook are also joining the fray. What started as vehicles for making simple requests (“Remind me of a meeting at 10 a.m.”) has morphed into demands for research and answers on the meaning of life.
Several things have driven this trend. Artificial intelligence programs and machine learning have enabled more complex tasks — witness IBM’s Watson’s mastery of Jeopardy and numerous games, such as chess. And increasing success with speech recognition makes it more likely that your phone will actually recognize “Order roses for Kay’s mom” as just that, and not “Order fuses for a bomb” (Hello, NSA …).
The new versions of these programs are distinguished by their ability to chat, which requires the responses to be more natural and flowing. They also have the ability to joke.
Because of this, the writers and techies creating new programs spend much of their time developing a “personality” for the app.
Should it be thoughtful? Occasionally flippant? Does it address you by your first name, or is that too informal? The personality type is then used to define answers. It also has to deal with handling political questions and other things that we humans try to pull on it, such as repeating vulgarities.
Ask Microsoft about that. It launched (and quickly shut down) a “chatbot” named Tay on Twitter that started sounding like a Nazi because it was repeating things people said, including a few folks who were deliberately sabotaging it.
Then there are specialized programs, such as medical applications, that remind you to take your meds at certain times and ask you how you’re feeling. They can often refer you or pass your responses on to your real doctor if they recognize a problem.
People seem to expect more of a program (or app) with a personality, rather than one that strictly does tasks and robotically responds. There is also a problem that when apps appear too human, it gets a bit weird for many people.
Anyway, virtual assistants are big business and spawning numerous startups, who scooped up more than $35 million in venture capital last year — and who knows how much internally from the likes of Google. So, while most of the information dispensed could be found by doing your own Internet search, people seem increasingly willing to outsource it to a phone.
Next Vulgarity, Please
My fellow curmudgeon, John Dvorak at PC Magazine, has also raised the question of how long until these applications get hacked. Like the aforementioned Tay (which had to be taken down), what happens when hackers get into Cortana and program it to answer you with a blistering series of obscenities? Many hackers would consider that humor of the highest sort. However, until robots can make really bad jokes or stupid plays on words, I know I’m safe. This column was written by a (semi-) human. That is all.
Cliff Feldwick is owner of Riverside Computing, and does PC troubleshooting and network setups for small businesses, when not trying to get his phone to answer his door and bring him a cup of tea. He can be reached at 410-880-0171 or at [email protected]. Older columns are available online at http://feldwick.com.