There’s an unformed thought I’ve had bouncing around in my head since I first tried out Siri on my 4S. I’d have liked to develop it into something more robust, but it’s a pretty simple idea so I’ll just leave it here.
You can’t say to Siri, “I guess I’d like to do something later like maybe a movie or just something laid back like coffee in the area or probably not a bar because they’re so noisy—maybe one of those coffee places with lots of books and definitely not Jitters because that place smells like underarm” and expect Siri to help you in any way.
You learn quickly that Siri has certain expectations, certain limitations, and must be spoken to with a certain cadence reflecting a certain pattern of thought. Speaking to Siri is a lot like speaking to someone whose English isn’t so strong. It works better if you naturally pre-diagram your sentences and order them rudimentarily.
From Siri’s acceptance or rejection of our commands or requests, comes a feedback loop that trains us to constrain our thoughts to the crucial data.
As we learn to speak to Siri, we’ll learn more about how we formulate ideas into words, how to express those so that they may be understood with less margin of error, ultimately shortening the gap between intention and comprehension.
It’s safe to assume that as we learn to talk to Siri, Siri learns to listen to us. So we’re not simply assimilating with the robot culture, we’re fostering a new understanding between our vastly different types of intelligence.
Which is to say, Siri will teach us how to talk to Siri but maybe more importantly, how to talk to each other."