The idea came while on holiday with my wife; I wrote it on our plane trip home so I could submit it to an internal "Think-Week" call, but, unfortunately I missed the deadline for submission by 30 minutes. So, the blog-sphere can now vet-out my idea 🙂
Respectfully submitted to the world,
How to Interpret Lassie
When Lassie comes running up to the farm barking, Timmy’s adoptive parents, Paul and Ruth Martin, could begin crunching all of the data about the nearby wilderness and create a massive search party to meticulously canvass the entire county (big-data approach). Instead, they use less data and concentrate more on context, behavior, and emotion to determine Timmy is most likely in some form of trouble down at the abandoned mine. [Incidentally, of all his mishaps, Timmy was never stuck in a well.]
Context comes from local observation (coal dust on Lassie’s coat) to greater breadth (Timmy’s questions about the mine at breakfast).
Behavior comes from parents knowing their child’s reaction to specific direction, "Timmy, I don’t want you playing near the abandoned mine, it’s not safe," too often is an open invitation to a 7-yr. old.
Emotion shows it’s evidence from Timmy’s concern for the baby raccoons he mentions he saw near the abandoned mine after their mother was killed by a wolf (disobedience under good intentions).
Only four to five data points under the umbrella of context, behavior, and emotion are required to inform the Martins with the greatest accuracy that Lassie’s presence and barking (more contextual data) means they need to get over to that abandoned mine, pronto.
The above illustration forms the basis of this thesis: modern computing systems need to stop out-thinking themselves searching meticulously through mountains of data to present smaller, prioritized mountains of options a consumer must further digest. Instead, at the base-class level, they must factor in queues from contextual interpretation, behavioral observation, and emotional interpolation, working together to inform computational behavior.
This thesis is not about putting humanity into computational systems, rather it’s about computational devices and services reading the humanity around them to simply do what is being asked of them.
Big Data is Dead
Perhaps a better way to phrase this section is, "I most likely will be dead before we learn how to analyze big-data to gain insight in real-time."
There is no question there is value in the pursuit of insight from the many data sources we create and interact with. What’s missing, unfortunately, are actual outcomes from any insights!
For years I’ve checked the option to send data anonymously to Microsoft and other applications providers under the "promise" of product improvement. I am still awaiting improvements from the many insights I’ve provided by my utilization and even more, specific comments I provide when overly frustrated with computational behaviors.
I can only assume that the millions of data sources providing a continuous stream of "insight-data" back to application providers simply overwhelms any hope of gaining actual meaningful insight.
On the surface, it seems as though our computational systems are spending more and more time searching for what they will do next and less time finding what the user/consumer really wants.
From a public search engine to an enterprise portal, simply extracting context, behavior, and emotion from subsequent search strings can build greater insight for the desired results than any neutral network or AI algorithm. Another illustration:
To a conventional search engine, each successive search string is atomic and independently requires the same excessive computational load because it treats each request equally. The application simply is doing what it’s asked and has no concept of computational failure.
A Context/Behavior/Emotion (CBE) aware search engine, however, would quickly realize it’s fast attaining a state of computational failure due to its acute observation of the humanity in successive search strings. Less data, more accurate interpretation. Now adjust the computational loads accordingly.
Context – thankS for Nothing!
Multiple levels of context can be utilized together for greater CBE insight realization. In the Lassie story, coal dust on Lassie’s coat was an immediate, localized queue. The conversation at breakfast was an additional queue.
Let’s map out a more concrete example using the Touch Keyboard and Handwriting Panel (tabtip.exe) that I’m using to write this. Below are two messages with concentric circles illustrating the multiple levels of context from which the software [should] draw:
The word, thanks, was interpreted with advanced handwriting analysis, stored data, and corrective history, yet it still comes out wrong. Let’s see how contextual insight increases the accuracy for, "thankS":
This is obviously an oversimplification for illustrative purposes, however, the point clearly remains: context eliminates the search of isolated data points and allows software to find the desired result in very few iterations.
Behavior and Emotion
A small vocabulary is all that’s required to categorize computational observation. The actual observation is actually less complicated if you factor in and utilize the many sensors at our immediate disposal. Yes, if there’s a user facing camera and microphone, the base CBE class should be watching and listening to report the state of behavior and emotion.
My XBox-One uses facial recognition to log me in. When it fails to recognize me (often, unfortunately), it apologizes and says it will try harder next time. Nice, putting humanity into the user experience, however, it is still not getting any better at recognizing me. By the time I grab the remote control and log-in manually, the sensors in the control reporting hostility and my facial expression/language reporting frustration should clearly be interpreted as "epic-fail".
A study of "The Platinum Rule" provides a most comprehensive guidance in the interpretation of behavior and emotion. This can than be used to create the most effective computational behavioral response to contextual inputs.
Behavior and emotion paired with context provides the most direct and profound insight to a user/consumer’s immediate needs for which a direct response from software can evoke the "find" elation most sought.
It’s not about "Her"
A recent movie relating the connection between human emotion and artificial intelligence based on device sensorial receptivity paired to compute-anywhere technology has raised awareness of how near or far we are at bringing humanity into software. This and systems like Apple’s Siri strive to bring an emotional connection to everyday computing by mimicking human-like behavior in its response.
This is not what CBE strives to do.
CBE is more focused on reading the user/consumer directly and responding with a focused computational (non-human) behavior.
Call to Action
Now is the time to break some development cycles away from traditional big-data cloud calculated insights and begin experimenting with a CBE framework. A CBE framework can give simple applications unprecedented insight to human behaviors through both passive and active (sensorial) observation.
These simple apps, today, may come in the form of CBE based text word offerings on your phone where how hard you press influences what words are offered.
This then grows into audio queues influencing word selection, and later facial recognition directing an appropriate form of response.
Placing the basics of these observations into a base-class CBE framework means applications need only make a call to ascertain the three values of CBE for which it can respond appropriately.
The final question is not how to make technology search for answers on your behalf, rather, how do we teach to technology actually listen/observe you to find where it is you want to be?
[Wikipedia 16 February 2014] Lassie (1954 TV series), http://en.wikipedia.org/wiki/Lassie_(1954_TV_series)
[english-at-home.com 2013] English words that describe behaviour, http://www.english-at-home.com/vocabulary/words-that-describe-behaviour/
[english-at-home.com 2013] English words that describe emotion, http://www.english-at-home.com/vocabulary/english-word-for-emotions/
[Mars Cyrillo, CI&T February 16, 2014] The world we see in the movie Her isn’t far off, http://venturebeat.com/2014/02/16/the-world-we-see-in-the-movie-her-isnt-far-off/
[Ilya Gelfenbeyn, Speaktoit Feb. 15, 2014] After Her: Why our love of technology will remain unrequited, http://gigaom.com/2014/02/15/after-her-why-our-love-of-technology-will-remain-unrequited/
[Alessandra, Tony and O’Connor, Michael J. Dec 14, 2008] The Platinum Rule: Discover the Four Basic Business Personalities and How They Can Lead You to Success