The Death Of The Search Engine – The Birth Of Personalized Digital Assistants

KIMI - My Personal Digital Assistant

The Days Of Googling Everything Will Soon Be Gone
—Replaced By A Digital Assistant That Delivers Only What You Need To See, When BEFORE You Need To See It—
And That Day Is Unimaginably Close!

Who is going to deliver this amazing new way of using our machines you ask?

Well actually, it may well be Amazon… (more on that in a minute)

What applications will be successful in your less-tethered to your phone world?

Applications that are not Android, iPhone, Facebook, Amazon or Microsoft device specific. The applications that will dominate the second wave of Google Now, Apple’s SIRI, Facebook VR Worlds and Amazon A9, or CORTANA, will take the reins from the big five and make it less about them and even more about YOU!

Immersive at home and in the office, light and hands free while mobile via WEARable devices, these second wave of computer learning algorithms will train the machines —and hence the cloud apps that alert you. When something must be seen immediately, how important something is to you and when it is not, will be key. But more importantly, predictive applications will know it must be seen by you now before all others…

My predictions

Your personal assistant will not be named SIRI, or CORTANA, or Google. Your personal digital friend? It will be named with a name you give it —or perhaps even choose it’s own name.

Your personal assistant will not be a blue girl, or a Beiber boy. It will behave uniquely and have a personality, look, style, accent, gender or even be genderless, shaped by algorithms that mirror your needs. In some cases, you wont’ get to pick. Your own personality, needs, wants, desires —they will shape this intelligence.

Very immersive personal, private environments in the home, are customized to the people who are in the room. Imagine having dinner with the family as Grandma’s Facebook pictures she just posted float on the photovoltaic glass walls around the dining room table.

No need to touch anything while you eat. Gesture control is only a year away now.

Business and data based immersive environments in the work place: Again photovoltaic glass and the data on them, customized to the particular user. Extremely narrow for the worker drone cubicle and then more expansive for team leaders. Broader yet for various rungs of management on into department heads and finally chief officers.

Wife at Victoria’s Secret, husband at work: As she (or he) tries on various lingerie she could share the virtual image of different pieces and colors privately with her other-half in the office. Talk about a promise of what’s awaiting you when you get home from work, to get you thru a bland work day.

So Who Owns What Augmented Technologies?


Google is calling it’s latest technology WEAR. So let’s take a look at what Google just released at Google I/O just over two weeks ago.

These new wearable devices have the technologies built for them and the watches will only pair with Android 4.3 and 4.4 phones, currently 23% of Android devices. Right now almost no apps are taking advantage of these watches as most tech writers will tell us, but these new apps for WEARables are being built as we speak.

UPDATE: Healthcare giant Novartis, parent company of the contact lens manufacturer Alcon, will join forces with Google to bring its smart lens technology to market. Through the partnership announced Tuesday, Alcon will license the technology and co-develop the lenses with Google for a variety of ocular medical uses.

And so, with the Novartis alliance, we get one step closer to combining Google WEAR, Google Glass and the contact lens into one PLENZES style ocular device.


While Amazon led the mobile frontier with the KINDLE, it’s predictive A9 search engine and has continued to compete well with the iPad and Android tablets —as the KINDLE Fire again in 2013— was the #1 best selling Amazon Christmas item.

UPDATE: Amazon Hires The Father Of Wearable Computing -Babak Parviz- From Google

“Google Glass is one answer to that question, it’s not necessarily the definitive answer.” Babak Parviz – Wearable Technologies Conference Source: Cnet

Now before you stop reading this because you think Amazon is nothing but the Walmart of the Internet, remember that my friend David Amerland has acheived his best seller status on Amazon with Google Semantic Search and Jeff Walker made the New York Times Best Seller List at #1 last week via his Amazon sales of LAUNCH

Here is the video that inspired “PLENZES” and “PLENZES – Birth Of The Machines,” starring… You guessed it, Babak Parviz, in February 2012, just after or just before leaving Microsoft and moving to Google.


We have all been waiting to see what Facebook would do to enter this new world of augmented and virtual reality, and OCULUS RIFT is the answer.

OCULUS RIFT could produce Facebook virtual worlds, where we see facets of subjects, surfing highly customized 360 dimensional UIs (user interfaces). Something my friend Boomhauer and I originally envisioned as “The Diamond” UI in 2008, now known as “faceted search” and well used on Amazon initially.


Apple I expect the most from sooner than any of the big five. Why?

Because there has been no major update to SIRI.

Because there has been no release on the iWatch front. Mike Elgan makes great points here on what is right with the Apple Watch, what is wrong and why he believes Apple knows more about WEARables than Google.

Just one of the great points Mike makes on how the Apple Watch becomes part of you:

The Apple Watch not only brings your senses of touch, hearing and sight into a unified experience, it also uses your skin for authentication. You can use the fingerprint sensor on the connected iPhone to authenticate the Apple Watch to make purchases via its NFC chip and Apple Pay. It will remain authenticated as long as the bottom of the watch remains in contact with your skin. Once contact is broken, the watch is de-authenticated (until you repeat the authentication process). In other words, the watch can buy stuff for you as long as it’s part of you. As soon as it’s no longer part of you, it loses its purchasing power.

But Apple did release SWIFT, a sweeping, hugely embraceable software language, possibly replacing JAVA on Apple devices.

Apple’s possible partnership with IBM, using WATSON (the machine that won Jeopardy) as it’s core digital personal assistant is probably the future. Read how WATSON is being tested in the board room meeting here…

Apple’s overall lack of new presence on the machine learning / artificial intelligence front says to me, something very-very big is looming. So don’t let the fanboy in you deal them out just yet. OK?

As annoying or revolutionary as SIRI was (you pick), I expect Apple to hold it all back until they can deliver something that is not the next thing in personal digital assistants. But the Only Thing…

So What About The Sleeping Giant Microsoft?

Like Apple, Microsoft has said they want to not wow you, they want to blow you away. Considering the bashing Microsoft has taken lately, they have no choice but to do no less. Here are the pieces of what they are putting together.

Microsoft’s Marcus Ash and Rob Chambers sat down with Danny Sullivan, Search Engine Land’s founding editor, for a keynote conversation at SMX Advanced earlier this month. They talked about how they feel digital assistants, specifically Microsoft’s CORTANA makes people do more searches, as well as the company’s thoughts on CORTANA, their personal digital assistant.

Next up in the Microsoft arsenel is their new asychronus distributed compuing model, called ADAM… [read every word of this – Chris]

Gesture and voice recognition on PCs in your home? KINECT is breaking away from XBOX: You can see what it does and preorder the KINECT V2 Sensor here and download the SDK for developers there. If you want to see what developers are doing with KINECT V2 the video gallery is here…

Just What Will These Devices Have To Do Next
Before They Sell Themselves To The Cool Kids?
The Moms?
You– That Lives On Your Phone?

I myself have no interest in seeing Google+, Facebook and Twitter updates on our Android watches, Google Glass and WEARables yet to be named. Instead I suggest WEAR will be much more useful if we could see:

  • Predictions and recommendations based on Google Places reviews, Yelp and Pinterest Pins, letting us know good times, great food and discovery of new pleasurable events in the real world are nearby (the term “real world” being most important here).
  • If you are young and in the dating game of the night life, a PLENZES style prediction engine would alert you to which clubs and bars nearby have a higher population of the opposite sex at times when you are out to meet and flirt.
  • Or say that girl or guy you have been waiting to hangout with is sharing a location where they are –And you just happen to drop in. Now that is an opportunity alert I want to get on my watch!
  • You will be able to whitelist contacts that you want to see pushed to a wearable / watch device, especially your children, your youngest children and significant others and those very close to you…
  • The bad news is that developers / programmers have very little imagination when it comes to the social life of a 20-something guy or girl. As is evidenced by the fact that in the video above, Google seems to think people still answer the phone. I think text alerts should be front line and phone call alerts optional.
  • So, the marketing department had best be young and connected to what twenty and thrity year olds do with their devices and keep the developers on their toes for these kind of apps to bring WEARables to life for us all!

Now That You See Where We Are Headed Today
Let’s Take A Look At Whether The Big 5
Android – iPhone – Facebook – Amazon – Microsoft
Can Actually Bring Their Machines To Life?

Let’s Set A Few Boundaries And Leave The Science Fiction Out First

There are a number of words and phrases tossed around as the big corporations like Google and Microsoft –soon to be followed by Apple and Facebook– continue trying to convince you living machines are just around the corner and that they (these corporations) will be the first to deliver your living personal digital assistant.

Simply put, here goes:

Computer Learning – Machine self learning is not that much different than how we learn. An event occurs, we can act on it or do not act on it. If { WeDoAct } then { RateTheSuccess } then finally store the event in a database so it can be referenced should the same event occur at a future date..

Artificial Intelligence – Far from a living machine, artificial intelligence is the act of imitating thinking, decision making that is autonomous and using machine learning to gather the knowledge to do so.

Consciousness – Consciousness has been described as the difference between during surgery being sedated and a non sedated state. Anesthesia in simplistic description, takes -shuts down- our consciousness leaving the autonomic parts of the brain to still operate.

Sentience – Described as knowledge of one self, knowledge of God, knowledge of others. This was demonstrated well in the beginning of Prometheus, as the sentient robot David, groomed himself to look like Lawrence of Arabia, as portrayed by Peter O’Toole.

Four very different things, different concepts and I believe, we are very-very far away from true consciouseness and certainly sentience in machines. But…

Here is Ray Kurzweil speaking at Google I/O 2014 about how close he feels we are to high functionality and thinking machines.

Amoebas VS The Machines

So, while futurist and visionary Ray Kurzweil feels we are very close to what some call “The Singularity” I want to pose that we are not. Just by comparing a machine of the future to an amoeba.

An amoeba, can move around it’s surroundings, find and eat naturally occurring food to sustain itself and find mates even in the ancient past, before the amoeba we know of today now reproduces a-sexually. Think about this. A single celled creature is able to do things no machine is going to do anytime soon.

The machines of today and the next decade are much like humans are today. Completely dependent on outside, unnaturally occurring instances of electricity, treated water, chemically treated synthesized manufactured foodstuffs and gasoline that has a shelf life between 45 days and a few months.

Depending on how gasoline is stored, the idea that after an apocalyptic event we will still be driving cars is best left to TV shows like The Walking Dead. If machines are going to be truly alive one day, they will have to have more autonomy of life than my Comcast Internet connection does when I forget to pay the bill.

One of the few large machine complexes that could remain running is the new Facebook datacenter in Luleå, Sweden. It gets it’s cheap power from a hydro-electric plant and is able to cool itself with low temp free northern air. Great place to start with a living machine, since our end user style devices will most certainly depend on the machines in the cloud for quite some time to come.

So as I begin to paint my vision of the next decade of computing, I want you to be sure I am not talking about living machines. We are at best talking about pairing machines with you. Machines that are able to mimic you by learning your habits, schedules, location and interests and preferences depending on the time of day. One side keeps the other going and visa versa.

But Wait, There Is A 6th Contender —Bring It

The US Military

How Far Along Is The US Military With WEAR?

Probably right now, the only ones with the droids I am looking for is the US Air Force and US Army.

While we are just getting to know our machines thru voice commands and the first commercially available Android WEAR devices, I believe there is another WEAR at work: Warfighter E-tactical Augmented Reality as I dubbed it in 2012 in PLENZES – AR / VR

While we know little about the US military’s WEARable future, I do personally know they are heavily developing both contact lens based augmented reality, voice command / computer voiced equipment. The military has also invested heavily in the HMD -Helmet Mounted Display- typically and somewhat inaccurately called a HUD, as far back as 1985.

Image of F35 helmet mounted display

Courtesy of Wikipedia and The Marines

While the predecessor to the HMD is the HUD, you have really been seeing military augmented reality at work in every movie pilot scene, since Top Gun. The black visor was always much more than sunglasses. In fact the HMD prior to the latest above, had huge computer generated capabilities and was able to allow the pilots in Gulf War F-16s look thru the cockpit and see the ground below, via cameras mounted fore and aft on the planes.

Currently The Military With It’s Drone Programs Is Not Allowed To Fly Drones Via Autonomous, Artificially Intelligent, Algorithimically Logical Flight Computers.

But that by no means brings me to believe that they do not have the capability, the resources and the aim to do so. Just take a look at what they have in the air right now…

Because the drones will become the droids… -Chris Lang

All images are real, all come from defense contractor sites.

While F-15s and F-16s, F-22 Raptors and F-35s fly two to three times faster than any drone and no drone built yet is configured for air to air combat… Think about how a pack of wolves take on 600 pound bears by harassing them from every angle or lion prides? Even 500 male lions can be reduced to a defensive position by a hyena pack, all members working together.

So, if the US Military is not allowed by military mandate
to use supercomputers to fly these incredible drones
you just showed me Chris -Why do you say the US government has no choice BUT to build a SKYNET?

Right now all this is very new, but 40 other countries have their own drone air force right now too. It’s only a matter of time before the US gets into a fight with a country that has the smaller Preditor and Reaper drones you saw in the video above. Right now, Israel has more drones than anyone else.

When two drone air forces finally do fight —and that is much more likely when there is no military loss of life to be had in a drone war— then it is going to take a swarm of drones driven by a hive mind of supercomputer mainframes to win that air war.

While no drones fly now via machine minds, that does not mean there is not machines already waiting somewhere below the mountains outside Denver where I live, that are just spoiling for a fight.

Not to mention there is a lot of money in it. Just what we know about now, that I showed you above is predicted to generate 8.2 billion in US spending between 2014 and 2018.

Want to know more? Of course you do…

My new novel, PLENZES – Birth Of The Machines spells it all out, while you take a wild ride thru the machine world with our hero –Stuart Pauls.

As Stuart, flees his own software firm, the military and his own conscience to go off grid, hiding for years… You may not be able to wait to see what he builds in response to just some of what you have seen above.

An application that not only serves all hardware, but all mankind as well… That is what Stuart Pauls sets out to do. However as time wears on, it seems he may have spent to much time alone with his machines. Or… The machines, to much time alone with him…

Click Here For Your $9.00 Off Pre-Release Instant Coupon
PLENZES – Birth Of The Machines
Due September 1st — For A Limited Time Only…

Cheers All! Hope you enjoyed reading this as much as I enjoyed writing it! – Chris