Making Job Search Easier by Finding the Great Companies First

Find a
JOB
Title/Keywords Company Name
City, state or zip (optional)
 

What does Google’s D‑WAVE, Blippar, and Machine Vision have to say about where media is heading?

What does Google’s DWAVE, Blippar, and Machine Vision have to say about where media is heading?

So why is a com­put­er being able to see be so darn impor­tant, I’ll tell you why, so adver­tis­ers can know exact­ly who you are, wher­ev­er you are, and send a high­ly tar­get­ed prod­uct adver­tise­ment at you through some pret­ty clever chan­nels.

Let’s imag­ine a far future sce­nario, like in 16 months from now, where you are walk­ing down the street and the HD dig­i­tal dis­play screen which is in actu­al­i­ty the glass of a store­front you hap­pen to be pass­ing near calls out to you specif­i­cal­ly. “Hey Jere­my! I love the Ralph Lau­ren Jack­et you’re wear­ing. We just got in the 2017 bold cut of that jack­et in Dark Green, take a look” The dis­play screen pro­ceeds to show you what you would look like by over­lay­ing the new jack­et on top of the old while main­tain­ing the street view behind you by using Aug­ment­ed Real­i­ty. The dis­play then pro­ceeds to say “if you come in and buy one right now, we’ll offer you a %10 per­cent dis­count”. You say to your­self, damn, I look great in this jack­et and walk right on in. SUCKER!!!!

Hon­est­ly sce­nar­ios like this are in devel­op­ment right now. Not as cohe­sive as the pic­ture I am try­ing to paint, but all the sep­a­rate pieces are being made as we speak. And don’t think for one sec­ond adver­tis­ers won’t be doing their very best to get you to spend those dol­lars by using advanced tech­nol­o­gy, your aes­thet­ics and ego against you.

Vision is a pri­ma­ry sense, we live in a visu­al-cen­tric world. In order for machines to be able to relate to humans and pro­vide the sup­port we need, it is crit­i­cal they can observe and inter­act in the visu­al envi­ron­ment. Get­ting our devices to under­stand what they are see­ing is a huge chal­lenge and see­ing the world isn’t as sim­ple as build­ing an algo­rithm to parse through data.

The rea­son vision is so com­plex is it requires expe­ri­ence and under­stand­ing of real sit­u­a­tions that allow us to respond accord­ing­ly. Machine vision, a rapid­ly grow­ing branch of AI (arti­fi­cial Intel­li­gence) that aims to give machines sight like our own, has made mas­sive strides over the past few years thanks to researchers apply­ing neur­al net­works to help machines iden­ti­fy and under­stand images from the real world.

Start­ing back in 2012 com­put­ers began to rec­og­nize images on the web to include faces, but now this have moved in to the play­ing field of autonomous drones and object recog­ni­tion sys­tems. Robots and drones face a myr­i­ad of obsta­cles that may be out of the norm, and fig­ur­ing out how to over­come these dif­fi­cul­ties is a pri­or­i­ty for those look­ing to real­ly bank on the AI rev­o­lu­tion.

Ambar­ish Mitra of Blip­par Is Build­ing a next-gen­er­a­tion app that acts like a Wikipedia of the Phys­i­cal World. By just point­ing the app at every­day objects, the app iden­ti­fies and cat­e­go­rizes near­ly every­thing in the frame. The app launch­es in two weeks in an effort to become a visu­al brows­er.

Blippar’s app cur­rent­ly lets you pho­to­graph a prod­uct and instant­ly get infor­ma­tion like price, nutri­tion infor­ma­tion, and where you can buy it. Soon the app will let a user just point the cam­era at a wide range of items and iden­ti­fy them in real-time.

The crit­i­cal chal­lenge is mak­ing aug­ment­ed real­i­ty apps work every time and Mitra explains the rea­son peo­ple use Google is because it works every time.

Now since we are speak­ing about Google, they have just pro­mot­ed their D‑Wave 2X quan­tum com­put­er, which it oper­ates along­side NASA at the U.S. space agency’s Ames Research Cen­ter in Cal­i­for­nia. This Google machine works with quan­tum bits, or qubits instead of more con­ven­tion­al bits. The super­po­si­tion of these qubits enable machines to make huge com­pu­ta­tions simul­ta­ne­ous­ly, mak­ing a quan­tum com­put­er high­ly desir­able for Big Data num­ber crunch­ing.

Well I just read in a recent ven­ture beat arti­cle that this kind of pro­cess­ing “could lead to speed-ups for things like image recog­ni­tion, which is in place inside of many Google ser­vices”.

Now think about what could hap­pen if you took a Machine vision app like Blip­par and com­bined it with the pro­cess­ing pow­er of Google’s DWAVE proces­sor. There are already a good deal of com­pa­nies engag­ing in oper­at­ing sys­tems that repli­cate the way a brain works so with the con­tin­ued adop­tion of tech­nolo­gies like neur­al net­works and spe­cial­ized machine vision hard­ware, we are rapid­ly clos­ing the gap between human and machine vision.

This could be the cream of the adver­tis­ing crop for sure. Any cam­era or screen could be capa­ble of pick­ing you out of the crowd and be able to tell if you Fit­bit is a cheap knock­off or not. What’s worse is that it would be able to tell if you were exhaust­ed and keep push­ing Red Bulls or 5‑hour ener­gy drinks every­where you go.

Get used to it, soon every­one and every­thing will be look­ing at you.

Just in case you are curi­ous D‑Wave has also sold quan­tum com­put­ers to Lock­heed Mar­tin and the Los Alam­os Nation­al Lab­o­ra­to­ry.

Print Friendly, PDF & Email
Comments
One Response to “What does Google’s D‑WAVE, Blippar, and Machine Vision have to say about where media is heading?”
Trackbacks
Check out what others are saying...
  1. […] What does Google’s D‑WAVE, Blip­par, and Machine Vision have to say about where media is head­ing? — […]



Leave A Comment

You must be logged in to post a comment.