Jul 142009
 

At Semicon 2009, the keynote speaker from Intel apparently said that Intel is working towards the Internet becoming an “immersive connective experience,” or ICE web.

Intel’s laboratories have also investing in researching visual computing, using computers in conjunction with cameras and GPS in a smartphone. For example, users could take a picture of a sign on their smartphones and the handset would check GPS to see what country the users was in, get a translation of its meaning and give directions from a mapping application overlaid.

He said that applications like Second Life were merely the first generation of virtual worlds and the situation was going to get more immersive. Intel has been using software modelling techniques to render 3D more effectively, including making computer generated environments obey physical laws of movement and building in behavioural intelligence.

— Intel outlines the next generation ‘reality web’ – Technology – News – CRN Australia

None of this sounds particularly off the Metaverse Roadmap, honestly. The interesting thing is the dates.

He estimated that the techniques of using the camera to produce visual searches for data of photographed object would come online in 2010, with information overlay on camera views by 2012 and a 2D and 3D visual overlay available by 2014.

Naturally, why this matters to Intel is that all this will need more powerful chips… especially on more mobile devices.

Of all the parts of the Metaverse Roadmap, it’s the augmeted reality quadrant that is moving the fastest (once you train yourself not to look for goggles and instead look for phones).