Intel has given a glimpse of the software its building so mobile computers can detect and communicate with a variety of devices, reducing the size of machine you need to lug about.
Lester Memmott, a senior architect in Intel's software pathfinding and innovation group, has revealed the existence of an experimental context-aware computing engine that could provide the framework for this next generation of mobile devices and applications.
Memmott said his team has built a running prototype with a plug-in architecture that can accept data from a variety of different sources. The data schema is extensible so third-party developers can add further data sources. The context-aware computing engine prototype includes a data collection mechanism (called an aggregator) and a programmable analyzer that can take decisions based on the context data.
The idea is that if you walked into a coffee bar that has WiFi access and there's an HD screen then your mobile device could detect the screen and format data such as pictures to suit, and that you'd then be able to transmit and display these pictures.
Intel's so-called Carry Small, Live Large vision for mobile computing is built on context-aware concepts that have been developed during the last decade, and that it hopes will drive the idea of small yet powerful portable devices capable of working with keyboards, mice, audio systems, high-definition televisions and other external digital resources. Intel used last week's Intel Developers Forum in Shanghai, China, to describe CSLL.
Elsewhere, Intel is exploring a concept called Dynamic Composable Computing to enable mobile devices to link to available resources easily.
The big worry is that some folk might use this technology as a modern equivalent of the slide projector and bore us all stupid with their holiday snaps in bars and restaurants.