As an engineer I’ve worked and observed the mobile world for over 18 years. The first mobile product I worked on was called the AT&T EO 440 Personal Communicator. The EO 440 could be called the original Smartphone. It was based on a Hobbit processor from AT&T and it used a standard analog mobile phone to provide the wireless communication. To give some time perspective it came out a little after the announcement of the Apple Newton (another revolutionary mobile product powered by the first ARM processor).
The EO 440 ran an Operating System called PenPoint from the GO Corporation which provided a pen based gesturing input system, where documents could be embedded, faxes could to be sent and received, and access to the analog phone service was available. The EO 440 provided all the basic building blocks for the future but lacked content and all-day usage on a single battery charge. Both the EO 440 and the PenPoint Operating System pushed the relative technology boundaries for the time. It was an interesting product but one which has long been forgotten. Up to now, including the EO 440, all UIs have been modeled on digital equivalents of paper diaries and calendars, with the addition that applications can be downloaded from a service provider - to me I want a device that goes far beyond this concept, a device that learns about me and provides feedback in a new form.
Mobile meets Cloud
Meta-self, an abstraction of me where the premise of the device is more around expert systems and learning rather than digital equivalents of paper based technologies. This device learns about you the owner and doesn’t necessarily require manual input of data. Where the backend or hidden thinking part of meta-self occurs in the cloud (using the computational scale of the data center) and the mobile device becomes the personal all-day visual input and output device. This means that both the cloud and mobile computing worlds become interleaved to form a self aware device. This meta-self can track where I go and record what I do and when I do it privately – I can query the device and ask questions beyond the scope of a simple calendar entry. For example, by automatically recording GPS and location information the data can be merged into the cloud based Calendar which in turn provides instant directions for your next appointment without manual inputting.
The collection of advanced technologies are starting to align in the mobile world including all-day battery life i.e. always connected, cloud computing i.e. distributed data, internet i.e. Web2.0, ultra efficient multiprocessors i.e. Cortex-A9, multi-touch, advances in AI , and smartbooks, the concept of having a device that is an extension of one’s self starts to become a possibility. Today we are just starting to see newer ways to enter data via GPS, voice recognition, and multi-touch. I believe this will only accelerate. Check out the application called Sherpa by Geodelic for an insight on how this technology is emerging. Once the data is collected data centers can analyze and sort the information into a new form which fundamentally morphs my mobile device into an advisor and companion i.e. a meta-self.
What do you want your phone to learn from you?
Andrew Sloss, Consultant Engineer, ARM, He is interested in future software technologies and trends. In particular, Andrew looks at how software can make use of low power devices in new innovating ways. Andrew is an author, Fellow of the British Computer Society, and currently holds the chair of the ARM Bindings Sub Team for UEFI.
Shortlink to this post: http://bit.ly/9IsOm4
8 Comments On This Entry
Please log in above to add a comment or register for an account
Market Disruption and Innovation From Sensors to Servers
on May 22 2013 09:33 AM
Fortune Brainstorm Green
on May 13 2013 10:58 AM
Moonshot - a shot in the ARM for the 21st century data center
on Apr 09 2013 01:22 PM
Bringing the Benefits of the Smartphone to Pay-TV
on Mar 14 2013 05:34 PM
2013 - A Lucky Year For All Smartphone Consumers
on Mar 13 2013 06:58 PM