Speedcuber was a very popular demonstration at MWC so it was natural to consider demonstrating it again. However, someone (I won’t name the “guilty party”) suggested that we should switch to more relevant, contemporary technology; after all, the Nokia N95 phone controlling the first robot was launched almost four years ago in 2006. It was proposed that we show the robot again in April at the Embedded Systems Conference in Silicon Valley (time to forget relaxing in my spare time for another couple of months!?!).
We decided to use an Android-based phone and I set about downloading the Android SDK and acquainting myself with the unfamiliar APIs. The ARM Solution Center for Android (SCA) provided a good starting point for this. The application on the Nokia phone was written in Java using MIDP 2.0 so I hoped that the port to an Android application using Java would be relatively painless. I managed to borrow an HTC Hero for my initial work. Using one of the many example applications available via the ARM SCA I was quickly able to construct a skeleton application and user interface from a “Hello World” example and port the cube solver code.
For me, one of the coolest things about developing for Android was the ability to download and debug an application directly on the device using features such as breakpoints, code stepping and variable inspection that you would expect in an embedded software development environment. While developing for the N95, I was able to debug the application running in an emulator, an option which is also available with Android. This allowed me to debug the main solving algorithm, but didn’t allow me to debug the Bluetooth and Camera interfaces. From my previous experience, I expected that writing code to interface from the phone to the Motorola DROID.
But it was all going too smoothly! Accessing the camera was more of a challenge. On the N95, I was only able to use the camera in snapshot mode rather than in streaming video mode. This limited the overall speed of the demonstration since the initial scan took several seconds to capture each of the images for the six faces of the Rubik’s cube. I was hoping to be able to use the live preview image stream provided by the android.hardware.Camera class to significantly reduce the time required to capture each image in the new demonstration. It took a while to find enough information to convert the HSV images returned in the preview call-back into RGB format to allow me to use my existing colour recognition algorithm. However, once this was achieved, the Android version of Speedcuber soon sprang into life with its first solve!
The other significant change to the demonstration was to add a large LED display to show messages and solve times. Luke from the “demo team” in Cambridge worked very hard to create this from scratch using a Cortex-M3 based microcontroller, the LM3S811 from Luminary Micro using their Evaluation kit. Of course the software development environment supports debug of the embedded application while it is running on the microcontroller. Luke had to develop code to allow the LEGO NXT controller to communicate with the Cortex-M3 in the LED display using an I2C interface so I wasn’t the only person to get to play with LEGO at work this time!
Overall, I had a very positive experience porting the application to Android. The developer resources are generally clear and comprehensive. Since I was interacting directly with hardware interfaces to external devices, the ability to debug the application directly on the phone was a definite win for me.
You can see the Android version of the ARM Powered LEGO Speedcuberlive on ARM’s stand #1308 Embedded Systems Conference in San Jose until 29th April (assuming it survived the flight this time!)
Back to my day job... and maybe I can find some time now to relax a little?!?
David Gilday, Principal Engineer, ARM, It all started with LEGOs. When asked what I was going to do when I grew up, I gave the answers that everyone expected of me…"train driver" or "fireman". But secretly I dreamed of a job where I could spend all my time building things with my favorite toy... Well thanks to ARM's cool technology it has come true (well in-between my regular day job anyway)! As I got older, I became interested in puzzle solving, electronics and programming and my "expectations" gradually changed from train driver to chip designer. However, my hobby and dream have remained essentially the same and I continue to enjoy dabbling in construction, electronics, mathematics and software programming in my spare time.
Shortlink to this post: http://bit.ly/9VvihX
6 Comments On This Entry
Please log in above to add a comment or register for an account
Market Disruption and Innovation From Sensors to Servers
on May 22 2013 09:33 AM
Fortune Brainstorm Green
on May 13 2013 10:58 AM
Moonshot - a shot in the ARM for the 21st century data center
on Apr 09 2013 01:22 PM
Bringing the Benefits of the Smartphone to Pay-TV
on Mar 14 2013 05:34 PM
2013 - A Lucky Year For All Smartphone Consumers
on Mar 13 2013 06:58 PM