Just a quick update concerning my SenseCam project. I have finally gotten around to porting my SenseCam software from the old J2ME version on my k850i to an Objective-C version for the iPhone 4.
I effectively had to do a complete rewrite from the ground up, but the development work only took about two weeks to complete based on a much friendlier and feature rich SDK (yay! for iOS 4) and the fact I had already done a lot of the hard work of figuring out what I wanted on my previous attempt for my SONY Ericsson.
I’m currently in the process of adding in some new features to perform basic image recognition, automatic background uploading of images, as they are taken, to a “SenseCam Server” written in PHP (part of the way through that development), and of course, making sure the data files output by the iPhone version of my SenseCam software are compatible with the .NET version of my desktop application for browsing the catalogue of images. Currently having problems with the GPS format stored in the image data for some reason, not sure if my math is off or if I am not writing the image meta data correctly and I’m too tired to figure it out tonight.
A nice feature that only took me an hour or so to implement was automatically rotating the images to the correct angle directly on the iPhone based on the very sensitive accelerometers. The angle detection in iPhone 4 is far superior to what it was on my old SONY Ericsson k850i or K790, plus there is no perceptible lag as the iPhone moves around. Now I don’t have to post-process each image to rotate it based on what the angle sensor said it was at the time. It’s all done on the very fast ARM CPU in the phone.
Next on my ever growing TO DO LIST list is writing some automatic “anti-blurry cam” software to remove the ghosts and streaks that frequently occur when wearing the SenseCam under artificial lights.
The GPS system in iPhone 4 is also superior to the solution I was using previously which entailed an external GPS logger. Each SenseCam image is automatically tagged with the GPS location as it is taken. No more post-processing!
Battery life is a concern right now. It kills the iPhone dead in just a few hours. One thing to note is that I am running debug builds of the software so I am hoping that switching over to release builds, and also optimizing some rather hacky code that I just threw together to solve a particular problem with the image rotation will at least mitigate what is basically a battery sucker.
I also have it on that same TO DO LIST of looking in to running the SenseCam as a background process. This would let me do other things with the phone instead of just running it as a SenseCam all of the time. This was a trivial problem to solve in J2ME on the k850i but I am not sure how easy or hard this is going to be, never having used the new iOS background sub-system before. I might be pleasantly surprised or I might just be surprised.
I am wondering if it is worth actually releasing my iPhone SenseCam software onto the Appstore at some point in the future. It needs vast improvement, but it might be worth doing.
Question to self: “Would there be any market for this software?”
Of course, along with the iPhone application, there is the .NET desktop application (Windows only at this time) that accompanies it which can be used for quickly browsing and tagging the day-to-day life blogging feed of images.