As many users want support for the BlackBerry Storm to be useful, I've been working on improving the UI for touch-screen devices. While I still have a long way to go, I've finally put all the necessary hooks into the code itself to make actually doing all of this possible. So, here are some pretty pictures, before I discuss some of the gory details:
Quick question: Has anyone actually seen an example of a tree widget that actually works well on a touch-screen device? Particularly one like the LogicMail folder tree, where each node can be expanded/collapsed as well as clicked?
One of my constant side-projects that has finally come to fruition is support for building LogicMail against multiple versions of the BlackBerry API. The challenge is keeping a single code-base, while continuing to implement new features that become available. The initial drive for this, of course, is offering a better UI on the BlackBerry Storm (OS 4.7 and up).
Towards achieving this goal, two approaches were initially investigated: using the preprocessor in RAPC to separate chunks of code, or using a fancier approach based on multiple libraries and class inheritance. While tempting up front, I wound up disregarding the preprocessor approach due to its sheer inelegance and the quirkiness of Eclipse IDE support for it. So what I've implemented is a variation on the library approach. I separate out the code for different API versions when working in Eclipse, but merge it all together in multiple full-up builds when actually using Ant to compile LogicMail on the build server. Got it? :-)
My first implementation of this approach involved some of the UI screens. I wanted to offer specific functionality on touch-screen devices when necessary, while maintaining as much common code as possible with the base implementation. Of course the API to even find out if you have a touchscreen doesn't exist until 4.7, so that posed a bit of a challenge. What I essentially did was completely refactor how screens are created, from an inheritance-based approach to a composition-based one. This had the effect of separating the screen implementations from the RIM API. A somewhat confusing UML diagram attempts to explain this:
P.S. I've also updated the development page with instructions on how to use this new structure when working with the source code.