Mozilla HTML5 evangelist Chris Heilmann, posted an article dispelling some HTML5 ‘myths’ that devs often cite when rejecting the technology in favour of native code, which on mobile you can read as Objective-C (iOS) or Java (Android).
The post itself made some good points I thought, except when he makes outright misleading statements like this:
“Native applications need to be written for every single device and every new platform from scratch”
.. er what? Anyway, moving along..
I found it through Reddit’s /r/programming, and as with many such discusssions it was dominated by those who felt very strongly one way or the other. You know the stuff : the native-camp decrying the crappy performance or tools, the HTML5 camp citing the standardisation and write-once-run-anyhere type mantras.
All valid points I’m sure. But heavily polarised discussions like this seem to miss the obvious middle ground.
You Don’t Have to Choose
Its a false dichotomy.
You really can have your cake and eat the hell out of it. I do it all the time. It tastes delicious.
It works like this. All platforms provide ways to embed HTML5 content in your native app. iOS uses UIWebView, Android uses android.webkit.WebView and I’m sure WP7/8 have an equivalent. How much of your content is split between HTML5 and Native is entirely up to you, but I tend to split it along these lines:
- Native Code : UI elements that need to be responsive or look like familiar OS UI items? Use the native platform APIs. Tables, Navigation bars, etc.
- HTML5 : Content that needs to be highly customisable, visually identical across all platforms, or needs to change frequently? Use HTML5 embedded in a web view.
The strength of a mix-and-match approach is that it is entirely adjustable during development, you’re not bound to any particular path. Once you see how much flexibility there is in keeping app content online, you may choose to move even more stuff out of native into HTML5. If something is a bit ‘chuggy’, slow to load, or just looks crap in the platform’s web view, you might decide its worth investing the time to do separate native implementations for each of your platforms.
But you know what I like most? At some stage your client points to another app and says “make it like that”! When that time comes you will have options, not limitations, and you can avoid lame sheepish excuses like the following:
- “<arbitrary technology framework> doesn’t support that”,
- “its kind of supported, but its buggy”,
- “you’re not thinking the HTML5/Java/iOS way” (please never say stuff like this to a client, just don’t).
Instead, you can give a response, framed with factors your client actually cares about:
- user experience,
Requirements and designs are going to change. Locking yourself into a particular framework will limit your future options. So why limit yourself at all?
Don’t Make me Learn
Of course with this approach is that you need to know (or have researched) both HTML5 and as many native technologies as your platforms dicate! Objective-C/Java/.NET/etc.
Oh the horror!
I’m sure some people would consider this a major drawback. I don’t. The more you know, the more options you have, and the better your products will be. You might not get to know everything in depth, but nobody does. Technology choices should be driven by client and product needs, and shouldn’t be constrained by what you already know or are comfortable with.
Just think – you get to learn! And you’ll get better across everything! How cool is that?
So, horses for courses really. You can easily marry HTML5 with as much or as little native code as you desire. Both approaches have advantages from which you can cherry-pick the best bits on a feature-by-feature basis.
Or don’t. By all means, go purely with one or the other if your project requirements justify it, but don’t feel that you have to commit yourself to one path just because thats the way the sideline commentators want to frame the options for us.