Archive | Interactive Books RSS feed for this section

The Super Super Secret Process

Recently, @abitofcode asked me to do a post about my super secret use of tools, frameworks, and shortcuts while developing apps for the iOS. So, here goes.

If you’re not up for reading a long post, here is the short answer: I invest a lot of my effort into physics simulations and procedural animations which, when executed correctly, result in organically-feeling worlds that appear to have a mind of their own.

The long answer is, … well, a little longer.

I find that good-looking and well-polished apps share two characteristics:

  1. The obvious animations and interactions appropriate to the functioning of the app and
  2. The more subtle movements and interactions that are non-essential but make the app appear seamless, comforting, and alive.

I tend to spend a lot of time working on point #2, perhaps too much, but to me that’s where the juice is. I also like to use fuzzy words such as comforting and empowering because I do believe that apps are much more about how they make you feel rather than what they actually do.

Slightly off topic - Disney's 12 Principles of Animation

I recently found this super useful article on principles of effective animation. If you aren’t familiar with the basics of animation, it’s a good and quick read. Here is an excerpt:

“In the real world, the basic laws of physics were first described by Sir Isaac Newton and Albert Einstein. In the world of animation, however, we owe the laws of physics to Frank Thomas and Ollie Johnston. Working for the Walt Disney Company in its heyday of the 1930s, these two animators came up with the 12 basic principles of animation, each of which can be used to produce the illusion of characters moving realistically in a cartoon.”

Read the full article here…

Now, there are many ways to approach this second point. My preferred way is to inject physics simulation and procedural animation into elements that move on the screen. This can be an animated character, a button, or a simple label.

In the case of a character, a few springs and pivots can add a whole slew of beautiful behaviors for free. Well, “free” may actually be a little strong of a term, especially since rigging up a characters can be a time consuming task, but in certain cases it will save you a ton of time down the road.

Bobo the Robot was a great example. I took a static (non-simulated) body that I moved around the screen parametrically (ie. moved it from point A to point B over the duration of C seconds, easing in and out of each position). To that body I attached a second body, this one simulated using physics. I added a pivot joint and three springs to hold it in place. Bobo Menu ScreenOn the static body I placed an image of Bobo’s wheels, on the dynamic body I placed an image of Bobo’s bulbous green head and blue torso and I let the physics engine do its magic. When Bobo moved across the screen, his head ended up swaying so naturally from side to side, that he immediately jumped out of the screen as alive. Magic.

And as I said before, you are not limited to applying this principle to just characters. Buttons, labels, and dialogs can all benefit from a similar approach. Just know that it’s possible to over-do it. When in doubt, ask a passer-by for an opinion.

But I’m getting ahead of myself. First, the frameworks.

For my rendering and physics needs I rely on two open source frameworks – Cocos2D and Chipmunk2D.

Many, many years ago, at least five or so, some very smart and very dedicated people created the Cocos2D for iOS. You probably already know this, but if you don’t, here is a very brief overview.

What is this Cocos2D anyway?

When you create a new iOS app, you get a lot of goodies from Apple for free in the form of the iOS SDK. I’m talking about ways to manipulate data, show UI elements (buttons, tables, and such), talk over the network, etc. All of the visual goo (UIKit, part of iOS SDK) is built on top OpenGL, a standardized way to talk to graphics chips to display individual pixels on the screen. That’s cool when you have a couple of buttons to deal with or if you’re using optimized widgets that ship with UIKit directly. However, for any other visually intensive experiences such as what you might want to expose in a game, you will want to talk to OpenGL directly to get better performance.

OpenGL is pretty low level and requires you to deal with frame buffers and matrices and encoding types and what not, which is fine, but if you work with OpenGL, you end up spending more of your time making the framework do your bidding and less time developing your game.

That’s where Cocos2D comes in.

Cocos2D

With useful and highly-optimized abstractions such as sprites (moving images), animations, and batches of particles, it takes care of a lot of the OpenGL stuff for you. It also gives you entry points all along the way, so if you want to hack into OpenGL you can, but you don’t have to. Awesome!

Now, in iOS7 Apple introduced something called SpriteKit, which is also part of the iOS SDK and it’s basically Apple’s version of Cocos2D. That’s cool, so why should you use Cocos2D, you ask? Maybe you shouldn’t, but I can tell you that Cocos2D is much more mature a framework than SpriteKit, at least right now, which means you can do a lot more with it straight out of the box. With some recent efforts, Cocos2D v3 came out which, I believe, allows you to port your games onto the Android OS fairly easily. I haven’t actually tried this, though, so don’t take my word for it. I'm cool...Finally, Cocos2D is an indie effort, which is inherently cool, but more importantly you can hack into its codebase, meaning it’s easy to tweak the framework to suit your needs. While I’m sure the purists in the crowd are giving the evil eye right now, I do it …uhm… all the time. Another secret leaked…

Then, some other very smart people, or perhaps just one person – I’m not 100% sure about the full origin story, forgive me Scott – got together and created Chipmunk Physics, also open source and also free. Once again, this is likely old news, but in case it isn’t, below you will find more.

Chipmunk Who?

Chipmunk2D

Chipmunk Physics, or Chipmunk2D as it has recently been renamed, is a free-ish, portable 2D physics engine written in C. It is the brain child of Scott Lembcke and his Howling Moon Software company. It’s lean, it’s fast, it’s well written, it’s predictable, it’s extensible, it’s actually kind of awesome. It handles rigid-body collisions, springs, motors, hinges, and a slew of other stuff. I would recommend you pay a little bit of money to get the Pro version as well. It comes with a few extra goodies. However, and more importantly, you’d support Scott in his super awesome physics efforts so you should definitely do that.

So, then, to make something interesting, you need to stitch the two together. There have been a couple of efforts to bridge the two  in some standardized way, but they all seemed to have lacked in their own ways until Cocos2D v3.0 came along, which brought Chipmunk2D and Cocos2D together in one, unified framework. All hail the folks involved in that effort! Going forward, you should definitely consider investing time to learn the ways of v3.0 because it will simplify your work on games and other Cocos2D apps significantly. However, since Cocos2D v3.0 is still a relatively new effort not available in my heyday, I ended up creating into my own, home-brewed solution which taught me a couple of things:

Physics Simulations Take Time to Setup Correctly

That’s another way of saying that physics simulations take in a lot of variables which, if not well balanced, can lead into unstable outcomes – ie. your physics rig explodes. There is no easy way around this, but here are some steps that make the process less frustrating:

  1. Read the documentation – Many a time I would be struggling with a particular component of the simulation only to realize that there is an errorBias property somewhere that lets me adjust the very thing that’s unstable. Chipmunk has okay documentation on its website and you should definitely read through it. Also, look through and understand tutorial code posted there. You will discover simpler alternatives of doing whatever it is that you need to do. If all else fails, dig into the framework code itself and read through the comments.
  2. Create the simplest rig possible – Chances are it will be good enough. It will also simulate quicker, you will understand it better, and you will minimize potential places for things to go wrong. Can you fake the more complicated bits with a pre-rendered animation or some visual overlay? Do it!
  3. Ask questions on forums – Both Cocos2D and Chipmunk2D have great forums (here and here, respectively) and if you post a clear, thoughtful, and complete question, you will most likely receive a clear, thoughtful, and complete answer. The converse is also true. I often encounter questions of the type “my code is broken. why?” without much other information offered which makes answers very difficult to come up with. You will get more applicable responses if you clearly state your issue, list your expectation, and your actual outcome and ask why the two don’t match. Posting a small snippet of code can also be helpful. Just don’t dump your entire project into the post, lest you want people to roll their eyes and move on. Finally, if you get your bearings on how to use a particular feature, go back to the forums and pay your karmic debt by answering questions for other people.
  4. Test your rigs on actual devices – Getting physics to feel “right” means that you need to test it in the same conditions as those of your final product. If you tweak some constants on the simulator running at 30 FPS and then play your game on a device running at 60 FPS, what felt natural might now feel too fast and you need to go back to the drawing board.
  5. Be patient – Tweaking takes time and often you will have to try several approaches to find the one that works the best. When I was working on Axl and Tuna, for example, I found that Axl was gliding along a track fairly well at slow speeds, but tended to bounce off the ground and not make much contact with it at higher speeds. I tried a few things to fix this behavior: I tried intercepting and modifying velocities during axl-track collision callbacks, I tried adding an invisible wheel underneath the track connected to Axl’s rig by a loose spring, etc. but none of these looked quite right. In the end, I simulated magnetic behavior by applying a small amount of force to Axl, pushing him towards the track surface along its normal, whenever Axl was within some threshold distance from it. That approach finally did the trick, but it took several head-scratching sessions to get there.

This brings me to my next point, which is…

Editors Are Your Friends

Now, don’t get me wrong. I love tweaking constants in code and recompiling and re-tweaking and recompiling as much as the next guy, but if your game / app is reasonably complex, doing this process over and over is a major pain in the butt. Especially, if there are twenty different things to tweak and you don’t know where to start.

Fortunately for you, there are some editors such as R.U.B.E. and SpriteBuilder out there which, as I understand it, allow you to build CCNode hierarchies and plug them into the underlying physics mechanics. I’ve never actually used either because they are still fairly new tools, but they both look promising, especially because they appear extensible and seem to have a solid-looking visual interface that allows you to tweak values quickly and intuitively.

The extensibility component is very important because, inevitably, you’ll come up with some cool idea that the tools won’t support natively and extending the existing tools, rather than reinventing the wheel and building your own from scratch, will be your only path to salvation.

Unfortunately for me, when I began app development, some of these tools didn’t exist and I had to resort to building my own.

My MO approach was to bake an editor directly into the app I was building and that worked fairly well. Here are a couple of examples:

Editor1 Editor3 Editor2

It started as a necessity to lay out text for interactive books that I was working on, but then, with a few extra tweaks, I started editing physics shapes, simulation constants, sprite placement, and the works. It was very helpful to momentarily pause a game (or a page in a book) tweak some values and then restart it without having to recompile the code. I also found this setup to work well as a great debugging tool that allowed me to quickly dive into complex bugs just by swiping my finger across the iPad screen. Sadly, there is adownside…

Editors Are Your Enemies

SkeletonIt turns out that when you invest your time into an editor, you don’t spend that time working on your game. Who knew, right? And if you are like me and, in the process of creating a crude editing environment for your game, you discover cool ways to constantly rewrite your framework to make it “easier to use”, you will get lost in your own rathole with no end in sight. In other words, sometimes it can be difficult to break yourself away from creating the tool and spend time creating the app.

It’s a balance. Editors can save time and frustration but they also take time to build (and debug), so I find it useful to constantly ask myself – can I achieve what I need to achieve with the tools that I already have? If you are like me and building tools is exiting for you, the previous question is a good one to write in permanent marker above your monitor.

However, always consider the power of editors that you already have. Will Sprite Builder work for you? Can you export coordinates from a photo-editing program? Design your physics setup in Inkscape? The image on the right, for example, is the design of a character rig for Axl. Use whatever tools you already have whenever you can.

The other problem is that editors tend to end up being project-specific. They will likely end up sharing a common infrastructure from one project to the next, but I find that each game / app has its own needs that require at least some form of a custom editing experience. In the past I always ended up tweaking and rewriting editors ever so slightly as I progressed in my creation of new apps.

So, Editors?

While working on Axl and Tuna, I asked myself whether I could create a run-time editor that was truly universal without having to spend a year writing the most flexible and extensible framework ever. Was there a compromise that delivered minimal, but necessary editing capabilities for a wide range of scenarios, one that was simple to use and integrate into any project?

I’m happy to say that I found an answer that worked for me. I’m sure I’m not the first one to have thought of this, but the solution I came up with is relatively easy to construct but still powerful enough to do what it needs to do.

Editor4What I’ve done is create a very simple editing framework with a corresponding editor that allows me to edit primitive values (ints, floats, vectors, etc.) organized into arbitrary hierarchies right within the app itself. Using a simple macro, any class can expose properties for editing. These can be backed by actual variables or just by named constants. If, during the app execution, an editor is invoked, I simply create a top visual layer and place it over the entire screen that scours a given hierarchy of objects, looking for and exposing any and all properties marked as editable. The editor displays the value for each property and, if you select it, you can use your finger to change its value directly on the iPad / iPhone screen. If no property is selected, the touches are passed into the scene underneath for normal app execution.

Once you find the right values for the properties you care about you can either copy those values back into the code manually or you can dump the property tree into a plist that can be read in and applied to your app during its next execution.

Very crude, but very effective because it applies to a wide array of scenarios. So, there you have it – another secret exposed!

Procedural Behaviors

Remember that organic feeling for apps I was talking about earlier? A lot of that comes from animation. I talked about physics-based animation already, but there is more you can do.

Sadly for me, I’m not an animator. I also don’t have one working with me. So I have to tackle animations programmatically.

This approach can be a work-intensive way to add movement into your apps. However, it can breathe unexpected life that you wouldn’t be able to achieve otherwise. Let me give you an example.

Bobo, my favorite robot character, has two mechanical arms. He uses his right arm to help you, the user, pull down pages from the top of the screen when you tap on their heading. Being curious and all, Bobo sometimes gets interested in whatever gizmo happens to be on a given page and may end up using his right arm to do something else for a moment (pull on a switch, reach up to tickle a monkey, etc.). If at that same moment you, the user, tap on a pull down menu and Bobo’s right arm is occupied, he will just switch and use his left arm to pull down the page instead. If Bobo was a traditionally animated character, with predefined keyframes, this type of an interaction would either be impossible or it would result in one animation being abruptly cut off while the other played itself out. However, because Bobo is monitoring his whereabouts and can make simple decisions on how to behave in a given situation, he doesn’t always do the same thing to achieve a given result. Instead, he dynamically changes his behavior based on the circumstances and exhibits a much more varied array of movements, emotions, and animations.

 To make development of this type of interactivity easier and avoid the pitfall of a whole bunch of spaghetti code, I invested a little bit of time to create a behavior system. Basically, a character (or a menu button for that matter) can perform a certain set of behaviors. Take Bobo as an example again. Bobo knows how to blink, how to look around, how to look at the user, how to sing a song, how to move to a requested location, along with a bunch of other things. A given behavior can be either ON or OFF and often several behaviors are ON at the same time. Some behaviors have higher priority (user directing Bobo to go somewhere) than others (Bobo checking out a location on his own). Some behaviors deactivate others. Bobo singing and Bobo saying “Ouch!” are mutually exclusive and because “Ouch!” has a higher priority it will overshadow and automatically deactivate the singing behavior.

Anyway, you throw all these rules together, each one defined and governed by an isolated piece of code, and (if you are lucky) you get an experience that feels spontaneous and real and gives you a huge variety of responses to a set of conditions.

Parting Words

Before I go, here are a few final lessons I learned that you might find helpful in your own projects.

  1. Code structure – whatever you do, structure your code well and refactor it as you go along. Building code in isolated components makes testing, bug fixing, refactoring, and maintenance not only easier but possible. If you make your code structure bullet-proof, good code will follow.
  2. Bugs – fix your bugs early and when you encounter them, even if you are in the middle of something else. I constantly interrupt my work because I notice that something is not happening the way it should be. Waiting all the way until the end means that you will find yourself facing a mountain of issues a week before you want to go live and that you will end up shipping a product that will fail in your users’ hands.
  3. Profiling – do it through out your development to understand where your code is spending most of its time and what operations are costly. That practice will help you come up with design decision that won’t corner you into an app that runs at 10 frames/sec. However, I would suggest not optimizing your app until the end. That way you won’t waste time perfecting code that you might not end up using in the final product.
  4. User feedback – get it early and get it often. Stand back and watch people get frustrated with your app without offering a single word of advice. That will take nerves of steel, but it will allow you to identify the parts of your app with which people struggle.
  5. Have your app be playable from day 1 – even if most of your app’s behavior is initially faked, seeing the final product in your hands early will help tremendously in guiding your design decisions going forward.

Finally, whatever you do, work on something that you love. While it’s possible to mess up a project that you really believe in, it’s nearly impossible to make a project you don’t believe in successful.

Now go and code your hearts out.


It’s a Sharp, Sharp World…

…or Some Tips on How to Bring Your Big iPad App to the Even Bigger Retina Display

I’ve just spent the the past several weeks updating Bobo Explores Light for iPad’s new retina screen.  It was a tricky problem to solve right, but I’ve learned a couple of tricks along the way that you might find useful.  If so, read on…

The Problem

For vector-based or parametric iOS apps, ones that rely on 3D models or that perform some clever run-time rendering of 2D assets, the retina conversion is pretty straight forward.  All they need to do is introduce an x2 multiplier somewhere in the rendering path and the final visuals will take advantage of the larger screen automagically.  There might still be a couple of textures and Heads-Up-Display images to update, but the scope of these changes is quite small.

My problem was different.

The Bobo app contains well over 1400 individual illustrations and image components that look pixelated when scaled up.  It features several bitmap fonts that don’t scale well either.  When I introduced the x2 multiplier over the entire scene, the app technically worked as expected, but it appeared fuzzy:

Fuzzy Problem

My first impulse was to replace all of the illustrations and fonts by their x2 resampled sharp equivalents.  This line of thinking, however, presented two immediate challenges:

1) I needed to manually upscale 1400 images.  That’s a lot!

Even though the illustrator behind the project, Dean MacAdam, kept high-res versions of all the images in the app, the process of creating the individual retina assets was very tedious:

  • Open every low-res (SD) image in Photoshop
  • Scale it to 200%
  • Overlay it with an appropriately scaled, rotated, and positioned high-res version of that same image
  • Discard the layer containing the original SD image
  • Save the new high-res image (HD) with a different filename
  • Repeat 1399 times

If each image were to take 5min to convert, and that’s pretty fast, this conversion alone would take well over three weeks.  Yikes!

2) I needed to keep the size of the binary in check.

4in1Bobo Explores Light already comes with a hefty 330MB footprint.  Not all of it is because of illustrations since the app includes a number of videos, tons of sounds and narratives, etc.  But a good 200MB is.

Now, the retina display includes 4x as many pixels as a non-retina display.  If I were to embed an HD image for every SD image used in the app, the size of the Bobo binary would exceed 1GB (130MB for non-image content + 200MB for all SD images and 4 x 200MB for all HD images).  That just wasn’t an option.

The saving grace

When I calculated the above numbers, I’ve reached the conclusion that in the case of Bobo, retina conversion was a futile effort.  Nonetheless, I got myself the latest iPad and did some experimenting.  My secret hope was that I could mix SD images with HD images and come up with an acceptable hybrid solution.  My secret fear, however, was that the few HD images would only highlight the pixelation of SD images still on the screen and that it would be an all-or-nothing type of a scenario.

I uploaded a few mock-up images onto the new device, iterated over several configurations, and I was pleasantly surprised.  Not all, but some combination of SD and HD images actually worked beautifully together.  In certain cases, the blurry SD images even added a sense of depth to the overall scene, resulting in a cheap man’s depth of field effect.

I was excited because these results helped me address both of the problems I outlined above.  By being selective about which images I needed to convert, the total number of retina assets I needed shrunk to 692.  Still a large number, but less than half of the original.  Also, the ballooning of the binary size would be diminished.  That problem would not be solved, mind you, but it would certainly help.

Text

Text was the number one item in the app that screamed “I’m pixelated!”.  The native iOS code renders such beautifully sharp text on the new iPad that any text pixelation introduced in the Bobo app stuck out like a sore thumb.  This part was easy to fix, though.  By loading a larger font on retina devices, all of the text that was dynamically laid out suddenly snapped to focus.  Unfortunately for me, not all of the text in the app was dynamically laid out.

Bobo features well over 100 pages of text with images in the form of side articles and interesting factoids.  For the sake of saving time when we worked on v1.0 of the app, we baked some of that text and images together and rendered the entire page as a single image.  This approach really helped us streamline the creation process and push the app out in time.  All in all, these text-images amounted to about 80MB of the final binary, but given the time it saved us, it was the right approach at the time.  Now, however, it presented a problem.

If we were to re-sample all these text-images for the retina display, we would gain ~80Mb x 4 = ~320Mb of additional content just from the text alone.  That was way too much.  But, we *needed* to render sharp text.  So, we bit the bullet, separated the text from its background, and dynamically laid out all the text at run-time.

This conversion took well over two weeks, but it was worth the effort.  The text became sharp without requiring any more space.  At the same time, we were able to keep all the photographs interleaved with the text as SD images.  Because these were photographs that were visually fairly busy and because they were positioned next to sharp text that drew the attention of the eyes, the apparent blurring from the pixelation was minimal.  Additionally, without any baked text the background images compressed into much smaller chunks, giving us about 50MB worth of savings.  That was not only cool, but very necessary.

Home-Brewed Cocos2D Solution

Bobo is built on top of the open-sourced Cocos2D framework (an awesome framework with a great community of developers – I highly recommend it!).  Out of the box, Cocos2D supports loading of retina-specific images using a naming convention.  However, this functionality is somewhat limited.  If all of the images in an app are either HD or SD, this works great.  But my needs were such that I required mixing and matching of the two, often without knowing ahead of time which images needed upscaling without trying it out first.  I needed a solution that would allow me to replace HD images with SD images on a whim without having to touch the code every time I did so.

Way back when, when I was working on The Little Mermaid and Three Little Pigs, I created an interactive book framework where I separated the metadata of each page (text positioning, list of images, etc.) from the actual Cocos2D sprites and labels that would render them on the screen.  This is a fairly common development pattern, but I can never remember what it’s officially called (View-Model separation maybe?).  Anyway, I used this separation to my advantage in Three Little Pigs to create the x-ray vision feature.  Render the metadata one way and the page appears normal; render that same data another way and you are looking at the x-ray version of that page.  Super simple and super effective.

With this mechanism in place, I was able to modify a single point in the rendering code to load differently scaled assets based on what assets were available.  In pseudo-code, the process looked something like this:

Sprite giveMeSpriteWithName(name) {
    if (retina && nameHD exists in the application bundle) {
        sprite = sprite with name(nameHD);
        sprite.scale = 1;
        return sprite;
    }
    else {
        sprite = sprite with name(name);
        sprite.scale = retina ? 2 : 1;
        return sprite;
    }
}

It got a little more complicated because of parenting issues (if SD and HD images belonged to different texture atlases, they each needed their own parents), but this was the core of it.  What this meant for me was that all of the pages, by default, took SD images and scaled them up.  Apart from appearing pixelated, the pages looked and behaved correctly.  Then, I could go in and, image-by-image, decide which assets needed to be converted to HD, testing these incremental changes on retina device as I went along.

There was some tediousness involved for sure.  However, I quickly got the sense of what portions of what pages needed updating and I came up with the following rough rules, that hopefully might come handy to you as well.

Things That Scream “I’m pixelated!”

1) Type

At the very least, convert all your fonts, whether they baked into images or laid out dynamically.  Your eye focuses almost instantly on the text on the screen, if some exists, and the fuzzy curves on letters become immediately noticeable.  By that same token, convert *all* of your fonts – don’t skimp out just by converting the main font that you use in 90% of the cases.  The other fuzzy 10% would essentially nullify the entire effort.

2) Small parts that are the focus of attention

When converted to HD, cogs, wheels, pupils, and tiny components all make a huge difference in giving the app the *appearance* of fine detail even if the larger images, however bright and prominent, are still in SD.  Moreover, because these smaller images are … uhm… small, scaling them up doesn’t take that much extra space, so it’s a win-win setup.

3) High-contrast boundaries

Bobo’s head is a perfect example.  Most of the time, Bobo moves across dark colors with his bright green bulbous head in sharp contrast with the background.  Even though Bobo’s head was relatively large, it begged for a razor-sharp edge on most pages.

Things That You Can Probably Ignore

1) Action sequences

This one can sometimes go either way, but it’s still worth mentioning.  If something pixelated moves across the screen, the movement will mask that pixelation enough so that no one will really care.  However, if you have an action sequence that draws the attention of the eye and the sequence contains at least some amount of stillness, the pixelation will show.

2) Shadows, glows, and fuzzy things

All of these guys *benefit* from pixelation – definitely don’t bother with them.  If anything, downscale them even for the SD displays and no one will be the wiser.  Seriously, this is a great trick.  Anything that has a nondescript texture without sharp outlines (either because the outlines should be fuzzy or because the outlines are covered with other images), store it as a 50% version, and scale it up dynamically in code to 200% on non-retina displays and 400% on retina displays.  The paper image behind all side articles in Bobo Explores Light is a perfect example.  The texture itself is a little fuzzy, but because it is lined with sharp metal edges and overlaid with sharp text, nobody cares.

When All Else Fails…

A few times I found myself in situations where the SD image was too fuzzy on the retina display, but the HD image took way too much space to store efficiently.  What I ended up doing in those cases was to create a single 150% version of the image, and scaled it down to 66% for SD displays and 133% for HD displays.  The results were perfectly passable in both cases.

Final Tallies

When all was said and done and my eyes were spinning from some of the more repetitive tasks, I was very curious to see how much the binary expanded.  I kept an on-going tally as I went through this process, but because of various reasons, it wasn’t super accurate.  When I compiled the finished version, I discovered that not only did the binary not expand, it *shrunk* by a whooping 50 MB!  This whole process took one freakishly tedious month to complete, but in the end the retina-enabled version of the app was significantly smaller than it’s non-retina original.

I don’t know whether that says more about my initial sloppiness or the effectiveness of the retina conversion.  I’ll leave that as a question for the reader.  Nonetheless, the results were exciting and Bobo Explores Light looks, if I dare say, pretty darn sharp on the new iPad.  Check it out!


Q&A with GeekWire

I recently had a chance to connect with John Cook from GeekWire who asked me about Game Collage, Bobo, and the startup world in general.  He just posted our conversation online – check it out at GeekWire’s Startup Spotlight.

One for the Kids!

Way back when, when I was still a wee-little iOS developer about yay high, I created my first interactive book of Hans-Christian’s Anderson’s fairy tales called “The Little Mermaid and Other Stories“.  It came out only a few weeks after the iPad was released and I remember working round-the-clock on the project to get the book out.  Because it was my first interactive book, I needed to spend a lot of energy developing the basic framework – general code structure, run-time page layout, dynamic loading and unloading of assets, etc.  I was working alone on this project so, in addition to the coding, I spent well over half of my time creating illustrations, designing sound effects, editing the content, and, at the very end, creating a short video trailer – definitely not my forte.  Nonetheless, I was happy with the results.  I submitted the app to the App Store and, once it was approved, I watched it go live one sunny Friday morning.

And then I paced.

I was very tired from the effort of the previous six weeks; I squeezed every ounce of creative juice I had in me. I was excited, and very, very nervous.  I don’t even know why.  I do know, however, that I spent most of the afternoon pacing – I just couldn’t stand still.

Yet nothing happened.

The week just before the app went live, I sent dozens of emails to newspapers, review web sites, and popular bloggers announcing the upcoming launch, but every time I refreshed my inbox, I was greeted with a message telling me that I have “no new messages”.  At that point, a wise man suggested I should step away from the computer and go for a walk.  Before I did, I literally hit the refresh button one more time and was rewarded with the following message:

We love your apps.  Thank you!
Zoe age 5,
William age 8 


I almost lost it.  Unbeknownst to them, Zoe and William had given me the justification that I so desperately needed.  I peeled myself away from my laptop to go for the walk and I couldn’t stop smiling.  Mission accomplished.

Since that time, I’ve heard from many more kids and parents and grandparents enjoying the apps and offering me their feedback and suggestions.  I’m always thrilled.  Each time I receive such an email, I’m excited to read it and respond as quickly as I can.  For me, it’s one of the most rewarding reasons of why I do what I do.  And both Zoe and William have a very special place in my heart, because they were the first to let me know that they cared.

Today, out of the blue, I sent an email to Zoe and William’s dad and let him know how much the original email from his kids meant to me.  Soon, he replied:

Thank you so much for the email. William just turned 10 and Zoe is now 7. Each of them now has their own iPad with your app installed. They even take turns reading your app to their little brother Levi (2). Now you have three customers for life. I will make sure they check out your new app [Bobo Explores Light], and I will send you their review (in their own words).
I think you have also inspired William to start working on his own app. He is currently working on his artwork and game rules. I will let you know how he progresses.
Keep up the great work. You have provided hours of enjoyment to William, Zoe and now Levi through a mixture of technology, art and science.


Once again, I am grinning ear to ear and want to send a huge thank you to William and Zoe and Levi and their dad Scott.  You guys rule and inspire me every day!  And, William, I look forward to seeing what you come up with.  Keep it up!

 


A rare look at Bobo’s code base

I came across a very cool online tool created by Jonathan Feinberg that produces beautiful word clouds from random chunks of text.  I was curious to see what it would do with a piece of code and so I dug up the base class for Bobo and pasted it in.  To give you some context, the class controls most things Bobo, from his singing and face animation to movement, interactivity, and physics.  It’s a hefty beast of over 1300 lines of code and seemed like a great candidate.  So, if you ever wondered what Bobo was made of, here is your answer:


The Importance of Not Guessing

Bobo ExaminingYesterday, I went to Apple’s iOS Tech Talk held in Seattle out of all places (how could I not?) and was excited to meet a slew of other developers with whom I previously only interacted online or via their apps.  It was quite a trip – I guess I should climb from underneath my “rock” more often.

Besides all the socializing, I sat through a number of lectures delivered by the Apple folks on the wonders of the iOS technology.  One of the more interesting sessions centered on profiling, whose overarching message was “Don’t guess – measure!”.

If you’ve ever written code with tight performance requirements, “thou shalt measure” is a well known commandment.  What I recently discovered, however, is that measuring is equally important in the development of interactive books and apps in general.  Specifically, knowing how your customers use your product is paramount to figuring out what features are important and which ones are not.

Let me give you an example.

When I was writing Bobo, certain pages seemed more important to me.  For example, the book wouldn’t seem complete if it didn’t mention Edison at some point.  In addition, certain other pages appealed to me personally more than others.  For example, I was in love with the imagery of the Jungle / Photosynthesis page and spent good four days tweaking countless little details – from the blooming flowers to the swaying vines, paying particular attention to dynamically recreate jungle sounds from a collection of animal calls, avoiding repetitiveness yet mimic the overall impression of vibrant life.  In short, I really geeked out.

Bobo Jungle

Dean was similarly enamored with the Bioluminescence page.  It all started with him sketching a beautiful yet menacing-looking angler fish.  From that point on, he wouldn’t rest until I finally caved in and spent four days on that page as well.  There was plenty to keep me occupied – animated fins on the fishes, several particle systems, fading colors with water depth, Bobo’s swimming movement which was unlike his movement on any other page, bubbles, water sounds, chomping angler fish, … you name it.  It was another point where we geeked out because of our sheer excitement (mostly powered by Dean) about the topic at hand.

Bobo Bioluminescence

Once we released the book into the wild, however, we were surprised that our users responded with page preferences completely different from ours.  The Introduction to Photosynthesis (a.k.a. the Tomato page), for example, is among the more popular pages in the book, even though we slapped it together in a single day to provide a much needed transition between some of the other topics in the book.

If I order all the pages by how much each of the topics appealed to me as a developer/user, I get the list in the left column.  If I order them by how much time I spent creating each, the list looks a little different, but not entirely dissimilar (middle column).  However, if I order it by popularity of users from all around the world, the list looks completely different (right column):

My preference

  • MOST EXCITING
  • Photosynthesis
  • Bioluminescence
  • Auroras
  • Disco
  • Fireworks
  • Glow in the Dark
  • Reflection
  • Sunset / Night / Sunrise
  • Lightning
  • Edison
  • RGB
  • Eyeball
  • Sun
  • LaserIntro
  • Caveman
  • Refraction
  • Telescopes
  • Photosynthesis Intro (Tomatoes)
  • LEAST EXCITING

Code Complexity

  • MOST COMPLEX
  • Reflection
  • Glow in the Dark
  • Photosynthesis
  • Bioluminescence
  • Disco
  • Sun
  • LaserIntro
  • Eyeball
  • Auroras
  • RGB
  • Fireworks
  • Edison
  • Sunset / Night / Sunrise
  • Lightning
  • Refraction
  • Caveman
  • Telescopes
  • Photosynthesis Intro (Tomatoes)
  • LEAST COMPLEX

User Preference

  • MOST EXCITING
  • Sun
  • Sunset / Night / Sunrise
  • Auroras
  • Lightning
  • Photosynthesis Intro (Tomatoes)
  • LaserIntro
  • RGB
  • Disco
  • Caveman
  • Photosynthesis
  • Fireworks
  • Reflection
  • Glow in the Dark
  • Eyeball
  • Edison
  • Bioluminescence
  • Telescopes
  • Refraction
  • LEAST EXCITING

The Tomato page says it all.

The moral of the story is that the time we spent on the different book parts is incongruous with the amount of time people spend using it and, if we paused and collected some of this data during development, we would probably have adjusted our internal schedules and priorities accordingly.  Live and learn but, most importantly, don’t guess – measure often and repeatedly.

The Hall of Fame

I’m super excited to report that our little Bobo was spotted in Apple’s Hall of Fame.  Check it out and happy holidays!

Hall of Fame

Bobo Awarded the KAPi Award

Today, the Children’s Technology Review and Living in Digital Times announced their third annual Kids at Play Interactive Award winners, which included our interactive book Bobo Explores Light!

KAPi Award

Children’s Technology Review and Living in Digital Times award the KAPi Award to recognize “the most innovative games, software, devices and apps for engaging, entertaining and educating today’s children.”  This year, 8 products received the award from over 600 that were nominated.  We are very excited to have achieved this honor.  You can read more about the awards as well read about the other winners from the official posting here.

Essentials for your iPad

My mom just got an iPad for herself and, naturally, I was bombarded with a barrage of questions:

  • How can I delete an email?
  • Can I call you on Skype?
  • Why does the screen turn dark when I don’t touch it for a while?
  • How can I read books on this?
  • Why do the icons shimmer when I hold them?  etc.

All were delightful, of course, and made me appreciate the device all over again.  It really is a sleek little box that, for the most part, just works – even with my mom driving.

At some point during the Q&A, it became apparent that we need to install some apps that no device should be without.  I went to the App Store searching my memory and trying to remember all the goodies that she would likely find helpful.  It turns out that Apple beat me to the punch because among the main features of the US store this week is the “App Starter Kit”:

Starter Kit

I followed the link and was delighted to find out that Bobo Explores Light shows up among the selected apps, right on the front page.  Check it out!

Bobo Starter

Who Says Three Little Pigs Are Only For Kids?!?

PiggieYesterday, I was excited to receive the following email from Judith in Clermont, FL:

I have been delighted and intrigued since my first meeting with your pigs. I teach a conversation class for adult learners who use English as their second language. I brought in my iPad and passed it around for the learners to take turns reading each page. We discussed the meaning of each part of the story as many words are difficult for them. I can say no one was in the least bored. They couldn’t wait for their turn to play and read. Believe it or not, with all the discussions, etc. we spent almost a full hour on this very enjoyable lesson.

I’ve heard of our interactive books being used in a variety of classroom settings before, but this is definitely the first story involving adults and Three Little Pigs.  The iPad is really a remarkable device that has the potential to inspire people in all sorts of situations.  Do you have stories from your classroom as well?  Let me know!