One weekend in the middle of February, we ran a hack session to make a first useable prototype of our book for the Library of Lost Books – a talking, gesture-responsive book.
Our aim was to put together the elements we’ve each been working on: story and audio; Lilypad Arduino; gesture detection; getting iPhone and Arduino talking to each other to share data – and to combine them in a physical book.
We’re using an old green cloth-covered copy of Treasure Island (pub. Cassell) for the prototype, working out how to solve some of the problems – such as how to hide the components into the book – and testing whether the story works. Some of the conversion work has involved copying, cutting, gluing and sticking key pages from the Picturegoer’s Who’s Who into Treasure Island, to test the audio story.
On the Saturday, I started work adding a new story script into Mo’s storyengine plist in Xcode. Mo has built an engine which means I can control the narrative and input the gestures I want to match key bits of the story. It also means I can play with hidden audio stories too.
I also wrote up a second story which is hidden within the book’s pages and will run alongside the main audio story for curious readers who explore further.
Mo is working on the 3D gesture data – how to interpret movements and gestures of the book – a complex problem.
Dave is working on the problem of getting data from the Arduino to the iPhone – using Xbee, and also trialling a BLEbee shield that Dr Michael Kroll kindly sent over, to transfer data through Bluetooth on the iPhone. Dave also worked on adding Twitter capabilities to the story engine, so we can trigger some extra content when the book is picked up and moved.
Parts of the story have been redrafted, and will continue to be – writing, recording and listening to the story triggers a need to rework, refine, make each one better than the last.
On the Arduino side, I’ve been pulling together the different components, working on the code to make everything play nicely, and working out how to place this in the book.
We also now have page detection working, so we’ll know if a reader has opened the book at a specific page during the story – and if so, the book can respond in a number of ways depending on whether the reader chooses to look at a page or not.
We have a second weekend lined up in the next few weeks to finish off this book – I’ll post a video when we have this first prototype of the talking book ready to show.