Tuesday, 12 May 2015

Working Wonders

Early last week, it was full steam ahead. After collecting all the appropriate content for the audio, my Module Leader, Dr Sara Perry, went to site whilst I stayed in York writing up the scripts for each individual stop. By doing it this way, I could then email the scripts to her and then Sara would read it out loud to see how long the breaks in between paragraphs would be, whether the directions I was giving were relevant and easy to follow, etc. Because I wasn't present on site, it made it a lot harder for me to orientate myself! Sara then emailed me back the script for me to record. I had decided not to have the Chapel as a stop, purely because I could not find enough content to support the audio, which was a shame as no one seems to know an awful lot about it. This back-and-forth process, with one of us on site, and one back in the lab at York was actually really effective, and highlighted the importance of getting the audio and other content right in terms of its connection to the landscape.

Mind map for the app title
(photo by author)
Back at King's Manor, the whole first draft of the app had to be completed. Once again, Simon Davis was on hand to assist in editing the three scripts on the software, Audacity. When I was in my room on campus recording the audio, I was conscious of getting the right tone, so not to sound too enthusiastic but to not sound monotone and uninterested, so finding my 'voice' if you like took a while. Listening back to the audio, there were background noises (shuffling paper, breathing etc), long gaps which proved unnecessary and some stumbles/re-starting of sentences which obviously had to be erased. I learnt that you could spend hours on one recording to get it spot on, but we had 2 hours to ideally do three recordings - so it was quite repetitive in listening to the same bits over and over again! I learnt that I had to be brutal in erasing the pauses, again, by not being on site and not knowing how long the user would need to break for (to look for certain features etc) was tricky. So after 2 hours, I had one recording which was cut down from 6 minutes to about 2 and a half, the process feeling very long, but the outcome being very short!

A screen shot of deleting background noise in Audacity (photo by author)
Screen shot of home page of mobile app (photo by author)

I was then able to edit the other two scripts back on campus, ready for the following Monday, when both Sara and I would test the app on-site. Following on from this, in the afternoon we met up with Tom Smith once again, so we could construct and finalize the app's format on LiveCode. We began by designing the home page: I'd done a black and white sketch (well, a pencil sketch), which we scanned into the Drive and then I drew an outline of the huts that would've been the focal point of the site. These huts we thought could be the buttons to lead you from one stop to the next, but we ended up lining them up next to each other, to create the illusion of them being how they would've been 100 years ago. My next task was to learn how to import audio into the app. On each card I had inserted two buttons, a 'play' and a 'pause' one which would then correlate to the playing and pausing of the audio. By selecting the button, and clicking on 'code', I was then able to insert a script code as to what I wanted the button to do once I clicked on it. I then repeated this method for the next two cards. 

Screen shot of importing audio (photo by author)

After a whole day of basically making the entire app, making sure the visual imagery, the audio, the content and the design of the app was usable (at least), I then was able to continue this process over the weekend of completing the editing of the audio, making sure the codes were in the right place and took you to the right page and any other small details were perfected. Then hopefully, it will all work out on the Monday! 

No comments:

Post a comment