At Innovative Learning Week this year we worked with students to develop board games using images from the CRC Flickr account as inspiration. Their challenge was to design a game which used at least three images from Open Educational Resource sites, one of which had to come from the CRC collection. The games also had to include at least three different game mechanics, be openly licensed and have a full set of rules.
Apocalypse Later is a card game in which players cooperate to overcome challenges ranging from volcano eruptions through to a zombie apocalypse, drawing and playing cards to gain advantages and advance in the game. One character is secretly a ‘mole’, whose sole purpose is to prevent the team from winning the game! The game features images from Anton Koberger’s German bible, the seal of Robert the Bruce and a decorated page from the Hours of the Blessed Virgin Mary from the CRC image collection.
In this art-themed board game, players take control of larvae hunting for works of art in various locations across the University. The larvae are highly cultured beings and need inspiration from art works in order to stay alive! Players draw cards representing different types of art (e.g. painting, sculpture) and have to decide whether to play them immediately for in-game bonuses / penalties or retain them for scoring at the end of the game. The player with the highest art value at the end is the winner. Cultured Ai (Arts for Ai) uses CRC plans of McEwan Hall, the Medical School and Glencoe Ballachulish for the game board.
In The Mouse Hunt, players compete in two teams vying for domination of an 18th century Edinburgh tenement! On one side, a team of mice attempts to drive the human inhabitants mad by digging tunnels and making a lot of noise. On the other side, humans set traps and try to rid the house of the rodent infestation! The house in which the game is set was inspired by historical images of Edinburgh from the CRC collection.
In Mythical Continents, players sail the seven seas fighting monsters and collecting relics hidden across the globe. Movement is governed by a wind dial (modelled on the Kalendar and Astronomical Tables from the CRC collection) and players complete to bring all treasures back to Nessie, drawing event and monster cards along the way!
Location-based intelligence is a growing area of importance in the academic library environment (as identified in the most recent NMC Horizon Report) and we’ve been exploring how Bluetooth beacons can be used to deliver information and content to users based on their location in the library space.
More recently, we’ve started to explore ways in which beacons can be used to provide tours of the library building itself. There are several potential use cases for this, such as a tour for new undergraduates showing them where key services are located, or a tour of the paintings on display in the main library for art enthusiasts, but we decided to create a tour of the building for the general public in order to tie in with Doors Open Day 2015. Our library was designed by the British architect Sir Basil Spence and A-listed in 2006: its history is of real interest to our visitors.
Working with colleagues from across Information Services, we developed a tour app (available from the Apple App Store and Google Play) which uses beacons to tell the story of the library building and service.Beacons were set up at seven locations and users who had installed the app on their phone were sent a notification whenever they came into proximity of one – tapping the notification provided the user with a short, 1-2 minute long, video about the area they were in, such as this general introduction to the building:
We had originally hoped to use beacons to create a form of ‘internal GPS’ to show the user their location in the library space (much like the blue ‘you are here’ dot on Google Maps) but we found that their inaccuracy over three metres made it impossible to trilaterate location accurately enough.
Around 50 people downloaded the tour over the weekend and the feedback was extremely positive. We learned some important lessons from this application, which will inform future uses of technology in this way.
Have a backup content delivery mechanism: make sure the content can be accessed manually through the app if the beacons don’t work. This also allows visitors to access the videos once they have left the building.
Have staff on hand to help people download the app: many visitors needed assistance to access the Wi-Fi network and download the app from the relevant online store.
Make sure Wi-Fi is available: provide Wi-Fi so that visitors don’t need to use their data connection to download the app, particularly as apps can be quite large (our app was 50MB). We set up a Wi-Fi hotspot for people who didn’t have access to the network.
Provide some basic signage in the physical space: let people know when they are in a beacon zone and provide QR codes linking to the app stores in order to assist with the download.
Bluetooth: make sure users switch it on!
The content is more important than the medium: we got good feedback for our experiment but, ultimately, a beacon is just a delivery mechanism and it was crucial that we provided high quality content. It took in the region of 50 staff hours to create seven two-minute videos.
Having to download an app is a huge barrier: the need to download an app prevented many visitors from engaging with the tour.
We’re continuing to explore the use of beacons in the library space and recently secured funding to see how Google’s new Eddystone beacon can be used to provide information and updates to library users throughout the building. We are especially keen on exploring the potential for Eddystone to bypass the need to download an app and will blog more as the project progresses!
Gavin Willshaw (Library & University Collections) , Ben Butchart (Edina), Sandy Buchanan (Edina), Claire Knowles (Library & University Collections)
Following on from the success of our Pop Up Library Metadata Games sessions during Innovative Learning Week, this week we took Metadata Games to the City of Edinburgh Council’s Central Library, the first time the game has been taken off campus since its initial pilot back in August. We battled through the inclement weather and set up stall in the library’s ground floor foyer area then, poised with laptops and lollypops, set about recruiting people to take part in our tagging game.
We were keen to spread awareness of our work – and the university’s collections – beyond the confines of campus and we were also interested in to see if there was any noticeable difference between the sorts of tags that members of the public contributed in comparison to those provided by students, academics and staff (that analysis is yet to be done!). Over the course of the two hour session, 15 players provided us with 776 tags (more than 50 per person) – an impressive total considering there was no free coffee on offer for participants on this occasion!
These tags will now be moderated and then uploaded to our image database, ultimately helping to make our collections more discoverable online. You can see an example of how the tags we have harvested from the game have directly contributed to the improved description of one of our iconic items, Rashid al-Din’s History of the World (http://collections.ed.ac.uk/iconics/record/51419). The crowdsourced tags ‘horses’ and ‘knights’, which were harvested from the game, complement the existing formal descriptive metadata showing author, date shelfmark etc.
We enjoyed taking the game on tour: many thanks to Bronwen Brown, Fiona Myles and all the staff at Central Library for all their help with the event, and to all players who contributed their time to help us improve the description of our images.
On Friday we trialled the use of Bluetooth beacons in our exhibition space, using Google Glass and the Guidigo app to provide an immersive tour of the Something Blue exhibition. Beacons work by emitting a small Bluetooth signal which activates content installed on visitors’ mobile devices. Four of these were placed at locations throughout the exhibition and, when users came within range, music, videos and voice recordings relating to specific exhibits were activated on the Glass headsets.
Users standing near the Blob 05 (Blue) exhibit, for example, were able to access an interview with Art Curator Neil Lebeter talking about the painting, while those in close proximity to the Vienna Horn could watch a video of Curator Sarah Deters playing the instrument.
There was a strong novelty factor as many people had not tried out Google Glass before, but on the whole it was felt that using the technology with the beacons in this way was an effective way of delivering content. The exhibition room is a relatively narrow space and because of space restrictions, some of the beacons were situated very close together. As a result, the signals from different beacons often interfered with each other, meaning content delivery was sometimes quite erratic. On more than one occasion someone standing next to one beacon received content from another one located several metres away on the other side of the room. As well as this, when too many people were standing close to a beacon the signal could be blocked or dulled.
In order to combat this for future sessions, it would be more effective to spread the beacons evenly throughout the space and have specific signs on the floor or walls saying something like “stand here to hear an interview with the curator”. Aside from these issues, the Google Glass worked really well: the Guidigo app overcame many of the well-known problems associated with the technology (poor battery life, overheating, and headaches) by putting Glass into sleep mode whenever the user was outside the beacons’ range. On the whole, it was an interesting experiment to take part in and we hope to have a more public trial of the technology at our next exhibition, so please do get in touch if you would like to be involved!
We are also exploring further ways of using beacons with other mobile devices to provide self-guided library tours: watch this space for further updates.
The aims of the sessions were to show off items from our art collection, get students and staff to try out the latest version of our metadata game, and to raise awareness of the importance, and ubiquity, of descriptive metadata, particularly for digital objects.
Players of the game were given laptops on which were displayed a series of digital images from the art collection. They were asked to ‘say what they saw’ by tagging these and then voting on the quality of other players’ tags. Points were awarded for the best descriptive tags, and the leaderboard was displayed on TV screens and projected onto the Holopro above the Main Library Helpdesk, thus creating a healthy sense of competition amongst players. If you took part, have a look at the leaderboard below to see how you did!
STUDENTS: TOP 10STAFF: TOP 10
Alongside the game, we displayed original works of art next to their digital surrogates in order to contextualise the tagging game: the digital images in our collections are representations of physical items and the information and details that can be seen in the digital object is often quite different to that which can be seen in the original. For digital collections, it is important that items are tagged correctly so that they can be found both in search engine results and within the image database itself.
The sessions were well attended, with almost 3,500 tags entered by over 50 staff and students. The tags obtained from theses sessions, once moderated, will be uploaded to our image database and used to improve the discoverability of our digital image collection.Why not play the game for yourself on the new Edinburgh Library Labs blog?
The Imperial War Museum ran an experiment to see how its First World War Galleries could be enhanced with the use of Google Glass and invited heritage professionals to try out the technology. Information Services at the University of Edinburgh have recently acquired a few sets of Google Glass and announced a competition to see how students could use it to improve their learning, so I was keen to see how it could be used in a heritage setting. The concept was actually very simple: a Glass ‘tour’ had been uploaded to the device and, whenever a wearer approached one of several beacons installed throughout the exhibition, the user was fed additional relevant content onto their Glass screen. For example, one of the exhibits was an early tank – when I came within range, a short 1916 propaganda film appeared on my screen describing how the new invention would “bring an end to the war”.
I felt the museum did a good job of providing enough additional content through the Glass to complement existing exhibits without overwhelming the user with too much additional information. The device was surprisingly comfortable and the screen wasn’t overly intrusive. This experiment showed that Google Glass can work in a museum setting: there is definitely scope for using it in one of the Library’s exhibition spaces to provide another dimension to showcasing our collections.
Digital Conversations at the British Library
There have been some fantastic initiatives recently in using heritage content as inspiration for video games – this event, part of the British Library’s Digital Conversations series, brought heritage professionals and games designers together for the formal launch of the 2015 ‘Off the Map’ competition for students to design games inspired by the BL’s collections. The theme for the competition, ‘Alice’s Adventures off the Map’, relates to next year’s 150th anniversary of the publication of Alice’s Adventures in Wonderland. The Library provides asset packs for games designers and facilitates access to original collections; the designers use these materials to create exciting and innovative computer games. Previous winners have included an underwater adventure through the long demolished, but now digitally restored, Fonthill Abbey, and a fully immersive 3D version of London from before the Great Fire of 1666.
There were also some really interesting talks at the event about the launch of the National Videogame Arcade in Nottingham, a discussion about how the V&A’s designer in residence built a successful mobile app using items from the museum’s collections, and a demonstration of how the British Museum used Minecraft to engage users with the building and its collections. The range of ideas on display gave food for thought – how can the University take inspiration from initiatives such as these to enhance access to and use of our own collections?
Earlier this week we ran a 1980s-inspired crowdsourcing ‘Metadata Game’ to enhance the description of items within our online image collection http://images.is.ed.ac.uk. The good folk of Library and University Collections rallied to the cause and ten hardy volunteers surrendered their lunch breaks to take part in the session, sustained only by coffee, biscuits and a will to succeed.
Each participant was provided with a series of random images and received one ‘crowdsourcing point’ for every person, object or location they tagged. Can you spot a familiar face in one of the randomly-generated images below?
Once the tagging was complete, the group then moderated the submitted terms and further points were awarded for approved tags. In the course of the session, 264 images were tagged a total of 631 times and, although time ran out during the moderation phase because of issues with load on the software, we learned a lot about the programme and received useful feedback for future sessions.
Norman Rodger from the Projects & Innovations team ran out the clear winner on this occasion, with 221 points, but he was pushed all the way by Sandi Phillips and Clare Button, who finished second and third with 155 and 146 points respectively.
We’ll be setting up the game again at the Pop Up Library during Freshers’ Week so come along and find us if you’re interested in having a go for yourself! Or if you’re anxious to have a shot now, please email email@example.com or firstname.lastname@example.org and we’d be happy to send you the link. Maybe you can even set a new high score…