Hack The Museum

Early on a bright Tuesday morning, around 60 coders, designers, creators and writers arrived at the hackathon hosted by the Science Museum, ready to make brilliant things happen.

A hackathon can be a fascinating experience for everyone involved. These kinds of events can be anything; from manic, thoughtful, chaotic, to ingenious and brilliant, and it completely depends on the people in the room. 

We were a mix of people of all ages and with a variety of skills, from different backgrounds and companies; joining together to find creative uses of the Science Museum’s open API to showcase the collection in a unique way.

Challenge 

“What would you do if the Science Museum and its data were your creative playground for two days?”

The first half hour involved forming teams. It wasn’t long before I found myself around a table with four lovely humans. Sam and Olivia are both PHD candidates at the Royal College of Art (researching interactive data visualisation), Céline is a digital artist and designer, and Manchi is a leader of digital education projects at the Institute of Physics.

In the first morning we started sketching ideas based on our skills, resources and thoughts about what we could do with the Science Museum’s Open API, which gave us access to all the images, and data from the Museum’s collections.

Our team decided to go for a very creative and visual approach to answering a potential question, perfectly written by Manchi.

“What do you do when you have over 20,000 objects/images stuck in a database and no way to explore the magnitude of the data beyond a conventional – and at times scary – Search Box of Doom?”

We settled on the idea of a visual and creative discovery installation, that would make use of all of the images in the Science Museum’s collection. Our plan was to randomly pull the images from the collection and then display them from a projection, so that the images were falling down from a beautiful waterfall. The waterfall itself would have a timeline at its source, showing that the objects from the left of the waterfall were from further in the past, and objects from the right were from more recent times.

That afternoon and the next day became a blur of coffee, biscuits and sweets. Our desk transformed into a chaotic mess of scribbled notes on paper, drawings and sketches, laptop cables and chargers. There were surreal chats about timeline waterfall positions, image physics and GitHub stickers, whilst beautiful visual backgrounds, animations and music were made to further enhance our creation.

Midway through day two, the Science museum gave us all the toys that we’d need to put our installation together; a mighty projector, a screen, some speakers and a million cables. Sam and Olivia frantically worked through to the final few seconds, until finally it was time for every team to demonstrate their creations.

Solution

Our project queries the API every tenth of a second and randomly snatches an object from the collection… mysteriously manifesting it upon our waterfall. Each object is liberated from the collection. A simple timeline straddles the top of the fall indicating the era of each object’s apparition.

A mesmerising experience, this visualisation of the collection enables watchers to serendipitously encounter objects from across the whole swathe of human endeavour.

The idealised version of the waterfall would have the following additional features:

  • Greater user interaction: the user could mouse-over the waterfall, slowing down time to enable them to click on an object. A pop-up would appear with further details of the object. (We were quite close here – maybe 30 minutes away from this being a reality!)
  • The waterfall image moves more dynamically.
  • We would take collection information from across other online collections such as display objects from the V&A, Natural History collection etc.

Impact

#Waterfall won the “Creative Prize” for most artistic, beautiful, and creative use of the Science Museum’s open API.

#museumegg won the prize “The Next Big Thing”. Their Museum Egg prototype identifies your favourite objects (by dwell time) so you can print out info about them at the end of a visit.

#scimu9000 won the “Punk Prize”. Their voice interface robot answers questions about objects at the Science Museum.

    Later that evening we demonstrated all of the creations to the public at the Science Museum’s lates event. Hundreds of curious people came to chat about what we’d made and why. When people realised that the Science Museum’s API is open to the public and that everyone has access to this content, they were often impressed and excited by the possibilities.

    Here’s to a successful two days! It was an absolute pleasure to “Hack The Museum”.

    Greg Felton,  

    Media Editor and Music Lead at Amphio