Last month I was at the Citizen Cyberscience Summit in London. This is a yearly event that brings together scientists and IT experts from different areas in order to discuss and advance citizen science (public participation in scientific research).
Citizen science is something that’s really taken off in the last few years. The natural sciences seem to have benefited most from it so far, through wildlife reporting projects. One area where mass public participation (via the internet) has proven very useful is in areas where large volumes of data need to be eyeball-checked. An example would be Snapshot Serengeti. Plankton Portal is a similar project in the marine sciences. One of the most successful mass participation projects involved classifying astronomical data.
Royal Geographical Society, where part of the summit was held (photo credit: _sarchi, flickr)
If the research doesn’t involve animals or outer space, researchers need other strategies to get the public involved. Gamification is a very popular approach – neuroscientists mapping the human brain and researchers developing quantum computing have both had success using online games as a way to get the public to participate.
Other talks discussed: citizen science in environmental activism; the importance of building and keeping a user base; the creation of citizen science apps for (Android) mobile phones; the difficulties of citizen science in extreme environments and areas of low literacy and connectivity.
Some of the more evangelical citizen science boosters believe future developments will involve greater public involvement – giving the citizen scientists more say in the research objectives and outcomes (as one person put it, “if it’s just collecting data, it’s not citizen science, it’s just science”), rather than just crowdsourcing data.
One negative was that there was almost no mention of *metadata* during the two days.
Organised by Integrated Earth Data Applications (IEDA) and Elsevier Research Data Services, the award was created to help improve preservation of and access to research data, particularly dark data (information that organisations collect, process and store during regular business activities but don’t use for other purposes). Participating organisations were encouraged to discuss the varied ways that these data are being processed, stored and used.
“The IMDIS series of Conferences promotes the meeting of different communities working in informatics, data management, research, environmental protection, etc. It is focused on online access to data, metadata and products, communication standards and adapted technology to ensure platforms interoperability. IMDIS 2013 aims at providing an overview of the existing information on marine environmental data, and showing the progresses on development of efficient infrastructures for managing large and diverse data sets.”
The poster discussed the production of Open Educational Resources (OERs) for historical sea level data from tide gauges and explained our motives for creating the resources.
“We wanted to raise awareness in the wider community of the existence of the new data and to help users understand the data, so we worked with three Ocean Sciences undergraduates to create OERs from the digitised and scanned resources. The students were able to incorporate their own learning styles and methods in the OERs, and use their experience to show us where observational data could be used in their courses. The OERs are documents which use the scanned and digitised data to explain and test understanding of an idea.”
Of using the data, the students said:
“Any time you use real data, it just feels more applicable… its real data so you’re doing more real experiments.”
“It makes the theory more real.”
Recently we began filming our end of project case study video. We decided to start by interviewing the students who worked on the Open Educational Resources about their experiences. We also spoke to their supervisor, Senior Lecturer Dr Harry Leach from the University of Liverpool. We asked him to tell us why he thought saving historic sea level data was important.
We put our newly developed filming skills to use – we used two cameras, a video camera and a digital single-lens reflex (DSLR), to capture the interviews from two different angles.
Our colleagues at the National Oceanography Centre had filmed in a room in our building recently and created a diagram for others to use, showing where to put the camera, lights, interviewer and interviewee in order to make the best use of the natural light. This made it quick and easy for us to set up. We had to deviate from their instructions slightly as they’d designed a setup for filming one person, while one of our interviews was with three people.
We’d planned our interview questions beforehand and managed to film about 15 minutes of footage, which should give us plenty to edit down into a five minute video. The only problem we had was some very noisy seagulls outside, but I think they’ll lend a suitably maritime atmosphere to the piece!
We would like to announce a forthcoming workshop on Major Research Topics in Sea Level Science. It’s being held to commemorate the 80th anniversary of the Permanent Service for Mean Sea Level (PSMSL) and will take place at the University of Liverpool’s Victoria Gallery & Museum on the 28th and 29th of October. The workshop is open to all, but free registration is required by the 16th of September. The workshop will focus on aspects of the Intergovernmental Panel on Climate Change’s Fifth Assessment Report (Working Group I), but there will also be a session on Data Archaeology, including talks and a poster session, where we will highlight the work we carried out for this JISC project.
Victoria Gallery, Liverpool (photo credit: Ian-S, flickr)
This workshop precedes the Global Sea Level Observing System (GLOSS) Group of Experts (GE) meeting, where we will present a progress report on techniques for the digitisation of archived mareograms (tide level recordings).
This week I attended the Jisc Final Programme Meeting at the Jisc offices in London. The attendees were mainly from the Strand B Mass digitisation projects, but there were also representatives from the other strands there.
Discovery was a big topic of conversation, and we discussed the results of the Jisc discovery survey. We talked about the methods we each employed for data discovery, the strengths and weaknesses of those methods and also the mechanisms we used to measure the success of those methods.
It was nice to discover that when you searched for ‘historic UK sea levels’ our Jisc and BODC project pages were all the top hits, so we need to build on this.
There were presentations from some of the projects about what the future holds, and a couple of them mentioned crowd sourcing and citizen science. One in particular discussed the creation of generic crowd sourcing tool kits, which could be of use to us, so we will be contacting the speakers to see if they can offer advice.
Kelvin’s first tide-predicting machine, Science Museum
And finally, when I was in London, I had time to pop into the Science Museum and see Lord Kelvin’s first tide-predicting machine. It is a lovely bit of engineering and I would recommend popping in to have a look at it!
Today I attended a meeting of the Sea Level and Ocean Climate group at the National Oceanography Centre, Liverpool, where Professor Philip Woodworth gave a short talk about tide prediction machines.
Prof Woodworth started with a brief history of tide predicting machines. The concept was demonstrated by Sir William Thomson (later Lord Kelvin) but Prof Woodworth argued that Edward Roberts should also be considered for the title of ‘Father of tide predicting machines’. Roberts was a mechanical engineer who built many of the machines put into use. Only around 25 machines were ever built and 20 of them were made in Britain. The very first machine is now in the Science Museum in London.
One of the very interesting aspects of the talk was a bit of detective work that Prof Woodworth and his colleagues had been involved in. I blogged recently about a film of a tide predicting machine that we’d had digitised; this machine was initially thought to be the Roberts-type machine currently in storage in Liverpool. However, upon closer inspection (including carefully counting the gears) it transpired to be a mystery machine. The Roberts machine in Liverpool could analyse 42 tidal constituents but the machine in the video only had 30 wheels. After a lot of questions and emails Prof Woodworth discovered that the machine in the video was actually a Roberts-type machine that had been built for the Soviet Union, which ended up in Moscow.
Other tidal prediction machines are preserved in museums around the world. These include the biggest machine ever built, in the Deutsches Museum in Munich.
Prof Woodworth is now investigating where the other missing machines are and if they are still in working order. He has started to contact the various agencies around the world who were known to operate them. We hope to get the Roberts or Doodson-Légé machine back out on display in the future.
The photo is of a functioning model of a tide predicting machine built by engineers at NOCL to illustrate the principles of a machine. The model was running at the talk.
This week, a few of us attended an introductory course on filming and editing at Futureworks in Manchester. We spent four days learning how to plan, shoot and edit a short film.
Day one was spent learning how our video camera works, how to shoot video using DSLR cameras and the basic principles of lighting. On the morning of day two we learned how to plan and storyboard a shoot; in the afternoon we went to Manchester Art Gallery (on location, as I believe it’s called!) to film a short piece about a couple of the current exhibitions there.
Day three was spent in the editing suite where we learned how to import our clips, rearrange them, edit them and create a timeline. Day four was all about adding the finishing touches, audio, effects, titles, etc.
Having just completed the course, I feel we can now produce a short film by ourselves on our project for JISC. We’re all bursting with ideas about exciting content. I’m hoping we can get an intervalometer so we can do some time-lapse filming…
The National Oceanography Centre, Liverpool, holds a weekly seminar every Wednesday during university term time. Though the lab specialises in shelf sea and sea level science, the seminar series is broad in scope, covering all aspects of oceanography, climate science and geophysical fluid dynamics. Seminars are attended by Ph.D. students, researchers and senior scientists from NOC and the University of Liverpool.
The Joseph Proudman Building, home of the National Oceanography Centre, Liverpool. Image credit: Rept0n1x (CC BY-SA 3.0)
Next week, I will be giving the seminar, the title of which is “Marine Data Management: Past, Present and Future”.
The first section of the talk will focus on data archaeology and I hope to explain the importance of recovering, quality controlling and distributing historic data. I’ll be discussing the JISC Sea level project as an example. It will also help raise awareness of this project within the lab and at other organisations and let people know there will be more data available soon.
Last week I attended the European Geosciences Union (EGU) General Assembly in Vienna. We had a poster in the “Climate: Past, Present, Future” session. The session was very well attended and I had a number of enquiries, including quite a few people telling me how difficult it was to get things digitised and congratulating me for doing this work. Which was nice!
There was also a poster in our session, “Global and regional sea level change since 1900” by Jens Schröter and Manfred Wenzel, which discussed analysing historic tide gauge records to look at global mean sea level change.
And, once I’d finished work, there was time for a spot of cake…