OKCon 2010 thoughts

 

 

 

 

A brief blog post with a few thoughts about the okcon conference in London last Saturday.

It was a wonderful meeting where I got the feeling that the Open Knowledge Foundation was now a real power in the world of making information and resources available to everyone.  It was well attended and all the sessions were exciting and varied. The plenary session started with a State of the Nation session led by Rufus which has already been recorded and posted on vimeo

http://vimeo.com/11220474 Thanks to Jo Walsh).  There were presentations ( see previous blog post) which had a wide variety of topics.  In my own presentation I tried to show how science was critical to making major decisions in the current world such as in climate change.  Access to data is critical and it is frequently difficult to know what data exists or to get it even when that is known.  For example the IPCC copyright their publications and require formal permission to reproduce any material.  This is unacceptable in the modern world where we are increasingly requiring machines are to discover information and bring it back to us.  These machines cannot and should not be required to understand legal niceties so the only reasonable way forward for the semantic web is for all public information is categorically open.  I reported on the work done by the OKF and Science Commons on creating appropriate protocols and licences for ensuring that their data was dedicated to the public domain and appropriately licensed.  I believe that the OKF’s support of the Panton Principles is an important milestone in open science.

There were many exciting presentations but two that stuck out work on clean climate code and on open street map.  The presentation on climate code dealt with the FORTRAN which was used to generate the hockey stick graph. Some critics claimed that this FORTRAN could not be compiled and was of such a low quality that it could not have been used.  The presenters showed that they could in fact compile the FORTRAN and reproduce the graph pretty well.  There were several other groups who had written similar programs and generated very similar curves.  This shows that the actual calculation is reproducible.  They are now campaigning that all code used in this endeavour should be clean and public.  Our group takes a similar view in chemoinformatics software.

The work by Open Street Map was also impressive.  When the earthquake struck in Haiti the current maps were extremely poor and it was difficult for the rescue services to know where the roads had been.  However various companies and organisations made satellite images available and in a remarkably short time volunteers  created high quality maps of Haiti allowing the rescue services to know where buildings had been and how to get to them.  In fact the maps of Haiti are now superior to those available before the earth quake.  This is a major credit to OSM which has reached this position in only about five years from its early beginnings in mapping the streets of London by bicycle courier GPS traces.

We finished with a session in the bar where we discussed what needed to be done to help the effort to make climate change data available and computations more accessible.  We agreed to set up a working party and Jonathan Gray at the OKF is the contact for anyone who wishes to explore how we can help this process.

I am proud to be a member of the advisory board of OKF and congratulate all those who organised the meeting especially Rufus, Sara and Jo.

(In the dictation I am like don marquis’ archie the cockroach who cannot reach the shift keys of the typewriter. I shall learn).

This entry was posted in Uncategorized. Bookmark the permalink.

2 Responses to OKCon 2010 thoughts

  1. Pingback: Twitter Trackbacks for Unilever Centre for Molecular Informatics, Cambridge - OKCon 2010 thoughts « petermr’s blog [cam.ac.uk] on Topsy.com

  2. Nick Barnes says:

    Thanks for the Clear Climate Code mention, Peter. To enlarge a little on what you wrote about us:
    – We took the FORTRAN for GISTEMP – the GISS surface temperature record from 1880 to the present – not for “the hockey stick” (the reconstruction of surface temperature from proxies over the last 1000+ years, by Mann, Bradley, and Hughes 1998). They are related graphs.
    – As you write, we compiled and run that FORTRAN, and match the GISS results. But our real work began after that, when we started replacing the FORTRAN with equivalent newly-written Python. None of the original FORTRAN remains – our whole body of code is fresh Python, written with the specific goal of clarity, and we still match the GISS results.
    – Our goal is to encourage climate science software to be clear, so that the public has more faith in climate science results.
    It was interesting discussing this at OKCon, where a major focus is on open data. Our observation is that, in some fields, open data, and even open code, are not enough. The data relevant to GISTEMP is all open, and the code was published in full in 2007, but that did not diminish the doubt cast on the GISTEMP results.

Leave a Reply

Your email address will not be published. Required fields are marked *