I've just finished my book (JavaScript Creativity) - you can buy it now at Amazon UK or US!
Back to homepage

OPEC Data Visualisation Portal

Update 27/04/2014: You can now find the project in its generic form as GISPortal on GitHub.

As many of you already know, I started working at Plymouth Marine Laboratory (PML) in July 2013. As I’ve now been here for about 6 months, I’ve decided it is time to write about the project that I’m working on and to ask for some feedback about it so please feel free to ask any questions you may have and share your opinions.

We are working on an open source web-based visualisation portal for marine data. The data (mostly generated by models) is produced by PML and other partners in the Operational Ecology (OPEC) project, which is being funded by the European Union. I joined halfway through development of the portal, which is due to be finished by the end of 2014.

The aim of the portal is to make the data accessible to a wide range of people from other scientists to the general public, with a focus on decision makers. We could just ask them to download complicated GIS software and send them massive files full of data, but that is pretty much useless! This way, we can just direct people interested in the data to a website (or web-app if you prefer) that will hopefully ‘just work’.

As I am sure you can imagine, this is quite a challenge. On one hand it needs to be so simple that anyone can go to the URL and produce something. But on the other, it needs to be advanced enough for it to be used by scientists that want to delve deep into the data. Oh, and at the same time we need to take massive amounts of data and make it small enough to be used on the web.

OPEC Portal

As the data is all geographic because it is about the sea, it makes sense that the main interface is a map. We use Open Layers 2 to manage the map, as it can handle both raster and vector layers as well as making it easy to convert from latitude and longitude to pixel coordinates and vice versa.

One downside to using Open Layers 2 is that it uses gridded <img> elements that are 256px square. If it were a simple mapping application then this wouldn’t be a problem, but as soon as you add multiple layers of data (each layer having its own grid that covers the entire viewport) and start trying to manipulate the map (for example, zooming) then it soon adds up and drastically effects the performance. In an ideal world, we would be using Open Layers 3 which utilises hardware-accelerated technologies such as WebGL, but unfortunately that is still in beta and will not be stable for quite some time.

Okay, so we have a map… how do we add layers to it? As I said earlier, it needs to be easy for the general public to use. Seeing layers appear on the map is often quite enjoyable, especially to people that haven’t used the portal before. So we have a jQuery UI dialog box that appears straight away that lets the user choose which variables (such as Chlorophyll or Nitrate) they would like to add as layers to the map. Because there are so many different variables, there is a filter on the left that should make it much easier for both an experienced scientist to find the exact variable they are looking for and somebody that just knows they want to see something from a particular region.

I am not a fan of jQuery UI in general, for a number of reasons. It does not have great performance and the dialog boxes are currently using a default theme that is obvious to anybody that has used it before. We are in the process of looking for a designer, who will hopefully help improve the look and feel of the dialog boxes or perhaps remove them entirely.

After testing the variable selector, it soon became clear that a lot of the users had no idea what the data was; this was partially due to inconsistent scientific names, so we renamed all of the variable names to be easier to understand We also slightly changed how a variable is added. It used to have a click handler on the variable to show the description and an icon (+) to add it to the map. We changed this, subtly, so that clicking the variable name would add it to the map (because it makes more sense) and there is now an arrow that indicates more information. Once the arrow is clicked, it shows not just the description but also the region of the data, the latitude and longitude, the date range of the data and any other applicable information. I’m not sure it is quite obvious enough though that the arrow is clickable.

Once a variable has been added to the map, it appears in the Layers panel at the top left. Currently all variables are models. The difference between models and reference is basically that models are raster (areas of data on the map) and reference layers are vector (such as the path of a ship).

Currently there is a context menu, indicated by an arrow, that let’s you manipulate the map such as changing the opacity of the layer or changing the colour scheme that it uses. There is also the option to zoom to data, which is a very useful tool when dealing with a lot of variables. We are going to need to move them at some point, rather than hiding useful features away in a drop-down menu!

Also, another slightly hidden feature is that, you can drag the variables in the layer panel to determine the order that they are layered on the map. This is particularly useful when used with opacity, or just to change the top layer.

Quite separate to the map is the graphing tools, found in the Analysis panel on the top right. You will see that there are currently four graphs:

  • Timeseries
  • Histogram
  • Latitude Hovmöller
  • Longitude Hovmöller

The graphing feature is overdue a large update, currently the timeseries use Flotr2 while the others use D3. The plan is to move them all to use D3 and use D3.Chart as a structure so that there can be some inheritance rather than duplicating code to produce similar graphs. Because the portal will be used by scientists, it needs to produce scalable graphs for publication in a variety of places including journals and posters. This sounds easy, D3 is very good at producing SVGs. However, some graphs such as the Hovmöllers are not well suited to SVG. The current implementation uses SVG rectangles as if they were pixels… which does scale but is extremely inefficient since every SVG rectangle is its own DOM element. This needs to change and we are looking into various solutions, it will probably end up needing to be a raster format or perhaps SVG but displayed as raster on a canvas for the web.

Another useful but somewhat hidden feature is the ability to create range bars to indicate time ranges. Currently this is only used within the analysis tools (graphing and data export) so it is under the heading of Area and Time of Interest. Currently it has four buttons and a select box, which seems over the top and I would like to simplify it at some point. Once you’ve created a range, it appears on the timeline at the bottom of the screen (along with any data layers that are already selected). You can then drag from one point to another along the new bar to create the range. The selected range populates the date fields, or for more precise control you can use the date picker by clicking on the fields.

Finally, for the front end at least, there is functionality to save the state of the portal. You can log in with either Google or Yahoo (these are temporary choices, it uses OpenID so the sky’s the limit) by clicking the first icon on the right of the top bar. Once logged in, you are able to simply click save to get a URL of the current state. This stores the majority (if not all) of the configuration changes that have been made, such as which layers are being shown on the map and the range bars on the timeline. A feature that I’ve been working on, but not quite finished, is a history window that stores all of the states you’ve previously saved, as well as every graph that has been generated. This makes it especially useful to scientists who need more than just one session of looking at the data. Also it makes it very easy to share with other people (a feature we’ve been using quite heavily for debugging on different machines).

15 Responses to “OPEC Data Visualisation Portal”

  1. Alex Jegtnes says:

    @ShaneHudson Good writeup. :) Seems like a difficult but satisfying challenge!

    — via twitter.com

  2. Shane Hudson says:

    @jegtnes Yeah it is an interesting one :) Thanks for reading and the retweet!

    — via twitter.com

  3. @ShaneHudson @PlymouthMarine @UniKentComp Amazing what you can achieve in 6 months – 6 months left! ……

    — via twitter.com

  4. Juan Antonio Bermejo says:

    Exelent work!

  5. Felipe says:

    Nice work!, one question, which is the technology that you are using to publish the model outputs? Thredds, geoserver, Mapserver? Thanks!

    • ShaneHudson says:

      We are using Thredds for the PML data but it uses WMS and WCS so the other data providers can use any server they wish. I plan to write another article soon explaining all of the data server and middleware, since this mostly focussed on the front end interface.

  6. Angel Cruz says:

    Sounds complex but really practical, good job. Is it possible to work and operate with grid layers in Open Layers 2?