Friday 6 November 2009

Support for Labels and GDAL Rasters in SharpMap.Wpf

Having finally got around to doing some more work on SharpMap.Wpf I have at last managed to implement generic versions of both the LabelLayer and GdalRasterLayer. As with VectorLayer, a large body of the code is the same as the GDI+ specific versions however they both derive from the new SharpMap.Wpf.Layers.Layer base class which uses an IRenderContext in its overridable Render method instead of a GDI+ graphics object.

There are still a few minor issues to iron out. I haven't gotten around to supporting label rotation yet (although this should not be very difficult) and GdalRasterLayer does not yet do anything with the TransparentColor property. Still, nearly there! Just need a better demo project now.

Some screenshots below:







Tuesday 22 September 2009

SharpMap.Wpf

I've been wanting to write a WPF renderer for SharpMap ever since I started using version 0.9 about a year ago. Sure, SharpMap v2 has a few early WPF implementations, but its been in beta for quite some time now.

Our existing WPF/SharpMap solution involves using the plain old GDI+ stuff and blatting the entire map into a WPF image using Interop Bitmap. It does the job quite well, so why bother doing anything more?

Well, mainly cos I wanted to! but also because I like the idea of having true WPF styles and themes. The improved brushes in WPF mean that there is scope for some really pretty maps. Also, the idea of using several WPF images to host transparent layers and animated overlays is quite appealing.


So far, I've got a WPF map renderer up and running for vector layers only. I'm currently working on WPFing the LabelLayer and the GdalRasterLayer. Maybe, when its getting to a semi-finished state i'll try and host it on CodePlex. I guess it all depends if anyone is still interested in 0.9 by then!?
A few interesting points on the current state of play:
  • At first I started to write a dedicated WpfVectorLayer and a corresponding WpfVectorRenderer, but I just couldn't live with the code smell. There was too much redundant duplication. In the end, I refactored the VectorLayer into a new generic class with a templatable style and introduced a new Render method that used an IRenderContext instead of a Graphics object. I now have two concrete implementations of IRenderContext, one for GDI+ and one for WPF. To this end, you can declare concrete instances of the generic VectorLayer that either use the standard VectorStyle or the new WpfVectorStyle, depending upon the type of renderer you wish to use.

  • Due to the departure from the original design, all the new stuff is in a separate namespace. The 'legacy' VectorLayer has been left unchanged (yeah, so I've still ended up duplicating code, but at least moving forward you need only implement an IRenderContext)

  • The WPF renderer uses the stream geometry classes to draw straight to a WPF DrawingContext. In addition to this, it provides methods to render the entire map, or just a single layer, to a Visual or a BitmapSource.
Here is my SharpMap.Wpf 'hello world':

Wednesday 9 September 2009

GdalRasterLayer and Ordnance Survey TIFF Files

..or any other raster files that use a colour palette.

So, having found out that you can use SharpMap's GdalRasterLayer to load and display
virtual datatsets, I quickly set about trying to get it working with the tiff files provided by Ordnance Survey.

However, when I actually tried to view the data I was presented with a big black nothing.


Turns out there is...not a bug exactly, but more of an 'omission' in the GdalRasterLayer code. At the time of writing both the 0.9 release and the current trunk build of SharpMap contain a version of GdalRasterLayer that cannot handle images that use colour palettes. And yes, all the raster data from Ordnance Survey (or at least, all the tiff files) make use of them.


I eventually came across a fix in this post. However, the patch was based on an old version of GdalRasterLayer and the code had moved on significantly. After a bit of head scratching I gave in and tried my hand at porting across the changes.

Now, the code is not beautiful. It doesn't seem to be a perfect fit. Having said that, I did not want to change the original code too drastically and so it is what it is...and it even seems to work!


Anyone who is interested can pick up it up from
here or download it from the original post here.

Monday 7 September 2009

Raster index datasets in SharpMap

In my previous post I eluded to the fact that it was possible to use SharpMap 'out of the box' to render raster index datasets (tiles). About a year ago I remember trying to get SharpMap to do just that and came to the conclusion that unless i wanted to use some kind of web based tile server, I would have to invest a lot of time creating a new type of layer, perhaps based loosely on the TiledWmsLayer, but specifically for loading large raster tile datasets from disk.

Turns out I didnt need to do any of that.

GdalRasterLayer supports many different raster formats. One that I was unfamiliar with until recently was its own virtual dataset format. This is basically an xml file that, among other things, can pull together several raster files and treat them as a single dataset. You can build one of these very easily by doing something like:
gdalbuildvrt rasterIndex.vrt *.tif
Now, because vrt files are just another format supported by gdal, you can load them straight in using the GdalRasterLayer like so:
GdalRasterLayer layer = new GdalRasterLayer("MyRasterIndex", "rasterIndex.vrt");
And because the GdalRasterLayer is pretty smart at what it does, it will only read the pixels that it needs, meaning it's pretty efficient in terms of its speed and memory usage.

Wednesday 2 September 2009

Processing Ordnance Survey Rasters with GDAL

Ordnance survey produce raster maps of the UK at several resolutions. They are provided as a set of tiles, geographically chunking the whole of the UK according to the British national grid. As with most things Ordnance survey, they seem to go out of their way to make them as hard to use in a standard GIS application as possible.

The set of raster tiles for the UK at 1:10k comes in at about 26GB. A whopping 10,000+ tiles. They do not come with any geo-referencing and are not made available in any format that includes this information (i.e Geotif). Instead, they are provided as tif, bmp or jpeg files with the option to download the geo-reference files (tfw etc) separately. To add insult to injury, it would seem that, in trying to save disk space, instead of a standard 3 band rgb tif file, what you actually get is a single band tif file with a colour palette. Its not so bad really, just another minor annoyance.

As I have mentioned before, we use a SharpMap based WPF control to render our data. This makes rendering OS tiles non-trivial for the following reasons:

  1. There is a bug in GdalRasterLayer which prevents it from rendering tif files that make use of colour palettes
  2. There is no dedicated layer type for rendering a raster tile index/set unless you want to serve them up over a WMS

Now, I was actually able to fix the problem in (1) and because of this (and with a better understanding of GDAL) I was able to render a raster tile set using nothing but a single GdalRasterLayer. However, maybe that's something I'll come back to in another post. For now, I want to talk about our original solution, which was to convert the tiles to ECW's using GDAL.

For 1:250k, 1:50k and 1:25k we found it was possible to create a single ECW for the whole of the UK. For 1:10k the huge number of tiles involved really put a strain on the memory. I don't know if it is achievable or not but one thing is for sure..it would take a very very long time (somewhere in the region of 1-2 weeks of solid processing). In the end, for the 1:10k stuff, we decided to create a single ECW for each top level square of the British national grid, giving us something like 50 ecw's in all. The time taken to process each of these regions varied between a few minutes and a few hours.

So, here are the steps involved to convert a set ordnance survey rasters into a single ECW:

  1. Copy all the tiles at a particular resolution (say 1:250k) into their own directory. e.g C:\250k\
  2. Download the geo-reference files from ordnance survey and copy them into the same directory as (1)
  3. Using the fwtools shell, cd to the directory with the raster tiles and the geo-reference files in.
  4. Build a virtual raster file(an xml file) of the entire set using:
    gdalbuildvrt 250k.vrt *.tif
  5. Use gdal_translate on the virtual raster file to convert the entire set of tiles into a single ECW:
    gdal_translate -of ECW -co LARGE_OK=YES -expand rgb -a_srs EPSG:27700 250k.vrt 250k.ecw
Issues, workarounds and a handy bit of C#

Generally speaking, this worked well. The only real problem I faced was when dealing with the 10k national set. When trying to create a virtual raster of a top level grid square with a command like:
gdalbuildvrt SU.vrt SU*.tif
occasionally the command would fail. This would happen if the tiles in question had differing colour palettes or a differing number of bands.


The solution was to introduce an intermediate stage whereby the source rasters were 'normalised' to 3 band rgb tif files. I achieved this by using the following command on each file:
gdal_translate -co COMPRESS=DEFLATE -co TFW=YES -expand rgb HP60NE.tif Expanded\HP60NE.tif
Then, when it came to making the final ECW the -expand option had to be left out, due to the palette expansion having already happened in the previous stage.

I wrapped all of this up into a neat little C# program (mainly because I couldn't be bothered to work out the java script or batch file commands) that checked the integrity of the files, 'normalised' their palettes if necessary, built the vrt and finally, converted the raster tiles into a single ECW file, logging all the results into a separate file to record timings and errors.

The resulting tiles2ecw utility can be downloaded from here. You can either compile it into a console application or run it directly from the command line using the rather brillient cs-script.

Friday 28 August 2009

Adventures with SpatiaLite

A while ago I came across SpatiaLite. Its particularly interesting to me as we (the company I work for) need to deploy vast amounts of mapping data on machines where it is not always possible to connect to a dedicated mapping server. For us, this means installing PostGIS onto every client and restoring 20+GB of spatial data onto each one.

I wanted to use SpatiaLite instead of PostGIS. That way, deploying the mapping data onto the client would become as simple as copying across a flat file. It sounded like the perfect tool for the job. However, as with all things, nothing is ever easy. Here are a couple of the problems I faced:

  1. We deal mainly with mastermap data and use a proprietary database loader to upload it to PostGIS. This loader does not support SpatiaLite so I had to figure out how I was going to get it into the SpatiaLite format.


  2. We use the SharpMap library to render our mapping data, which has extensions to support PostGIS. I would have to find or write a custom data provider if I wanted to use SpatiaLite instead.

Converting Ordnance Survey Mastermap data to SpatiaLite

First the easy (but less than satisfactory) way...

As we already had our data in PostGIS, I was able to use the latest version of fwtools to convert the existing database to SpatiaLite. This was surprisingly easy once i'd figured out all the arguments. It preserved our existing schema and was reasonably efficient. The command I used can be seen below:

ogr2ogr -f SQLite -dsco "SPATIALITE=yes" london.sqlite PG:"host=localhost user=postgres dbname=london password=postgres" mastermap

I had three main issues with this approach. Firstly, the latest version of fwtools (2.4.2) is not linked against libspatialite, meaning I was unable to create the spatial index from the command line and had to use the spatialite tools to do this as another manual step. Secondly, and more importantly, I had no way to apply change only updates (COU's) to the SpatiaLite database. I would have to apply them to PostGIS and then re-convert the whole lot. Thirdly, its just plain painful. encoding/uploading the data to PostGIS can take a long time. Converting it to SpatiaLite as a secondary stage doubles the time involved. Wouldn't it be nice if you could go straight from the original mastermap GML to the SpatiaLite database and handle COU's as well...

A Better Approach...

I'd had my eye on an open source converter for a while. After several experiments I had managed to get it translating and uploading the mastermap GML to a PostGIS database according to our schema (the tool uses a schema recommended by Ordnance Survey but, sadly, very different to the one our proprietry tool uses). Due to the foresight of the author, all this involved in the end was tweaking the xsd and xslt files (Actually that's not quite true, i had to change the source a bit as well so that it created the PostGIS tables in a slightly different way..but that's by the by).

After playing around with the 'Ogr converter' option, I finally bit the bullet and decided to have a crack at extending the tool to output directly to SpatiaLite. A few hours later (and after much reading up) I had a working converter. It could handle bulk loads, COU's and spatial index creation. It was fast and, thanks to its use of xsd/xslt, very flexible. I must take my hat of to the author of this tool for making it so extensible. Great job.

Now, just how was I going to plug this into our rendering engine? Read on...

Rendering SpatiaLite data using SharpMap

Bill dollins posted a SpatiaLite data provider on his blog a while back. Just the job. However, when I came to put it all together I was left twiddling my thumbs. It was taking an absolute age to render. The problem was that SpatiaLite does not automatically use the spatial index. Instead, you have to build a special query to make use of it...and the data provider I'd nabbed from Bill's site didn't make any use of it.

I ended up making a few changes to allow the data provider to be constructed with information about the spatial index table (yes, its implemented as a table, not a true index) and then modified the spatial query to make use of it. The main change was to the GetBoxClause method which can be seen below:

 private string _SpatialIndex;

/// <summary>
/// Name of the spatial index table
/// </summary>
public string SpatialIndex
{
get { return _SpatialIndex; }
set { _SpatialIndex = value; }
}

private string GetBoxClause(SharpMap.Geometries.BoundingBox bbox)
{
if (!string.IsNullOrEmpty(SpatialIndex))
{
StringBuilder sql = new StringBuilder("ROWID IN ( ");
sql.Append("SELECT pkid FROM ");
sql.Append(SpatialIndex);
sql.Append(" WHERE ");
sql.AppendFormat(SharpMap.Map.numberFormat_EnUS,
"xmin < {0} AND xmax > {1} AND ymin < {2} AND ymax > {3} )",
bbox.Max.X, bbox.Min.X, bbox.Max.Y, bbox.Min.Y);

return sql.ToString();
}

string wkt = SharpMap.Converters.WellKnownText.GeometryToWKT.Write(LineFromBbox(bbox));
return "MBRIntersects(GeomFromText('" + wkt + "')," + _GeometryColumn + ")=1";
}


After making the changes, everything performed blisteringly fast. Perhaps even slightly faster than it did with PostGIS. So there you have it. A zero configuration GIS database for storing mastermap thats as easy to deploy as copying a file. All thanks to SpatiaLite (and a bit of work here and there).

Thursday 27 August 2009

To blog or not to blog...

I've never really understood blogging. I mean, blogs are great for finding things out but actually writing one...now why would you want to do that?

Well, i guess maybe its time to give something back. Or maybe, its just time to start recording some of the nerdy little gems that make life as a software engineer just that bit sweeter.

So here it is. A blog. Written by me. Who would have thought?

My intention is to start writing a series of posts dealing with Ordnance Survey mapping data and Open Source GIS (undoubtedly paying much attention to SharpMap along the way). Anyone who has to work with Ordnance Survey data may well understand the headache that can ensue from the vast amount of data processing and storage involved. Its often difficult to know where to start if your not buying into an enterprise GIS solution. So, if you don't want to splash out on proprietary software for dealing with all that GML...well you might just have come to the right place.