Posts Tagged ‘QGIS’

A satisfied customer

Posted: November 9, 2015 in GIS, This and That
Tags: ,

Recently I did a QGIS scripting job and here is the feedback from an extremely satisfied customer:

It was fantastic to work with Yury. He provided an excellent product, that went far beyond what I was expecting. I will certainly be contacting Yury in the future for any jobs that relate to GIS and python scripting. Communication was excellent and his ability to understand the job requirement was very impressive. A+++

Guys, if you are in need of geoprocessing tool for your project – don’t hesitate to contact me 😉

Advertisements

My QGIS Processing Scripts at GitHub

Posted: October 23, 2015 in GIS
Tags: , , ,

This is probably my shortest post ever.

All my QGIS processing scripts (R and Python) and models that I already blogged about, plus some extra are now available at GitHub.

A Quick Map With QGIS and OSM

Posted: June 18, 2015 in GIS
Tags: , , ,

What I love about QGIS is that one is able to create a nice map quickly. The other day I was asked to make a situation map for the project we are working on to include it into presentation. Аll I had was a laptop with no relevant spatial data at all, but with QGIS installed (I even had no mouse to draw something). Though it was more than enough: I loaded OSM as a base layer and used annotation tool to add more sense to it. Voilà:

Posted: June 11, 2015 in GIS
Tags: , , , ,
This is strange, but I was unable to find instruction about importing QGIS layers into PostGIS database with PyQGIS API. The PyQGIS cookbook has the example of exporting layers as .shp-files via QgsVectorFileWriter.writeAsVectorFormat() function and says that the other other OGR-supported formats are available to use in this function as well. PostGIS is supported by OGR so people got confused and try to use this function to import data to the PostGIS with no success and have to write generic import functions. 


After of couple of hours of searching the internet for the solution I gave up and decided to find the answer the hard way: started to search through the source code for the DB Manager plugin that has this nice “import layer” feature. It took about 20 minuets to trace down the function that was in charge of the import and it was QgsVectorLayerImport.importLayer(). But the quest wasn’t over yet! The documentation says nothing about provider names that are accepted by this function. “PostgreSQL” would be the obvious name for the provider as it is the name for the PostgeSQL provider in OGR, but it is not the case. I had to go through the source code of DB Manager again and luckily in comments (and there are quite a few of them in DB Manager: I didn’t find a single doc-string there) the author wrote that it is “postgres” for the PostgreSQL.

Now here is the very basic example code of importing QGIS layer into PostGIS:

uri = "dbname='test' host=localhost port=5432 user='user' password='password' key=gid type=POINT table=\"public\".\"test\" (geom) sql="
crs = None
# layer - QGIS vector layer
error = QgsVectorLayerImport.importLayer(layer, uri, "postgres", crs, False, False)
if error[0] != 0:
    iface.messageBar().pushMessage(u'Error', error[1], QgsMessageBar.CRITICAL, 5)

One of the optional input for the creation of the classification models in OrfeoToolbox is XML image statistic file that is produced by Compute Image Second Order Statistics tool. If you opt to calculate this statistics – necessarily check the created XML file. If you will see “#INF” record in band statistics (i.e. “1.#INF”) – replace “#INF” with something like “0e+32” (“1.#INF” -> “1.0e+32”), or (a better solution) – calculate the statistics for this problematic band independently (probably with another tool) and completely replace the value.

If you leave “#INF” record in XML file, models for classification that will be created based on it will have “#IND” records and when you will try to run classification based upon these models – the process will be terminated complaining about unrecognised “#IND”-values, because values must be numeric and not characters.

I don’t know what causes the “#INF” records to appear in the first place, maybe not properly defined no-data values? In some cased it is impossible to provide correct no-data value due to limitations of the input field (I use OTB from within Processing module for QGIS).

UPD: you may also want to check out my post on Pansharpening Using R.
For a long time I wanted to play with the OrfeoToolBox instruments (and its GUI named Monteverdi) – a set of powerfull tools to process remote-sensing imagery. Finally I got the opportunity – I needed to perform pansharpening of the World-View-2 scene and I decided to make it using OTB modules available via QGIS Processing toolbox.
To make OTB modules available in QGIS you need to install it on your system. Official site provides good instructions on how to do it. When you have OTB installed you need to enable it in QGIS. Go Processing -> Options and configuration -> Providers, activate OTB and provide folder information if needed:

A dialogue to enable OTB modules in QGIS

Pansharpening is made in two steps: 1) resampling of the low resolution raster via Superimpose sensor tool; 2) pansharpening using Pansharpening tool. Alternatively you can do it using one command line (no QGIS or other GUI needed) in your OS’s console:
otbcli_BundleToPerfectSensor -inp pan_image -inxs xs_image -out output_image.
Performing two steps instead of just one is super-boring! Lets create a model that will allow us to perform pansharpening in QGIS in one step (virtually). Go to Processing Toolbox -> Models -> Tools and start creating new model:

Processing toolbox location of 'Create new model tool'

In the model creation window fill the fields of the model name and group name. Then add two rasters as inputs and name them: High_resolution_panchrom_raster and Low_resolution_raster. Now add the module named Superimpose sensor (Orfeo Toolbox -> Geomentry -> Superimpose sensor) and configure it to resample the Low_resolution raster: use High_resolution_panchrom_raster as input in Reference input field and Low_resolution_raster as input in The image to reproject field.

 

Superimpose sensor dialogue window

Add Pansharpening (rsc) (Orfeo Toolbox -> Geomentry -> Superimpose sensor) module to model. Use High_resolution_panchrom_raster as input in the Input PAN Image and output of the Superimpose sensor as the input in the Input XS Image field. Give a name (pansharpened_OTB) for the Output image to let model know that this is the final stage of the processing.

Pansharpening (rsc) dialogue window

Here how our model looks like in Model builder:

Overview of the Pan-sharpening model in model builder

And this is how it looks like when you launch it:

Pan-sharpening model window

Also you can skip model creation process and just download model that I created. You need to paste files from archive’s models folder into your /.qgis2/processing/models directory.
If you will have issues running OTB Pansharpening it is likely that they will be covered in following gis.stackexchange topics: OTB Pansharpening Error: Adapter for adaptPansharpening-bayes not found and What causes OTB pansharpening ERROR: Inputs do not occupy the same physical space?

This post is mainly my answer to the corresponding question.

The main rule for accurate measurements is to define correct CRS for your layers and project. This is somewhat wide theme to talk about so I won’t 😉

Lets just discuss what you should do when the project CRS can’t be easily used for measurement (like Web Mercator) and you want to use Measure tool to calculate distance or area. Go Settings -> Project Properties, in CRS tab enable on the fly transformation; in General tab in Measure tool menu choose ellipsoid for distance calculations. See picture:

When you start measurements with the Measure tool move your cursor over the Measure window – an information about measurement settings will pop up just like that:

Now you can make your measurements and be sure that they are accurate. NOTE that you should always check Measure tool settings like it shown at the picture above because Ellipsoid settings for the Measure tool are not always stored or displayed in Project settings correctly (QGIS 2.6.1).

Can you guess the projection that I used to display a world map at the picture below (EPSG code is left there because I doubt someone would recall it anyway)?

It is the World Bonne projection in QGIS (original shp’s CRS is EPSG:4326). I already had some fun with Bonne in this post. The world got torn apart at scale 1:75 000 000. At bigger scales map renders normally:

 

Fortunately this issue can be easily resolved by disabling “feature simplification by default for newly added layers” under Settings -> Options -> Rendering.

Schema of the Conservation Areas in Leningradskaya Region

About a year ago I was asked to create a small (a B5 size) and simple schema of the conservation areas in Leningradskaya region. I did it using QGIS. Here you are the author version of the schema and several notes that might be helpful for a beginner map-maker.

There was a huge disproportion between areas of different objects and both polygon and point markers were needed to show them. I decided to use Centroid Fill option in polygon style to be able to use only one layer (polygon) instead of two (polygon and point). Using Centroid Fill makes points in centres of the small polygons overlap and mask these tiny polygons.

All the administrative borers were stored in one layer (and there are far more borders than one see here). They are drawn using rule-based symbology so I didn’t even need to subset this layer to get rid of the rest of the polygons in this layer.

All the names of the surrounding countries, regions, city and the water bodies are not labels derived from layers, but labels created inside the map composer. It was quicker and their position was more precise which is crucial for such a small schema.

There was a lack of space for the legend so I had to utilise every bit of canvas to place it. I had to use 3 legend items here. One of them is actually overlapping the other and setting a transparent background for the legends was helping with that.

Finally labels for the conservation areas (numbers) were outlined with white colour to be perfectly readable. Some of them were moved manually (with storing coordinates in special columns of the layer) to prevent overlapping with other labels and data.

P.S. Don’t be afraid to argue with the client about the workflow. Initially I was asked it manually digitise a 20 000 x 15 000 pixels raster map of the Leningrad region to extract the most precise borders of the conservation areas (and districts of the region). Of course I could do it and charge more money for the job, but what’s the point if some of that borders are not even to be seen at this small schema? So I convinced client to use data from OSM and saved myself a lot of time.

This is my third post in this series, here are the links to the Part 1 and Part 2. Sorry for the delay, but at that time I got completely consumed with my PhD thesis and this story just went out of my mind.
To the point. There were 2 types of shadows I had to deal with: cloud shadows and cast shadows (from building and trees). In scientific literature you can find several relatively complicated algorithms to deal with them (including DEM usage for cast shadow detection) and I definitely should have used some of them, but… I was too lazy to implement them when there were couple of quick’n’dirty workarounds available.

Cloud shadow removal

To remove cloud shadows I decided to calculate ratio between mean pixel value of the lighted up and shadowed parts of the raster for each band and multiply shadowed pixels by this ratio.  I manually created a mask for shadowed areas. There were 3 areas and their borders were quite distinct in NIR channels:
Cloud shadows in one of the NIR bands
I drew several polygons over shadows in QGIS and converted them to raster (Raster -> Conversion -> Rasterise). Now everything was ready for patching:
library(raster)
library(rgdal)

# load rasters
image <- stack('2013_IEEE_GRSS_DF_Contest_CASI.tif') # hyper spectral imagery
s_mask <- raster('shadow_mask.tif') # mask for cloud shadow

# extract shadowed area
m_cloud <- s_mask
m_cloud[m_cloud < 1] <- NA
shadowed <- mask(image, m_cloud)

# calculate zonal statistics
zd <- zonal(image, s_mask, fun = 'mean', na.rm = T)
zd <- as.data.frame(zd)

# calculate ratio between lighted up and shadowed zones
zd[3,] <- zd[1,] / zd[2,] 

# recalculae data in shadows
multiplyer <- zd[3,]
multiplyer <- multiplyer[,2:ncol(zd)]
multiplyer <- as.numeric(multiplyer)
enlight <- shadowed*multiplyer # patch layer
restored <- cover(enlight, image, progress = 'text')

# save result
wr <- writeRaster(restored, 
                  filename = 'restored.tif', 
                  datatype = 'GTiff', overwrite = TRUE)

Now it’s time to check out results:
The same NIR band after cloud shadow removal
Yes, we can easily find patched areas, but overall result is good for such simple approach.

Cast shadow removal

Shadows from buildings can be processed the same way as cloud shadows as shown above except mask creation approach is different. In case of cast shadows they have to be detected automatically (for there are a bit more than 3 as in case with clouds). Using one of the NIR bands I created a learning sample of points and passed it to Random Forest algorithm to classify imagery into shadowed and non-shadowed areas. I will not describe here this classification process for Random Forest was used for overall classification of the imagery and it is the subject of my next post of this series.