News aggregator

GeoTools Team: GeoTools 18.0 Released

OSGeo Planet - Tue, 2017-10-17 08:58
The GeoTools team is pleased to announce the release of GeoTools 18.0:This release is also available from our Maven repository.

Thanks to everyone who took part in the code-freeze, monthly bug stomp, or directly making the release. This release is made in conjunction with GeoServer 2.12.0

This release is the new stable release and as such users and downstream projects should consider moving from older releases to this one.
Highlights from our issue tracker release-notes:
  • GeoPackage store now supports spatial indexes.
  • WMTS store added this allows programs to process tiles in a similar way to the existing WMS store.
For more information see past release notes (18-RC1 | 18-beta).

Thanks to Astun Technology for allowing Ian Turton to make this release.
Categories: OSGeo Planet

GeoServer Team: GeoServer 2.12.0 Released

OSGeo Planet - Tue, 2017-10-17 08:55

We are happy to announce the release of GeoServer 2.12.0. Downloads are available (zipwardmg and exe) along with docs and extensions.

This is a stable release recommended for production use. This release is made in conjunction with GeoTools 18.0.

Rest API now using Spring MVC

In March, we upgraded the framework used by the GeoServer REST API from Restlet to Spring MVC. All the endpoints have remain unchanged and we would like to thank everyone who took part.

We should also thank David Vick who migrated the embedded GeoWebCache REST API, and the entire team who helped him reintegrate the results for this 2.12.0 release.

Thanks again to the code sprint sponsors and in-kind contributors:

Gaia3d   atol_logo  Boundless_Logo    How2map_logo     fossgis_logo  iag_logo  

As part of this upgrade, we also have new REST documentation, providing detailed information about each endpoint. The documentation is written in swagger, allowing different presentations to be generated as shown below.


WMTS Cascading

Adds the ability to create WMS layers backed by remote WMTS layers, similar to the pre-existing WMS cascading functionality.

See GSIP-162 for more details.

Style Based Layer Groups

Adds the ability to define a listing of layers and styles using a single SLD file, in accordance with the original vision of the SLD specification. This includes a new entry type in the Layer Group layers list and a new preview mode for the style editor.

GeoServer has long supported this functionality for clients, via an external SLD file. This change allows more people to use the idea of a single file defining their map layers and styling as a configuration option.

See GSIP-161 for more details.

Options for KML Placemark placement

New options for KML encoding have been added, to control the placement of placemark icons, mostly for polygons. The syntax of the new options introduces three new top-level format options keys:


See GSIP-160 for more details.

GeoWebCache data security API

Add an extension point to GeoWebCache allowing for a security check based on the layer and extent of the tile. Adds an implementation of this extension point to GeoServer’s GWC integration.

This change mostly only affects developers but will lead to improved security for users in the future.

See GSIP 159 for more details.

NetCDF output support for variable attributes and extra variables

Adds the following to the NetCDF output extension:

  1. An option to allow all attributes to be copied from the source NetCDF/GRIB variable to the target variable.
  2. Support for manual configuration of variable attributes, much like the current support for setting global attributes.
  3. Support for configuration of extra variables which are copied from the NetCDF/GRIB source to the output; initially only scalar variables will be supported. Extra variables can be expanded to “higher” dimensions, that is, values copied from one scalar per ImageMosaic granule are assembled into a multidimensional variable over, for example, time and elevation.

See GSIP 158 for more details.

New labelling features and QGIS compatibility

A number of small new features have been added to labelling to match some of QGIS features, in particular:

  • Kerning is on by default
  • New vendor option to strikethrough text
  • New vendor options to control char and word spacing


  • Perpendicular offset now works also for curved labels (previously only supported for straight labels):
  • Labeling the border of polygons as opposed to their centroid when using a LinePlacement (here with repetition and offset):

Along with this work some SLD 1.1 text symbolizer fixes were added in order to better support the new QGIS 3.0 label export, here is an example of a map labeling with background image, as shown in QGIS, and then again in GeoServer using the same data and the exported SLD 1.1 style (click to enlarge):


CSS improvements

The CSS styling language and editing UI have seen various improvements. The editor now supports some primitive code completion:

At the language level:

  • Scale dependencies can now also be expressed using the “@sd” variable (scale denominator) and the values can use common suffixes such as k and M to get more readable values, compare for example “[@scale < 1000000]” with “[@sd < 1M]”
  • Color functions have been introduced to match LessCSS functionality, like “Darken”, “Lighten, “Saturate” and so on. The same functions have been made available in all other styling languages.
  • Calling a “env” variable has been made easier, from “env(‘varName’)” to “@varName” (or “@varName(defaultValue)” if you want to provide a default value).

As you probably already know, internally CSS is translated to an equivalent SLD for map rendering purposes. This translation process became 50 times faster over large stylesheets (such as OSM roads, a particularly long and complicated style).

Image mosaic improvements and protocol control

Image mosaic saw several improvements in 2.12.

First, the support for mosaicking images in different coordinate reference systems improved greatly, with several tweaks and correctness fixes. As a noteworthy change, the code can now handle source data crossing the dateline. The following images show the footprints of images before and after the dateline (expressed in two different UTM zones, 60 and 1 respectively) and the result of mosaicking them as rasters (click to get a larger picture of each):

There is more good news for those that handle mosaics with a lot of super-imposing images taken at different times. If you added interesting information into the mosaic index, such as cloud cover, off-nadir, snow cover and the like, you can now filter and sort them, in both WMS (viewing) and WCS (downloading) by adding the cql_filter and sortBy KVP parameters.

Here is an example of the same mosaic, the first composite favouring smallest cloud cover, the second one favouring recency instead (click to enlarge):


GeoPackage graduation

The GeoPackage store jumped straight from community to core package, in light of its increasing importance.

The WMS/WFS/WPS output formats are still part of community. Currently, GeoPackage vector does not support spatial indexes but stay tuned, it’s cooking!

New community modules

The 2.12 series comes with a few new community modules, in particular:

  • Looking into styling vector tiles and server side using a single language? Look no further than the MBStyle module
  • For those into Earth Observation, there is a new OpenSearch for EO module in the community section
  • Need to store full GeoTiff in Amazon S3? The “S3 support for GeoTiff” module might just be what you’re looking for
  • A new “status-monitoring” community module has been added, providing basic statistics system resource usage. Check out this pull request to follow its progress and merge.

Mind, community modules are not part of the release, but you can find them in the nightly builds instead.

Other assorted improvements

Highlights of this release featured below, for more information please see the release notes (2.12.0 | 2.12-RC12.12-beta):

  • Users REST uses default role service name as a user/group service name
  • imageio-ext-gdal-bindings-xxx.jar not available in anymore since 2.10
  • REST GET resource metadata – file extension can override format parameter
  • GeoServer macOS picks up system extensions
  • SLD files not deleted when SLD is deleted in web admin
  • Reproject geometries in WMS GetFeatureInfo responses when info_format is GML
  • Include Marlin by default in bin/win/osx downloads, add to war instructions
  • Handle placemark placement when centroid of geometry not contained within
  • Enable usage of viewParams in WPS embedded WFS requests
  • Add GeoJson encoder for complex features
  • Allow image mosaic to refer a GeoServer configured store
  • Duplicate GeoPackage formats in layer preview page
  • ExternalGraphicFactory does not have a general way to reset caches
  • Generating a raster SLD style from template produced a functionally invalid style, now fixed
  • Style Editor Can Create Incorrect External Legend URLs
  • Namespace filtering on capabilities returns all layer groups (including the ones in other workspaces)


About GeoServer 2.12 Series

Additional information on the 2.12.0 series:

Categories: OSGeo Planet

gvSIG Team: SIG aplicado a Gestión Municipal: Módulo 5.2 ‘Servicios web (Carga de servicios web desde gvSIG Desktop)’

OSGeo Planet - Mon, 2017-10-16 08:30

Ya está disponible el segundo vídeo del quinto módulo, en el que veremos cómo cargar servicios web desde gvSIG Desktop. En el primer vídeo de este módulo vimos una introducción sobre las Infraestructuras de Datos Espaciales (IDE), que nos sirvió para poder entender mejor este nuevo vídeo.

Muchas administraciones tienen a disposición de los usuarios una gran cantidad de cartografía disponible, siendo en muchas ocasiones servicios web accesibles desde aplicaciones de escritorio o visores web, que nos permite acceder a dicha cartografía sin necesidad de descargar nada en nuestro disco.

La cartografía a utilizar en este módulo podéis descargarla del siguiente enlace.

El segundo vídeo-tutorial de este quinto módulo es el siguiente:

Post relacionados:

Filed under: gvSIG Desktop, IDE, spanish, training Tagged: IDE, Infraestructuras de Datos Espaciales, Servicios web, WFS, WMS
Categories: OSGeo Planet

Cameron Shorter: The Yin & Yang of OSGeo Leadership

OSGeo Planet - Sun, 2017-10-15 22:42

The 2017 OSGeo Board elections are about to start. Some of us who have been involved with OSGeo over the years have collated thoughts about the effectiveness of different strategies. Hopefully these thoughts will be useful for future boards, and charter members who are about to select board members.The Yin and Yang of OSGeoAs with life, there are a number of Yin vs Yang questions we are continually trying to balance. Discussions around acting as a high or low capital organisation; organising top down vs bottom up; populating a board with old wisdom or fresh blood; personal vs altruistic motivation; protecting privacy vs public transparency. Let’s discuss some of them here.Time vs MoneyOSGeo is an Open Source organisation using a primary currency of volunteer time. We mostly self-manage our time via principles of Do-ocracy and Merit-ocracy. This is bottom up.However, OSGeo also manages some money. Our board divvies up a budget which is allocated down to committees and projects. This is top-down command-and-control management. This cross-over between volunteer and market economics is a constant point of tension. (For more on the cross-over of economies, see Paul Ramsey’s FOSS4G 2017 Keynote, or low capital organisation?Our 2013 OSGeo Board tackled this question: OSGeo act as a high capital or low capital organisation? I.e., should OSGeo dedicate energy to collecting sponsorship and then passing out these funds to worthy OSGeo causes.While initially it seems attractive to have OSGeo woo sponsors, because we would all love to have more money to throw at worthy OSGeo goals, the reality is that chasing money is hard work. And someone who can chase OSGeo sponsorship is likely conflicted with chasing sponsorship for their particular workplace. So in practice, to be effective in chasing sponsorship, OSGeo will probably need to hire someone specifically for the role. OSGeo would then need to raise at least enough to cover wages, and then quite a bit more if the sponsorship path is to create extra value.This high capital path is how the Apache foundation is set up, and how LocationTech propose to organise themselves. It is the path that OSGeo started following when founded under the umbrella of Autodesk.However, as OSGeo has grown, OSGeo has slowly evolved toward a low capital volunteer focused organisation. Our overheads are very low, which means we waste very little of our volunteer labour and capital on the time consuming task of chasing and managing money. Consequently, any money we do receive (from conference windfalls or sponsorship) goes a long way - as it doesn't get eaten up by high overheads.Size and TitlesWithin small communities influence is based around meritocracy and do-ocracy. Good ideas bubble to the top and those who do the work decide what work gets done. Leaders who try to pull rank in order to gain influence quickly lose volunteers. Within these small communities, a person’s title hold little tradable value.However, our OSGeo community has grown very large, upward of tens of thousands of people. At this size, we often can’t use our personal relationships to assess reputation and trust. Instead we need to rely on other cues, such as titles and allocated positions of power.Consider also that OSGeo projects have become widely adopted. As such, knowledge and influence within an OSGeo community has become a valuable commodity. It helps land a job; secure a speaking slot at a conference; or get an academic paper published.This introduces a commercial dynamic into our volunteer power structures:
  • A title is sometimes awarded to a dedicated volunteer, hoping that it can be traded for value within the commercial economy. (In practice, deriving value from a title is much harder than it sounds).
  • There are both altruistic and personal reasons for someone to obtain a title. A title can be used to improve the effectiveness of the volunteer; or to improve the volunteers financial opportunities.
  • This can prompt questions of a volunteer’s motivations.
In response to this, over the years we have seen a gradual change to position of roles within the OSGeo community.Top-down vs bottom-upOSGeo board candidates have been asked for their “vision”, and “what they would like to change or introduce”.  These are valid questions if OSGeo were run as a command-and-control top-down hierarchy; if board made decisions were delegated to OSGeo committees to implement. But OSGeo is bottom-up. Boards which attempt to centralise control and delegate tasks cause resentment and disengagement amongst volunteers. Likewise, communities who try to delegate tasks to their leaders merely burn out their leaders. Both are ignoring the principles of Do-ocracy and Merit-ocracy. So ironically, boards which do less are often helping more.Darwinian evolution means that only awesome ideas and inspiring leaders attract volunteer attention - and that is a good thing.Recognising ineffective control attemptsHow do you recognise ineffective command-and-control techniques within a volunteer community? Look for statements such as:
  • “The XXX committee needs to do YYY…”
  • “Why isn’t anyone helping us do …?”
  • “The XXX community hasn’t completed YYY requirements - we need to tell them to implement ZZZ”
If all the ideas from an organisation come from management, then management isn’t listening to their team.Power to the peopleIn most cases the board should keep out of the way of OSGeo communities. Only in exceptional circumstances should a board override volunteer initiatives.Decisions and power within OSGeo should be moved back into OSGeo committees, chapters and projects. This empowers our community, and motivates volunteers wishing to scratch an itch.We do want our board members to be enlightened, motivated and engaged within OSGeo. This active engagement should be done within OSGeo communities: partaking, facilitating or mentoring as required. A recent example of this was Jody Garnett’s active involvement with OSGeo rebranding - where he worked with others within the OSGeo marketing committee.Democratising key decisionsWhile we have a charter membership of nearly 400 who are tasked with ‘protecting’ the principles of the foundation and voting for new charter members and the board. Beyond this, charter members have had little way of engaging with the board to influence the direction of OSGeo.How can we balance the signal-to-noise ratio such that we can achieve effective membership engagement with the board without overwhelming ourselves with chatter? Currently we have no formal or prescribed processes for such consultation.ReimbursementOSGeo Board members are not paid for their services. However, they are regularly invited to partake in activities such as presenting at conferences or participating in meetings with other organisations. These are typically beneficial to both OSGeo and the leader’s reputation or personal interest. To avoid OSGeo Board membership being seen as a “Honey Pot”, and for the Board to maintain trust and integrity, OSGeo board members should refuse payment from OSGeo for partaking in such activities. (There is nothing wrong with accepting payment from another organisation, such as the conference organisers.)In response to the question of conferences, OSGeo has previously created OSGeo Advocates - an extensive list of local volunteers from around the world willing to talk about OSGeo. Old vs newShould we populate our board with old wisdom or encourage fresh blood and new ideas? We ideally want a bit of both, bring wisdom from the past, but also spreading the opportunity of leadership across our membership. We should avoid leadership becoming an exclusive “boys club” without active community involvement, and possibly should consider maximum terms for board members.If our leadership follow a “hands off oversight role”, then past leaders can still play influential roles within OSGeo’s subcommittees.Vision for OSGeo 2.0Prior OSGeo thought leaders have suggested it’s time to grow from OSGeo 1.0 to OSGeo 2.0; time to update our vision and mission.  A few of those ideas have fed into OSGeo’s website revamp currently underway. This has been a good start, but there is still room to acknowledge that much has changed since OSGeo was born a decade ago, and there are plenty of opportunities to positively redefine ourselves. A test of OSGeo’s effectiveness is to see how well community ideas are embraced and taken through to implementation. This is a challenge that I hope will attract new energy and new ideas from a new OSGeo generation.Here are a few well considered ideas that have been presented to date that we can start from:RecommendationsSo where does this leave us.
  • Let’s recognise that OSGeo is an Open Source community, and we organise ourselves best with bottom-up Meritocracy and Do-ocracy.
  • Wherever possible, decisions should be made at the committee, chapter or project level, with the board merely providing hands-off oversight. This empowers and enables our sub-communities.
  • Let’s identify strategic topics where the OSGeo board would benefit from consultation with charter membership and work out how this could be accomplished efficiently and effectively.
  • Let’s embrace and encourage new blood into our leadership ranks, while retaining access to our wise old white beards.  
  • The one top-down task for the board is based around allocation of OSGeo’s (minimal) budget.
Categories: OSGeo Planet

Free and Open Source GIS Ramblings: Movement data in GIS #9: trajectory data models

OSGeo Planet - Sun, 2017-10-15 16:23

There are multiple ways to model trajectory data. This post takes a closer look at the OGC® Moving Features Encoding Extension: Simple Comma Separated Values (CSV). This standard has been published in 2015 but I haven’t been able to find any reviews of the standard (in a GIS context or anywhere else).

The following analysis is based on the official OGC trajcectory example at The header consists of two lines: the first line provides some meta information while the second defines the CSV columns. The data model is segment based. That is, each line describes a trajectory segment with at least two coordinate pairs (or triplets for 3D trajectories). For each segment, there is a start and an end time which can be specified as absolute or relative (offset) values:

@stboundedby,urn:x-ogc:def:crs:EPSG:6.6:4326,2D,50.23 9.23,50.31 9.27,2012-01-17T12:33:41Z,2012-01-17T12:37:00Z,sec
@columns,mfidref,trajectory,state,xsd:token,”type code”,xsd:integer
a, 10,150,11.0 2.0 12.0 3.0,walking,1
b, 10,190,10.0 2.0 11.0 3.0,walking,2
a,150,190,12.0 3.0 10.0 3.0,walking,2
c, 10,190,12.0 1.0 10.0 2.0 11.0 3.0,vehicle,1

Let’s look at the first data row in detail:

  • a … trajectory id
  • 10 … start time offset from 2012-01-17T12:33:41Z in seconds
  • 150 … end time offset from 2012-01-17T12:33:41Z in seconds
  • 11.0 2.0 12.0 3.0 … trajectory coordinates: x1, y1, x2, y2
  • walking …  state
  • 1… type code

My main issues with this approach are

  1. They missed the chance to use WKT notation to make the CSV easily readable by existing GIS tools.
  2. As far as I can see, the data model requires a regular sampling interval because there is no way to store time stamps for intermediate positions along trajectory segments. (Irregular intervals can be stored using segments for each pair of consecutive locations.)

In the common GIS simple feature data model (which is point-based), the same data would look something like this:


The main issue here is that there has to be some application logic that knows how to translate from points to trajectory. For example, trajectory a changes from walking1 to walking2 at 2012-01-17T12:36:11Z but we have to decide whether to store the previous or the following state code for this individual point.

An alternative to the common simple feature model is the PostGIS trajectory data model (which is LineStringM-based). For this data model, we need to convert time stamps to numeric values, e.g. 2012-01-17T12:33:41Z is 1326803621 in Unix time. In this data model, the data looks like this:

a,LINESTRINGM(11.0 2.0 1326803631, 12.0 3.0 1326803771),walking,1
a,LINESTRINGM(12.0 3.0 1326803771, 10.0 3.0 1326803811),walking,2
b,LINESTRINGM(10.0 2.0 1326803631, 11.0 3.0 1326803811),walking,2
c,LINESTRINGM(12.0 1.0 1326803631, 10.0 2.0 1326803771, 11.0 3.0 1326803811),vehicle,1

This is very similar to the OGC data model, with the notable difference that every position is time-stamped (instead of just having segment start and end times). If one has movement data which is recorded at regular intervals, the OGC data model can be a bit more compact, but if the trajectories are sampled at irregular intervals, each point pair will have to be modeled as a separate segment.

Since the PostGIS data model is flexible, explicit, and comes with existing GIS tool support, it’s my clear favorite.

Read more:

Categories: OSGeo Planet

BostonGIS: Using pg_upgrade to upgrade PostGIS without installing an older version of PostGIS

OSGeo Planet - Sun, 2017-10-15 05:11

PostGIS releases a new minor version of PostGIS every one or two years. Each minor version of postgis has a different libname suffix. In PostGIS 2.1 you'll find files in your PostgreSQL lib folder called postgis-2.1.*, rtpostgis-2.1.*, postgis-topology-2.1.*, address-standardizer-2.1.* etc. and in a PostGIS 2.2 you'll find similar files but with 2.2 in the name. I believe PostGIS and pgRouting are the only extensions that stamp the lib with a version number. Most other extensions you will find are just called e.g. hstore is always called hstore.dll / even if the version changed from 9.6 to 10. On the bright side this allows people to have two versions of PostGIS installed in a PostgreSQL cluster, though a database can use at most one version. So you can have an experimental database running a very new or unreleased version of PostGIS and a production database running a more battery tested version.

On the sad side this causes a lot of PostGIS users frustration trying to use pg_upgrade to upgrade from an older version of PostGIS/PostgreSQL to a newer version of PostGIS/PostgreSQL; as their pg_upgrade often bails with a message in the loaded_libraries.txt log file something to the affect:

could not load library "$libdir/postgis-2.2": ERROR: could not access file "$libdir/postgis-2.2": No such file or directory could not load library "$libdir/postgis-2.3": ERROR: could not access file "$libdir/postgis-2.3": No such file or directory

This is also a hassle because we generally don't support a newer version of PostgreSQL on older PostGIS installs because the PostgreSQL major version changes tend to break our code often and backporting those changes is both time-consuming and dangerous. For example the DatumGetJsonb change and this PostgreSQL 11 crasher we haven't isolated the cause of yet. There are several changes like this that have already made the PostGIS 2.4.0 we released recently incompatible with the PostgreSQL 11 head development.

Continue reading "Using pg_upgrade to upgrade PostGIS without installing an older version of PostGIS"
Categories: OSGeo Planet

Free and Open Source GIS Ramblings: Movement data in GIS extra: trajectory generalization code and sample data

OSGeo Planet - Fri, 2017-10-13 18:41

Today’s post is a follow-up of Movement data in GIS #3: visualizing massive trajectory datasets. In that post, I summarized a concept for trajectory generalization. Now, I have published the scripts and sample data in my QGIS-Processing-tools repository on Github.

To add the trajectory generalization scripts to your Processing toolbox, you can use the Add scripts from files tool:

It is worth noting, that Add scripts from files fails to correctly import potential help files for the scripts but that’s not an issue this time around, since I haven’t gotten around to actually write help files yet.

The scripts are used in the following order:

  1. Extract characteristic trajectory points
  2. Group points in space
  3. Compute flows between cells from trajectories

The sample project contains input data, as well as output layers of the individual tools. The only required input is a layer of trajectories, where trajectories have to be LINESTRINGM (note the M!) features:

Trajectory sample based on data provided by the GeoLife project

In Extract characteristic trajectory points, distance parameters are specified in meters, stop duration in seconds, and angles in degrees. The characteristic points contain start and end locations, as well as turns and stop locations:

The characteristic points are then clustered. In this tool, the distance has to be specified in layer units, which are degrees in case of the sample data.

Finally, we can compute flows between cells defined by these clusters:

Flow lines scaled by flow strength and cell centers scaled by counts

If you use these tools on your own data, I’d be happy so see what you come up with!

Read more:

Categories: OSGeo Planet

Equipo Geotux: Publicar un servicio de teselas de mapas con QMetatiles y GitHub

OSGeo Planet - Fri, 2017-10-13 15:07

Para realizar una publicación de un servicio WMTS usando simplemente el almacenamiento estático en un servidor Web, es necesario utilizar una serie de herramientas que permiten, en primera medida generar las teselas o baldosas de imágenes y segundo una herramienta que nos permita generar el archivo XML.. Para el primer caso, se va hacer uso de las siguientes extensiones o complementos de QGIS.



Nota: Este documento hace parte del material de guías de talleres del curso de Servicios Web Geográficos de la Maestría en Geomática de la Universidad Nacional de Colombia Sede Bogotá, mayor información en 1. IDENTIFICACIÓN DE LAS ESCALAS DE TESELAS CON EL COMPLEMENTO DE QGIS DE TILELAYER.

Una vez instalados los complementos de QGIS, se procede a cargar el proyecto de Laguna de Tota, recuerde que el sistema por defecto de este proyecto debe ser EPSG:3857 o Web Mercator, ya que las herramientas a utilizar sólo son compatibles con este sistema de referencia de coordenadas. Una vez el proyecto es desplegado, se procede a cargar el esquema de matrices de teselas desde el menú Web → TileLayer Plugin → Add TileLayer …, en este caso para el servicio WMTS es el esquema XYZFrame.

TileLayer Plugin

El esquema se representa con una nueva capa de nombre XYZFrame, y permite identificar cual es rango mínimo y máximo de zoom, así cómo el número e índice de las teselas. Para el siguiente caso, el zoom mínimo que permite el almacenamiento de nuestra zona de estudio en una tesela en 13, y el índice de origen en la matriz de teselas es 2434,3967.



Una vez identificado los niveles de zoom o escalas de las matrices, se procede a generarlas las teselas con el complemento QMetaTiles disponible en la ruta del menú Complementos → QMetaTiles → QMetaTiles.

Los parámetros solicitados por esta herramienta son los que se muestran en la siguiente imagen.


  • Output: La ruta del directorio en el cual va a crear las teselas. Recomendable usar, si esta usando el servidor de GeoTux Server, la ruta que corresponda al punto de montaje /gisdata/tiles
  • Tileset name: hace referencia al nombre del proyecto o conjunto de matriz de teselas, en este caso “z11to17”.
  • Extent: Hace referencia a la extensión geográfica de la generación de teselas, para este caso use una capa para restringir la extensión geográfica para la generación de teselas.
  • Zoom: son los niveles de zoom para generar el conjunto de matrices de teselas, para este caso se ha identificado anteriormente un conjunto de matrices de teselas de 11 a 17.

Categories: OSGeo Planet

Even Rouault: Optimizing JPEG2000 decoding

OSGeo Planet - Thu, 2017-10-12 16:41
Over this summer I have spent 40 days (*) in the guts of the OpenJPEG open-source library (BSD 2-clause licensed) optimizing the decoding speed and memory consumption. The result of this work is now available in the OpenJPEG 2.3.0 release.
For those who are not familiar with JPEG-2000, and they have a lot of excuse given its complexity, this is a standard for image compression, that supports lossless and lossy methods. It uses discrete wavelet transform for multi-resolution analysis, and a context-driven binary arithmetic coder for encoding of bit plane coefficients. When you go into the depths of the format, what is striking is the number of independent variables that can be tuned:
- use of tiling or not, and tile dimensions
- number of resolutions
- number of quality layers
- code-block dimensions
- 6 independent options regarding how code-blocks are encoded (code-block styles): use of Selective arithmetic coding bypass, use of Reset context probabilities on coding pass boundaries, use of Termination on each coding pass, use of Vertically stripe causal context, use of Predictable termination, use of Segmentation Symbols. Some can bring decoding speed advantages (notably selective arithmetic coding bypass), at the price of less compression efficiency. Others might help hardware based implementations. Others can help detecting corruption in the codestream (predictable termination)- spatial partition of code-blocks into so-called precincts, whose dimension may vary per resolution- progression order, ie the criterion to decide how packets are ordered, which is a permutation of the 4 variables: Precincts, Component, Resolution, Layer. The standard allows for 5 different permutations. To add extra fun, the progression order might be configured to change several times among the 5 possible (something I haven't yet had the opportunity to really understand)- division of packets into tile-parts
- use of multi-component transform or not
- choice of lossless or lossy wavelet transforms
- use of start of packet / end of packet markers
- use of  Region Of Interest, to have higher quality in some areas
- choice of image origin and tiling origins with respect to a reference grid (the image and tile origin are not necessarily pixel (0,0))
And if that was not enough, some/most of those parameters may vary per-tile! If you already found that TIFF/GeoTIFF had too many parameters to tune (tiling or not, pixel or band interleaving, compression method), JPEG-2000 is probably one or two orders of magnitude more complex. JPEG-2000 is truly a technological and mathematical jewel. But needless to say that having a compliant JPEG-2000 encoder/decoder, which OpenJPEG is (it is an official reference implementation of the standard) is already something complex. Having it perform optimally is yet another target.
Previously to that latest optimization round, I had already worked at enabling multi-threaded decoding at the code-block level, since they can be decoded independently (once you've re-assembled from the code-stream the bytes that encode a code-block), and in the inverse wavelet transform as well (during the horizontal pass, resp vertical pass, rows, resp columns, can be transformed independently). But the single-thread use had yet to be improved. Roughly, around 80 to 90% of the time during JPEG-2000 decoding is spent in the context-driven binary arithmetic decoder, around 10% in the inverse wavelet transform and the rest in other operations such as multi-component transform. I managed to get around 10% improvement in the global decompression time by porting to the decoder an optimization that had been proposed by Carl Hetherington for the encoding side, in the code that determines which bit of wavelet transformed coefficient must be encoded during which coding pass. The trick here was to reduce the memory needed for the context flags, so as to decrease the pressure on the CPU cache. Other optimizations in that area have consisted in making sure that some critical variables are kept preferably in CPU registers rather than in memory. I've spent a good deal of time looking at the disassembly of the compiled code.
I've also optimized the reversible (lossless) inverse transform to use the Intel SSE2 (or AVX2) instruction sets to be able to process several rows, which can result up to 3x speed-up for that stage (so a global 3% improvement)
I've also worked on reducing the memory consumption needed to decode images, by removing the use of intermediate buffers when possible. The result is that the amount of memory needed to do full-image decoding was reduced by 2.4.
Another major work direction was to optimize speed and memory consumption for sub-window decoding. Up to now, the minimal unit of decompression was a tile. Which is OK for tiles of reasonable dimensions (let's say 1024x1024 pixels), but definitely not on images that don't use tiling, and that hardly fit into memory. In particular, OpenJPEG couldn't open images of more than 4 billion pixels. The work has consisted in 3 steps :- identifying which precincts and code-blocks are needed for the reconstruction of a spatial region- optimize the inverse wavelet transform to operate only on rows and columns needed- reducing the allocation of buffers to the amount strictly needed for the subwindow of interestThe overall result is that the decoding time and memory consumption are now roughly proportional to the size of the subwindow to decode, whereas they were previously constant. For example decoding 256x256 pixels in a 13498x9944x3 bands image takes now only 190 ms, versus about 40 seconds before.
As a side activity, I've also fixed 2 different annoying bugs that could cause lossless encoding to not be lossless for some combinations of tile sizes and number of resolutions, or when some code-block style options were used.
I've just updated the GDAL OpenJPEG driver (in GDAL trunk) to be more efficient when dealing with untiled JPEG-2000 images.
There are many more things that could be done in the OpenJPEG library :- port a number of optimizations on the encoding side: multi-threadig, discrete wavelet transform optimizations, etc...- on the decoding side, reduce again the memory consumption, particularly in the untiled case. Currently we need to ingest into memory the whole codestream for a tile (so the whole compressed file, on a untiled image)- linked to the above, use of TLM and PLT marker segments (kind of indexes to tiles and packets)- on the decoding side, investigate further improvements for the code specific of irreversible / lossy compression- make the opj_decompress utility do a better use of the API and consume less memory. Currently it decodes a full image into memory instead of proceeding by chunks (you won't have this issue if using gdal_translate)- investigate how using GPGPU capabilities (CUDA or OpenCL) could help reduce the time spent in context-driven binary arithmetic decoder.
Contact me if you are interested in some of those items (or others !)

(*) funding provided by academic institutions and archival organizations, namely
… And logistic support from the International Image Interoperability Framework (IIIF), the Council on Library and Information Resources (CLIR), intoPIX, and of course the Image and Signal Processing Group (ISPGroup) from University of Louvain (UCL, Belgium) hosting the OpenJPEG project.
Categories: OSGeo Planet

Paul Ramsey: Adding PgSQL to PHP on OSX

OSGeo Planet - Thu, 2017-10-12 14:00

I’m yak shaving this morning, and one of the yaks I need to ensmooth is running a PHP script that connects to a PgSQL database.

No problem, OSX ships with PHP! Oh wait, that PHP does not include PgSQL database support.

Adding PgSQL to PHP on OSX

At this point, you can either run to completely replace your in-build PHP with another PHP (probably good if you’re doing modern PHP development and want something newer than 5.5) or you can add PgSQL to your existing PHP installation. I chose the latter.

The key is to build the extension you want without building the whole thing. This is a nice trick available in PHP, similar to the Apache module system for independent module development.

First, figure out what version of PHP you will be extending:

> php --info | grep "PHP Version" PHP Version => 5.5.38

For my version of OSX, Apple shipped 5.5.38, so I’ll pull down the code package for that version.

Then, unbundle it and go to the php extension directory:

tar xvfz php-5.5.38.tar.bz2 cd php-5.5.38/ext/pgsql

Now the magic part. In order to build the extension, without building the whole of PHP, we need to tell the extension how the PHP that Apple ships was built and configured. How do we do that? We run phpize in the extension directory.

> /usr/bin/phpize Configuring for: PHP Api Version: 20121113 Zend Module Api No: 20121212 Zend Extension Api No: 220121212

The phpize process reads the configuration of the installed PHP and sets up a local build environment just for the extension. All of a sudden we have a ./configure script, and we’re ready to build (assuming you have installed the MacOSX command-line developers tools with XCode).

> ./configure \ --with-php-config=/usr/bin/php-config \ --with-pgsql=/opt/pgsql/10 > make

Note that I have my own build of PostgreSQL in /opt/pgsql. You’ll need to supply the path to your own install of PgSQL so that the PHP extension can find the PgSQL libraries and headers to build against.

When the build is complete, you’ll have a new modules/ directory in the extension directory. Now figure out where your system wants extensions copied, and copy the module there.

> php --info | grep extension_dir extension_dir => /usr/lib/php/extensions/no-debug-non-zts-20121212 => /usr/lib/php/extensions/no-debug-non-zts-20121212 > sudo cp modules/ /usr/lib/php/extensions/no-debug-non-zts-20121212

Finally, you need to edit the /etc/php.ini file to enable the new module. If the file doesn’t already exist, you’ll have to copy in the template version and then edit that.

sudo cp /etc/php.ini.default /etc/php.ini sudo vi /etc/php.ini

Find the line for the PgSQL module and uncomment and edit it appropriately.

;extension=php_pdo_sqlite.dll ;extension=php_pspell.dll

Now you can check and see if it has picked up the PgSQL module.

> php --info | grep PostgreSQL PostgreSQL Support => enabled PostgreSQL(libpq) Version => 10.0 PostgreSQL(libpq) => PostgreSQL 10.0 on x86_64-apple-darwin15.6.0, compiled by Apple LLVM version 8.0.0 (clang-800.0.42.1)

That’s it!

Categories: OSGeo Planet

Gis-Lab: Обработка данных аэрофотосъемки средствами открытого пакета OpenDroneMap

OSGeo Planet - Thu, 2017-10-12 07:46

В связи с бурным развитием как фотограмметрических технологий, так и индустрии простых в освоении БПЛА оснащенных фото/видео-аппаратурой, у специалистов самых разных профилей стал расти интерес к возможностям организации аэрофотосъемки и обработки получаемых данных для дальнейшей работы с географическими продуктами, такими как ортофотопланы, цифровые модели местности, трёхмерные модели. На рынке представлено большое количество решений как аппаратных (преимущественно БПЛА), так и программных. Программные продукты для фотограмметрической обработки данных стали разрабатывать практически все крупные вендоры (Autodesk, Trimble, …), также появилось множество новых компаний, продвигающих собственные пакеты (Agisoft, Pix4D, DroneDeploy, …). Параллельно начали развиваться и проекты с открытым исходным кодом. Подробное описание установки на виртуальную машину и основы использования одного из наиболее удачных открытых пакетов – OpenDroneMap – рассмотрены в представленной статье.

Прочитать | Обсудить


Categories: OSGeo Planet

gvSIG Team: SIG aplicado a Gestión Municipal: Módulo 5.1 ‘Servicios web (Introducción a las IDE)’

OSGeo Planet - Wed, 2017-10-11 22:29

Con el primer vídeo del quinto módulo, que trata sobre el acceso a servicios web desde gvSIG, nos introducimos en un concepto fundamental cuando hablamos de la gestión eficiente de la información geográfica: las Infraestructuras de Datos Espaciales (IDE). Es tal su importancia que son cada vez los países o regiones del mundo que legislan para hacer efectiva su implantación en toda administración que genere información geográfica digital.

Las IDE se consideran el sistema idóneo para gestionar, en su totalidad, la información geográfica de cualquier organización y, por supuesto, de un ayuntamiento. En futuros módulos de este curso veremos gvSIG Online, la solución libre para ponerlas en marcha. En el módulo de hoy veremos cómo trabajar con los servicios web de mapas que pueden generar las IDE desde el SIG de escritorio.

Actualmente, una gran cantidad de administraciones ofrecen su cartografía de forma pública para poder cargar a través de servicios web. Gracias al uso de determinados estándares es posible poder acceder a estos servicios desde gvSIG Desktop, lo que nos permite cargar cartografía en nuestro proyecto sin necesidad de descargar nada en disco.

Para poder entender mejor esta parte en gvSIG comenzaremos con un primer vídeo teórico sobre introducción a las Infraestructuras de Datos Espaciales, donde explicaremos qué son los servicios web, qué tipos hay, y algunos enlaces donde se recopilan algunos estos servicios disponibles.

En este módulo no es necesario descargar ninguna cartografía, ya que es un vídeo totalmente teórico.

El primer vídeo-tutorial de este quinto módulo es el siguiente:

Post relacionados:

Filed under: gvSIG Desktop, IDE, spanish, training Tagged: IDE, OGC, Servicios web
Categories: OSGeo Planet

Even Rouault: GDAL and cloud storage

OSGeo Planet - Wed, 2017-10-11 17:48
In the past weeks, a number of improvements related to access to cloud storage have been committed to GDAL trunk (future GDAL 2.3.0)

Cloud based virtual file systems
There was already support to access private data in Amazon S3 buckets through the /vsis3/ virtual file system (VFS). Besides a few robustness fixes, a few new capabilities have been added, like creation and deletion of directories inside a bucket with VSIMkdir() / VSIRmdir(). The authentication methods have also been extended to support, beyond the AWS_SECRET_ACCESS_KEY and AWS_ACCESS_KEY_ID environment variables, the other ways accepted by the "aws" command line utilities, that is to say storing credentials in the ~/.aws/credentials or ~/.aws/config files. If GDAL is executed since a Amazon EC2 instance that has been assigned rights to buckets, GDAL will automatically fetch the instance profile credentials.
The existing read-only /vsigs/ VFS for Google Cloud Storage as being extended with write capabilities (creation of new files), to be on feature parity with /vsis3/. The authentication methods have also been extended to support OAuth2 authentication with a refresh token, or with service account authentication. The credentials can be stored in a ~/.boto configuration file. And when run from a Google Compute Engine virtual machine, GDAL will automatically fetch the instance profile credentials.
Two new VFS have also been added, /vsiaz/ for Microsoft Azure Blobs and /vsioss/ for Alibaba Cloud Object Storage Service. They support read and write operations similarly to the two previously mentioned VFS.

To make file and directory management easy, a number of Python sample scripts have been created or improved: my.tif /vsis3/mybucket/raster/ -r /vsis3/mybucket/raster /vsigs/somebucket -lr /vsis3/mybucket /vsis3/mybucket/raster/my.tif /vsis3/mybucket/newdir -r /vsis3/mybucket/newdir

Cloud Optimized GeoTIFF
Over the last past few months, there has been adoption by various actors of the cloud optimized formulation of GeoTIFF files, which enables clients to efficiently open and access portions of a GeoTIFF file available through HTTP GET range requests.

Source code for a online service that offers validation of cloud optimized GeoTIFF (using GDAL and the script underneath) and can run as a AWS Lambda function is available. Note: as the current definition of what is or is not a cloud optimized formulation has been uniteraly decided up to now, it cannot be excluded that it might be changed on some points (for example relaxing constraints on the ordering of the data of each overview level, or enforcing that tiles are ordered in a top-to-bottom left-to-right way)

GDAL trunk has received improvements to speed up access to sub windows of a GeoTIFF file by making sure that the tiles that participate to a sub-window of interest are requested in parallel (this is true for public files accessed through /vsicurl/ or with the four above mentioned specialized cloud VFS), by reducing the amount of data fetched to the strict minimum and merging requests for consecutive ranges. In some environments, particularly when accessing to the storage service of a virtual machine of the same provider, HTTP/2 can be used by setting the GDAL_HTTP_VERSION=2 configuration option (provided you have a libcurl recent enough and built against nghttp2). In that case, HTTP/2 multiplexing will be used to request and retrieve data on the same HTTP connection (saving time to establish TLS for example). Otherwise, GDAL will default to several parallel HTTP/1.1 connections. For long lived processes, efforts have been made to re-use as much as possible existing HTTP connections.

Categories: OSGeo Planet

gvSIG Team: Impresiones tras las 4as Jornadas gvSIG México

OSGeo Planet - Tue, 2017-10-10 01:24

La pasada semana tuvieron lugar las 4as Jornadas gvSIG México. Por cuarto año consecutivo la Comunidad Mexicana de gvSIG ha celebrado un encuentro lleno de actividades, con interesantes ponencias y talleres llenos a rebosar.

Creo que la mejor forma de mostrar mis impresiones de cómo fueron la jornadas es publicar unas pocas imágenes sobre las mismas. Tan sólo repetiré alguna de las palabras que utilicé en la clausura del evento: las jornadas de gvSIG en México han sido semillas que han sumado a un proceso que va más allá de la migración a software libre de tal o cual organismo, han puesto en marcha algo mucho más importante desde el área de la geomática, el camino hacia la independencia tecnológica de un país.

Unas jornadas que se llevaron a cabo al mismo tiempo que se estaban celebrando las 9as Jornadas de Latinoamérica y Caribe; en dos semanas tenemos las 13as Jornadas Internacionales…indicadores, junto a muchos otros, que muestran que gvSIG es un proyecto consolidado y en constante crecimiento, cuyo uso se extiende a más de 160 países. Y esperaros al 2018, que va a deparar muchas sorpresas…

Filed under: events, spanish Tagged: Conferencia, Culiacán, jornadas, México, Sinaloa
Categories: OSGeo Planet

gvSIG Team: SIG aplicado a Gestión Municipal: Módulo 4.2 ‘Tablas de atributos (unión de tablas)’

OSGeo Planet - Mon, 2017-10-09 17:46

Ya está disponible el segundo vídeo del cuarto módulo, en el que continuamos trabajando con las tablas de atributos. En este caso lo que haremos será ver cómo realizar unión de tablas, muy útil para cuando tenemos una capa con cartografía de nuestro ayuntamiento (como por ejemplo una capa con las parcelas o barrios), y queremos unirle información de una tabla externa como puede ser la población de cada parcela o barrio. Para ello necesitaremos que ambas tablas tengan un campo con valores comunes, que será por los que las unirá.

En el tercer módulo tenéis toda la información sobre cómo instalar gvSIG, y en el segundo, en el apartado de preguntas frecuentes tenéis toda la información sobre cómo preguntar las dudas que tengáis durante el curso.

En este módulo se trabaja con un asistente para abrir un fichero CSV, que por defecto no suele ir instalado en gvSIG 2.3.1. Para poder instalarlo podéis ver la información del siguiente post.

La cartografía a utilizar en este módulo ya estaba disponible en el post anterior, pero si no la habéis descargado podéis hacerlo desde el siguiente enlace.

El segundo vídeo-tutorial de este cuarto módulo es el siguiente:

Post relacionados:

Módulo 1: Diferencias entre SIG y CAD

Módulo 2: Introducción a los Sistemas de Referencia

Módulo 3: Vistas, capas, simbología, etiquetado

Módulo 4.1: Tablas de atributos (información alfanumérica)

Filed under: gvSIG Desktop, spanish, training Tagged: ayuntamientos, gestión municipal, información alfanumérica, Tablas, tablas de atributos, unión de tablas
Categories: OSGeo Planet

From GIS to Remote Sensing: Developing the SCP 6: Main interface and Multiple band sets

OSGeo Planet - Mon, 2017-10-09 08:00
I am updating the Semi-Automatic Classification Plugin (SCP) to version 6 (codename Greenbelt) which will be compatible with the upcoming QGIS 3.

In this previous post, I described the main changes to the SCP dock, and in particular the new tabs designed to optimize the space on the display.
In this post I present the redesigned Main interface that contains the SCP tools.
In addition, I have implemented a profound change to the SCP core, which is the ability to define multiple band setsthis opens many possibilities for new tools that can exploit the combination of bands (I am thinking about raster mosaic and more tools) which I'll try to include in SCP 6.

Main interface
Categories: OSGeo Planet GRASS GIS crash course

OSGeo Planet - Sun, 2017-10-08 18:44

On 3 November 2017 there will be a GRASS GIS crash course, organized by and the Faculty of Geo-Information Science and Earth Observation of ITC.  The course will provide participants with an overview of the software capabilities and a hands-on experience in raster, vector and time series processing with the Open Source software GRASS GIS 7. For more information, go to You can register on our Meetup page [update: maximum number of participants has been reached, registration closed].



Course structure/contents

The course consists of mainly practical sessions with a short intro to basic concepts at the beginning. All the code and material will be publicly available.

  • We will cover the following introductory topics, among others:
  • GRASS database, locations and mapsets
  • Working with different data types (vector, raster, 3D raster formats, time series)
  • Different interfaces (GUI, CLI, Python)
  • Region and mask
  • Scripting examples
  • Visualization of spatial data; scale bar, symbols, grids, color tables, histograms
  • Where to find help

After the introduction with simple examples, we will go together through a guided exercise to demonstrate a full workflow in GRASS GIS, involving raster, vector and temporal data. In the end, participants will have the chance to follow three different tutorials: Remote sensing analysis using satellite data, Time series processing and spatial point interpolation. Teachers will be available for questions and explanations.

When and where?
  • Friday, November 3rd, 2017 from 10.30 to 16.30 (with 1-hour for lunch break)
  • ITC – Faculty of Geo-Information Science and Earth Observation. University of Twente. Hengelosestraat 99, 7514 AE, Enschede. The Netherlands.
Categories: OSGeo Planet

Free and Open Source GIS Ramblings: Movement data in GIS #8: edge bundling for flow maps

OSGeo Planet - Sun, 2017-10-08 17:50

If you follow this blog, you’ll probably remember that I published a QGIS style for flow maps a while ago. The example showed domestic migration between the nine Austrian states, a rather small dataset. Even so, it required some manual tweaking to make the flow map readable. Even with only 72 edges, the map quickly gets messy:

Raw migration flows between Austrian states, line width scaled by flow strength

One popular approach in the data viz community to deal with this problem is edge bundling. The idea is to reduce visual clutter by generate bundles of similar edges. 

Surprisingly, edge bundling is not available in desktop GIS. Existing implementations in the visual analytics field often run on GPUs because edge bundling is computationally expensive. Nonetheless, we have set out to implement force-directed edge bundling for the QGIS Processing toolbox [0]. The resulting scripts are available at

The main procedure consists of two tools: bundle edges and summarize. Bundle edges takes the raw straight lines, and incrementally adds intermediate nodes (called control points) and shifts them according to computed spring and electrostatic forces. If the input are 72 lines, the output again are 72 lines but each line geometry has been bent so that similar lines overlap and form a bundle.

After this edge bundling step, most common implementations compute a line heatmap, that is, for each map pixel, determine the number of lines passing through the pixel. But QGIS does not support line heatmaps and this approach also has issues distinguishing lines that run in opposite directions. We have therefore implemented a summarize tool that computes the local strength of the generated bundles.

Continuing our previous example, if the input are 72 lines, summarize breaks each line into its individual segments and determines the number of segments from other lines that are part of the same bundle. If a weight field is specified, each line is not just counted once but according to its weight value. The resulting bundle strength can be used to create a line layer style with data-defined line width:

Bundled migration flows

To avoid overlaps of flows in opposing directions, we define a line offset. Finally, summarize also adds a sequence number to the line segments. This sequence number is used to assign a line color on the gradient that indicates flow direction.

I already mentioned that edge bundling is computationally expensive. One reason is that we need to perform pairwise comparison of edges to determine if they are similar and should be bundled. This comparison results in a compatibility matrix and depending on the defined compatibility threshold, different bundles can be generated.

The following U.S. dataset contains around 4000 lines and bundling it takes a considerable amount of time.

One approach to speed up computations is to first use a quick clustering algorithm and then perform edge bundling on each cluster individually. If done correctly, clustering significantly reduces the size of each compatibility matrix.

In this example, we divided the edges into six clusters before bundling them. If you compare this result to the visualization at the top of this post (which did not use clustering), you’ll see some differences here and there but, overall, the results are quite similar:

Looking at these examples, you’ll probably spot a couple of issues. There are many additional ideas for potential improvements from existing literature which we have not implemented yet. If you are interested in improving these tools, please go ahead! The code and more examples are available on Github.

For more details, leave your email in a comment below and I’ll gladly send you the pre-print of our paper.

[0] Graser, A., Schmidt, J., Roth, F., & Brändle, N. (accepted) Untangling Origin-Destination Flows in Geographic Information Systems. Information Visualization – Special Issue on Visual Movement Analytics.

Read more:

Categories: OSGeo Planet

OSGeo-fr: 5eme rencontre des utilisateurs de QGIS

OSGeo Planet - Fri, 2017-10-06 11:16

L'OSGeo-fr et Montpellier SupAgro organisent la cinquième rencontre de la communauté francophone QGIS.

Cette rencontre se déroulera les 14 et 15 décembre à Montpellier SupAgro.

La première journée est organisée sous forme de barcamp : des ateliers informels où vous (utilisateurs) pouvez proposer des sujets, présenter des travaux, échanger, discuter. Ce moment a vocation de permettre l'échange entre utilisateurs, contributeurs et financeurs. N'hésitez pas à venir découvrir une forme originale de contributions autour de QGIS !

Voici quelques sujets évoqués en 2015-2016 :

  • comment traduire QGIS ?
  • comment utiliser QGIS sur une tablette (QField) ?
  • comment créer une carte avec QGIS et la diffuser sur le net ?

Durant la seconde journée, plusieurs conférences seront proposées. Cette année, le thème de cette journée est "QGIS 3.0 : que va changer cette nouvelle version pour les utilisateurs ?".

L'appel à mécénat a été diffusé cette semaine et celui pour les présentations le sera tout bientôt. Restez attentif !

Pour toute information ou demande, utilisez le formulaire de contact prévu à cet effet.

À propos de Montpellier SupAgro et la formation AgroTIC

Montpellier SupAgro est un établissement d'enseignement et de recherche qui forme des ingénieurs agronomes. La formation AgroTIC est une des formations portées par Montpellier SupAgro. Chaque année nous formons une quinzaine d'ingénieurs agronomes avec une double compétence en géomatique. Qgis constitue un outil central de notre formation d'une part pour les fonctionnalités de gestion de l'information spatiale qu'il offre et d'autre part parce qu'il est libre et gratuit et pourra être réutilisé par nos futurs diplomés quelque soit leur contexte professionnel.
Depuis 3 ans, nous organisons un évènement d'échange autour de Qgis au cours duquel nos étudiants peuvent rencontrer des professionnels qui utilisent ce logiciel. Cet évènement leur permet aussi de valoriser les enseignements qu'ils ont suivi puisqu'ils présentent également les nouvelles fonctionnalités de QGIS devant les professionnels.
Montpellier SupAgro :
AgroTIC :

À propos de l'OSGeo-fr

L’association est la représentation Francophone de la fondation Open Source Geospatial dont la mission est d’aider et de promouvoir le développement collaboratif des données et des technologies géospatiales ouvertes. L’association sert d’entité légale envers qui les membres de la communauté peuvent contribuer au code, aux finances et aux autres ressources, s’assurer que leurs contributions seront maintenues au bénéfice du public.

L’ sert également d’organisation de référence et d’assistance pour la communauté géospatiale libre, et fournit un forum commun et une infrastructure partagée pour améliorer la collaboration entre les projets.

La participation est ouverte à l’ensemble de la communauté Open Source. Tous les travaux de l’association sont publiés sur des forums publics où peut s’investir une communauté libre de participants. Les projets de la Fondation OSGeo sont tous librement disponibles et utilisables sous une licence Open Source certifiée par l’OSI.

Categories: OSGeo Planet

Andrea Antonello: JGrasstools back in black: The Horton Machine - Part 1

OSGeo Planet - Thu, 2017-10-05 15:21
I have to admit that when in far 2002, working at the University of Trento, Faculty of Environmental Engineering, we published the first release of The Horton Machine, I never thought we would ever change that name. The Horton Machine was a collection of around 40 GRASS modules written in C and dedicated to advanced Hydrology and Geomorphology. They represented the effort of the passed 10 years (now around 20) of professor Riccardo Rigon and his team.
At that time Riccardo and I were dreaming about a nice to use GUI for GRASS that would allow GRASS to be used more outside of the academic domain. In 2003 we started the JGrass project with that objective: create a userfriendly GUI for GRASS. The reaction of the GRASS community was bad, mostly (so they said) because Java was not open source. Also QGis was coming and becoming the natural choice for being an interface to GRASS.
Not at all in the mood of religious wars in 2006 we decided to join the java tribe and moved our resources to support the uDig project, where we happily lived and developed for many many years. We kind of stayed between two worlds, still using GRASS and its mapsets, but living in the userfriendly java world. :-)
At that time the processing libraries for Hydrology and Geomorphology (as well as LiDAR and forestry later on) were extracted into a library that could be used in standalone mode or inside uDig. That library, as a logical follow-up, got the name JGrasstools.
Had I only known better! In some (rare, but still!) occasions I and other JGrasstools developers have been asked why we still use the name JGrasstools, since we are not directly "bound" to GRASS. Well, I have been fighting over this a few times and had no hard feeling about this, apart of the huge work that it would have been to change everything. 
The last time it happened, was at the Foss4g conference in Paris. At the end of a great presentation given by Silvia about the tools we developed for forestry management using LiDAR data, (again) a member of the GRASS community asked the same old question in the questions interval dedicated to the presentation: Why do you still call it JGrasstools....?
This was the final straw for me. I still have to understand why people do certain things, but one thing was sure for me: we had to change that name, to make some of the GRASS community members sleep sweet dreams, and to be finally free!
So this is it. After 15 years of continuous development on the JGrasstools core, we go back to our origins: The Horton Machine.
The Horton Machine is now something more than just Hydrology and Geomorphology, there are other projects that support interaction with the mobile digital field mapping app Geopaparazzi, a module that supports interaction with spatial databases (also on android), the LESTO modules developed with the team of professor Giustino Tonon at the Free University of Bolzano... and beyond other things... also the plugins for the Desktop GIS gvSIG.
It has been quite an exercise to make this namespace migration and has taken me days between code refactoring, domain registration, maven publishing updates, documentation updating (still much work to be done there) and and and... but it is done now. Many will sleep sweet dreams, and I will be the first. And maybe at the next conference someone will ask a question related to the content of the presentation. Don't know what? A hint: "How did you get such a high single tree extraction rate from LiDAR data with your tools?" ;-)  What will happen now?Before the 13 international gvSIG conference we will do the first HortonMachine branded release and the together with it the connected release if gvSIG plugins.
Also maven releases of the modules will be done. At the time JGrasstools was at version 0.8.1. The HortonMachine will most probably start at 0.9.0. It sure should have been a major number, but well, we still need to reach the first major. :-)
In the next post we will show you what you will find in the release. Stay tuned!

Categories: OSGeo Planet
Syndicate content