OSGeo Planet

Free and Open Source GIS Ramblings: Movement data in GIS #10: open tools for AIS tracks from MarineCadastre.gov

OSGeo Planet - Sat, 2017-10-28 13:22

MarineCadastre.gov is a great source for AIS data along the US coast. Their data formats and tools though are less open. Luckily, GDAL – and therefore QGIS – can read ESRI File Geodatabases (.gdb).

They also offer a Track Builder script that creates lines out of the broadcast points. (It can also join additional information from the vessel and voyage layers.) We could reproduce the line creation step using tools such as Processing’s Point to path. But this post will show how to create PostGIS trajectories instead.

First, we have to import the points into PostGIS using either DB Manager or Processing’s Import into PostGIS tool:

Then we can create the trajectories. I’ve opted to create a materialized view:

The first part of the query creates a temporary table called ptm (short for PointM). This step adds time stamp information to each point. The second part of the query then aggregates these PointMs into trajectories of type LineStringM.

CREATE MATERIALIZED VIEW ais.trajectory AS WITH ptm AS ( SELECT b.mmsi, st_makepointm( st_x(b.geom), st_y(b.geom), date_part('epoch', b.basedatetime) ) AS pt, b.basedatetime t FROM ais.broadcast b ORDER BY mmsi, basedatetime ) SELECT row_number() OVER () AS id, st_makeline(ptm.pt) AS st_makeline, ptm.mmsi, min(ptm.t) AS min_t, max(ptm.t) AS max_t FROM ptm GROUP BY ptm.mmsi WITH DATA;

The trajectory start and end times (min_t and max_t) are optional but they can help speed up future queries.

One of the advantages of creating trajectory lines is that they render many times faster than the original points.

Of course, we end up with some artifacts at the border of the dataset extent. (Files are split by UTM zone.) Trajectories connect the last known position before the vessel left the observed area with the position of reentry. This results, for example, in vertical lines which you can see in the bottom left corner of the above screenshot.

With the trajectories ready, we can go ahead and start exploring the dataset. For example, we can visualize trajectory speed and/or create animations:

Purple trajectory segments are slow while green segments are faster

We can also perform trajectory analysis, such as trajectory generalization:

This is a first proof of concept. It would be great to have a script that automatically fetches the datasets for a specified time frame and list of UTM zones and loads them into PostGIS for further processing. In addition, it would be great to also make use of the information in the vessel and voyage tables, thus splitting up trajectories into individual voyages.

Read more:


Categories: OSGeo Planet

From GIS to Remote Sensing: SCP Questions of This Month: October

OSGeo Planet - Fri, 2017-10-27 08:00
This post is a collection of questions and answers about the Semi-Automatic Classification Plugin (SCP) and remote sensing which were discussed in the Facebook group and the Google+ Community this month.
These questions vary from supervised classification technique to software issues, and can be useful to the readers of this blog for solving issues about the use of SCP.

Categories: OSGeo Planet

GIS for Thought: #Ireland 2023

OSGeo Planet - Thu, 2017-10-26 09:53

Ireland is bidding for the 2023 Rugby World Cup.

They have submitted 12 stadiums in their bid. They cover all four provinces and the breadth of the island.

Ranging from Europe’s third biggest stadium Croke Park in Dublin. Welcoming 1.5 million fans every year. Packing in 82,300 dedicated fans every year for the GAA (Gaelic Athletic Association) Hurling and Gaelic Football finals. An integral part of Ireland’s history through sweat, blood, and identity.

The Aviva Stadium in Dublin, the worlds oldest international rugby stadium. Venue for the 2011 Europa League Final between Portuguese sides Porto and Braga.

Ravenhill Stadium in Belfast. Home of Ulster Rugby and in 1991 venue for Japan’s first match victory in a Rugby World Cup.

Thomond Park in Limerick. Heart of the community and host to a 12 year unbeaten run for Munster rugby. Winner of the ‘Best Rugby Stadium in the World’ vote in 2013.

Ireland 2023 Stadiums

Categories: OSGeo Planet

gvSIG Team: Material de los Talleres de Introducción a Scripting con gvSIG realizados en Valencia y Culiacan

OSGeo Planet - Thu, 2017-10-26 09:45

Durante las 13as Jornadas Internacionales gvSIG en Valencia y las 4as Jornadas gvSIG de México, se dieron varios talleres de Introducción a Scripting con gvSIG.

Puedes descargar el taller completo desde el siguiente enlace: Taller Introducción de Scripting.

El taller está realizado sobre la versión 2.3.1 de gvSIG y está todo explicado en diapositivas paso a paso, incluidas dentro del paquete y de manera online. Para instalar el paquete solo hay que ir al Administrador de Complementos -> Instalación desde fichero.

Esperamos que os sirvan de utilidad. Cualquier duda sobre el taller o el módulo de Scripting podéis hacerlas en las Listas de Correo.


Filed under: development, gvSIG Desktop, gvSIG development, scripting, spanish
Categories: OSGeo Planet

gvSIG Team: SIG aplicado a Gestión Municipal: Módulo 7.1 ‘Edición (Creación de nuevas capas, edición gráfica, edición alfanumérica)’

OSGeo Planet - Thu, 2017-10-26 07:22

Ya está disponible el primer vídeo del séptimo módulo, en el que comenzaremos con la parte de edición.

La edición es una parte muy importante dentro de los Sistemas de Información Geográfica, ya que nos permite por ejemplo crear nuevas capas vectoriales, digitalizar elementos, añadir información alfanumérica a dichas geometrías… Esto es lo que veremos en esta primera parte del módulo.

Son muchas las herramientas disponibles en el módulo de edición de gvSIG, entre las que destacan la de crear nuevos elementos (puntos, líneas, polígonos…), rotarlos, escalarlos, desplazarlos, moverlos, crear paralelas, alargar o recortar líneas, unir o partir geometrías, crear autopolígonos o polígonos isla, etc.

Podemos digitalizar tanto con cartografía de referencia, por ejemplo una ortofoto, como utilizar la consola de edición para escribir las coordenadas de inserción.

La cartografía a utilizar en este vídeo la podéis descargar desde el siguiente enlace.

El primer vídeo-tutorial de este séptimo módulo es el siguiente:

Post relacionados:


Filed under: gvSIG Desktop, spanish, training Tagged: ayuntamientos, edición, Edición alfanumérica, edición gráfica, gestión municipal
Categories: OSGeo Planet

gvSIG Team: Workshops about Geostatistics with R and gvSIG given in Valencia and Cualiacan

OSGeo Planet - Tue, 2017-10-24 12:26

During the 13th International gvSIG Conference in Valencia and 4th gvSIG Conference in Mexico, several workshops were given about R integration in gvSIG and some development examples with R.

From the gvSIG Association we want to share all this material to be avalaible for test and consulting. You can download it from this link. Besides from a layer about Brasil, all the data in the workshop has been downloaded from the UK Police Open Data portal.

We are going to explain some of the examples:

  • Test1 – Kriging example. based on a example found in here. The target was learn how to adapt a script to gvSIG.

  • Test 11: Descomposition. Reading csv files about UK crimes we’ll try to analyze if there is a seasonal component based on time series. We also create a prognostication about the crimes in the future for that region.

  • Test12: Mean Center. An analysis of the mean center for crimes point cloud in the region of North Wales. Also, there is the data to calculate the city of London. In the North of Wales zone it can be appreciated that the Y-axis, corresponding to latitude value, it has a seasonal component. The mean center moves to the East in Winter, and to the West in Summer. We also calculate the number of total crimes (ncrimes column).

  • Test 2:R example for transformation of multiple csv files (UK Crimes) to an unique shapefile. The input parameter is the folder of the csv files.

  • Test 3: Mapa. Simple example to show how to pass parameters from gvSIG to R.

  • Test 4: Simple example of R plot.

The workshops were made using gvSIG 2.3.1. If you are using a gvSIG portable version, the R plugin is already inside. If you are using a gvSIG installable version, you should install the “R” plugin from the Plugin Administrator.

An error that can occur is the lack of some R libraries. To do this you should install them from the R Console manually. This steps could be automatize. For example, if an error similar to “Error library tcltk2 not found”. If this happen, you need to install the libraries writing in the R console: install.packages(“tcltk2”).

The libraries that you could need are: tcltk2, forecast, xts, lattice.To open the R console, you need to go to the tab “System” in the scripting composer, search the module r.extension, and execute the script RShell as we explain in this documentation.

To install the data of the workshoop, you can do it from  the Plugin Administrator -> Install package from file. select the previous downloaded file. Once is installed, go to the Scripting Composer inside Tools -> Scripting. Inside the addons folder you will find the workshop.

If you want to execute one example of r, you will have to execute it from the python file that handle the execution. Each ejemploN.py execute testN.r

There are only to script that need special attention:

  • test2: you need to open a View with the projection 4326
  • test3: you need to have a View open with the layer UFEBRASIL. In that view, you need to select in the Table of Contents the layer, this one will appear in bold.

With a little bit of development, these conditions could be managed but as the purpose of the workshop is keep it simple we didn’t do it.

Any question can be asked in the mailing list.

Hope you like it!

Some of the references we have used for the workshop:


Filed under: development, english, gvSIG Desktop, gvSIG development, scripting Tagged: r
Categories: OSGeo Planet

gvSIG Team: Final notes: 13th International gvSIG Conference

OSGeo Planet - Tue, 2017-10-24 10:19

Fantastic, great, excellent presentations, full workshops…, with these words and with many thanks, most of the more than 300 attendees to the 13th International gvSIG Conference have transmitted us a feeling of satisfaction that we have had during the event.

That’s a reflection of the situation of the gvSIG project. It started developing a desktop GIS and it has continued with a whole catalog of open source software solutions, the gvSIG Suite with ‘horizontal’ solutions: gvSIG Desktop, gvSIG Mobile, gvSIG Online and an a wide range of sector products: gvSIG Roads, gvSIG Educa, gvSIG Crime …

The gvSIG Association, responsible for the sustainability and evolution of this suite, has become a model of professional services in open source geomatics. During the conference some of these projects related to critical sectors were presented. For example about the Provincial Consortium of Firemen of Valencia, the Civil Protection National Department of Spain or the Observatory for Security and Citizen Coexistence of the Government of Cordoba in Argentina.

The vitality of the project is also defined by the innumerable novelties that were presented during the first day of the event. gvSIG Desktop 2.4 includes a lot of improvements, gvSIG Online too, the new gvSIG Mobile based on Geopaparazzi is being released now, gvSIG Crime…

And complementing the professional sector with a set of outstanding papers from the field of research and education. At the beginning of the project we spoke about synergies between administration, university and company, and now they are being materialized around gvSIG.

The gvSIG conference are always a good place to expand training. Workshops with full capacity until the last minute of the event, it has been the same if they were for users or for developers. Another excellent sign.

Finally, on Friday, there was a working day of the different teams that are working on the development of the different gvSIG products. The feeling: next year will be even better.

Understand that we are happy. And that we transmit it.


Filed under: english, events Tagged: 13th gvSIG Conference
Categories: OSGeo Planet

gvSIG Team: Material de los Talleres de Geoestadística realizados con R y gvSIG en Valencia y Culiacán

OSGeo Planet - Tue, 2017-10-24 08:51

Durante las 13as Jornadas Internacionales gvSIG en Valencia y las 4as Jornadas gvSIG de México, se dieron varios talleres muy similares explicando la Integración de R en gvSIG y ejemplos de desarrollo con R.

Desde  la Asociación queremos poner todo este material disponible para la consulta y prueba de cualquiera. Os lo podéis descargar desde este enlace. Además de una capa de Brasil, el contenido principal del taller ha sido descargado del portal de Open Data de crímenes en UK.

Vamos a explicar los diferentes ejemplos que se han realizado:

  • Test1 – Ejemplo de Kriging. Basado en un ejemplo encontrado en Internet en el siguiente enlace. El objetivo era aprender a adaptar ejemplos ya hechos a gvSIG.

  • Test 11: Descomposition. Se realiza una lectura de los csv de crímenes en UK y se busca analizar si tienen una componente estacional basada en series de tiempo. Con el resultado se busca poder pronosticar el comportamiento a futuro de los crímenes en la región.

  • Test12: Mean Center. Se realiza un análisis del eje mediano de la nube de puntos de crímenes en la región del Norte de Gales. También se incluye la opción de realizar el cálculo sobre la ciudad de Londres. En la zona del Norte de Gales se puede apreciar que en el eje Y que corresponde a la longitud de se centro mediano, sufre un desplazamiento estacional, teniendo picos en invierno (desplazamiento hacia el Este del centro mediano) y bajadas en verano (desplazamiento hacia el Oeste). También se hace teniendo en cuenta el Numero de Crímenes totales (columna ncrimes).

  • Test 2: Ejemplo en R de transformación de toda una serie de ficheros csv en un único fichero shapefile con todos los crimenes de UK para una cantidad de tiempo. El único parámetro de entrada es la carpeta donde se encuentran los csv.

  • Test 3: Mapa. Ejemplo sencillo de pase de parámetros y de mostrar un mapa a través de R.

  • Test 4: Ejemplo sencillo de gráfica en R.

Los talleres fueron realizados con la versión 2.3.1 de gvSIG. Si se usa la versión portable, el plugin de R debería de venir instalado por defecto. Si usáis la versión instalable, deberéis de ir al administrador de complementos y descargar el plugin correspondiente de “R”.

Un error que puede aparecer es que te avise de que no tiene ciertas librerías instaladas. En ese caso tienes que ir a la consola de R y instalarla manualmente (también se podría automatizar esto dentro del script). Por ejemplo, si aparece un Error library ggplot2 not found o similar deberías de hacer un install.packages(“tcltk2”) en la consola. En principio las únicas librerías de R que harían falta para instalar serían: tcltk2, forecast, xts, lattice

Esta explicado en la documentación del Taller realizado el año anterior:

http://downloads.gvsig.org/download/web/es/build/html/workshops/workshop_12gvsig_r/index.html#librerias-de-r

Para utilizar los ejemplos del taller, desde el Administrador de Complementos -> Instalación desde Archivo, seleccionaremos el fichero descargado anteriormente. Una vez realizado esto, si nos vamos a Herramientas -> Scripting -> Editor de Scripts, dentro de la carpeta de /addons/, nos debería de aparecer una carpeta con el nombre del Taller.

Si queremos ejecutar estos scripts, debemos de abrir el script correspondiente de Python que lo ejecuta. En este caso, cada ejemploN.py ejecuta el script de R testN.r

Los dos scripts que necesitas alguna condición previa para su ejecución son:

  • test2: necesita tener una Vista con la proyección 4326 asignada
  • test3: necesita tener una Vista abierta con la capa de UFEBRASIL que contiene el propio script, y con esta capa seleccionada en la Tabla de Contenidos (TOC) de la Vista.

Con algo más de desarrollo estas condiciones se podrían manejar mejor, pero por mantener un nivel de iniciados en el taller no se han implementado en los scripts.

Cualquier duda, pregunta o recomendación sobre el taller podéis escribirnos a las Listas de Correo.

¡Espero que los talleres fueran de utilidad y de su interés!

Alguna de la bibliografía base usada para los talleres:


Filed under: development, events, gvSIG Desktop, gvSIG development, scripting, spanish
Categories: OSGeo Planet

GeoTools Team: GeoTools 17.3 released

OSGeo Planet - Tue, 2017-10-24 06:04
The GeoTools team is pleased to announce the release of GeoTools 17.3:This release is also available from our maven repository.

This release is made in conjunction with GeoServer 2.11.3.

GeoTools 17.3  marks the switch of the 17.x series to maintenance mode (as 18.x takes the role of stable release) is a recommended upgrade for projects already using the 17.x series. This release comes with 15 assorted fixes and a couple of minor improvements:
  • Several improvements in image mosaic and raster rendering (in particular related to mosaics with mixed CRS, filtering and sorting, and direct modification of the mosaic index)
  • Avoid rendering empty outputs on raster data when the oversampling factor reaches very high values
  • Better mapping of dates in Oracle datastore, in particular, DATE is now mapped to java.sql.TimeStamp
  • Better toString for temporal filters (handy if you're debugging some code involving them)
  • Fixed an issue preventing to parse GeoJSON having a "crs" key among its properties
And more! For more information please see the release notes (17.3 | 17.2 | 17.1 | 17.0 | 17-RC1 | 17-beta).About GeoTools 17
  • The wfs-ng module is now a drop in replacement and will be replacing gt-wfs
  • The NetCDF module now uses NetCDF-Java 4.6.6
Upgrading
  • The AbstractDataStore has finally been removed, please transition any custom DataStore implementations to ContentDataStore (tutorial available).
Categories: OSGeo Planet

GeoServer Team: GeoServer 2.11.3 released

OSGeo Planet - Mon, 2017-10-23 17:35

We are happy to announce the release of GeoServer 2.11.3. Downloads are available (zipwardmg and exe) along with documentation and extensions.

GeoServer 2.11.3 is the first maintenance release of the GeoServer 2.11.x series recommended for production system. This release is made in conjunction with GeoTools 17.3.

Highlights of this release are featured below, for more information please see the release notes (2.11.3 |  2.11.22.11.12.11.0 | 2.11-RC1 | 2.11-beta ).

New Features and Improvements
  • KML has two new options to control polygon placemark placement, to force it within the current view, and to make sure it’s inside polygons regardless of the polygon shape
  • The layer preview allows filtering on raster layers such as mosaics and image pyramids (which can apply the filter on the underlying mosaic index)
Bug Fixes
  • Improvements on pre-build legend treatment in GetLegendGraphic, now works against a layer default style too, and for workspace specific styles
  • GDAL bindings jars restored in the build, making it easier to use the pre-built native support (as opposed to custom built ones, for those remember to remove them)
  • Raster rendering fixes: no more empty output when asking for highly oversampled rasters in WMS, make sure new image mosaic tiles are rendered even if outside the original bounds in image mosaic
  • GWC related improvemenets, seed links now working when running behind a proxy, and more strict checks for non cacheable requests over the WMS service endpoint (especially useful when using direct integration)
  • GeoServer now changes the SLD version number when uploading a SLD 1.1 overwriting a SLD 1.0 file using the REST API (especially useful for QGIS and GeoNode integrations). The UI was not affected by this issue.
  • And several more, check the release notes for full details
About GeoServer 2.11

Articles, docs, blog posts and presentations:

  • OAuth2 for GeoServer (GeoSolutions)
  • YSLD has graduated and is now available for download as a supported extension
  • Vector tiles has graduate and is now available for download as an extension
  • The rendering engine continues to improve with underlying labels now available as a vendor option
  • A new “opaque container” layer group mode can be used to publish a basemap while completely restricting access to the individual layers.
  • Layer group security restrictions are now available
  • Latest in performance optimizations in GeoServer (GeoSolutions)
  • Improved lookup of EPSG codes allows GeoServer to automatically match EPSG codes making shapefiles easier to import into a database (or publish individually).
Categories: OSGeo Planet

gvSIG Team: Apuntes finales: 13as Jornadas Internacionales de gvSIG

OSGeo Planet - Mon, 2017-10-23 08:21

Fantásticas, estupendas, excelentes ponencias, talleres llenos,…con estas palabras y con muchos agradecimientos nos han transmitido gran parte de los más de 300 asistentes a las 13as Jornadas Internacionales de gvSIG una sensación de satisfacción que hemos tenido durante todo el evento.

Un reflejo, por otro lado, de la situación que vive el proyecto gvSIG. Se ha pasado de desarrollar un SIG de escritorio a disponer de todo un catálogo de soluciones en software libre, la Suite gvSIG con soluciones ‘horizontales’: gvSIG Desktop, gvSIG Mobile, gvSIG Online y un cada vez más amplio abanico de productos sectoriales: gvSIG Roads, gvSIG Educa, gvSIG Crime…

La Asociación gvSIG, responsable de la sostenibilidad y evolución de esta suite, se ha convertido en el referente de servicios profesionales de geomática libre. Ahí estaban proyectos de sectores tan críticos como el de los proyectos presentados por el Consorcio Provincial de Bomberos de Valencia, la Dirección Nacional de Protección Civil de España o el Observatorio de Seguridad y Convivencia Ciudadana del Gobierno de Córdoba en Argentina.

La vitalidad del proyecto la definen también las innumerables novedades que fueron presentadas durante buena parte del primer día del evento. gvSIG Desktop 2.4 viene repleto de mejoras, gvSIG Online le va a la par, llega el nuevo gvSIG Mobile basado en Geopaparazzi, se presenta gvSIG Crime…

Y complementando al sector profesional un conjunto de destacadas ponencias del ámbito de la investigación y la educación. Aquellas sinergias entre administración, universidad y empresa, de las que hablábamos en nuestros orígenes se están materializando alrededor de gvSIG.

Las jornadas gvSIG son siempre un buen lugar para ampliar formación. Talleres con aforo completo hasta el último minuto del evento, dando igual que fueran para usuarios o para desarrolladores. Otra excelente señal.

Por último, el viernes, jornada de trabajo de los distintos equipos que están trabajando en el desarrollo de los distintos productos de gvSIG. La sensación: el año que viene va a ser todavía mejor.

Comprendan que estemos contentos. Y que lo transmitamos.


Filed under: events, spanish Tagged: 13as Jornadas gvSIG, 13gvsig
Categories: OSGeo Planet

gvSIG Team: SIG aplicado a Gestión Municipal: Módulo 6 ‘Administrador de complementos’

OSGeo Planet - Mon, 2017-10-23 08:06

Ya está disponible el sexto módulo del Curso de SIG aplicado a Gestión Municipal, que tratará sobre el Administrador de complementos de gvSIG.

Cada versión de gvSIG lleva una gran cantidad de extensiones y librerías de símbolos que no van instaladas por defecto para no recargar la aplicación, y que el usuario puede elegir cuándo instalarlas. Para ello utilizará el Administrador de complementos

Aparte, si se publica una nueva extensión de gvSIG o librería de símbolos tras la publicación de una versión final no hace falta esperar a la siguiente versión para poder disponer de ellas. Con el Administrador de complementos tenemos la posibilidad de conectar al servidor e instalarlas.

También, si se ha corregido algún error tras la publicación de una versión final se puede actualizar el plugin correspondiente sin tener que publicar una versión completa.

El Administrador de complementos nos ofrece tres opciones:

  • Instalación estándar: Plugins incluidos en el paquete de instalación descargado pero que no se instalan por defecto.
  • Instalación desde archivo: Cuando descargamos el paquete de la extensión, script o librería de símbolos en nuestro disco. Muy útil por ejemplo cuando creamos una librería de símbolos y la queremos pasar al resto de usuarios de nuestra organización.
  • Instalación de URL: Plugins disponibles en el servidor. Se utiliza normalmente para los paquetes publicados tras una versión final, bien nuevas funcionalidades o bien que corrijan algún error detectado tras la publicación de la versión final.

En este módulo no será necesario descargar ninguna cartografía previamente.

El vídeo-tutorial de este sexto módulo es el siguiente:

Post relacionados:

 


Filed under: gvSIG Desktop, spanish, training Tagged: administrador de complementos, extensiones, librerías de símbolos, plugins
Categories: OSGeo Planet

Paulo van Breugel: GRASS GIS Jupyter notebooks

OSGeo Planet - Sun, 2017-10-22 09:43
A great source of information about GRASS GIS is the GRASS Wiki. One example is this list with GRASS GIS Jupyter notebooks which was just added by Markus Neteler (no introduction needed I guess). There are some really nice tutorials there, which alone is reason enough to check out this list. I have been using …

Continue reading GRASS GIS Jupyter notebooks

Categories: OSGeo Planet

Ian Turton's Blog: Adding a .prj file to existing data files

OSGeo Planet - Fri, 2017-10-20 00:00

While teaching a GeoServer course recently, we were trying to add a collection of tif and world files to GeoServer as an image mosaic. But the operation kept failing as GeoServer was unable to work out the projection of the files.

This problem can be avoided by adding a .prj file to the tif file to help GeoServer out. However we had hundreds of files and a certain national mapping agency had just assumed that everyone knew its files were in EPSG:27700.

Later, I worked up a quick solution to this problem. GeoTools is capable of writing out a WKT representation of a projection and Java has no problem walking a directory tree matching a regular expression.

Getting the WKT of a projection is trivial:

CoordinateReferenceSystem crs = CRS.decode("epsg:27700"); String wkt = crs.toWKT();

Walking the directory tree was a little trickier but uses an anonymous method of the Files class walkFileTree

public static ArrayList<File> match(String glob, String location) throws IOException { ArrayList<File> ret = new ArrayList<>(); final PathMatcher pathMatcher = FileSystems.getDefault().getPathMatcher("glob:**/" + glob); Files.walkFileTree(Paths.get(location), new SimpleFileVisitor<Path>() { @Override public FileVisitResult visitFile(Path path, BasicFileAttributes attrs) throws IOException { if (pathMatcher.matches(path)) { ret.add(path.toFile()); } return FileVisitResult.CONTINUE; } @Override public FileVisitResult visitFileFailed(Path file, IOException exc) throws IOException { return FileVisitResult.CONTINUE; } }); return ret; }

The full code can be found in this snippet. The usage is pretty simple to just add a .prj file to a single file (say a shapefile):

java AddProj epsg:27700 file.shp

Or to deal with a whole directory

java AddProj epsg:27700 /data/os-data/rasters/streetview/*.tif

Which adds a .prj file to all the .tif files in that directory and all subdirectories.

Obviously you can use other EPSG codes if your data supplier assumes that everyone knows their projection is the only one in the world.

Categories: OSGeo Planet

Oslandia: Auxiliary Storage support in QGIS 3

OSGeo Planet - Thu, 2017-10-19 12:50

For those who know how powerful QGIS can be using data defined widgets and expressions almost anywhere in styling and labeling settings, it remains today quite complex to store custom data.

For instance, moving a simple label using the label toolbar is not straightforward, that wonderful toolbar remains desperately greyed-out for manual labeling tweaks

…unless you do the following:

  • Set your vector layer editable (yes, it’s not possible with readonly data)
  • Add two columns in your data
  • Link the X property position to a column and the Y position to another

 

the Move Label map tool becomes available and ready to be used (while your layer is editable). Then, if you move a label, the underlying data is modified to store the position. But what happened if you want to fully use the Change Label map tool (color, size, style, and so on)?

 

Well… You just have to add a new column for each property you want to manage. No need to tell you that it’s not very convenient to use or even impossible when your data administrator has set your data in readonly mode…

A plugin, made some years ago named EasyCustomLabeling was made to address that issue. But it kept being full of caveats, like a dependency to another plugin (Memory layer saver) for persistence, or a full copy of the layer to label inside a memory layer which indeed led to loose synchronisation with the source layer.

Two years ago, the French Agence de l’eau Adour Garonne (a water basin agency) and the Ministry in charge of Ecology asked Oslandia to think out QGIS Enhancement proposals to port that plugin into QGIS core, among a few other things like labeling connectors or curved labels enhancements.

Those QEPs were accepted and we could work on the real implementation, so here we are, Auxiliary storage has now landed in master!

How

The aim of auxiliary storage is to propose a more integrated solution to manage these data defined properties :

  • Easy to use (one click)
  • Transparent for the user (map tools always available by default when labeling is activated)
  • Do not update the underlying data (it should work even when the layer is not editable)
  • Keep in sync with the datasource (as much as possible)
  • Store this data along or inside the project file

As said above, thanks to the Auxiliary Storage mechanism, map tools like Move Label, Rotate Label or Change Label are available by default. Then, when the user select the map tool to move a label and click for the first time on the map, a simple question is asked allowing to select a primary key :

Primary key choice dialog – (YES, you NEED a primary key for any data management)

From that moment on, a hidden table is transparently created to store all data defined values (positions, rotations, …) and joined to the original layer thanks to the primary key previously selected. When you move a label, the corresponding property is automatically created in the auxiliary layer. This way, the original data is not modified but only the joined auxiliary layer!

A new tab has been added in vector layer properties to manage the Auxiliary Storage mechanism. You can retrieve, clean up, export or create new properties from there :

Where the auxiliary data is really saved between projects?

We end up in using a light SQLite database which, by default, is just 8 Ko! When you save your project with the usual extension .qgs, the SQLite database is saved at the same location but with a different extension : .qgd.

Two thoughts with that choice: 

  • “Hey, I would like to store geometries, why no spatialite instead? “

Good point. We tried that at start in fact. But spatialite database initializing process using QGIS spatialite provider was found too long, really long. And a raw spatialite table weight about 4 Mo, because of the huge spatial reference system table, the numerous spatial functions and metadata tables. We chose to fall back onto using sqlite through OGR provider and it proved to be fast and stable enough. If some day, we achieve in merging spatialite provider and GDAL-OGR spatialite provider, with options to only create necessary SRS and functions, that would open news possibilities, like storing spatial auxiliary data.

  • “Does that mean that when you want to move/share a QGIS project, you have to manually manage these 2 files to keep them in the same location?!”

True, and dangerous isn’t it? Users often forgot auxiliary files with EasyCustomLabeling plugin.  Hence, we created a new format allowing to zip several files : .qgz.  Using that format, the SQLite database project.qgd and the regular project.qgs file will be embedded in a single project.zip file. WIN!!

Changing the project file format so that it can embed, data, fonts, svg was a long standing feature. So now we have a format available for self hosted QGIS project. Plugins like offline editing, Qconsolidate and other similar that aim at making it easy to export a portable GIS database could take profit of that new storage container.

Now, some work remains to add labeling connectors capabilities,  allow user to draw labeling paths by hand. If you’re interested in making this happen, please contact us!

 

 

More information

A full video showing auxiliary storage capabilities:

 

QEP: https://github.com/qgis/QGIS-Enhancement-Proposals/issues/27

PR New Zip format: https://github.com/qgis/QGIS/pull/4845

PR Editable Joined layers: https://github.com/qgis/QGIS/pull/4913

PR Auxiliary Storage: https://github.com/qgis/QGIS/pull/5086

Categories: OSGeo Planet

gvSIG Team: SIG aplicado a Gestión Municipal: Módulo 5.3 ‘Servicios web (Servicios no estándares)’

OSGeo Planet - Thu, 2017-10-19 10:49

Ya está disponible el tercer vídeo del quinto módulo, en el que hablaremos de cómo trabajar con servicios web que no siguen los estándares OGC en gvSIG Desktop, pero que nos pueden servir para complementar nuestros mapas con capas diferentes.

Entre los servicios disponibles tenemos el de OpenStreetMap, con el que tenemos acceso a varias capas, tanto de callejeros, como de cartografía náutica o de ferrocarriles, o cartografía con diferentes tonalidades que nos pueden servir como cartografía de referencia en nuestro mapa.

Otros servicios disponibles son los de Google Maps y de Bing Maps, donde podemos cargar distintas capas.

El requisito para poder cargar estas capas hasta la versión 2.4 inclusive es que debemos tener la vista en el sistema de referencia EPSG 3857, un sistema propio que utilizan dichos servicios.

Aparte, para poder cargar las capas de Bing Maps necesitaremos obtener previamente una clave, que podemos obtener como se cuenta en el vídeo.

Una ver cargados podemos reproyectar a dicho sistema nuestras capas. Además muchos servicios web OGC, como WMS, WFS…, ofrecen sus capas en dicho sistema de referencia, por lo que podemos superponerlas a ellas.

El tercer vídeo-tutorial de este quinto módulo es el siguiente:

Post relacionados:


Filed under: gvSIG Desktop, IDE, spanish, training Tagged: ayuntamientos, Bing Maps, gestión municipal, Google Maps, OpenStreetMap, OSM, Servicios web
Categories: OSGeo Planet

gvSIG Team: Concurso gvSIG Batovi: premiación

OSGeo Planet - Wed, 2017-10-18 15:20

gvSIG Batovi

Ha culminado el concurso Proyectos de trabajo con estudiantes y gvSIG Batoví. Esta muy gratificante y enriquecedora primera experiencia para Uruguay resultó todo un desafío, desde el punto de vista organizativo, de planificación y coordinación. Pero podemos afirmar -con modestia y sencillez pero también con convencimiento- que ha resultado todo un éxito.

Este concurso buscaba incentivar el uso de gvSIG Batoví en proyectos concretos. Fue una iniciativa del Ministerio de Transporte y Obras Públicas (en especial la Dirección Nacional de Topografía), en coordinación con el Consejo de Educación Secundaria de la Administración Nacional de Educación Pública -ANEP-CES- (en especial la Inspección Nacional de Geografía) y el Centro Ceibal (en especial el Área de Contenidos y LabTeD -Laboratorios de Tecnologías Digitales-).

Los grupos postulados (integrados por estudiantes y docentes de Geografía y otras disciplinas de Enseñanza Secundaria del sistema público de educación de todo el país) contaron con el seguimiento…

View original post 402 more words


Filed under: gvSIG Desktop
Categories: OSGeo Planet

Jackie Ng: The journey of porting the MapGuide Maestro API to .net standard

OSGeo Planet - Wed, 2017-10-18 14:37
So what prompted the push to port the MapGuide Maestro API to .net standard was Microsoft recently releasing a whole slate of developer goodies:
Of particular relevance to this subject of this post, is .net standard 2.0.
For those who don't know, .net standard is (you guessed it) a versioned standard by which one can write portable and cross-platform class libraries against that will work in any .net runtime environment that supports the version of .net standard that you are targeting. If you do Android development, this is similar to API levels.
.net standard is of interest to me as the MapGuide Maestro API at the moment is a set of class libraries that target the full .net Framework. Having it target .net standard instead would give us guaranteed cross-platform portability across .net runtime environments that support .net standard (Mono) and/or supporting platforms that would never have been possible before in the past (.net Core/Xamarin/UWP)
I tried an attempt at porting the Maestro API to earlier versions of .net standard, with mixed success:
  • The ObjectModels library was able to be ported to .net standard 1.6, but required installing many piecemeal System.* nuget packages to fill in the missing APIs.
  • Maestro API itself could not be ported due to reliance of XML schema functionality and HttpWebRequest, that no version of .net standard before 2.0 supported.
  • Maestro API had upstream dependencies (eg. NetTopologySuite) that were not ported to .net standard.
  • More importantly, the bits I were able to port across (ObjectModels), I couldn't run their respective (full-framework) unit test libraries from the VS test explorer due to cryptic assembly loading errors due to the assembly manifest of the various piecemeal System.* assemblies not matching their assembly reference. With no way to run these tests, the porting effort wasn't worth continuing.
Around this time, I heard of what the upcoming (at the time) .net standard 2.0 would bring to the table:
  • Over 2x the API surface of netstandard1.6, including key missing APIs needed by the Maestro API like the XML schema APIs and HttpWebRequest
  • A compatibility mode for full .net Framework. If this works as hoped, it means we can skip waiting on upstream dependencies like NetTopologySuite and friends needing to have netstandard-compatible ports and use the existing versions as-is.
Given the compelling points of .net standard 2.0 and mixed results with porting to the (then) current iteration on .net standard, I decided to put these porting efforts on ice and wait until the time when .net standard 2.0 and its supporting tooling comes out.

Now that .net standard 2.0 and supporting tooling came out, it was time to give this porting effort another try ... and I could not believe how much less painful the whole process was! This was basically all I had to do to port the following libraries to .net standard 2.0:

Preparation Work

To be able to use our (ported to .net standard 2.0) MaestroAPI in the (full framework) Maestro windows application, we needed to first re-target all affected project files to target .net Framework 4.6.1, as this is the minimal version of the full .net framework that supports .net standard 2.0

OSGeo.FDO.Expressions

This is a class library that uses the Irony grammar parser to parse FDO expression strings to an object oriented form. Maestro uses this library to be able to analyze FDO expressions for validation purposes (eg. You don't have a FDO expression that references a property that doesn't exist).

My process of converting the existing full framework csproj file to .net standard was to basically just replace the entire contents of the original csproj file with the minimum required content for a .net standard 2.0 class library.


1
2
3
4
5<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netstandard2.0</TargetFramework>
</PropertyGroup>
</Project>

That's right, the content of a minimal .net standard 2.0 class library is just 5 lines of XML! All .cs files are implicitly included now when building this project, which greatly contributes to the simplicity of the new csproj format.

Now obviously this project file as-is won't compile as we need to reference Irony and use VS2017 to regenerate the resx string bundles and source link shared assembly info files. After those changes were made, the project builds with the only notable warning being NU1701, which is the warning emitted by the new tooling when we reference full framework libraries/packages from a netstandard2.0 class library (that the new tooling allows us to do for compatibility purposes).

It was around this time that I discovered that someone has made a netstandard-compatible port of Irony, so we replaced the existing Irony reference with the netstandard-compatible port instead. This library was now fully ported across.

ObjectModels

This is the class library that describes all of our XML resources in MapGuide as strongly-typed classes with full XML (de)serialization support to and from both forms at various schema versions.

The original porting attempt targeted netstandard 1.6. While this was mostly painless, I had to reference tons of piecemeal System.* nuget packages, which then flowed down to anything that was referencing it.

For this attempt, we target .net standard 2.0 using the same technique of pasting a minimal netstandard2.0 class library template into the existing csproj file. Like the previous attempt, building this project failed due to dependencies on System.Drawing as a result of usages of System.Drawing.Font. Further analysis shows that we were using Font as a glorified DTO. So it was just a case of adding a new type that carried the same properties we were capturing with the System.Drawing.Font objects that were being passed around.

Due to referencing the NETStandard.Library metapackage by default, this attempt did not require referencing piecemeal System.* nuget packages like the previous attempts. So that's another library ported across.

MaestroAPI

Now for the main event. Maestro API needed to be netstandard-compatible otherwise this whole porting effort is a waste. The previous attempt (to target netstandard1.6) was cut short as APIs such as XML Schema support was not there. For .net standard 2.0, these missing APIs are back, so porting across MaestroAPI should be a much simpler affair.

And indeed it was.

Just like the ObjectModels porting effort, we hit some snags around references to System.Drawing. Unlike ObjectModels, we were using full blown Images and Bitmaps from System.Drawing and not things like Fonts which we were just using to sling font information around.

To address this problem a new full framework library (OSGeo.MapGuide.MaestroAPI.FxBridge) was introduced where classes that were using these incompatible types were relocated to. There was also service interfaces that returned System.Drawing.Image objects (IMappingService). These APIs have been modified to return raw System.IO.Stream objects instead, with the FxBridge library providing extension methods to "polyfill" in the old APIs that returned images. Thus, code that used these affected APIs can just reference the FxBridge library in addition to MaestroAPI and be able to work as before.

After sectioning off these incompatible types to the FxBridge library, the next potential roadblock in our porting efforts was our upstream dependencies. In particular, we were using NetTopologySuite, GeoAPI and Proj.NET to give Maestro API a strongly-typed geometry model and some basic coordinate system transformation capabilities. These were all full framework packages, meaning our previous porting attempt (to target netstandard1.6) was stopped in its tracks.

Because netstandard2.0 has a full-framework compatibility shim, we were able to reference these existing packages with the standard NU1701 compatibility warnings spat out by NuGet. However, since the previous porting attempt, the authors of NetTopologySuite, GeoAPI and Proj.NET have released netstandard-compatible (albeit prerelease) versions of their respective libraries, so as a result we were able to fully netstandard-ify all our dependencies as well.

However, we had to turn off strong naming of our assembly in the process because our upstream dependencies did not offer strong-named netstandard assemblies.

And with that, the Maestro API was ported to .net standard 2.0

MaestroAPI HTTP Provider

However, the Maestro API would not be useful without a functional HTTP provider to communicate with the mapagent. So this library also needed to be netstandard-compatible.

The previous porting attempt (to netstandard1.6) was roadblocked because the HTTP provider uses HttpWebRequest to communicate with the mapagent. While we could have just replaced HttpWebRequest with the newer HttpClient, that would require a full async/await-ification of the whole code base and then having to deal properly with the leaky abstractions known as SynchronizationContext and ConfigureAwait to ensure our async/await-ified HTTP provider is usable in both ASP.net and desktop windows application contexts without it deadlocking on one or the other.

While having a fully async HTTP provider is good, I wanted to have a functional one first before undertaking the task of async/await-ifying it. The development effort involved was such that it was better to just wait for .net standard 2.0 to arrive (where HttpWebRequest was supported) than to try to modify the HTTP provider to use HttpClient.

And just like the porting of the ObjectModels/MaestroAPI projects, this was a case of taking the existing csproj file, replacing the contents with the minimal netstandard class library template and manually adding in the required references and various settings until the project built again.

Caught in a snag

So all the key parts of the Maestro API have been ported across to .net standard 2.0 and the code all builds, so now it was time to run our unit tests to make sure everything was still green.

All green they were indeed. All good! Now to run the thing.

Most things seemed to work until I validated a Map Definition and got this message.



Assembly manifest what? I have no idea! This error is also thrown when I use any part of the MaestroAPI that uses NetTopologySuite -> GeoAPI.

My first port of call was to look at this known issue and try all the workarounds listed:
  • Force all our projects to use PackageReferences mode for installing/restoring nuget packages
  • Enable automatic binding redirect generation on all executable projects
After trying these workarounds, the assembly manifest errors still persisted. At this point I was stuck and was on the verge of giving up on this porting effort until some part of my brain told me to take a look at the assemblies that were in the output directory.
Since the error in question referred to GeoAPI.dll, I'd thought I'd crack that assembly open in ILSpy and see what interesting information I could find about this assembly.


Well this was certainly most interesting! Why is a full-framework GeoAPI.dll being copied out? The only direct consumer of GeoAPI (OSGeo.MapGuide.MaestroAPI.dll) is netstandard2.0, and it is referencing the netstandard target of GeoAPI.

Here's a diagram of what I was expecting to see:



After digging around some more it appears from observation that there is a bug (or is it feature?) in MSBuild where given a nuget package that offers both netstandard and full-framework targets, it will prefer the full-framework target over the netstandard one. This means in the case of GeoAPI, because our root application is a full-framework one, MSBuild chose the full-framework target offered by GeoAPI instead of the netstandard one.
So what's the assembly manifest error all about? The FusionLog property of the exception reveals the answer.


GeoAPI is strong-named for full-framework. GeoAPI is not strong-named for netstandard. The assembly manifest error is because our netstandard-targeting MaestroAPI references the netstandard target of GeoAPI (not strong-named), but because our root application is a full-framework one, MSBuild gave us a full-framework GeoAPI assembly instead. At runtime, .net could not reconcile that a strong-named GeoAPI was being loaded when our netstandard-targeting MaestroAPI was references the netstandard GeoAPI that is not strong named. Hence the assembly manifest error.
Multi-targeting for the ... win?

Okay, so now we know why it's happening, what can we do about it? Well, the other major thing that the new MSBuild and csproj file format gives us is the ability to easily multi-target the project for different frameworks and runtimes.

By changing the TargetFramework element in our project to TargetFrameworks (plural) and specifying a semi-colon-delimited list of TFMs, we now have a class library that can build for each one of the TFMs specified.

For example, a netstandard 2.0 class library like this:

1
2
3
4
5<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netstandard2.0</TargetFramework>
</PropertyGroup>
</Project>

Can be made to multi-target like this:

1
2
3
4
5<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFrameworks>netstandard2.0;net461</TargetFrameworks>
</PropertyGroup>
</Project>

If MSBuild insists on giving us full-framework dependencies if given the choice between full-framework and netstandard (when both are compatible), then the solution is to basically multi-target the MaestroAPI class library so that we offer 2 flavors of the assembly:
  • A full-framework one (net461) that will be selected by MSBuild if the consuming application is a full-framework one.
  • The netstandard one (netstandard2.0) that will be selected by MSBuild if the consuming application is .net Core, Xamarin, etc.
Under this setup MSBuild will choose the full-framework Maestro API over the netstandard one when building the Maestro windows application. Since we're now building for multiple frameworks/runtimes and explictly targeting full-framework again, we can re-activate strong naming on the full-framework (net461) target, ensuring the full-framework dependency chain of MaestroAPI is fully strong-named (as it was before we started this porting effort), and our assembly manifest error goes away when running unit tests and the Maestro application itself whenever we hit functionality that uses GeoAPI/NetTopologySuite.

So the problem is effectively solved, but the whole process feels somewhat anti-climactic.

I mean ... the whole premise of .net standard and why I wanted to port MaestroAPI to target it was the promise of one unified target (an interface if you will) with many supporting runtimes (ie. various implementations of this interface). Target the standard and your class library will work across the supporting runtimes, in theory.

Unfortunately in practice, strong-naming (and MSBuild choosing full-framework targets over netstandard, even if both are compatible) was the leaky abstraction that threw a monkey wrench on this whole concept, especially if some targets are strong-named and some are not. Having to multi-target the Maestro API as a workaround feels unnecessary.

But at the end of the day, we still achieved our goal of a netstandard-compatbile Maestro API that can be used in .net Core, Xamarin, etc. We just had to take a very long detour to get from A to B and all I can think of was: Was this (multi-targeting) absolutely necessary?

Some Changes and Compromises

Although we now have a .net standard and full framework compatible versions of the Maestro API, we have to make some changes and compromises around the developer and acquisition experience for this to work in a cross-platform .net world.

1. For reasons previously stated, we have to disable strong-naming of the Maestro API for the .net standard target. This is brought upon us by our upstream dependencies (the netstandard flavors of GeoAPI and NetTopologySuite), which we can't do anything about. The full framework target however is still strong-named as before.

2. The SDK package in its current form will most likely go away. This is because turning Maestro API into a .net standard library forces us to use nuget packages as the main delivery mechanism, which is a good thing because nobody should be manually referencing assemblies in this day and age for consuming libraries. The tooling now is just so brain-dead simple that we have no excuse to not make nuget packages. No SDK package also means that we can look at alternative means of generating API documentation (docfx looks like a winner), instead of Sandcastle as making CHM files is kind of pointless and the only reason I made CHM files was to bundle it with the SDK package.

The sample code and supporting tools that were previously part of the SDK package will be offloaded to a separate GitHub repository that I'll announce in due course. I'll need to re-think the main ASP.net code sample as well, because the old example required:

  • Manually setting up a web application in local IIS (not IIS Express)
  • Manually referencing a whole bunch of assemblies
  • Needing to run Visual Studio as administrator to debug the code sample due to the local IIS constraint.

These are things that should not be done in 2017!

3. Because nuget packages are the strongly preferred way of consuming libraries, it meant that having the HTTP provider as a separate library just complicates things (having to register this provider in ConnectionProviders.xml and automating it when installing its theoretical nuget package). The Maestro API on its own is pretty useless without the HTTP provider anyways, so in the interest of killing two birds with one stone, the HTTP provider has been integrated into the Maestro API assembly itself. This means that you don't even need ConnectionProviders.xml unless you need to use the (mg-desktop wrapper) local connection provider, or you need to use a (roll your own wrapper around the official MapGuide API) local-native connection provider.

4. The CI machinery needed some adjustments. I couldn't get OpenCover to work against our newly ported netstandard libraries using (dotnet test) as the runner, so I had to momentarily disable the OpenCover instrumentation while the unit tests ran in AppVeyor. But as a result of needing to multi-target MaestroAPI (for reasons already stated), I decided on this CI matrix:

  • Use AppVeyor to run the Maestro API unit tests for the full-framework target on Windows. Because we're running the tests under a full-framework runner, the OpenCover instrumentation can be restored, allowing us to upload code coverage reports again to coveralls.io
  • Use TravisCI to run the Maestro API unit tests for the netstandard target under .net Core 2.0 on Linux. The whole motivation for netstandard-izing MaestroAPI was to get it to run on these non-windows .net platforms, so let TravisCI handle and verify/validate that aspect for us. We have no code coverage stats here, but surely that can't be radically different than the code coverage states had we run the same test suite on Windows with OpenCover instrumentation.
Where to from here?
Now that the porting efforts have been completed, the next milestone release should follow shortly. 
This milestone will probably only concern the application itself as the SDK story needs revising and I don't want that to hold up on a new release of Maestro (the application).
Categories: OSGeo Planet
Syndicate content