OSGeo Planet

gvSIG Team: Prémio Just Side. Condições

OSGeo Planet - Tue, 2018-04-17 05:26

Serão aceites trabalhos académicos inéditos (não publicados) fruto de elaboração própria de um ou mais autores, mesmo que já tenham sido submetidos a apreciação formal em contexto académico ou noutro (por ex. junto de entidades financiadoras). Os trabalhos podem utilizar conceitos ou metodologias de uma única ou de várias áreas científicas ou abordar várias das abordagens científicas subjacentes à Rede JustSide e assumir uma postura eminentemente interdisciplinar, que reflita o tema da justiça territorial.

Condições em português: PDF, DOC

Condições em espanhol: PDF, DOC

Mais informação: https://www.rtp.pt/play/p2063/e340706/ponto-de-partida

Categories: OSGeo Planet

Free and Open Source GIS Ramblings: Movement data in GIS #12: Why you should be using PostGIS trajectories

OSGeo Planet - Mon, 2018-04-16 20:30

In short: both writing trajectory queries as well as executing them is considerably faster using PostGIS trajectories (as LinestringM) rather than the commonly used point-based approach.

Here are a couple of examples to give you an impression of the differences.

Spoiler alert! Trajectory queries are up to 500 times faster than comparable point-based queries.

Length

First, let’s see how to determine trajectory length for all observed moving objects (identified by a tracker id).

Using the point-based approach, we first need to ensure that the points are in the correct temporal order, create the lines, and finally sum up their length:

WITH ordered AS ( SELECT trajectory_id, tracker, t, pt FROM geolife.trajectory_pt ORDER BY t ), tmp AS ( SELECT trajectory_id, tracker, st_makeline(pt) traj FROM ordered GROUP BY trajectory_id, tracker ) SELECT tracker, round(sum(ST_Length(traj::geography))) FROM tmp GROUP BY tracker ORDER BY tracker

With trajectories, we can go right to computing lengths:

SELECT tracker, round(sum(ST_Length(track::geography))) FROM geolife.trajectory_ext GROUP BY tracker ORDER BY tracker

On my test system, the trajectory query run time is 22.7 sec instead of 43.0 sec for the point-based approach:

Duration

Compared to trajectory length, duration is less complicated in the point-based approach:

WITH tmp AS ( SELECT trajectory_id, tracker, min(t) start_time, max(t) end_time FROM geolife.trajectory_pt GROUP BY trajectory_id, tracker ) SELECT tracker, sum(end_time - start_time) FROM tmp GROUP BY tracker ORDER BY tracker

Still, the trajectory query is less complex and much faster at 31 ms instead of 6.0 sec:

SELECT tracker, sum(upper(time_range) - lower(time_range)) FROM geolife.trajectory_ext GROUP BY tracker ORDER BY tracker

A quick look at indexing

In both cases, we have indexed the tracker id, geometry, and time columns to speed up query processing.

The trajectory table has 3 indexes

  • gist (time_range)
  • gist (track gist_geometry_ops_nd)
  • btree (tracker)

The point-based table has 4 indexes

  • gist (pt)
  • btree (trajectory_id)
  • btree (tracker)
  • btree (t)

Temporal filter

Extracting trajectories that occurred during a certain time frame is another common use case:

WITH tmp AS ( SELECT trajectory_id, tracker, min(t) start_time, max(t) end_time FROM geolife.trajectory_pt GROUP BY trajectory_id, tracker ) SELECT trajectory_id, tracker, start_time, end_time FROM tmp WHERE end_time > '2008-11-26 11:00' AND start_time < '2008-11-26 15:00' ORDER BY tracker

This point-based query takes 6.0 sec while the shorter trajectory query finishes in 12 ms:

SELECT id, tracker, time_range
FROM geolife.trajectory_ext
WHERE time_range && '[2008-11-26 11:00+1,2008-11-26 15:00+01]'::tstzrange

or equally fast (12 ms) by making use of the n-dimensional index:

WHERE track &&& ST_Collect(
ST_MakePointM(-180, -90, extract(epoch from '2008-11-26 11:00'::timestamptz)),
ST_MakePointM(180, 90, extract(epoch from '2008-11-26 15:00'::timestamptz))
)

Spatial filter

Finally, of course, let’s have a look at spatial filters, for example, trajectories that start in a certain area:

WITH my AS ( SELECT ST_Buffer(ST_SetSRID(ST_MakePoint(116.31894,39.97472),4326),0.0005) areaA ), tmp AS ( SELECT trajectory_id, tracker, min(t) t FROM geolife.trajectory_pt GROUP BY trajectory_id, tracker ) SELECT distinct traj.tracker, traj.trajectory_id FROM tmp JOIN geolife.trajectory_pt traj ON tmp.trajectory_id = traj.trajectory_id AND traj.t = tmp.t JOIN my ON ST_Within(traj.pt, my.areaA)

This point-based query takes 6.0 sec while the shorter trajectory query finishes in 488 ms:

WITH my AS ( SELECT ST_Buffer(ST_SetSRID(ST_MakePoint(116.31894, 39.97472),4326),0.0005) areaA ) SELECT id, tracker, ST_AsText(track) FROM geolife.trajectory_ext JOIN my ON areaA && track AND ST_Within(ST_StartPoint(track), areaA)

For more generic “does this trajectory intersect another geometry”, the points can also be aggregated to a linestring on the fly but that takes 21.9 sec:

I’ll be presenting more work on PostGIS trajectories at GI_Forum in Salzburg in July. In the talk, I’ll also have a look at the custom PG-Trajectory datatype.

Read more:

Categories: OSGeo Planet

gvSIG Team: Copiar geometrías entre capas en gvSIG Desktop

OSGeo Planet - Mon, 2018-04-16 15:29

Una funcionalidad disponible desde hace varias versiones de gvSIG Desktop, muy útil para ciertos proyectos, es la de copiar geometrías de una capa a otra.

Por ejemplo, un caso común puede ser el de tener una capa con cierto tipo de mobiliario urbano de un municipio (como pueden ser farolas), y donde cada cierto tiempo recibimos de otro departamento o de una empresa externa una capa que contiene solo los nuevos elementos creados. Por ejemplo podemos recibir una capa que contiene solo las nuevas farolas incorporadas a la ciudad, de una zona recién urbanizada.

Para poder unir dichas geometrías a nuestra capa original que contiene todos los elementos tenemos varias opciones:

  • Utilizar el geoproceso Juntar, en el que obtendríamos una tercera capa, que tendría solo los campos de una de las dos capas originales. Esto puede ser incómodo porque para cada capa que recibimos con los elementos nuevos tendríamos una nueva capa, por lo que podemos llegar a tener una gran cantidad de capas en nuestro disco.
  • Utilizar el geoproceso Unión, en el que obtendríamos una tercera capa, que tendría los campos de la primera capa más los campos de la segunda (en caso de ser diferentes). Esto también puede ser incómodo porque podemos llegar a tener una gran cantidad de capas en nuestro disco.
  • Utilizar la herramienta de Copiar geometrías, en la que copiaríamos las geometrías de la capa recibida a nuestra capa original, que iría creciendo con cada capa recibida. De esa forma solo tendríamos una capa con toda la información de nuestro municipio, más cómoda para manejar.

Para utilizar la herramienta de Copiar geometrías tendremos que poner activa la capa de la cual queremos copiar las geometrías, y seleccionarlas con cualquiera de las herramientas de selección. Después seleccionaremos el botón de “Copiar elementos seleccionados al portapapeles” (también disponible en el menú Capa->Modificar). Indicaremos si queremos incluir los campos alfanuméricos.

El siguiente paso será poner en edición la capa destino, y seleccionaremos la herramienta de “Pegar elementos” (también disponible en el menú Capa->Modificar).

Nos indicará cuántos elementos se han copiado y cuántos han dado error. Después de eso finalizaremos edición y tendremos la capa original con las nuevas geometrías.

Por si tenéis alguna duda os dejamos también un vídeo sobre la herramienta:

Categories: OSGeo Planet

gvSIG Team: Copying features between layers in gvSIG Desktop

OSGeo Planet - Mon, 2018-04-16 15:26

From some time ago there is a tool in gvSIG Desktop, very useful for certain projects, that allow us to copy features from a layer to another one.

For example, a common case may be when we have a layer with a certain type of urban furniture in a municipality (such as lampposts), and from time to time we receive a layer containing only the new elements created from another department or from an external company. For example, we can receive a layer containing only the new lampposts incorporated into the city, from a newly urbanized area.

If we want to join these geometries to our original layer, that contains all the elements, we have several options:

  • We can use Merge geoprocess, in which we would get a third layer, which would have only the fields of one of the two original layers. It can be uncomfortable because for each layer that we received with the new elements we would have a new layer, so we can have a large number of layers on our disk.
  • We can use Union geoprocess, in which we would get a third layer, which would have the fields of both layers (if they are different). It can also be uncomfortable because we can have a large number of layers on our disk.
  • We can use “Copy geometries” tool, in which we would copy the features of the layer that we have received to our original layer, which would grow with each new layer received. In this way we would only have one layer with all the information of our municipality, more comfortable to manage.

To use the Copy features tool we will have to activate the layer from which we want to copy the features, and select them with any of the selecting tools. Then we will select the “Copy selected features to clipboard” button (also available in the Layer-> Modify menu). We will indicate if we want to include the alphanumeric fields.

Next step would be to start editing mode on the source layer, and select “Paste features” tool (also available in the Layer-> Modify menu).

It will indicate us how many elements have been copied and how many features have failed. After that we will finish editing and we will have the original layer with the new features.

If you have any doubt here you have a video about the tool:

 

Categories: OSGeo Planet

gvSIG Team: Scripting in gvSIG: Git integration

OSGeo Planet - Mon, 2018-04-16 10:01

Since gvSIG 2.4, the tools for development in the Scripting Module has been increased. One of this tools is an integration with Git. This tools will help to keep a version control of our plugins.

Inside of the gvSIG Association, we use this integration to keep updated and published our plugins that you can check in Github: gvSIGAssociation.

For those who don’t know Github, it’s a platform based on Git to keep a version control of programming developments. It allow us to have an online repository with our code and keep it updated. If you are interested in development and in sharing your project, github is a good place to start. We encourage you to search more info online.

This Git integration is very easy to use from gvSIG. This is a miniguide step by step.

  • The functionalities of Git/Github are many and very extensive, this is just an example of clone a repository that it is already online and update it using gvSIG. You have to use a repository that you own to follow this guide.

We are going to start with a  initialized project like this one in Github: gvsig-desktop-scripting-ConvertFieldToDate 

Other developers could have their own repositories, public or private. If you are new in Github or Git, there is a lot of information online.

First step, get the link to the repository. In the project web page, press the green button “Clone or download”

and click over “Copy to clipboard”

Next step is inside gvSIG. Open the Scripting Composer and create a new empty folder. In this case we are going to create it inside “addons” with the same name as the project: “ConvertFieldToDate”.

In the menu, we will search for Git module.

To start, we are going to clone the files that are in the cloud to start working with them. Select “Clone” and paste the link to the repository from the previous step. Be sure the project is selected in the Tree projects folder.

After:

If  we check the project folder, we’ll see that is still empty, but we already created a “link” between this folder and Git.

  • Check again if your project folder is selected

The majority part of the options appears in Git -> Show Changes

A new tab will be oppened down. It will show the differences between the local repository and the online repository, also a bunch of icons with other options.

We want to match the local repository  with the online repository so we will click on “Update all” and then press “Yes”. Be careful, if any file is contained in this folder and it’s different from the online repository, it will be deleted.

Once this operation is concluded, refresh the tree project:

We will see that there are files inside the folder

In this example, we are going to fix some code deleting some imports that we don’t already need. Change the lines in “convertFieldToDate” and save the script.

Once this is done, press the refresh button down

The changes will be show in the list.

If we select one of this changes and press the button “Diff selected with head”, we can see the changes between the local file and the file in the repository. In this case, we have deleted two lines (in red).

To update the online repository, first we have to commit them. We can select witch file to commit or commit all.

Write a description of the bug fixed:

Once the commit is done, search for the push button for upload the code to Github.

An user and password will be asked.

Once it’s finished, it will show “OK”.

In the website  we could check if the commit  has been done.

 

And also check for the changes:

This is a brief guide to Git integration in gvSIG.

If you have any question ask in Mailing lists.

 

Categories: OSGeo Planet

gvSIG Team: gvSIG aplicado a Medio Ambiente. Tema 5: Raster.

OSGeo Planet - Mon, 2018-04-16 10:00

Ya tenéis disponible el Tema 5 del curso gratuito gvSIG aplicado a Medio Ambiente. Comenzamos a trabajar con información raster y a gestionar este tipo de datos para incorporarlos a nuestros proyectos.

Screenshot_20180416_115547

Aprender a trabajar con información raster nos va a permitir dar un salto cualitativo en el manejo de gvSIG, ya que a partir de aquí podremos trabajar con imágenes de satélite, datos LiDAR y ortofotografías.

http://www.geoalternativa.com/gvsig-training/course/view.php?id=3&section=7

Como siempre, si tenéis dudas, os animamos a escribir en la lista de usuari@s de gvSIG: http://osgeo-org.1560.x6.nabble.com/gvSIG-usuarios-f4118557.html

Categories: OSGeo Planet

Marco Bernasocchi: Marco becomes QGIS.org Co-chair

OSGeo Planet - Mon, 2018-04-16 06:09
We are very proud to announce that one of our founders and directors Marco Bernasocchi was elected as QGIS.org project steering committee (PSC) co-chair. With over 10 years of involvement with QGIS (he started working with QGIS 0.6) Marco will… See more ›
Categories: OSGeo Planet

Tom Kralidis: 20 years later – first website

OSGeo Planet - Mon, 2018-04-16 00:03
20 years ago I was living in Ottawa, in GIS school and started working with Natural Resources Canada.  Fast forward to a few weeks back scanning through old CDROMs and low and behold there was my first ever website.  I sat back for a few minutes remembering the details: made with Microsoft FrontPage followed by HotDog […]
Categories: OSGeo Planet

GRASS GIS: Recap from the OSGeo Code Sprint in Bonn 2018

OSGeo Planet - Sun, 2018-04-15 21:01
The GRASS GIS team members met at the BaseCamp in Bonn for the OSGeo code sprint, edition 2018
Categories: OSGeo Planet

QGIS Blog: QGIS Grants #3: Call for Grant Proposals 2018

OSGeo Planet - Sun, 2018-04-15 12:16

Dear QGIS Community

Our first two rounds of Grant Proposals were a great success. If you are an early adopter using QGIS 3.0, you can already try out some of the new capabilities that have arrived in QGIS thanks to these grants.

We are very pleased to announce the third round of grants is now available to QGIS contributors. The deadline for this round is Sunday, 13 May 2018. All the details for the grants are described in the application form, and for more context we encourage you to also read these articles:

We look forward to seeing all your great ideas about how to improve QGIS!

Anita Graser

QGIS PSC

Categories: OSGeo Planet

QGIS Blog: QGIS Annual General Meeting – 2018

OSGeo Planet - Fri, 2018-04-13 13:26

Dear QGIS Community

 

We recently held our 2018 QGIS Annual General Meeting. The minutes of this meeting are available for all to view. As I have previously announced, I have decided to step down as chair of the PSC this year, so this post is my last official act as QGIS Chair. Thank you all for the kind words and deeds of support you gave me during my time as project chair.

I would like to welcome our new QGIS Board Chair: Paolo Cavallini, and our new QGIS Board Vice-Chair and QGIS PSC Member, Marco Bernasocchi. In case you are not familiar with Paolo and Marco, you can find short introductions to them below. I am pleased also to say that the project governance is in good hands with Richard Duivenvoorde, Jürgen Fischer, Andreas Neumann and Anita Graser kindly making themselves available to serve on the PSC for another two years. It is also great to know that our project founder, Gary Sherman, continues to serve on the PSC as honorary PSC member. Gary set the standard for our great project culture and it is great to have his continued presence.

QGIS has been growing from strength to strength, backed by a really amazing community of kind and collaborative users, developers, contributors and funders. I am looking forward to seeing how it continues to grow and flourish and I am excited and confident it will do so with Paolo acting as the project chair and representative. Rock on QGIS!

 

Paolo Cavallini

Paolo

I got involved in QGIS long ago, first as a user, then more and more deeply in various activities, initiating and supporting various plugins and core functions (e.g. GDAL Tools, DB Manager), opening and managing bugs, taking care of GRASS modules, handling the trademark registration, etc. I acted as Finance and Marketing Advisor for several years. Currently, I manage the plugin approval process. Motivation: It’s such a pleasure building up, in a truly cooperative and democratic way, together with truly intelligent people, a tool that enables people to freely do their job or pursue their interests, that I cannot resist helping as much as I can.

Marco Bernasocchi (http://berna.io @mbernasocchi)

20180214_112925.jpg

I am an open source advocate, consultant, teacher and developer. My background is in geography with a specialization in geographic information science. I live in Switzerland in a small Romansh speaking mountain village where I love scrambling around the mountains to enjoy the feeling of freedom it gives me. I’m a very communicative person, I fluently speak Italian, German, French English and Spanish and love travelling. I work as director of OPENGIS.ch which I founded in 2011. Since 2015 I share the company ownership with Matthias Kuhn. At OPENGIS.ch LLC we (4 superstar devs and myself) develop, train and consult our client on any aspect related to QGIS. My first QGIS (to be correct for that time QuantumGIS) ever was “Simon (0.6)” during my BSc when the University of Zurich was teaching us proprietary products and I started looking around for Open Source alternatives. In 2008, when starting my MSc, I made the definitive switch to Ubuntu and I started working more and more with QGIS Metis (0.11) and ended developing some plugins and part of Globe as my Masters thesis. Since three years the University of Zurich invites me to hold two seminars on Entrepreneurship and Open Source. In November 2011 I attended my first Hackfest in Zürich where I started porting all QGIS dependencies and developing QGIS for Android under a Google Summer of Code. A couple of years and a lot of work later QField was born. Since then I’ve always tried to attend at least to one Hackfest per year to be able to feel first hand the strong bonds within our very welcoming community. In 2013 I was lucky enough to have a release named after a suggestion I saved you all from having QGIS 2.0 – Hönggerberg and giving you instead QGIS 2.0 – Dufour Beside my long story with QGIS as user and passionate advocate I have a long story as QGIS service provider where we are fully committed to its stability, feature richness and sustainable development. Furthermore, as WorldBank consultant, I am lucky enough to be sent now and then to spread the QGIS goodness in less fortunate countries. Motivation: One of my main motivation to be part of the PSC is to help QGIS keep this incredible growth rate by being even more attractive to new community members, sponsors and large/corporate users. To achieve this, the key is maintaining the right balance between sustainable processes (that guarantee the great quality QGIS has been known for) and an interesting and motivating grassroots project where community members can bloom and enjoy contributing in their most creative ways.

 

Regards

timsutton

Tim Sutton (outgoing Chair)

Categories: OSGeo Planet

Marco Bernasocchi: Porting QGIS plugins to API v3 – Strategy and tools

OSGeo Planet - Fri, 2018-04-13 09:20
The Release of QGIS 3.0 was a great success and with the first LTR (3.4) scheduled for release this fall, it is now the perfect time to port your plugins to the new API. QGIS 3.0 is the first major… See more ›
Categories: OSGeo Planet

From GIS to Remote Sensing: Announcing the release of the new Semi-Automatic Classification Plugin

OSGeo Planet - Fri, 2018-04-13 08:46
I am very pleased to announce the release date of the new Semi-Automatic Classification Plugin (SCP) version 6 (codename Greenbelt).
This new SCP version, which is compatible with QGIS 3 only, will be released on the:
 22 January 2018
The Semi-Automatic Classification Plugin (SCP) version 6 has the codename Greenbelt (Maryland, USA) which is the location of the NASA’s Goddard Space Flight Center that had a key role in Landsat satellite development and will lead the Landsat 9 development of the space and flight segments (Landsat 9 to be launched in 2020).

Main interfaceRead more »
Categories: OSGeo Planet

From GIS to Remote Sensing: Basic tutorial 1: Land Cover Classification of Landsat Images

OSGeo Planet - Fri, 2018-04-13 08:43
This is a basic tutorial about the use of the new Semi-Automatic Classification Plugin version 6 for QGIS for the classification of a multispectral image. It is recommended to read the Brief Introduction to Remote Sensing before this tutorial.The purpose of the classification is to identify the following land cover classes:
  1. Water;
  2. Built-up;
  3. Vegetation;
  4. Bare soil.
The study area of this tutorial is Greenbelt (Maryland, USA) which is the site of NASA’s Goddard Space Flight Center (the institution that will lead the development of the future Landsat 9 flight segment).


Read more »
Categories: OSGeo Planet

gvSIG Team: Scripting en gvSIG: Integración con Git

OSGeo Planet - Fri, 2018-04-13 08:41

Desde la versión gvSIG 2.4, hemos aumentado el número de herramientas que facilitan el desarrollo de las extensiones desde el Editor de Scripts. Una de estas nuevas herramientas es la integración de Git con el Editor. Esta integración nos facilitará llevar un control de versiones para las extensiones que estemos desarrollando.

Desde la Asociación gvSIG utilizamos principalmente esta integración para el desarrollo de las extensiones que tenemos publicadas en gvSIGAssociation en Github.

Para los que no lo conozcan, Github es una plataforma basada en Git que se utiliza para mantener un seguimiento de desarrollos de programación. Nos permite tener repositorios de manera gratuita (siempre y cuando sean públicos)  y almacenar nuestro código, así como, mantener un historial de las diferentes versiones que desarrollemos. Por lo que si estás interesado en compartir un proyecto y tenerlo almacenado en la nube con seguridad, Github puede ser un buen sitio donde empezar.

El modo de utilizar desde gvSIG sería sencillo. Vamos a explicar un ejemplo paso a paso.

  • Las funcionalidades de Git/Github son muchas y extensas, en este ejemplo solo vamos a ver lo que sería modificar un repositorio que nos pertenece desde gvSIG.

Para empezar, tenemos un repositorio ya inicializado en Github, como en este ejemplo el proyecto gvsig-desktop-scripting-ConvertFieldToDate  En el caso de otros desarrolladores podrían tener estos repositorios en sus propios perfiles, asociados a sus empresas, privados, etc. Si eres nuevo en Github  o Git, hay muchas guías por Internet que te pueden ayudar a iniciarte.

Primer paso sería ir a la web del proyecto en Github y copiar el enlace al repositorio donde pone “Clone or Download”:

Y clickar sobre el “Copiar el portapapeles”

Siguiente paso sería irnos al Editor de Scripts en gvSIG, y crear una carpeta en nuestro directorio. En este caso vamos a crearla dentro de “Addons” con el mismo nombre que tiene el proyecto “ConvertFieldToDate”.

Siguiente paso, seleccionamos la carpeta en nuestro árbol de proyectos y buscamos el módulo de Git.

Para empezar, vamos a clonar los ficheros que ya tenemos en Github para trabajar en nuestro ordenador. Seleccionaremos la opción “Clone” y introduciremos el enlace copia de la web anteriormente. Siempre con cuidado, teniendo la carpeta del proyecto correcta seleccionada.

Y después de ejecutarse:

Si vamos a la carpeta del proyecto podemos ver que aún está vacía, pero ya hemos creado un enlace entre esta carpeta y su análoga en Github.

Vamos a acceder al panel que nos va a permitir manejar la mayor parte de opciones de Git. Teniendo la carpeta siempre seleccionada, vamos a Git -> Show Changes

Podemos ver que nos aparecerá un nuevo tab abajo con las diferencias que existen entre nuestro repositorio local, y nuestro repositorio remoto en Github, además de un menu con muchas otras opciones.

Para empezar, queremos traernos a nuestra carpeta que está actualmente vacía nuestros ficheros. Para ello clickaremos sobre el botón “Update all” y clickaremos “Yes” a la pregunta “Are you sure to overwrite files in the workspace?”. Esto descargará los ficheros de la nube a local. Cuidado si ya dispones de algún fichero en esta carpeta ya que con esta operación podría ser eliminado:

Una vez concluida la operación, si actualizamos el árbol de proyectos “:

Veremos que ya aparece un “+” delante de “ConvertFieldToDate” y comprobamos que tiene los ficheros:

En este caso, queremos corregir una serie de imports en el fichero “convertFieldToDate”, por lo que abrimos el script, realizamos los cambios y guardamos el script.

Una vez realizados presionamos el botón de refrescar  en la ventana anterior.

Veremos que nos aparece que el fichero ha sido modificado y que es diferente al que está en el repositorio online.

Si seleccionamos uno de los cambios de la lista y apretamos en el botón de “Show differences”, podremos comprobar los cambios que ha sufrido el fichero. En este caso, la eliminación de dos líneas de código:

Para subir los cambios al repositorio online primero tenemos que hacer un commit de ellos. Podemos selecionar primero si queremos que suba todos o solo los de los ficheros seleccionados. Presionaremos el botón de Commit:

Escribiremos un mensaje que describa el problema solucionado:

Una vez realizado el commit, tendremos que subir los cambios realizados al repositorio online. Para este paso hay que usar el botón de Push.

Nos pedirá el usuario y contraseña del repositorio.

Una vez finalizado aparecerá un mensaje de “OK”

Si vamos a la web podremos comprobar que el commit se ha realizado correctamente.

 

Y podemos visualizar los cambios de este commit:

Esto es una breve introducción de la funcionalidad de Git en gvSIG.

Esperemos sacar más documentación próximamente pero para cualquier duda como siempre podéis preguntar en las Listas de Desarrollo

 

Categories: OSGeo Planet

From GIS to Remote Sensing: Cloud Masking, Image Mosaic, and Land Cover Change Location

OSGeo Planet - Fri, 2018-04-13 08:20
This tutorial is about the use of SCP for the assessment of land cover change. It is recommended to complete the Tutorial 2: Cloud Masking, Image Mosaic, and Land Cover Change Location before this tutorial.

The purpose of this tutorial is to locate land cover change over one year (between 2017 and 2018), using free Sentinel-2 images.

Read more »
Categories: OSGeo Planet

Martin Davis: The world needs a new flavour of SOSS!

OSGeo Planet - Thu, 2018-04-12 18:29
Yes, you don't what the acronym SOSS means, because I just made it up. SOSS stands for Standard Open Simple Spatial format.

It's crazy that in the 21st century the most common de facto standard spatial format is based on 30-year old technology, is proprietary, and has silly limitations such as 11-character uppercase attribute names.

I'm talking, of course, about shapefiles.

Surely we can do better than this?!

Now there are actually a few things that shapefiles get right. For instance, the shapefile's simplistic tabular data model gets two full marks for being - simple and tabular! Hierarchical data models are very cool and highly expressive, but overkill and too complex for 80% of the use cases out there.

Another useful feature of shapefiles is that they store floating point data with full precision - i.e. in binary. Representing binary floating point numbers as textual decimal values is inherently lossy, and causes all kinds of subtle and annoying problems. (I'm always surprised that this doesn't crop up more often as a serious limitation of GML.)

So what are the current leading contenders for a SOSS format? Here's an opinionated list, with pros and cons


FormatProConShapefiletabular, lossless numericsproprietary, antiquated, limitedGMLcomplex to model and parse, lossy numerics, poor schema handlingopen, flexibleKMLproprietary, lossy, limited attribute handling, designed for presentationrelatively simple, well documentedGeoRSSnot appropriate as a full-featured SOSS, lossyGeoJSONtoo tied to Javascript, lossy, no schema standardYAMLneeds a spatial profile
Conspicuous by its absence on this list is XML. In fact XML is a meta-format, not a format. To utilize XML would require defining an appropriate profile (which would need to be highly restricted to meet the criteria of simple). The major drawback of XML is that specifying the profile almost inevitably drags one into the mind-bending hell of XML Schema. (There are other schema languages, such as RelaxNG, but they involve similar complexity and have even less traction).
There's also more esoteric formats such as NetCDF, but it fails the simplicity test, and it's unclear how well it supports Geometry types.



Categories: OSGeo Planet

gvSIG Team: Correction of shapefiles with errors in gvSIG Desktop

OSGeo Planet - Thu, 2018-04-12 12:45

Since gvSIG 2.3, when a shapefile has corrupt geometries it is loaded in the Table of Contents (ToC) but not in the View, but we can have the option to correct that layer. One of the reasons is that if the layer with corrupt geometries was loaded, and the user edited that layer, saving editing with those errors it could cause problems and lose data. To avoid this the decision has been to load it in the ToC but not to load the geometries, and inform the user.

If after adding the layer in the View we see that it is added in the ToC but it is deactivated and marked with a red exclamation point, and an error message appears in the lower left part of the screen, it is a sign that it is a layer with errors.

It may happen that the layer has been loaded without problems in the view initially, but if all the records are not read, there has been any error. Then when performing any subsequent operation (it can simply be a zoom) the corrupted geometry is detected and the layer is deactivated, showing the red exclamation mark.

In order to see the details of the error we will activate layer, and with the secondary button of the mouse we will select “Show errors”.

In the new window that is opened we can have information about which geometries are corrupt, with a message similar to this: “There were errors loading the feature ‘x’ from ‘[layer_name]'”, where “x” will be the geometry number (we must take into account that the first geometry is “0”), and “layer_name” will be the name of our layer.

This window will also indicate what type of error is, and what we should do. The errors can be related to the number of vertexes of a line or a polygon for example, if we have a line with an only vertex or a polygon with two vertexes, that is not correct. In that case we will see a message similar to this: “Invalid number of points in LinearRing (found 3 – must be 0 or> = 4).”

It can also give us instructions about if it is possible to solve it. For example, the message “Check ‘Fix LinearRings and LineStrings’ in the shape’s properties of the add layer dialog to try fix it” may appear.

If we get this message, we can try to correct the layer. From the previous window we will access to its properties by clicking on the lower button “_View_properties_of_data_source” (or its corresponding translation).

Note: For the moment in gvSIG 2.3 and 2.4 this option is only available for shapefiles. For the rest of vector formats it will be necessary to delete the layer at the View and add it again, and then access to its properties from the “Add layer” window.

In the layer properties window we will access to “Advanced” tab and we will follow the steps that the error message indicated.

There will be several options to fix the error.

Correction of geometries when they have less points than needed

If the error message inform us about which geometry was corrupt, and it indicates us to mark the “Fix LinearRings and LineStrings” option, at the “Advanced” tab we will mark it, and we will accept, and in the View we will click with the secondary button on the active layer and we will press “Reload”. In this way we will see the elements of the layer in the View, and duplicated vertexes will be added at the end of the geometries that had less points than needed.

Correction of geometries with another type of error

If we have corrected a geometry that had less points than needed, we said that gvSIG duplicates the necessary points for the geometry to load it well, but after being fixed automatically it can remain a polygon with three points where two of them coincide. In that case we may want to know why the geometry was wrong and correct it if it’s possible.

When we have seen the initial information of the error, the message has indicated which geometry was corrupt. We would then open the attribute table, and we would look for the register indicated in the error. As the records start from “0” in the code, and in the attribute table they starts from “1”, we will have to add “1” to the record indicated in the error. For example, if it indicated the geometry number “5” we will look for the element “6” of the attribute table.

After selecting the element we will make a Zoom to the selected feature, and we will see the corrupted geometry on the View. The next step will be to put the layer in editing mode and correct the geometry.

Depending on the initial error message, we will try to correct it in one way or another. Some of the possible ways to correct the geometries would be the following:

  • If it is a polygon layer and we have a geometry formed by two points, we can delete it and create a new polygon or we can add more vertexes. The same if it is formed by a point.
  • If it is a polygon layer where we have a multipolygon geometry, and one of the polygons of of that multipolygon is really a point or a line, we would use the “Explode geometry” tool first and we would delete the excess parts or we would correct them.

Once the geometries are corrected, we won’t be able to save the layer that we are editing. The application will force to export to a new layer when finishing editing. With this, the new layer will be loaded correctly on the View.

Elimination of corrupt geometries

If we want to remove the corrupt geometries directly, we will mark the “Load corrupt geometries as null” option at the “Advanced” tab. Then we will accept, and on the View we will click with secondary button on the active layer and we will press “Reload”. In this way we will see the elements of the layer in the View.

The next step will be to put the layer in editing mode. Then we will open its attribute table, and we will go to the Table->Properties menu. We will see the different fields of the layer at the new window. We will mark the GEOMETRY field as visible, and we will accept later. So, if we go to the last column in the table we will see that it is the field that contains the geometry.

The next thing is to put the GEOMETRY field active and sort it in descending order, so that at the top we will see the records that do not have geometry (the value at GEOMETRY field is empty).

The corrupt geometries will be among the records where the GEOMETRY field is empty, but we must take into account that if we have more than one record with the GEOMETRY field empty, it is possible that any of these geometries is correct. For example, if we have worked with a database previously, it’s easy to have null geometries on it, so exporting that database to SHP, those records would be kept and they wouldn’t be an error really. If we only have a null geometry in the table, it will be the one with the error.

By having the layer in editing mode, we could select the records that we want to delete in the table directly, and we would click on the “Remove row” button (also available in the Table menu), which would eliminate the desired geometries.

Finally we will finish editing, where the application will force us to save it as a new layer on disk. In this way we will have the corrected layer, without the geometries that were corrupted.

 

Categories: OSGeo Planet

GIScussions: An address is …

OSGeo Planet - Wed, 2018-04-11 18:34

postage-perspectives

This week a headline in the Telegraph caught my eye:

Postcodes ‘no longer fit for purpose’ as study shows most people have one which does not go directly to their door

The article goes on:

Postcodes are “no longer fit for purpose” as three in four people in the UK have an address that does not lead directly to the door of their home or business, according to new research.

British technology company What3words has said that the current address system in the UK needs to change after it found that people are struggling more than ever to get their parcels delivered to the right location.

A survey carried out by the company showed that out of 1,000 people in the UK, 74 per cent said deliveries, services and visitors struggle to find their homes or businesses; equating to almost 46 million people nationally.

And now the revolutionary firm is hoping to stop the delivery nightmare with their global grid system.

First up I should congratulate the marketing and PR team at what3words for getting this piece published in a so called quality daily paper. Those of you who know me and have read some of my previous posts on location codes will know that this would be a red rag for me, and despite my best efforts I couldn’t resist the bait.

Bailar pegados

Let’s start by deconstructing the premise of this puff piece.

“three in four people in the UK have an address that does not lead directly to the door of their home or business”

Well that’s an enormous jump from a postcode to an address! This is either a deliberate attempt to mislead by conflating addresses with postcodes or it is just sloppy writing/editing or most probably a bit of both.

Assuming the wording should have been “three in four people have a postcode that does not lead directly to the door...” then it is stating the obvious as most postcodes refer to an average of 15 properties with a small number applying to individual commercial properties (or even a floor in a large multi-tenanted block). Coordinates are assigned to the “middle house” within a postcode group, this is better explained by EDINA (slightly out of date with reference to Address-Point)

Code Point is derived from Gridlink and provides a National Grid reference for each unit postcode in the United Kingdom to a resolution of 1 metre. Each Grid Reference point is known as the Code-Point Location Coordinate (CPLC).

To generate each CPLC, the mean position of each delivery point in a unit postcode is calculated and the CPLC is allocated the ADDRESS-POINT (an Ordnance Survey Gridlink product) coordinates which fall nearest to this mean.

Although each postcode is allocated a grid reference, the grid reference may be shared by more than one postcode. For example, if a block of flats ( a Vertical Street) contains more than one postcode, each postcode will be allocated with the same grid reference as its CPLC.

So on that basis only 7% of houses have a postcode that leads to their front door. But miracle of miracles (and it pains me to praise the Royal Mail) according to the BBC News 99.87% of mail is delivered correctly. So clearly the fact that postcodes do not identify individual front doors does not prevent the mail being successfully delivered.

“the current address system in the UK needs to change after it found that people are struggling more than ever to get their parcels delivered to the right location”

Really? That’s not my experience, why would the mail services be able to successfully deliver mail but not the parcel services? In the last 5 years we have had one parcel delivery which went astray and that turned up a week or so later. I would have thought that most large scale home delivery firms will be using a full address product such as Ordnance Survey’s AddressBase Premium which provides accurate coordinates for every address in GB (yes, I know, now I’ve praised RM and OS!).

“A survey carried out by the company showed that out of 1,000 people in the UK, 74 per cent said deliveries, services and visitors struggle to find their homes or businesses; equating to almost 46 million people nationally”

Now I really wonder what the author of the piece, or her editor, were thinking of (or smoking)? 46 million people have a struggle to be found! This is difficult to believe, to say the least (I had to rephrase the first and second drafts of this paragraph to remove the expletives). Surely one or two (or several more) of the people I know or you know would have mentioned their struggle to be found? Is there a conspiracy going on to keep this national scandal a secret? Why do we not know that 46m people are struggling to be found? Should we call in the emergency services to help them get found or is it time to mobilise the army?

Do they “struggle” every day, every week, once a month, once a year, once in the last 10 years? I have a ‘thing’ about headlines based on surveys where the questions asked and raw data are not available. These pseudo scientific surveys are often carried out on behalf of companies promoting the need for their products or services or by campaigners seeking to make a point by asking ‘loaded’ questions. Of course what3words could easily shut me up by providing the raw data behind their survey and if I have mischaracterised their survey I will happily write a grovelling retraction.

“And now the revolutionary firm is hoping to stop the delivery nightmare with their global grid system”

You’ve got to hand it to the what3words PR team, they have played a blinder. We have now escalated to a “delivery nightmare” 46 million people are waking up with cold sweats worrying whether their Amazon or ASOS delivery is go missing – I recommend cocoa before bed along with a subscription to Prime.

A thought – 3m squares may not be the answer to the delivery nightmare. About 17% of homes are flats which means that the from doors are both inside the main entry point and stacked vertically above each other which will still leave the poor delivery person struggling to find the front door from his 3 word code. That’s 1 in 6 or 10 million people still having sleepless nights because drivers are using 3 word codes instead of addresses.

Enough! A code is not an address

 

BS

A location code (whether it is a postcode or based on 3 words or some other algorithm) is a proxy for coordinates it is not an address or even a proxy for an address (you might argue that a postcode is an aggregate of multiple addresses).

Addresses are an important component of our social, democratic, economic and digital infrastructure. I am grateful to my friends at GeoPlace for some useful quotes on why addresses matter

“Urban development, economic growth and the provision of basic services are inextricably linked to the existence of sound address infrastructures in urban and rural areas alike.”

Anna Tibaijuka, Tanzanian minister of lands, housing and human settlements development

Addresses are a component in establishing legal identity (which would suggest that they need to be allocated by a legal authority)

“Four billion people are excluded from the rule of law, as the lack of a legal identity often prevents them from enjoying their rights as citizens. Setting up an addressing system is the first step towards tackling that issue.”

Commission on Legal Empowerment of the Poor, United Nations Development Programme

The UPU may have an axe to grind on this topic but once again they are suggesting that addresses need official authentication.

Addresses appear to be a key element in aiding the delivery of policies at national and international levels in support of the Millennium Development Goals particularly with regard to governance, rule of law, poverty reduction, disease prevention and the provision of basic services such as electricity, sanitation and water.

Addressing the World – an address for everyone. Universal Postal Union 2012

Let me be clear here, I am not anti what3words. If delivery firms think that licensing a 3 word code system as a proxy for coordinates will help them to deliver parcels then go for it. If their customers want to remember their 3 word code rather than their address because that will prevent their “nightmare struggle” to get deliveries then that’s wonderful, I’m all in favour of getting a better night’s sleep. And if Mercedes think that 3 words will help their owners’ in car navigation system, then good luck to them (personally I would focus on beefing up the voice recognition to enable it to detect street addresses, because there aren’t many Mercedes in places without an address system).

What infuriates me (understatement) is the knowing endeavour to conflate a location code, a proxy for coordinates with no other context or hierarchy, with an address when it clearly is considerably different. Oh and add in the use of a faux survey to assert that there is a “delivery nightmare” in the UK which they (and only they?) can solve.

An address is …

Let’s try to define what constitutes an address and why a 3 word code is not an address. Rob Walker, a past chair of the Association for Geographic Information, and something of a super nerd on addressing and address standards submitted a paper to the “ISO Workshop on address standards: Considering the issues related to an international address standard”  (2008) in which he suggested that (my italics and bold below)

An address is a label used to reference a geographical object such as a property, for the purpose of identification and location, through the use of identifiable real-world objects. Addresses are widely used in government, commerce and everyday life as descriptions of where places are, and people are often referenced by their home address. The most common form of address is the postal address, used for the delivery of mail, where the address is essentially a routing instruction leading to the property.

It’s those real world identifiable objects that give an address context and make it human readable.

Addresses are used for a range of purposes including

  • Delivery organisations, to identify delivery points;
  • Other service organisations, to identify the location of service delivery points;
  • People, to uniquely identify themselves via their place of residence;
  • Governments, to identify where people live and work, for planning public services;
  • Taxation authorities, to levy taxes on people and organisations;
  • Emergency services, for deployment and contingency planning;
  • Land authorities, for property registration and transactions;

Rob outlines a general structure for addresses

Addresses generally follow a simple structure incorporating the names or numbers of a nested set of spatial units:

  • The name or number of a sub-unit within a building or property;
  • The name or number of a building or property within a street;
  • The name of a street;
  • The name of (one or more) geographic areas (locality, town, county etc);
  • The name of a country.

Part of such an address is often abbreviated by a code (e.g. a postcode or area code).

The exact definition of each of these levels in an address usually varies from country to country, and are often defined in national standards. (e.g. BS 7666 in the UK).

17 West Street, Epsom, Surrey gives me an immediate idea of where Astun’s office is, even if I don’t know Epsom well I probably have an idea of where Epsom is in relation to where I live. Adding the postcode doesn’t help me with that context but it usually works with the maps app on my phone. On the other hand doing.random.manliness doesn’t even tell me that the address is in the UK but it did give me a laugh

w3w has our office address down as doing.random.manliness .. more or less where the gents is. In fairness, you just don't get the same comedy value with KT18 7RL

— Daniel Ormsby (@ormsbydaniel1) April 9, 2018

 Apparently If the Queen wanted a package delivered to her front door, she’d just have to reference ‘rocks.skin.grand’.

I am sure there are lots of slightly risqué 3 word combinations that we can all search for but please let’s not kid ourselves that HRM is going on line and ordering her pizza to be delivered to her sleaze-word address. 

It’s probably time to get the tin hat out as the flack comes flying in

Categories: OSGeo Planet

Jackie Ng: Some patches and fixes of MapGuide Open Source 3.1.1

OSGeo Planet - Tue, 2018-04-10 13:10
Barely a week after releasing MapGuide Open Source 3.1.1, I've added some patches that may be of use for some of you out there.

Firstly, I've uploaded a newer release of Fusion that includes a fix for an issue with Google Maps support discovered after the 3.1.1 release. If you downloaded Fusion with the updated Bing Maps support in the past, this release is packaged in the same fashion: Download, extract and overwrite the existing Fusion installation.

Secondly, for those that use Oracle and have been clamoring for 12c support. I've uploaded experimental King Oracle FDO provider dlls that have been built against the Oracle 12c instant client SDK. These providers have not yet been fully tested against Oracle 12c and has been made available for you to provide any feedback or report any issues regarding 12c support.

If either one of these patches is of interest to you, grab them from the updated 3.1.1 release notes page.
Categories: OSGeo Planet
Syndicate content