OSGeo Planet

Geomatic Blog: Aggregating points: JSON on SQL and loops on infowindows

OSGeo Planet - Mon, 2017-02-20 07:00

NOTE: I’ll use CARTO but you can apply all this to any webmapping technology backed by a modern database.

Get all the data

So we start with the typical use case where we have a one to many relationship like this:

select e.cartodb_id, e.displayname, e.division, e.photourl, l.cartodb_id as locaction_id, l.location, l.the_geom_webmercator from locations l inner join employees e on e.location = l.location order by location

Easy peasy, we have a map with many stacked points. From here you can jump to this excellent post by James Milner about dense point maps. My example is not about having thousands of scattered points that at certain zoom levels overlap. Mine is a small set of locations but many points “stacking” on them. In this case you can do two things: aggregate or not. When you aggregate you pay a prize for readability: reducing all your data to those locations and maybe using visual variables to show counts or averages or any other aggregated value and finally try to use the interactivity of your map to complete the picture.

So at this point we have something like this map, no aggregation yet, but using transparency we can see where CARTO has many employees. We could also use a composite operation instead of transparency to modify the color of the stacked points.

Stacking points using transparency

Stacking points using transparency

Aggregate and count

OK, let’s do a GROUP BY the geometry and an aggregation like counting. At least now we know how many people are there but that’s all, we loose the rest of the details.

select l.the_geom_webmercator, min(e.cartodb_id) as cartodb_id, count(1) as counts from locations l inner join employees e on e.location = l.location group by l.the_geom_webmercator Grouping by location and counting

Grouping by location and counting

Aggregate one field

But in my case, with CARTO we have PostgreSQL at hand so we can do way more than that. PostgreSQL has many many cool features, handling JSON types is one of them. Mix that with the fact that almost all template systems for front-end applications allow you to iterate over JavaScript Objects and you have a winner here.

So we can combine the json_agg function with MustacheJS iteration over objects to allow rendering the names of our employees.

select l.the_geom_webmercator, min(e.cartodb_id) as cartodb_id, l.location, json_agg(e.firstname) as names, -- JSON aggregation count(1) as counts from locations l inner join employees e on e.location = l.location group by l.the_geom_webmercator,l.location

And this bit of HTML and Mustache template to create a list of employees we can add to the infowindow template:

<ul style="margin:1em;list-style-type: disc;max-height:10em;"> {{#names}}<li class="CDB-infowindow-title">{{.}}</li>{{/names}} </ul> {{^names}}loading...{{/names}}

List of employees on the infowindow

We could do this without JSON types, composing all the markup in the SQL statement but that’s generating quite a lot of content to move to the frontend and of course making the whole thing way harder to maintain.

Aggregate several fields

At this point we can repeat the same function for the rest of the fields but we need to iterate them separatedly. It’d be way better if we could create JSON objects with all the content we want to maintain in a single output field we could iterate on our infowindow. With PostgreSQL we can do this with the row_to_json function and nesting an inner double SELECT to give the properties names. We can use directly row_to_json(row(field1,field2,..)) but then our output fields would have generic names.

select l.the_geom_webmercator, min(e.cartodb_id) as cartodb_id, l.location, count(1) as counts, json_agg(row_to_json(( SELECT r FROM ( SELECT photourl as photo, coalesce(preferredname,firstname,'') as name ) r ),true)) as data from solutions.bamboo_locations l inner join solutions.bamboo_employees e on e.location = l.location group by l.the_geom_webmercator,l.location order by counts asc

With this query now we have a data field with an array of objects with the display name and web address for the employee picture. Easy now to compose this in a simple infowindow where you can see the faces and names of my colleagues.

<div style="column-count:3;"> {{#data}} <span style="display:inline-block;margin-bottom:5px;"> <img style="height:35px;" src="{{photo}}"/> <br/> <span style="font-size:0.55em;">{{name}}</span> </span> {{/data}} </div> {{^data}} loading... {{/data}}

Adding pictures and names

That’s it. You can do even more if you retrieve all the data directly from your database and render on the frontend, for example if you use D3 you probably can do fancy symbolizations and interactions.

One final note is that if you use UTF grids (like in these maps with CARTO) you need to be conservative with the amount of content you put on your interactivity because with medium and big datasets this can make your maps slow and too heavy for the front-end. On those cases you may want to change to an interactivity that works like WMS GetFeatureInfo workflow, where you retrieve the information directly from the backend when the user clicks on the map, instead of retrieving everything when loading your tiles.

Check the map below and how the interactions show the aggregated contents. What do you think of this technique? Any other procedure to display aggregated data that you think is more effective?

Filed under: CARTO, cartography, GIS, SQL, webmapping
Categories: OSGeo Planet

Ivan Minčík: Unique jobs at Land Information New Zealand

OSGeo Planet - Sun, 2017-02-19 23:58
I am working at Land Information New Zealand (LINZ) for almost one year now and my finding is that New Zealand is so unique country, unlike any other. Over the time, I have slowly realized, that LINZ and especially the part where I am working - Location Information, is also very positively unique with it's culture and people.
There is a giant area of land and sea we are caring about, stretching from New Zealand over South-West Pacific to Antarctica. We are running unique free data publishing service. We are using and contributing to lot of Open Source software like QGIS, PostGIS, GDAL, Python and Linux. We have unique managers doing Debian packaging. We are singing Maori songs every Friday. We have numerous running clubs. We are facing unique nature challenges and Kiwis are still the most optimistic people around the globe.

Great news is, that if you want to know what I am talking about, there is a unique opportunity. We are hiring two very interesting positions - DevOps Database Developer and Spatial IT Solutions Developer.

Have a look and send us your CV.
Categories: OSGeo Planet

QGIS Blog: QGIS Grants #2: Call for Grant Proposals 2017

OSGeo Planet - Sun, 2017-02-19 19:03

Dear QGIS Community

Last year we held our first ever call for Grant Proposals and it was a great success. If you are an early adopter using QGIS 3.0 preview builds, you can already try out some of the new capabilities that have arrived in QGIS thanks to these grants.

We are very pleased to announce the second round of Grants is now available to QGIS Contributors. The deadline for this round is Sunday, 19 March 2017. All the details for the Grant are described in the application form, and for more context we encourage you to also read these articles:

We look forward to seeing all your great ideas about how to improve QGIS!

Tim Sutton

QGIS Project Chair

Categories: OSGeo Planet

Paul Ramsey: Super Expensive Cerner Crack-up at Island Health

OSGeo Planet - Sun, 2017-02-19 16:00

Kansas City, we have a problem.

Super Expensive Cerner Crack-up at Island Health

A year after roll-out, the Island Health electronic health record (EHR) project being piloted at Nanaimo Regional General Hospital (NRGH) is abandoning electronic processes and returning to pen and paper. An alert reader forwarded me this note from the Island Health CEO, sent out Friday afternoon:

The Nanaimo Medical Staff Association Executive has requested that the CPOE tools be suspended while improvements are made. An Island Health Board meeting was held yesterday to discuss the path forward. The Board and Executive take the concerns raised by the Medical Staff Association seriously, and recognize the need to have the commitment and confidence of the physician community in using advanced EHR tools such as CPE. We will engage the NRGH physicians and staff on a plan to start taking steps to cease use of the CPOE tools and associated processes. Any plan will be implemented in a safe and thoughtful way with patient care and safety as a focus.
Dr. Brendan Carr to all Staff/Physicians at Nanaimo Regional General Hospital

This extremely expensive back-tracking comes after a year of struggles between the Health Authority and the staff and physicians at NRGH.

After two years of development and testing, the system was rolled out on March 19, 2016. Within a couple months, staff had moved beyond internal griping to griping to the media and attempting to force changes through bad publicity.

Doctors at Nanaimo Regional Hospital say a new paperless health record system isn’t getting any easier to use.

They say the system is cumbersome, prone to inputting errors, and has led to problems with medication orders.

“There continue to be reports daily of problems that are identified,” said Dr. David Forrest, president of the Medical Staff Association at the hospital.
– CBC News, July 7, 2016

Some of the early problems were undoubtedly of the “critical fault between chair and keyboard” variety – any new information interface quickly exposes how much we use our mental muscle memory to navigate both computer interfaces and paper forms.

IHealth Terminal & Trainer

So naturally, the Health Authority stuck to their guns, hoping to wait out the learning process. Unfortunately for them, the system appears to have been so poorly put together that no amount of user acclimatization can save it in the current form.

An independent review of the system in November 2016 has turned up not just user learning issues, but critical functional deficiencies:

  • High doses of medication can be ordered and could be administered. Using processes available to any user, a prescriber can inadvertently write an order for an unsafe dose of a medication.
  • Multiple orders for high-risk medications remain active on the medication administration record resulting in the possibility of unintended overdosing.
  • The IHealth system makes extensive use of small font sizes, long lists of items in drop-down menus and lacks filtering for some lists. The information display is dense making it hard to read and navigate.
  • End users report that challenges commonly occur with: system responsiveness, log-in when changing computers, unexplained screen freezes and bar code reader connectivity
  • PharmaNet integration is not effective and adds to the burden of medication reconciliation.

The Health Authority committed to address the concerns of the report, but evidently the hospital staff felt they could no longer risk patient health while waiting for the improvements to land. Hence a very expensive back-track to paper processes, and then another expensive roll-out process in the future.

This set-back will undoubtedly cost millions. The EHR roll-out was supposed to proceed smoothly from NRGH to the rest of the facilities in Island Health before the end of 2016.

This new functionality will first be implemented at the NRGH core campus, Dufferin Place and Oceanside Urgent Care on March 19, 2016. The remaining community sites and programs in Geography 2 and all of Geography 1 will follow approximately 6 months later. The rest of Island Health (Geographies 3 and 4) will go-live roughly 3 to 6 months after that.

Clearly that schedule is no longer operative.

The failure of this particular system is deeply worrying because it is a failure on the part of a vendor, Cerner, that is now the primary provider of EHR technology to the BC health system.


When the IBM-led EHR project at PHSA and Coastal Health was “reset” (after spending $72M) by Minister Terry Lake in 2015, the government fired IBM and turned to a vendor they hoped would be more reliable: EHR software maker Cerner.

Cerner was already leading the Island Health project, which at that point (mid-2015) was apparently heading to a successful on-time roll-out in Nanaimo. They seemed like a safe bet. They had more direct experience with the EHR software, since they wrote it. They were a health specialist firm, not a consulting generalist firm.

For all my concerns about failures in enterprise IT, I would have bet on Cerner turning out a successful if very, very, very costly system. There’s a lot of strength in having relevant domain experience: it provides focus and a deep store of best practices to fall back on. And as a specialist in EHR, a failed EHR project will injure Cerner’s reputation in ways a single failed project will barely dent IBM’s clout.

There will be a lot of finger-pointing and blame shifting going on at Island Health and the Ministry over the next few months. The government should not be afraid to point fingers at Cerner and force them to cough up some dollars for this failure. If Cerner doesn’t want to wear this failure, if they want to be seen as a true “partner” in this project, they need to buck up.

Cerner will want to blame the end users. But when data entry takes twice as long as paper processes, that’s not end users’ fault. When screens are built with piles of non-relevant fields, and poor layouts, that’s not end users’ fault. When systems are slow or unreliable, that’s not end users’ fault.

Congratulations British Columbia, on your latest non-working enterprise IT project. The only solace I can provide is that eventually it will probably work, albeit some years later and at several times the price you might have considered “reasonable”.

Categories: OSGeo Planet

longwayaround.org.uk: Postgres Information Functions

OSGeo Planet - Sat, 2017-02-18 07:55

Postgres contains a wealth of functions that provide information about a database and the objects within. The System Information Functions of the official documention provides a full list. There are a huge number of functions covering a whole host of info from the current database session, privileges, function properties.

Examples Find an objects oid

A lot of the info functions accept the Object Identifier Type for objects in the database. This can be obtained by casting to regclass (also described in the oid docs) then to oid:

select 'schema_name.relation_name'::regclass::oid;

Where relation_name is a table, view, index etc.

View definition select pg_get_viewdef('schema_name.view_name'::regclass::oid);

Or in psql you can use one of the built in commands:

\d+ schema_name.view_name Function definition

Returns the function definition for a given function. Many built-in functions don't reveal much due to them not being written in SQL but for those that are you'll get the complete create function statement. For example to view the definition of the PostGIS st_colormap function:

select pg_get_functiondef('st_colormap(raster, integer, text, text)'::regprocedure); Privileges

A whole host of functions exist to determine privileges for schemas, tables, functions etc. Some examples:

Determine if the current users can select from a table:

select has_table_privilege('schema_name.relation_name', 'select');

Note: The docs state that "multiple privilege types can be listed separated by commas, in which case the result will be true if any of the listed privileges is held". This means that in order to test a number of privileges it is normally better to test each privilege individually as select has_table_privilege('schema_name.relation_name', 'select,update'); would return t even if only select is supported.

Determine if a user can use a schema:

select has_schema_privilege('schema_name', 'usage');
Categories: OSGeo Planet

Jackie Ng: React-ing to the need for a modern MapGuide viewer (Part 11): I don't say this in jest

OSGeo Planet - Fri, 2017-02-17 13:01
... but seriously, Jest completes my holy trinity of web front-end development nirvana.

  • React, because of its high performance and revolutionary component-based way of building frontend UIs. I could never go back to jQuery, data-binding, string templating and those other primitive ways of building frontends.
  • TypeScript, because it is in my opinion, the only sane programming language for frontend development. Just like jQuery was the glue that held together inconsistent browser APIs for many years, TypeScript the the glue that lets us play with future JS technologies in the present. TypeScript is JavaScript with C#-quality static typing. I love that a whole class of errors are eliminated through a simple compile step. I can't fathom having to maintain large code bases in a dynamically-typed language. TypeScript brings order and sanity in that regard. And with TypeScript 2.0, I don't have to deal with billion dollar mistakes.
  • And finally, Jest which I believe to be the total package for JavaScript unit testing that is sooooooo easy to set up! Code coverage is also included.
Before I tried Jest, the current unit test suite for mapguide-react-layout was a convoluted stack of:I also tried to get code coverage working. But this required a tool called istanbul and because my code was TypeScript, it needed some TypeScript source map plugin for istanbul to recognise it, which resulted in some rube-goldberg-esque contraption that formed the foundation of my unit test suite, that didn't even get the coverage right! So I scrapped the code coverage part and just coasted along with karma/mocha/chai combo until now.
With the introduction of Jest, it does the job of karma/mocha/chai/istanbul in a single, easy to install package. Only some small changes to my unit test suite was required (porting chai assertions to their jest equivalents) and the whole test suite was passing. With a simple addition of a --coverage flag to jest, it then automagically generates code coverage results and reports.

Since I am now getting code coverage reports, the obvious next step was to upload this to the coveralls.io service. It turns out TravisCI already supports automatic upload of code coverage reports to coveralls. It just needed installing node-coveralls and piping the jest coverage output to it.
And with that, I get another shiny badge to brandish on the project home page

This is almost too easy. Results like this easily incentivize you to write good quality code.
One last thing before I close out this post. The 63% coverage is a bit of a misnomer. It turns out that it is actually the percentage of code in modules that your unit tests are currently testing, which makes sense. The moment I start bringing in other components and classes under test, I expect this percentage to plummet, which is merely incentive to write more tests to bump it back up.
Categories: OSGeo Planet

gvSIG Team: Tercera edición del concurso internacional Cátedra gvSIG a trabajos universitarios con geomática libre

OSGeo Planet - Fri, 2017-02-17 10:25


Como ya han reflejado varios medios la Universidad Miguel Hernández (UMH) ya ha lanzado la convocatoria a la tercera edición del concurso de la Cátedra gvSIG sobre trabajos realizados con Sistemas de Información Geográfica libres.

Con el objetivo de fomentar el uso de la geomática libre en el mundo universitario y pre-universitario se lanza esta tercera edición del concurso, animando a los usuarios del software gvSIG y de los Sistemas de Información Geográfica libres en general a que compartan y den visibilidad a sus trabajos.

Los premios están dirigidos a estudiantes o egresados de secundaria y formación profesional, estudiantes o egresados universitarios y profesores universitarios e investigadores de todos los países. Los concursantes podrán presentarse de forma colectiva e individual, presentando su trabajo en inglés, castellano o valenciano.

Entre los trabajos seleccionados se otorgará un premio de 500 euros para cada una de las siguientes categorías:

  • Trabajo elaborado por alumnos de Bachillerato o Formación Profesional.
  • Proyecto Fin de Titulación Universitaria (Licenciatura, Grado, Máster).
  • Tesis doctoral o trabajo de investigación.

Cada vez son más los trabajos de ámbito universitario que utilizan gvSIG como parte fundamental de sus investigaciones. Si tú también formas parte de este colectivo, animate y presenta tú propuesta al concurso Cátedra gvSIG.

Más información aquí.

Y para los que tengáis curiosidad os ponemos el vídeo del Informativo de Radio UMH donde se han hecho eco de este concurso:

Filed under: Geopaparazzi, gvSIG Desktop, gvSIG Mobile, gvSIG Online, premios, press office, Projects, software libre, spanish Tagged: Cátedra, Concurso, geomática, Tesis, trabajo fin de grado, Universidad
Categories: OSGeo Planet

Jackie Ng: Announcing: mapguide-react-layout 0.8

OSGeo Planet - Wed, 2017-02-15 13:32
Here's a new release of mapguide-react-layout

Here's what's new in this release

Multiple Map Support

If you load an Application Definition with multiple map groups, the viewer now properly supports them.

Thanks to the use of redux (as my previous blog adventure post explained), map state is all nicely isolated from each other and makes it easy for components and commands to easily be aware of multiple maps, such is the case of the measure component (notice how recorded measurements switch along with the active map)

Also for Task Pane content, we added some smarts so that you know whether current task pane content is applicable or not to the current active map.

Other Changes

  • Update Blueprint to 1.9.0
  • Update React to 15.4.2
  • Improved performance of redux aware components to avoid unnecessary re-rendering
  • Sidebar Template: Fix a small sliver of the Task Pane content visible when collapsed
  • Legend: Fix infinite loop on maps with multiple (>2) levels of group nesting
  • Hover styles no longer render for disabled toolbar items
  • Clicking an expanded panel in an accordion no longer collapses it (an expanded panel should be collapsed by clicking another collapsed panel). This affects viewer templates that use accordions (eg. Slate)
  • Added support for InvokeURL command parameters
  • Fix default positioning of modal dialogs

Categories: OSGeo Planet

Tom Kralidis: OSGeo Daytona Beach Code Sprint 2017 redux

OSGeo Planet - Wed, 2017-02-15 12:32
I attended the 2017 OSGeo Code Sprint last week in Daytona Beach.  Having put forth a personal sprint workplan for the week, I thought it would be useful to report back on progress. pycsw There was lots of discussion on refactoring pycsw’s filter support to enable NoSQL backends.  While we are still in discussion, this […]
Categories: OSGeo Planet

gvSIG Team: Nuevo geocodificador CartoCiudad desarrollado por la Asociación gvSIG

OSGeo Planet - Wed, 2017-02-15 11:09


Hoy se ha anunciado el nuevo geocodificador de Cartociudad, un servicio considerablemente perfeccionado y que permite obtener mejores resultados con respecto a su antecesor. Desde la Asociación gvSIG nos hacemos eco de esta noticia, mostrando nuestra satisfacción por haber participado en su desarrollo conjuntamente con Scolab, una de las empresas socias de la Asociación gvSIG.

Cartociudad es un proyecto colaborativo de producción y publicación mediante servicios web de datos espaciales de cobertura nacional. Contiene información de la red viaria continua (calles con portales y carreteras con puntos kilométricos), cartografía urbana y toponimia, códigos postales, y distritos y secciones censales.

El proyecto Cartociudad está liderado y coordinado por el Instituto Geográfico Nacional (IGN). Se genera a partir de datos oficiales del IGN, la Dirección General del Catastro, el Grupo Correos y el Instituto Nacional de Estadística. Además, colaboran en su elaboración las comunidades autónomas de País Vasco, Navarra, Comunidad Valenciana, La Rioja, Baleares y Andalucía.

Con esta nueva aplicación desarrollada por la Asociación gvSIG se puede realizar tanto geocodificación directa como inversa. Para la obtención de coordenadas a partir de una dirección, con el nuevo servicio se pueden geolocalizar tanto una dirección urbana, como un punto kilométrico de una carretera. El servicio ofrece la posibilidad de buscar una dirección utilizando el nombre de entidades menores al municipio para localizarla. Esto ha sido posible gracias a la utilización de la información de referencia de poblaciones del IGN.

Como novedad, el servicio permite la geolocalización de referencias catastrales obteniendo las coordenadas de parcela a través del servicio SOAP de callejero y datos catastrales no protegidos de la Dirección General del Catastro.

El visualizador del proyecto CartoCiudad utiliza ya esta nueva aplicación en la ventana de búsqueda y enrutamiento.

Los detalles sobre la utilización de este nuevo servicio, se publicarán próximamente en la guía técnica de servicios web.

Más información en el blog de la IDEE.

Filed under: geoportal, gvSIG Association, IDE, press office, Projects, software libre, spanish Tagged: cartociudad, cálculo de rutas, directa, geocodificación, geolocalización, inversa, referencias catastrales
Categories: OSGeo Planet

gvSIG Team: Aprendiendo SIG con Juego de Tronos (VIII): Calculadora de campos

OSGeo Planet - Tue, 2017-02-14 23:00

La “calculadora de campos” es una de las herramientas más utilizadas por los usuarios de SIG a la hora de editar los atributos de una capa. El motivo es su versatilidad y el ahorro de tiempo que proporciona a la hora de editar distintos registros al mismo tiempo.

Permite realizar distintos tipos de cálculos sobre los campos de una tabla. Esta herramienta puede ejecutarse en todos los registros de una tabla o en aquellos que se encuentren seleccionados.

Veamos como funciona con unos simples ejercicios sobre nuestros datos de Juego de Tronos. Pero antes de comenzar veamos su interfaz.


  1. Información. Proporciona información sobre el “Campo” o “Comandos” seleccionados.
  2. Campo. Listado de campos de la Tabla. Con doble clic sobre un campo se añade a la expresión a aplicar.
  3. Tipo. En función del tipo seleccionado se actualiza la lista de “Comandos” disponibles.
  4. Comandos. Listado de comandos disponibles en función del “Tipo” seleccionado. Con doble clic sobre un comando se añade a la expresión a aplicar.
  5. Expresión. Operación que se aplicará sobre el campo seleccionado. La expresión se puede escribir directamente.

Vista la teoría, pasamos a realizar nuestro ejercicio práctico.

En primer lugar abrimos la tabla de atributos de la capa “Locations”, que si has ido siguiendo todos los ejercicios ahora tendrá 7 columnas. Uno de los campos existentes es “type” que contiene los tipos de localización (city, castle, ruin, town, other).048_got

Vamos a imaginar que queremos añadir una nueva columna en la que poner el tipo de localización en idioma castellano. Podríamos hacerlo manualmente, tal y como vimos en el post de “Edición de Tablas”, pero gracias a la “Calculadora de campos” podemos hacer este ejercicio de forma mucho más rápida.

Siguiendo los pasos que aprendimos en el post de “Edición de Tablas”, ponemos la Tabla en edición y añadimos una columna de tipo cadena (“String”), dejando el número de caracteres por defecto (50). A esa nueva columna la llamaremos “Tipo”. Podríamos dejar el dato de “Valor por defecto” vacío, pero para ahorrar tiempo en el rellenado pondremos “Otro” (sin las comillas). De este modo rellenará de forma automática todos los registros con este valor. Ahora ya sólo queda actualizar el resto de valores.059_got

En este momento la tabla quedaría así:060_got

Ahora utilizaremos la herramienta de “Selección por atributos” para ir seleccionando los distintos valores del campo “Type”, y la calculadora de campos para rellenar de forma automática las filas seleccionadas con el valor correspondiente.

Llegados a este punto, si no sabes utilizar la herramienta de “Selección por atributos” revisa el post en que explicamos su funcionamiento.

Vamos a comenzar seleccionando todas las filas cuyo “type” es “Castle”:061_got

Una vez seleccionadas, pulsamos la cabecera del campo “Tipo” (se muestra de un color gris oscuro).064_got

Ejecutamos la herramienta de “Calculadora de campos”, disponible en el menú “Tabla/Calculadora de campos” y en su botón correspondiente.065_got

Se nos abrirá una nueva ventana, en la que podremos escribir la expresión “Castillo” con la que queremos que rellene los campos. Es importar señalar que los textos deben ir entre comillas dobles.066_got

Al pulsar “Aceptar” se rellenaran las celdas del campo “Tipo” de las filas seleccionadas:063_got

Repetimos la misma operación con el resto de valores del campo “type”. Primero seleccionar las filas y luego con la calculadora de campos rellenar los datos:

  • Type “City” = Tipo “Ciudad”
  • Type “Ruin” = Tipo “Ruina”
  • Type “ Town” = Tipo “Pueblo”

Una vez finalizamos nuestra tarea, terminamos la edición y guardamos los cambios. Nuestra tabla quedará con el siguiente aspecto:070_got

La “Calculadora de campos” es muy potente y permite utilizar expresiones complejas. Te recomendamos que experimentes con ella y aprendas todas sus posibilidades. Hasta el próximo post…

Filed under: gvSIG Desktop, spanish, training Tagged: Calculadora de campos, Editar tablas, Juego de tronos, selección por atributos
Categories: OSGeo Planet

Geomatic Blog: How a daily digest of geospatial links is distributed

OSGeo Planet - Tue, 2017-02-14 21:40

TL;DR If you are interested on getting a daily digest of geospatial links subscribe to this mailing list or this atom feed. Take «daily» with a grain of salt.

Over the last six years Raf Roset, one of my favourite geonerds out there, has been sending all the cool stuff he founds about our geospatial world to Barcelona mailing list at OSGeo mailman server. He started circa 2011 sending one link per mail, but in 2013-04-03 he started to make a daily digest. A gun burst in Spanish is called Ráfaga so the joke was really at hand when someone proposed to call those digests that way.

Time passes, September 2014 and I ask Raf to send them also to Valencia mailing list, since most people there understand Catalan and the content was too good to be enjoyed only by our loved neighbours. Finally in January 2015 I decide to start translating them into Spanish and send them also to Spanish and Seville mailing lists.

Then in May I join CARTO and @jatorre thinks is a good idea if I can send them to the whole company mailing list so after some weeks I stop translating them into Spanish. Since that day I only do it English, trying to follow Raf lead everyday translating his mails and forwarding them to CARTO internal mailing list and the rest of the OSGeo ones.

Also at June I decided to put those mails in a simple website so the Ráfagas would also be accessible on GitHub and a static jekyll website so anyone could use the Atom feed to reach them.

Final chapter, in July I also decide to create a dedicated mailing list just for those people who are only interested in receiving those digest mails, obviously thinking in a broader audience, not just my fellow friends from Spain. I think at some point I will stop sending them to the Spanish lists because normally Ráfagas don’t fire any discussion and I’m sending the same message to three lists. To be fair they sometimes provoke discussions at CARTO mailing list. By the way I’m almost certain the full team has a filter to move them to their archives and they think I’m just an annoying spammer (a couple of times I’ve changed the subject just to troll them xDDD).

To conclude I want to post here my daily Ráfagas experience:

  • Raf is an early bird and sends the digest in the morning, I copy the contents into a shared Google Doc where a group of collaborators help me on translating the content. It may seem not a lot of effort, but doing this every single day needs a team. Really.
  • I go to my favorite text editor, put the translated content into a new file and start a local server to check the website renders properly.
  • If everything is OK I copy the rendered content and send it to CARTO and OSGeo mailing lists
  • I commit and Push to the GitHub repo so the website is updated along with the feed.
  • I archive Raf’s mail from my inbox.

Creating a Ráfaga

That’s it. Raf you are a formidable example of perseverance and I hope you’ll have the energy to keep giving us all those contents for many years. Thanks mate!

Filed under: CARTO, cartography, GIS
Categories: OSGeo Planet

gvSIG Team: gvSIG Online en el especial de Mapping de las Jornadas Ibéricas de Infraestructuras de Datos Espaciales

OSGeo Planet - Tue, 2017-02-14 17:14


La revista Mapping, una de las publicaciones técnico-científicas más reconocidas en materia de Geomática y Ciencias de la Tierra, dedica su número 180 a las pasadas Jornadas Ibéricas de Infraestructuras de Datos Espaciales (JIIDE), incluyendo entre su selección de ponencias presentadas un artículo de gvSIG Online, la plataforma en software libre para IDE, una parte fundamental de la suite de soluciones de gvSIG.

Podéis acceder a su lectura en el siguiente enlace:


Cada vez son más las entidades que están adoptando gvSIG Online. Si tú también estás interesado puedes contactarnos en info@gvsig.com Libertad y profesionalidad para poner en marcha tú Infraestructura de Datos Espaciales y SIG Corporativo.

Filed under: events, geoportal, gvSIG Online, IDE, software libre, spanish Tagged: gvSIG Suite, INSPIRE, LISIGE
Categories: OSGeo Planet

gvSIG Team: Geopaparazzi Code Sprint and…first image of gvSIG Mobile 2.0

OSGeo Planet - Tue, 2017-02-14 16:43

If you are interested in Mobile GIS, the future of Geopaparazzi and the first version of the all-new, all-different gvSIG Mobile, you must read this post….

Filed under: development, english, Geopaparazzi, gvSIG Mobile Tagged: gvSIG Suite
Categories: OSGeo Planet

Volker Mische: An R-tree implementation for RocksDB

OSGeo Planet - Tue, 2017-02-14 14:54

It's long been my plan to implement an R-tree on top of RocksDB. Now there is a first version of it.

Getting started

Checkout the source code from my RocksDB rtree-table fork on Github, build RocksDB and the R-tree example.

git clone https://github.com/vmx/rocksdb.git cd rocksdb make static_lib cd examples make rtree_example

If you run the example it should output augsburg:

$ ./rtree_example augsburg

For more information about how to use the R-tree, see the Readme file of the project.


The nice thing about LSM-trees is that the index data structures can be bulk loaded. For now for my R-tree it's just a simple bottom up building with a fixed node size (default is 4KiB). The data is pre-sorted by the low value of the first dimension. This means that data has a total order, hence also sorted results based on the first dimension. The idea is based on the paper On Support of Ordering in Multidimensional Data Structures by Filip Křižka, Michal Krátký, Radim Bača.

The tree is far from optimal, but it is a good starting point. Currently only doubles are supported. In the future I'd like to support integers, fixed size decimals and also strings.

If you have a look at the source code and cringe because of the coding style, feel free to submit pull requests (my current C++ skills are sure limited).

Next steps

Currently it's a fork of RocksDB which surely isn't ideal. I've already mentioned it in last year's FOSS4G talk about the R-tree in RocksDB (warning: autoplay) that there are several possibilities:

  • Best (unlikely): Upstream merge
  • Good: Add-on without additional patches
  • Still OK: Be an easy to maintain fork
  • Worst case: Stay a fork

I hope to work together with the RocksDB folks to find a way to make such extensions easily possible with no (or minimal) code changes. Perhaps having stable interfaces or classes that can easily be overloaded.

Categories: OSGeo Planet

gvSIG Team: Learning GIS with Game of Thrones (V): Editing tables

OSGeo Planet - Tue, 2017-02-14 12:22

We are going to continue with the course about introduction to GIS with Game of Thrones. At this post we are going to review the alphanumeric editing tools. Using the “Political” layer, that contains the kingdoms of the continent called “Westeros”, we are going to complete the original alphanumeric information with the sentence of the reigning house and two fields that will allow us to see (in a next post) how the “Hyperlink” tool works.

Are you ready?

Once we have opened our project, we put the “Political” layer active and we open its attribute table, such as we saw in the “Tables” post. This table contains 3 fields: id, name (name of the kingdom) and ClaimedBy. We are going to start editing now and add three additional fields.

To start editing we are going to access to the “Table/Start editing” menu or we press the corresponding button:

024_gotIf you have the View visible, you will see that the name of the layer (“Political”) is in red colour now, that indicates that the layer is in editing mode.


We are going to add the three columns, one by one. There are several ways to do it, and we are going to see the easiest one, using the tool of the “Table/Add column” menu or from its corresponding button:

026_gotWhen we press the button a new window will appear where we can define: field name, type, length (maximum number of characters), precision, (only for numeric fields) and value by default (this is optional, cells will be empty if we don’t write anything here).


The values of the new three fields to create will be:

  • Name: Words, Type: String, Length: 50
  • Name: Shield, Type: String, Length: 100
  • Name: Web, Type: String, Length: 100

Once the three fields are added our table will be like this one:


Now we can start to fill in the cells with the data of each one. For that we only have to double-click on the corresponding cell and start to write. We will fill in the cells then.

For “Words” field we will add the next sentences for each of the reigning houses:

  • Tully: “Family, Duty, Honor”

  • Stark: “Winter is Coming”

  • Greyhoy: “What Is Dead May Never Die”

  • Martell: “Unbowed, Unbent, Unbroken”

  • Baratheon: “Ours is the Fury”

  • Arryn: “As High as Honor”

  • Lannister: “A Lannister Always Pays His Debts”

  • Targaryen: “Fire and Blood”

  • Tyrell: “Growing Strong”

Results will be similar to these ones:


As we have spoken about, we work with the other two fields in a next post related to hyperlinks. So we finish editing mode of the table from the “Table/Stop editing” or from its corresponding button:

Before finishing it’s important to tell that there’s a tool that allows us to edit the alphanumeric values of a layer from the View directly. Sometimes it can help us to save time in our updating data tasks.

To check it, from our View and with “Political” layer activated, we press on the “Attribute editor” button:

032_gotTo use it we press on the element to edit and a new window will be opened with its alphanumeric attributes, that we can modify.

033_gotTest it and check its working. To finish it you have to press “Finish editing” button at that window.

See you at the next posts to continue learning…

Filed under: gvSIG Desktop
Categories: OSGeo Planet

Andrea Antonello: Wrap up of the geopaparazzi code sprint in Valencia

OSGeo Planet - Tue, 2017-02-14 08:42
This will be a bit long and a bit for developers. But it might be a good read for anyone interested in the future of geopaparazzi.

Last week we meet up with the guys of Scolab to investigate, develop and plan future geoapaparazzi activities.

We had a way to big agenda, but we were positive we could do at least part of it:
  • investigate a map renderer upgrade (mapsforge 0.7.0 or Nasa World Wind Android)
  • make geopaparazzi pluggable to allow easier branding and customization
  • investigate the possibility to use forms also for spatialite layers/features

Investigation of a map renderer upgradeThis is not exactly strategic at the time, but it would be good to have. Geopaparazzi now has support for some basic feature editing and the current workarounds to have this going are not all that nice.

So we gave a good test to Nasa World Wind.NWW is easy to understand, nicely coded and allows for a clean integration of mapping tools. It has built in support for WMS and it really looks as if it has all we need. There are a few problems though. It only seems to support geographic projection WGS84. We tried to implement a mapsforge offline maps provider, which worked out well, but then was impossible to finalize due to the missing mercator projection.

All the created code is available on my NWW clone. You can find the simple create lines tool here and the mapsforge integration here.

Another problem of the NWW project is the low response rate on the forum. It is an open source project so we can't blame it, but it sure has an impact on the choice. I tried to post two questions on the android support forum, but there has been no reaction at all. It really is a pity, because we would love to use that project as next geopaparazzi renderer.

We also gave mapsforge 0.7.0 (the current) a go. There are a ton of examples available in the demo app. One problem is that the app crashes constantly while switching between examples. The other one is that the API has changed a lot from the version we are using in geopaparazzi. That means that we would have to start from scratch.

We had to stop on this due to time constraints. We now have more insight, but no clear ideas at all. This is a major work that needs to be done at some point but can't be done without the proper resources. Should we have them at some point, this investigation will sure help to get started.

Geopaparazzi plugins systemWe then started to investigate possibilities to make plugins installable from the play store. This has been proven to be quite difficult and resources demanding while trying to make plugins thin and generic.

So we decided to make a first intermediate step. We created a plugin system that would be based on intent services and libraries. This means that it is possible to package a version of geopaparazzi that is branded and presents functionalities that the official version does not.

Branding isn't actually a plugin, but falls anyways into this pot of customization, so I will quickly explain it.

BrandingTo brand geopaparazzi with an own name and style it is now possible to create a simply minimalistic android application.

Previously the geopaparazzi application was completely contained inside the android module named geopaparazzi.app while only the reusable code pieces were in their own modules:
  • geopaparazzilibrary
  • geopaparazzimapsforge
  • geopaparazzimarkerslib
  • geopaparazzispatialitelibrary
Now the logic of the app has been moved to the module geopaparazzi_core, while a minimalistic app wrapper is contained in the geopaparazzi.app module.

Looking into the wrapper modules shows that there is only one class containing:

public class GeopaparazziActivity extends GeopaparazziCoreActivity {
This means we just extend the main class and that is it.

In the same module we can define the app name and a custom style, as well as a custom icon for the app.

This minimalistic module makes it possible to maintain your own branded app in a very simple way.

Sure, plugins are necessary to make it really yours. :-)

Since the refactoring was in process we also decided to make the core module less dependent from the company that gave birth to the project, HydroloGIS. This module was the last one containing the eu.hydrologis.geopaparazzi namespace, which was changed to eu.geopaparazzi to better stress the importance of openness. So if you were depending on this code, you will need to change the import removing the reference to hydrologis.
PluginsAs written before the plugin system is based on intent services. We have created during this code sprint 2 first extension points that allow to customize import and export menus and actions.

There is now a folder named plugins, that contains available import and export plugin. If you do not include those in your app, then the import and export views will be empty.

Have a look at the plugins, simply look in the simple ones create in the plugins folder. It is quite easy to create one. If you need help, please write to the mailinglist of geopaparazzi.

gvSIG MobileThe result of this first implementation of the plugins and branding is the first version of gvSIG Mobile.

While geopaparazzi will always exist and be developed, gvSIG Mobile is the app maintained by the gvSIG Association as the mobile solution of there stack:


    As you can see from the screenshots it is quite simple to brand the app with a custom style and name.

    Right now geopaparazzi and gvSIG Mobile are very similar, but this will change with the use of the plugin system. Right now there is already a big difference between geopaparazzi and gvSIG Mobile. gvSIG Mobile has the possibility to synchronize spatialite databases with gvSIG Online, which makes it possible to centralize data surveys.

    Soon Scolab will also add the possibility to synchronize geopaparazzi projects to gvSIG Online to create online projects. This will make surveying even more fun and simple.

    With time and resources (based on the jobs we do on this) we will slowly add extension points to provide dashboard actions, context menu entries and even map tools.

    Forms for spatialite layersThis should have been an investigation of the effort necessary to allow the use of the complex forms of geopaparazzi also for spatialite layers. Also, it should be possible to create a tools for simple forms creation in gvSIG Online.

    Sadly we didn't have time to even talk about this during the code sprint, time was too short.

    Wrap upIt has been good to sit down with other developers and work together on common goals for a tiny project as geopaparazzi. I see the project growing slowly but constantly, which fills my hear with joy. The creation of the twin gvSIG Mobile is an important step and makes geopaparazzi the first choice in a GIS stack that is used all over the world.

    Well, we'll see what the future brings. :-)

    A quick image of the first visualization of gvSIG Mobile on an Android phone (one day this image will be important :-D ). With Jose^2 and Alvaro. Cesar is hidden somewhere :-)

    Categories: OSGeo Planet
    Syndicate content