OSGeo Planet

Marco Bernasocchi: Interlis translation

OSGeo Planet - Fri, 2017-12-01 07:18
Lately, I have been confronted with the need of translating Interlis files (from French to German) to use queries originally developed for German data. I decided to create an automated convertor for Interlis (version 1) Transfer Format files (.ITF) based… See more ›
Categories: OSGeo Planet

gvSIG Team: SIG aplicado a Gestión Municipal: Módulo 15 ‘gvSIG 3D’

OSGeo Planet - Thu, 2017-11-30 08:28

Ya está disponible el módulo nº 15 del curso de SIG para Gestión Municipal, donde veremos las principales funcionalidades de la parte 3D de gvSIG, basadas en la aplicación WorldWind de la NASA.

En este módulo aprenderemos a crear una Vista 3D a partir de una Vista 2D. Las vistas que podemos crear son vistas planas, cuando trabajamos en zonas más locales, y las vistas esféricas, cuando queremos verlas con la forma del globo terráqueo.

En las Vistas 3D podremos visualizar los modelos digitales de terreno, a los que podemos superponer cualquier otra capa, como por ejemplo una ortofoto.

Por otro lado, si disponemos de una capa vectorial de edificios, con un campo en el que tenemos el número de pisos o la altura del edificio, podemos crear extrusión, de forma que vemos los polígonos en vertical como si fuese el propio edificio, y nos sirve para visualizar de una forma fácil la estructura de la población.


Finalmente, otra funcionalidad disponible en la parte 3D es la de crear animaciones. Esto se realiza mediante capturas de pantalla en ciertos encuadres, y después se crea un vídeo, interpolando automáticamente entre los diferentes encuadres. Esto nos puede ir bien para una presentación, cuando queremos mostrar las diferentes zonas de nuestra población pero con más detalle.

La cartografía a utilizar en este vídeo la podéis descargar desde el siguiente enlace.

El vídeo de este módulo es el siguiente:

Post relacionados:


Filed under: gvSIG Desktop
Categories: OSGeo Planet

deegree: test-nov-17

OSGeo Planet - Thu, 2017-11-30 07:22

test-nov-17

Categories: OSGeo Planet

deegree: test-2

OSGeo Planet - Thu, 2017-11-30 07:18

test-2

Categories: OSGeo Planet

Jo Cook: Portable GIS accepted as OSGeo Community Project

OSGeo Planet - Wed, 2017-11-29 09:22
I’m delighted to announce that Portable GIS has been accepted as an official OSGeo Community Project! From a technical perspective, this is the culmination of several months work behind the scenes getting the proper code repository set up here, creating the website, improving the documentation, and formalising the open source license. As a colleague said recently, Portable GIS has moved from being (effectively) freeware, to proper open source. So, there are now official guidelines on how to contribute to Portable GIS development, and on the license terms under which you can use and contribute.
Categories: OSGeo Planet

gvSIG Team: A tenor de las ciudades sostenibles, una reflexión desde #ConamaLocalVLC

OSGeo Planet - Tue, 2017-11-28 14:48

Estos días se celebra en Valencia un evento organizado por CONAMA y cuyo objetivo es ser un foro de debate y de trabajo entre profesionales y administraciones locales. Unas jornadas que están siendo muy enriquecedoras, quizá en gran parte al compromiso que a priori tienen todos los asistentes por hacer más sostenibles nuestras ciudades.

Cuando se habla de sostenibilidad se hace referencia constante a los temas centrales de las jornadas: cambio climático, desarrollo sostenible, transformación de valores, buenas prácticas y modelos innovadores, colaboración entre sectores, instituciones y sociedad, nueva economía y, también, sinergias con la tecnología. Se habla de garantizar una transición justa que permita que los cambios que se activen sean una oportunidad para todos y no sólo para unos pocos o donde haya sectores que salgan perjudicados.

De lo general vayamos a lo particular. En el taller denominado ‘City Makers’ se definieron 8 temas principales. Entre ellos estuvo la ‘Soberanía Tecnológica’.

Interesante el papel de la tecnología en un evento de medio ambiente. Siglo XXI. Empezamos a estar concienciados de su importancia.

Parece una obviedad, pero repitamos que no habrá ciudades sostenibles si la sostenibilidad no se aplica a la tecnología, que no habrá colaboración si no se apuesta por el único tipo de software que lo permite, que esa nueva economía o nuevo modelo productivo…nunca llegará al sector tecnológico si las ciudades no hacen una clara apuesta por el software libre. Siglo XXI. No minimicemos su importancia capital.

En estos días casi no hay presentación en la que no hayan Sistemas de Información Geográfica detrás, se ven mapas y más mapas representando y analizando la información municipal en ese camino hacia la sostenibilidad. La gestión de la información geográfica se ha normalizado. Tenemos software libre para ello, y son cada vez más municipios los que trabajan con la Suite gvSIG con soluciones de escritorio, móviles y web (geoportales, Infraestructuras de Datos Espaciales).

Hay que invertir en sostenibilidad, también tecnológica.

Ciudades sostenibles serán aquellas que también sean ciudades soberanas tecnológicamente, o dicho de otro modo, tecnológicamente sostenibles.


Filed under: gvSIG Suite, opinion, spanish Tagged: cambio climático, ciudades sostenibles, CONAMA
Categories: OSGeo Planet

Paul Ramsey: Nested Loop Join with FDW

OSGeo Planet - Mon, 2017-11-27 16:00

Update: See below, but I didn’t test the full pushdown case, and the result is pretty awesome.

I have been wondering for a while if Postgres would correctly plan a spatial join over FDW, in which one table was local and one was remote. The specific use case would be “keeping a large pile of data on one side of the link, and joining to it”.

Because spatial joins always plan out to a “nested loop” execution, where one table is chosen to drive the loop, and the other to be filtered on the rows from the driver, there’s nothing to prevent the kind of remote execution I was looking for.

I set up my favourite spatial join test: BC voting areas against BC electoral districts, with local and remote versions of both tables.

CREATE EXTENSION postgres_fdw;

-- Loopback foreign server connects back to
-- this same database
CREATE SERVER test
FOREIGN DATA WRAPPER postgres_fdw
OPTIONS (
host '127.0.0.1',
dbname 'test',
extensions 'postgis'
);

CREATE USER MAPPING FOR pramsey
SERVER test
OPTIONS (user 'pramsey', password '');

-- Foreign versions of the local tables
CREATE FOREIGN TABLE ed_2013_fdw
(
gid integer,
edname text,
edabbr text,
geom geometry(MultiPolygon,4326)
) SERVER test
OPTIONS (
table_name 'ed_2013',
use_remote_estimate 'true');

CREATE FOREIGN TABLE va_2013_fdw
(
gid integer OPTIONS (column_name 'gid'),
id text OPTIONS (column_name 'id'),
vaabbr text OPTIONS (column_name 'vaabbr'),
edabbr text OPTIONS (column_name 'edabbr'),
geom geometry(MultiPolygon,4326) OPTIONS (column_name 'geom')
) SERVER test
OPTIONS (
table_name 'va_2013',
use_remote_estimate 'true');

The key option here is use_remote_estimate set to true. This tells postgres_fdw to query the remote server for an estimate of the remote table selectivity, which is then fed into the planner. Without use_remote_estimate, PostgreSQL will generate a terrible plan that pulls the contents of the `va_2013_fdw table local before joining.

With use_remote_estimate in place, the plan is just right:

SELECT count(*), e.edabbr
FROM ed_2013 e
JOIN va_2013_fdw v
ON ST_Intersects(e.geom, v.geom)
WHERE e.edabbr in ('VTB', 'VTS')
GROUP BY e.edabbr; GroupAggregate (cost=241.14..241.21 rows=2 width=12) Output: count(*), e.edabbr Group Key: e.edabbr -> Sort (cost=241.14..241.16 rows=6 width=4) Output: e.edabbr Sort Key: e.edabbr -> Nested Loop (cost=100.17..241.06 rows=6 width=4) Output: e.edabbr -> Seq Scan on public.ed_2013 e (cost=0.00..22.06 rows=2 width=158496) Output: e.gid, e.edname, e.edabbr, e.geom Filter: ((e.edabbr)::text = ANY ('{VTB,VTS}'::text[])) -> Foreign Scan on public.va_2013_fdw v (cost=100.17..109.49 rows=1 width=4236) Output: v.gid, v.id, v.vaabbr, v.edabbr, v.geom Remote SQL: SELECT geom FROM public.va_2013 WHERE (($1::public.geometry(MultiPolygon,4326) OPERATOR(public.&&) geom)) AND (public._st_intersects($1::public.geometry(MultiPolygon,4326), geom))

For FDW drivers other than postgres_fdw this means there’s a benefit to going to the trouble to support the FDW estimation callbacks, though the lack of exposed estimation functions in a lot of back-ends may mean the support will be ugly hacks and hard-coded nonsense. PostgreSQL is pretty unique in exposing fine-grained information about table statistics.

Update

One “bad” thing about the join pushdown plan above is that it still pulls all the resultant records back to the source before aggregating them, so there’s a missed opportunity there. However, if both the tables in the join condition are remote, the system will correctly plan the query as a remote join and aggregation.

SELECT count(*), e.edabbr
FROM ed_2013_fdw e
JOIN va_2013_fdw v
ON ST_Intersects(e.geom, v.geom)
WHERE e.edabbr in ('VTB', 'VTS')
GROUP BY e.edabbr; Foreign Scan (cost=157.20..157.26 rows=1 width=40) (actual time=32.750..32.752 rows=2 loops=1) Output: (count(*)), e.edabbr Relations: Aggregate on ((public.ed_2013_fdw e) INNER JOIN (public.va_2013_fdw v)) Remote SQL: SELECT count(*), r1.edabbr FROM (public.ed_2013 r1 INNER JOIN public.va_2013 r2 ON (((r1.geom OPERATOR(public.&&) r2.geom)) AND (public._st_intersects(r1.geom, r2.geom)) AND ((r1.edabbr = ANY ('{VTB,VTS}'::text[]))))) GROUP BY r1.edabbr Planning time: 12.752 ms Execution time: 33.145 ms
Categories: OSGeo Planet

gvSIG Team: SIG aplicado a Gestión Municipal: Módulo 14 ‘Georreferenciación de imágenes’

OSGeo Planet - Mon, 2017-11-27 08:34

Ya está disponible el módulo nº 14 del curso de SIG para Gestión Municipal, donde veremos cómo georreferenciar una imagen.

En ocasiones se puede disponer en un ayuntamiento de una imagen que no esté georrefenciada. También se puede tener un plano antiguo en papel, cuyos datos necesitamos para poder, por ejemplo, realizar algún análisis sobre nuestra aplicación de escritorio, como puede ser delimitar con detalle el término municipal. Ese plano en papel se podría escanear de forma que tendríamos una imagen en nuestro disco duro.

Dichas imágenes no tienen coordenadas, por lo que si las insertásemos en una Vista estarían en las coordenadas ‘0,0’, y no se superpondrían con nuestra cartografía georreferenciada.


Para georreferenciar dicha imagen necesitaremos cartografía de referencia que esté georreferenciada, de forma que indiquemos los puntos de apoyo en dicha cartografía, y su correspondiente en la imagen a georreferenciar. También se podría realizar si tenemos una tabla de coordenadas de los puntos de apoyo georreferenciados.

La cartografía a utilizar en este vídeo la podéis descargar desde el siguiente enlace.

El vídeo de este módulo es el siguiente:

Post relacionados:


Filed under: gvSIG Desktop, spanish, training Tagged: ayuntamientos, georreferenciación, gestión municipal, imágenes
Categories: OSGeo Planet

Cameron Shorter: Tackling the Open Source dilemma

OSGeo Planet - Sat, 2017-11-25 22:55



Here is the dilemma that you and your boss are faced with when considering Open Source:
Looked at through the lens of traditional management, Open Source collaboration is time consuming, imprecise, unreliable, hard to manage, rarely addresses short term objectives, and hard to quantify in a business case. And yet, in a digital economy, collaborative communities regularly out-innovate and out-compete closed or centrally controlled initiatives.So how do we justify following a more effective, sustainable, open and equitable strategy?



This is what we will be covering today:
  • The digital economy,
  • Complexity,
  • Trust,
  • Innovation and Obsolescence,
  • and what leads to Success or Failure.


The first thing to recognise is that the Digital Economy has fundamentally changed the rules of business. Ignore this at your own peril.
Zero Duplication Costs and the Connectivity of the Internet has led to Wicked Complexity, Rapid Innovation, and on the flip side, Rapid Obsolescence.

Let’s start by talking about Complexity.
Software systems have become huge, interdependent and complex.
It is no longer possible for one person to understand all of a system’s intricacies.
So decision makers need to assume, deduce and trust information provided by others.
It means that sourcing trustworthy advice has become a key criteria for success in the digital economy.
So what how do we assess trustworthiness?

It turns out we all make use of a variant of this trustworthiness equation.

  • We trust people who are credible and who have have track record of providing reliable advice in the past.
  • We trust people who are open and transparent.
  • We trust ourselves, our family, our friends, because they look out for us, and we look out for them.
  • We are suspicious of people who stand to gain from advice they give us.

We also trust processes.
  • We trust that the democratic process leads to fair governance and management of resources.
  • We trust that the scientific method leads to reliable research that we should act upon. I believe that climate change is happening and that we need to do something about it, despite the weather seeming pretty similar to me over the last 40 years.
  • We trust that the “survival of the fittest” competition of market economies leads to better products.

But we also know that all processes can be gamed.
And the more complex a system, the easier it is to bamboozle people and game the system.

Part of the reason Open Source has been so successful is that it’s characteristics lead to trustworthiness.
These characteristics include:
  • Freedom,
  • Altruism,
  • Openness,
  • Meritocracy,
  • and Do-ocracy.
Let’s break these down one by one.

Open source, by definition, provides the receivers of the software with the four freedoms:
  1. Freedom to use the software unencumbered; 
  2. Freedom to study the source code and find out how it works; 
  3. Freedom to modify, retask, and improve the code;
  4. Freedom to copy and share with others.
Providing such a valuable gift, which provides significantly more value to the receiver than to the giver, increases the trustworthiness of the giver.

Additionally, openness and transparency is almost universally applied to all Open Source development practices and communication.
  • Conversations are public; Everyone has the opportunity to join and contribute; 
  • Decisions are made openly; 
  • Issues and limitations are published and shared.
Being transparent and open to public critique reduces the potential for hidden agendas and creates trustworthiness.

In a meritocracy, the best ideas win, no matter who suggests them. It is the sign of an egalitarian community rather than a hierarchical or dysfunctional one.


With a do-ocracy the person motivated to do the work decides what gets done. In complex systems, the person closest to the problem will usually be best qualified to make the technical decisions.



A key strategy for managing complexity is to divide large systems into modular subsystems.
Using modular architectures, connected by open standards:
  • Reduces system complexity,
  • Enables interoperability,
  • Which reduces technical risk,
  • And facilitates sustained innovation.
It means you can improve one module, without impacting the rest of your system. This helps with maintenance, innovation, and keeping up with latest technologies.

Collaboration is a key focus of both Open Source and Open Standards narratives. Hence, successful Open Source applications usually provide exemplary support for standards.

By comparison, from the perspective of dominant proprietary companies, it makes business sense to apply vendor lock-in tactics, making cross-vendor integration difficult. Adoption of Open Standards threatens vendor lock-in tactics, and consequently dominant vendors are often reluctant and half-hearted in their support of Open Standards.

In the digital economy there are two dominant business models which work well.
Either:
  • You solve a generic problem by supplying an awesome "category killer" application which you distribute to the world; 
  • Or you provide personalised, specialised or localised services, typically using category killer applications.
There is a natural symbiotic relationship between the two.
If you are solving a generic problem, by yourself, you will be out-innovated!
There are simply more developers in the rest of the world than you can ever muster within your team.

Because software is so time consuming to create and so easy to copy, it is excessively prone to monopolies.
This holds true for both proprietary and open source products. A product that becomes a little better than its competitors will attracts users, developers and sponsors, which in turn allows that product to grow and improve quickly, allowing it to attract more users.
This highly sensitive, positive feedback leads to successful software projects becoming “category killers”.

This means that most of the software you own will be out-innovated within a year or two.
Your software is not an asset, it is a liability needing to be updated, maintained, and integrated with other systems. It is technical debt, and you should try to own as little of it as possible.The question is: should you select Proprietary or Open Source as the alternative?


Openness democratises wealth and power, which is a good thing for all of us, even those with wealth and power.Open Source and Proprietary business models differ in how their realised value is shared.Open source licenses are structured such that multiple companies can use and support the same open source product, so the market self corrects any tendencies toward price-fixing.Effectively, Open Licenses democratise information.It enables everyone to share in the value created by technology.As software markets mature, and components become generic commodity items, the collaborative practices of Open Source moves to becoming the most effective means for creating and managing functionality.Collaboration trumps Competition for commodity items!
By comparison, the ruthless competition between proprietary companies results in “winner takes all” scenarios. Many of the richest people in the world are self made software entrepreneurs.
Jeff Bezos who started Amazon has recently been ranked as the richest man in the world, stealing the spot from Bill Gates who started Microsoft. Mark Zuckerberg who started Facebook comes in at number 5. Jack Dangermond from ESRI is down at #603, with a mere $3.2 billion dollars to his name.

Lets explain this another way, following the money trail. Proprietary business model favours multi-nationals who establish themselves in big markets such as in the US or Europe.
From our Australian software spend, a small commission is provided to the local sales guy and systems integrator, and the rest is funnelled into the multinational who often farms development into cheap development centres.


Open Source on the other hand favours local business. The software is free, so the majority of money spent is on support and integration type services, which is typically applied locally, keeping money and expertise local.



Let’s look into the characteristics which make projects successful or not.
Open Source projects are highly susceptible to being Loved to Death. This happens when a project attracts an engaging user base without attracting matching contributions. Volunteer become overwhelmed leaving insufficient capacity to cover essential business-as-usual tasks.Don’t to overload the community you depend upon. It is both bad karma and bad business.Successful projects have worked out how to either:
  • Politely say NO to “gifts” of unsupported extra code and excessive requests for help;
  • Or how to help uses become contributors, either in kind, or financially.
If your organisation isn’t ready to act as a good community citizen, actively caring about the community’s long term sustainability, then you will probably have a disappointing Open Source experience. You will make self-centred, short term decisions, and you won’t get the support you need when you most need it. You will likely be better off with proprietary software. (And the community would be better off without you.)The success criteria for Open Source projects was researched by Professor Charlie Schweik who studied thousands of projects. As you can see from this graph, most projects are abandoned. Of the remainder, most projects don’t attract more than one or two staff, and very few attract a large community.Viewed another way, you can see that:
  • 4/5 projects are abandoned.
  • 1 in 7 remain with just 1 or 2 developers.
  • Only 1 in 100 manage to attract 10 core contributors.
(Source data)On this graph we’ve drawn in the success rate for projects, and you can see that as you attract developers, your chance of long term success increases dramatically.This is showing ruthless Darwinian evolution at work. Only projects of exceptional quality attract sustained growth and large communities. They fit in the “magic unicorn” category.



So how do you find these magic unicorn projects?Charlie’s team distilled further insights from their research. They found that successful projects typically possess:
  • A clearly defined vision;
  • Clear utility;
  • And leaders who lead by doing.
Then projects which manage to attract a medium to large team tend to:
  • Provide fine scaled task granularity, making it easier for people to contribute;
  • And often have attracted financial backing.


To get insights into project health, you can look at Open Hub metrics.  This slide is from the OSGeo-Live project and shows the status of leading Desktop GIS applications.And for QGIS you can see that it has a very healthy community with over 100 active contributors. Another strong indicator of a project’s success is whether it has completed an Open Source Foundation’s incubation process.The Open Source Geospatial Foundation’s incubation process covers:
  • Quality
  • Openness
  • Community Health
  • Maturity
  • Sustainability


Bringing this all together into a concise elevator pitch for your boss:
  • The Digital Economy leads to High Complexity, Rapid Innovation and Rapid Obsolescence. Get with the program, or become obsolete.
  • Increased complexity requires us to trust more. So increase the value you place on trustworthiness, openness and transparency. 
  • Software is technical debt. It needs significant maintenance to remain current. Own as little of it as possible.
  • Collaboration and openness fast tracks innovation.
  • For the long term play, Collaboration trumps Competition. If you are solving a generic problem, by yourself, you will be out innovated! Value, recognise, select and apply collaborative practices.
  • Don’t be naive, most Open Source projects fail. Learn how to pick winners.
  • Openness and Collaboration leads to the democratisation of wealth and power. Learn how to be part of the community - it makes good business sense.



  • Questions and comments are welcomed.
  • Slide deck is available online.
  • An earlier version of these slides was presented at QGIS Conference in Sydney, Australia, November 2017.
  • The text behind these slides, by Cameron Shorter, is licensed under a Creative Commons Attribution 4.0 International License
  • For those of you who already know me, I should point out that I’ve changed jobs. I now have a new enigmatic title of “Technology Demystifier” in the Information Experience team at Learnosity. And while it’s a shift away from my Open Source Geospatial roots, I plan to continue to be actively involved in Open Source.
  • If this presentation was of interest to you, then please let me know (use comments below or email address on the slide above). I enjoy hearing from people who share similar interests, or are facing similar challenges, and ideas people have on related topics. 
Related presentations:

Categories: OSGeo Planet

Free and Open Source GIS Ramblings: Intro to QGIS3 3D view with Viennese building data

OSGeo Planet - Sat, 2017-11-25 12:56

In this post, I want to show how to visualize building block data published by the city of Vienna in 3D using QGIS. This data is interesting due to its level of detail. For example, here you can see the Albertina landmark in the center of Vienna:

an this is the corresponding 3D visualization, including flying roof:

To enable 3D view in QGIS 2.99 (soon to be released as QGIS 3), go to View | New 3D Map View.

Viennese building data (https://www.data.gv.at/katalog/dataset/76c2e577-268f-4a93-bccd-7d5b43b14efd) is provided as Shapefiles. (Saber Razmjooei recently published a similar post using data from New York City in ESRI Multipatch format.) You can download a copy of the Shapefile and a DEM for the same area from my dropbox.  The Shapefile contains the following relevant attributes for 3D visualization

  • O_KOTE: absolute building height measured to the roof gutter(?) (“absolute Gebäudehöhe der Dachtraufe”)
  • U_KOTE: absolute height of the lower edge of the building block if floating above ground (“absolute Überbauungshöhe unten”)
  • HOEHE_DGM: absolute height of the terrain (“absolute Geländehöhe”)
  • T_KOTE: lowest point of the terrain for the given building block (“tiefster Punkt des Geländes auf den Kanten der Gebäudeteilfläche”)

To style the 3D view in QGIS 3, I set height to “U_KOTE” and extrusion to

O_KOTE-coalesce(U_KOTE,0)

both with a default value of 0 which is used if the field or expression is NULL:

The altitude clamping setting defines how height values are interpreted. Absolute clamping is perfect for the Viennese data since all height values are provided as absolute measures from 0. Other options are “relative” and “terrain” which add given elevation values to the underlying terrain elevation. According to the source of qgs3dutils:

AltClampAbsolute, //!< Z_final = z_geometry AltClampRelative, //!< Z_final = z_terrain + z_geometry AltClampTerrain, //!< Z_final = z_terrain

The gray colored polygon style shown in the map view on the top creates the illusion of shadows in the 3D view:

 

Beyond that, this example also features elevation model data which can be configured in the 3D View panel. I found it helpful to increase the terrain tile resolution (for example to 256 px) in order to get more detailed terrain renderings:

Overall, the results look pretty good. There are just a few small glitches in the rendering, as well as in the data. For example, the kiosik in front of Albertina which you can also see in the StreetView image, is lacking height information and therefore we can only see it’s “shadow” in the 3D rendering.

So far, I found 3D rendering performance very good. It works great on my PC with Nvidia graphics card. On my notebook with Intel Iris graphics, I’m unfortunately still experiencing crashes which I hope will be resolved in the future.


Categories: OSGeo Planet

gvSIG Team: Días universitarios de gvSIG en 2018

OSGeo Planet - Fri, 2017-11-24 10:39

Empezamos este año 2017 celebrando eventos de un día en diversas universidades. En 2018 queremos extender esta práctica y dar a conocer gvSIG cada vez más en el mundo académico.

Este tipo de jornadas consisten en una serie de actividades a celebrar en 1 día (o medio día, según el caso). Ponencias introductorias a la suite gvSIG y exposición de casos de uso que puedan interesar a la audiencia, complementados con talleres para usuarios y desarrolladores, tanto de uso general como aplicados a distintas temáticas (geoestadística, urbanismo, criminología,…). En algunos casos el público asistente ha sido meramente universitario y en otros se ha abierto la opción de asistencia de público en general. Las opciones son muchas y se adaptan a cada uno de estos ‘Días universitarios gvSIG’.

Ya tenemos confirmadas algunas universidades que nos han comunicado que quieren su día gvSIG en 2018 (por supuesto iremos publicitando estos eventos para darlos a conocer al máximo posible).

Si te gustaría que tu universidad tuviera su día gvSIG…ponte en contacto con nosotros: info@gvsig.com


Filed under: spanish Tagged: universidades
Categories: OSGeo Planet

GIScussions: Dr Jekyll and Mr Hyde consider a new Geospatial Commission

OSGeo Planet - Thu, 2017-11-23 22:34

Dr Jekyll & Mr Hyde 1931 via Wikimedia

It’s November, it’s budget time, it’s that moment when geo-geeks and OpenData enthusiasts scour the hundreds of pages of budget pronouncements searching for phrases like “Ordnance Survey”, “Land Registry” and “Open Data”. It’s amazing how often we have got a mention in the budget or spending review publications over the past decade, you’d think that with all of the challenges that the country has faced since the crash of 2008 that the Chancellor would have more important things on his mind than geospatial data (don’t get me started on my list of omissions from the budget).

Yesterday, the Chancellor gave his “make or break” budget speech and within minutes of the budget report being published twitter was humming with discussion about geospatial open data, to be honest humming might be a bit of an exaggeration but  quite a few geo-geeks were onto this announcement:

“4.14 Geospatial data – The UK has some of the best geospatial data in the world, and much of it is held by public bodies. The potential economic value of this data is huge. To maximise the growth of the digital economy and consolidate the UK’s position as the best place to start and grow a digital business, the government will establish a new Geospatial Commission to provide strategic oversight to the various public bodies who hold this data. To further boost the digital economy, the government will work with the Ordnance Survey (OS) and the new Commission, by May 2018, to establish how to open up freely the OS MasterMap data to UK-based small businesses in particular, under an Open Government Licence or through an alternative mechanism, while maintaining the OS’s strategic strengths. The Budget provides £40 million a year over the next two years to support this work.”

Wow, the buzz words are rushing at you at 100mph! I’ve highlighted a few in red above. Well this is fantastic news, or is it? I find myself with very mixed views on this announcement (as usual with budget announcements there is very little detail so we are left to guess what is actually intended) hence the Jekyll and Hyde analogy – this could be great but on the other hand …

Some Mr Hyde-ish pedantry

9 years after the Free Our Data campaign successfully made the case for OpenData leading to Gordon Brown’s Damascene moment when he met Sir Tim Berners-Lee we are about to be able to access MasterMap as OpenData (or are we? more on that in a minute). So what’s not to like about opening up MasterMap? If that is the outcome, and MasterMap becomes some form of OpenData (inevitably there will be conditions that will preclude certain usage and activities) then that has to be a good step forward. However, the devil is in the detail and when I read that budget statement I cannot avoid reading between the lines and wondering what they really mean or whether anyone in government really understands what they are buying into. Call me a pedant if you wish.

“the best geospatial data in the world”

A little bit of exaggeration or evidence of the persuasive powers of OS marketing and the influence of their leadership? I think it is fair to claim that we have some of the most detailed and up to date large scale mapping compared to any other country but does that make it the “best in the world”? For many, data that provides global coverage may be more useful/desirable than highly detailed data for one country (of course the two are not mutually exclusive).

“maximise the growth of the digital economy and consolidate the UK’s position as the best place to start and grow a digital business”

I’d suggest that for many, if not most, digital businesses success is dependent on being able to scale globally, a dependence on a highly detailed geospatial dataset that cannot be replicated across multiple geographies may be a hindrance rather than an advantage. I know of at least one ‘hot’ startup that started out thinking it would use MasterMap as a key data resource in its service and has now reconsidered as it prioritised geographic expansion over greater detail.

Call me an obsessive Remainer if you wish, but I reckon that access to talented staff from across Europe and further afield will be a bigger factor in making the UK the best place to start and grow a digital business than access to our national map data.

“The potential economic value of this data is huge”

Gosh we have heard this one so many times in the past but somehow it always remains “potential”. The evidence base remains limited (see this from 5 years ago and this from earlier this year), the same few companies in transport and health are repeatedly cited as case studies although they don’t seem to be generating much growth in revenues or employment let alone profits that can be shared with wider society through taxes. A year or so after the first release of OS OpenData in 2010, the OS commissioned a study of the benefits, that study was never published, since then there has been an NAO report and a study by the ODI.

Maybe the evidence does exist but it just needs publishing, maybe there is reason to believe that it will exist in the future and we just need to understand the assumptions and modelling but surely after nearly 8 years it is time to move on from an act of faith in the economic benefits of Open Data to stimulate innovation.

Of course there are other immensley important benefits arising from Open Data e.g. transparency, accountability and societal well being which may be more important than the financial benefits. But we are being told that government are about to invest £80m over 2 years “to support this work” that amount of money represents a 10% increase to the funding of 150 primary schools or … I think it is reasonable to expect some transparency from government on the basis that it is choosing to invest taxpayers money in geospatial data in preference to education, health, care or welfare budgets.

“establish how to open up freely the OS MasterMap data to UK-based small businesses in particular”

Well this will be fun, I guess a group of consultants and OS management could spend a fair chunk of that £40m working out how to define a small business, how to restrict the benefit to UK based companies, what to do when a small business grows, what to do when a small UK business gets acquired by an larger non UK business, how to prevent a non UK business setting up a UK subsidiary to gain access to the “hugely valuable” “best geospatial data in the world”.

The phrase “under an Open Government Licence or through an alternative mechanism” suggests that this may not be fully Open Data and that there will be constraints on what users can do with the data which will prompt a not unjustified howl from Open Data purists who aspire that data should be free to use, re-use and combine with other data. Maybe, but I doubt that would be sustainable while larger users are expected to pay millions for annual licenses for MasterMap.

Over to Dr Jekyll

Enough of my Mr Hyde-ish doubts, let’s pause to look at some of the early comment on the announcement, perhaps others have a better informed and more positive outlook.

A good place to start would be the former CTO of the OS, my friend Ed Parsons (now Geospatial Technologist at Google)

This creation of the geospatial commission in the budget can only be the beginning of a major realignment of Geo in UK government… Not the easiest timing for implementing however but Good News without question. pic.twitter.com/Udv1dTUhf0

— Ed Parsons (@edparsons) November 22, 2017

Charles Arthur the founder of of the Guardian’s Free Our Data campaign and a long term advocate of Open Data celebrated with

Back in 2006, we at Guardian Technology launched the #FreeOurData campaign. Getting OS data available for free was a lynchpin. And this takes it further.
This is what it looks like when you have an idea that’s unstoppable. https://t.co/ChFloeA5lf

— Charles Arthur (@charlesarthur) November 23, 2017

The open data agenda gets new impetus in this Budget with this excellent announcement on opening up Geospatial Data #fitforthefuture pic.twitter.com/4qr7Ww7dp2

— Matt Hancock (@MattHancock) November 22, 2017

Michael Cross, the other founder of Free Our Data wrote

So, a Geospatial Commission. Is this the cadastral agency hinted at in the Tory manifesto, massively watered down? Thoughts, please. #Budget2017 #OpenData

— Michael Cross (@michaelcross) November 22, 2017

While Ed Dowding said

Amazing to think that if we'd all listened to @edparsons a dozen years ago @OrdnanceSurvey could be what people use instead of google maps, and civic and public sector innovation in the UK would have had an extra decade to build amazingly detailed services.

— Ed Dowding (@eddowding) November 22, 2017

And finally this from Civil Service World

Government claims better use of location data produced by public bodies could grow the economy by £11bn a year – and has created the new Geospatial Commission to do this #Budget2017 https://t.co/ZCDcwR8Onu pic.twitter.com/umCtAKmG9e

— Civil Service World (@CSWnews) November 23, 2017

Inevitably addresses had to spoil the party a little bit, Bob Barr, who has advocated an open address register for more than a decade, poured some cold water on the jubiliation

Royal Mail’s PAF ownership MUST NOT be allowed to undermine the intention of the Geospatial Commission. One of disgraced Michael Fallon’s worst decisions.

— Robert Barr (@DrBobBarr) November 23, 2017

The Big 40

By this stage you may think that I should pay more attention to Dr Jekyll, if all these people think that the announcement is good news why am I still hesitant? I’m baffled by the talk of £80m over 2 years, that’s an enormous amount of money and based on any reading of OS accounts significantly more than the value of MasterMap sales to small businesses. So what is the rest of the money for?

My skepticism is fuelled by the knowledge that when OS negotiated £20m p.a. for the initial OpenData release they bundled in a number of products whose sales were relatively minor and in several cases were in decline. It is highly debatable that government got a good deal and the lack of evidence of usage only adds to that doubt. Rumour has it that the £20m was substantially reduced during a subsequent review.

Perhaps there is a big demand for MasterMap that will be satisfied by some form of OGL availability, I hope there is more evidence for the demand than there has been for the initial data releases from the OS. In the recent OS Annual Report and Accounts they report an average of 290 downloads per day (some products are chunked so that you need to download several chunks to get national cover). The risk for government is that smaller commercial users of MasterMap will be delighted with the price reduction or elimination that they receive but that the latest colossal estimate of £11bn of benefit will not be realised, but hey when you can throw out numbers like £11bn a cost £40m sounds like a tiny price to pay.

The Geospatial Commission

“Get thee gone Mr Hyde!” I say.

Maybe there is more to this new Geospatial Commission than just funding OS to make MasterMap open/free to small businesses for 2 years. The clue might be buried in the announcement “a new Geospatial Commission to provide strategic oversight to the various public bodies who hold this data” and a bit more info came out with this press release from the Cabinet Office and the Treasury 

“The new Geospatial Commission, supported by £40 million of new funding in each of the next two years, will drive the move to use this data more productively – unlocking up to £11 billion of extra value for the economy every year.

The new Commission will draw together HM Land Registry, the Ordnance Survey, the British Geological Survey, the Valuation Office Agency, the UK Hydrographic Office and the Coal Authority with a view to:

  • improving the access to, links between, and quality of their data
  • looking at making more geospatial data available for free and without restriction
  • setting regulation and policy in relation to geospatial data created by the public sector
  • holding individual bodies to account for delivery against the geospatial strategy
  • providing strategic oversight and direction across Whitehall and public bodies who operate in this area”

Perhaps the Geospatial Commission heralds a merger of the Land Registry and Ordnance Survey into a body similar to the cadastral bodies in several other European countries, this opens up the potential for a complete release of geospatial Open Data funded by a tiny levy on property transactions which is what Bob Barr has been suggesting for ages.

OS data sales revenue has, in effect, been an expensive to collect, geospatial data use tax. The new Commission has an opportunity to re-think and charge those who cause the data to change, not those who want to access it.

— Robert Barr (@DrBobBarr) November 23, 2017

“That would be a fantastic outcome for Open Data advocates, transparency campaigners, innovative businesses and the public sector purse” says Dr Jekyll to a sulking Mr Hyde.

Groundhog Day

This really does feel like a Groundhog Day, we keep having discussions about Ordnance Survey business models, open data, sustainability and innovation. 2 years ago I wrote to the then Chancellor of the Exchequer, George Osborne, suggesting that plans to “develop options to bring private capital into the Ordnance Survey before 2020” might not be a great idea. If you can bear it you might want to re-read what I said at the time because it feels equally applicable today as advice to Mr Hammond and his new Geospatial Commission.

“Why not just fund this team to carry on surveying etc? You currently pay the OS £80m for them to do this through a series of agreements with government, that should be more than enough to cover the costs of surveying, data management etc. You could probably save at least £10m or £20m or even more once they stopped doing things that weren’t really essential. You could make the raw data available to government and the private sector for free and without restriction potentially unlocking a chunk of the billions that people think will come from OpenData”

Removing sales, marketing, licensing, legal, other commercial staff, consultants, cartographic functions, printed map production and sales would probably result in operational costs lower than government’s current £88m expenditure with OS. We could have free and open geospatial data and potentially even reduce the tax payers spend on that data! Perfect for people (or governments) who like having cake and who like eating cake. If there really is £11bn of extra value to be released for the economy from open geospatial data then most of it should still be there without the commercial paraphernalia.

2 years on from George Osborne’s spending review we have heard little more about introducing private capital into OS and the proposed privatisation of Land Registry has been canned.

Putting on my Mr Hyde hat, I wonder what if anything the Geospatial Commission will achieve over the next 2 years. But switching to my Dr Jekyll hat, if the Geospatial Commission used its £80m of funding to merge OS with LR we could end up with free open geospatial data, reduced costs to taxpayers and we might find out whether there really is this huge potential to be unleashed.

Since the budget, I have been chatting with someone who has had close operational links with OS, who remarked:

“Basically the OS operates as if it was a competing private company but it’s like Network Rail, the Met Office & the Hydrographic Office, it’s a tax payer funded monopoly … 80% or more of its income is from the taxpayer STILL. So after 20 years, the growth model has failed. Time to Reboot for the 21st Century

I’m not sure about the 80% but otherwise I think this sums up where we are and why we need a rethink.

A touch of irony

Regular readers of this blog will know that there is usually an image at the beginning of each post. Sometimes the connection between the content and the image is quite tenuous to say the least but on this occasion I thought I would try to find a nice MasterMap image to start the post. It isn’t easy to find Creative Commons images of MasterMap so I thought I would tweet the OS team to ask for one. Ed Parsons pointed out the irony of that!

Ironic!

— Ed Parsons (@edparsons) November 22, 2017

Unfortunately a day later I hadn’t received a reply from anyone on the OS social media team so I went looking for an image idea to start the post and along came Dr Jekyll and Mr Hyde who represent my split personality on this topic – Dr Jekyll loves the idea of more open data, Mr Hyde wonders whether we are wasting a lot of taxpayers money

 

Categories: OSGeo Planet

gvSIG Team: SIG aplicado a Gestión Municipal: Módulo 13 ‘Mapas’

OSGeo Planet - Thu, 2017-11-23 16:29

Ya está disponible el módulo nº 13 del curso de SIG para Gestión Municipal, donde mostraremos cómo crear mapas con la cartografía que tenemos en nuestras vistas.

El mapa será el documento que nosotros podremos imprimir, o exportar a PDF o a PostScript, y en el que insertaremos las Vistas que hemos creado en nuestro proyecto.

En él podremos insertar todo tipo de elementos, como textos, norte, escala, leyenda, imágenes o logos, cajetines, gráficas, rectángulos, líneas…

La cartografía a utilizar en este vídeo la podéis descargar desde el siguiente enlace.

El vídeo de este módulo es el siguiente:

Post relacionados:


Filed under: gvSIG Desktop, spanish, training Tagged: ayuntamientos, gestión municipal, layout, mapa, pdf, Salida gráfica
Categories: OSGeo Planet

Jackie Ng: FDO road test: SQL Server 2017 on Linux

OSGeo Planet - Wed, 2017-11-22 15:16
You can consider this post as the 2017 edition of this post.

So for some background. There's been several annoyances I've been personally experiencing with the SQL Server FDO provider that have given me sufficient motivation to fix the problem right at the source (code). However, before I can go down that road, I needed to set up a local dev installation of SQL Server as my dev environment is more geared towards MapGuide than individual FDO providers.

But just like my previous adventure with the King Oracle FDO provider, I didn't want to have to actually find/download a SQL Server installer and proceed to pollute my dev environment with a whole assortment of junk and bloat. We now live in the era of docker containers! Spinning up a SQL Server environment should be a docker pull away and when I no longer need the environment, I can cleanly blow it away without leaving lots of junk behind.

And it just so happens that with the latest release of SQL Server 2017, not only is running it inside a docker container a first-class user story, it is also the first release of SQL Server that natively runs on Linux.

So through the exercise of spinning up a SQL Server 2017 linux container we can kill multiple birds with one stone:

  • We'll know if MapGuide/FDO in its current form can work with SQL Server 2017
  • We'll also know how well it works with the Linux version of SQL Server (given its feature set is not at parity with the equivalent Windows version)
  • If MapGuide/FDO works, we'd then have a SQL Server environment ready to go which can be spun up and torn down on demand to then start fixing various problems with the FDO provider.

Spinning up the SQL Server 2017 linux docker container
This was easy because Microsoft provides an official docker image. So it was a case of just pulling down the docker image and adjusting some environment parameters to use a custom SQL Server sa login when we go to docker run the container and also define port mappings so we can connect to this container from the docker host OS.
The FDO Toolbox bootstrapping test
This was an easy way to determine if the SQL Server FDO provider works with SQL Server 2017. FDO Toolbox has the ability to:
  1. Create a SQL Server data store
  2. Bulk Copy spatial data into it
  3. Query/Preview data from it
If we can do all 3 things above in FDO Toolbox against the freshly spun up SQL Server 2017 linux container, that's a very good sign that everything works.
Creating the FDO data store
FDO Toolbox has a specialized UI for creating SQL Server data stores that is accessible by right-clicking the FDO Data Sources node and choosing Create Data Store - Create SQL Server

This gives us the UI to set up a new SQL Server data store


The first real test is to see if the FDO provider can connect to our SQL Server container, which is a case of filling in all the required connection properties and clicking the Test button, which gives us:


So far so good. Now that we know the FDO provider can connect to the container, we can fill out the data store parameters and click OK to create the data store, which gave us another good sign:


Now just to be sure that the FDO provider did actually create the database, I connected to this SQL Server instance through alternative tools (such as the new SQL Operations Studio) and we can see that the database is indeed there.


So now we can bulk copy some spatial data into it, which will be a nice solid verification that the feature and schema manipulation functionality of the FDO provider work in SQL Server 2017.

So I set up a bulk copy using a whole bunch of test SHP files. A few moments later, we got another positive sign:


Again, for verification we can look at this database in a different tool and can see that the FDO provider correctly created the database tables.


And that data was actually being copied in


Just as an aside: SQL Operations Studio doesn't do spatial data previews like its big brother SQL Server Management Studio.

A shame really. Oh well, at least we can do that in FDO Toolbox :)


Which is also confirmation that FDO is getting the geometry data out of our SQL Server 2017 linux container without any problems.

So based on all these findings, I feel comfortable in saying that FDO (and applications using it like MapGuide) works just fine with SQL Server 2017, especially its Linux version.

Now to deal with these actual annoyances in the FDO provider itself ...
Categories: OSGeo Planet

Jackie Ng: An introduction to MgTileSeeder

OSGeo Planet - Wed, 2017-11-22 15:13
I previously said I'd cover this tool in a future post, and that future is now.

MgTileSeeder (introduced as a standalone companion release to MapGuide Maestro 6.0m8) is a new command-line tile seeding application that is the successor to the current MgCooker tile seeder.

This tool is the offspring of an original thought experiment about how one could possibly build a multi-threaded tile seeder using 2017-era .net libraries and tools. It turns out the actual implementation didn't differ that much from my hypothetical code sample from the original post!

But besides being a ground-up rewrite, MgTileSeeder has the following unique features over MgCooker:

  • If your MapGuide Server is 2.6 or newer, we will use CREATERUNTIMEMAP to automatically infer the required meters-per-unit value that is critical in determining how many tiles we need to actually seed.
  • MgTileSeeder is a cross-platform and self-contained .net core application taking advantage of the newly netstandard-ized Maestro API.
  • More importantly, MgTileSeeder finally supports seeding of XYZ tilesets. In fact, the way this support has been designed, you can use MgTileSeeder as a generic tile cache seeder for any XYZ tileset, not just ones served by MapGuide itself.
Seeding standard tiled maps
The minimal command to start seeding a tiled map is simply:
MgTileSeeder mapguide -m --map

Here's an example MgTileSeeder invocation to seed a tile set
MgTileSeeder mapguide -m http://localhost/mapguide/mapagent/mapagent.fcgi --map Library://Samples/Sheboygan/TileSets/Sheboygan.TileSetDefinition
This will use CREATERUNTIMEMAP to auto-infer the required meters-per-unit (for tile sets, we make a temporary Map Definition that links to the tile set and run CREATERUNTIMEMAP against that) and then proceeds to display a running progress that updates every second:

There are other options available, such as:
  • Restricting tile seeding to a specific extent
  • Restricting tile seeding to specific base layer groups
  • Manually passing in the meters-per-unit value

Seeding XYZ tile sets
Seeding XYZ tile sets uses a completely different set of parameters. The minimal command to seed an XYZ tile set is:
MgTileSeeder xyz --url --minx --miny --maxx --maxy
An example of tiling a XYZ tile set (eg. Library://Samples/Sheboygan/TileSets/SheboyganXYZ.TileSetDefinition) in MapGuide would look like this:
MgTileSeeder xyz --url "http://localhost/mapguide/mapagent/mapagent.fcgi?OPERATION=GETTILEIMAGE&VERSION=1.2.0&CLIENTAGENT=OpenLayers&USERNAME=Anonymous&MAPDEFINITION=Library://Samples/Sheboygan/TileSets/SheboyganXYZ.TileSetDefinition&BASEMAPLAYERGROUPNAME=Base+Layer+Group&TILECOL={y}&TILEROW={x}&SCALEINDEX={z}" --minx -87.7978 --miny 43.6868 --maxx -87.6645 --maxy 43.8037
Unlike the standard tiling mode you are required to define the bounds (in lat/long) of the area you wish to seed. Also you can see here that the XYZ tiling mode accepts any arbitrary URL that has {x}, {y} and {z} placeholders. This means you can use MgTileSeeder for tiling any XYZ tile set (eg. Your own custom OpenStreetMap tile set), not just ones served by MapGuide. You just need to make sure your URL provides the required XYZ placeholders.


And that concludes our introduction to the MgTileSeeder tool.
Happy tiling!

Categories: OSGeo Planet

Free and Open Source GIS Ramblings: Movement data in GIS #11: FOSS4G2017 talk recordings

OSGeo Planet - Wed, 2017-11-22 12:08

Many of the topics I’ve covered in recent “Movement data in GIS” posts, have also been discussed at this year’s FOSS4G. Here’s a list of videos for you to learn more about the OGC Moving Features standard, modelling AIS data with FOSS, and more:

1. Introduction to the OGC Moving Features standard presented by Kyoung-Sook Kim from the Artificial Intelligence Research Center, Japan:

Another Perspective View of Cesium for OGC Moving Features from FOSS4G Boston 2017 on Vimeo.

2. Modeling AIS data using GDAL & PostGIS presented by Morten Aronsen from the Norwegian Defence Research Establishment:

Density mapping of ship traffic using FOSS4G in C# .NET from FOSS4G Boston 2017 on Vimeo.

3. 3D visualization of movement data from videos presented by Anna Petrasova from the Center for Geospatial Analysis, North Carolina State University:

Visualization and analysis of active transportation patterns derived from public webcams from FOSS4G Boston 2017 on Vimeo.

There are also a ton of Docker presentations on the FOSS4G2017 Vimeo channel, if you liked “Docker basics with Geodocker GeoServer”.

Read more:


Categories: OSGeo Planet

gvSIG Team: The new gvSIG Mobile is already available. The open source mobile GIS of the gvSIG Suite

OSGeo Planet - Wed, 2017-11-22 09:20

Excellent news for all those who need GIS applications for field data gathering. You have the new gvSIG Mobile available already, for Android devices, that you can install from ‘Google Play’.

gvSIG Mobile is open source software, like all gvSIG Suite solutions. It’s licensed under the GNU/GPLv3 license.

The new gvSIG Mobile is based on Geopaparazzi, with more than obvious similarities, but with a different approach that will be reflected in its evolution. gvSIG Mobile was born with the aim of having a mobile GIS application for professionals and, it has different tools that facilitate its integration with the rest of the gvSIG Suite. For example, it has a data importer and exporter from/to gvSIG Online, functionality that is already used by many of the organizations that are betting on implementing their Spatial Data Infrastructures (SDI) with this platform. They are able to carry out census or inventories, update or audit information and all of that integrated with the SDI. In the same way the next version of gvSIG Desktop brings among its (numerous) improvements a plugin that will allow data dumping between both applications. And this is just the beginning …

Of course, gvSIG Mobile can be used independently of the rest of the gvSIG Suite components. In an individual level, it is a fantastic application for field data gathering. It includes a lot of functionalities, but it is very easy of use. You can gather field data, edit existing data, attach images, notes or bookmarks to geolocated elements, etc. Without forgetting the ability to have forms that make gathering easier.

In the coming months we will be complementing the application information currently available with user manuals in several languages, video-tutorials, etc. In addition, for this first version, until we have the specific gvSIG Mobile documentation available, you can also consult all the material available about Geopaparazzi, totally applicable to the use of gvSIG Mobile. And, of course, you can use the user mailing lists to ask about any doubt or problem that you have with the mobile GIS that is going to become your favorite one.

For those interested in the development part, the project can be found here: https://github.com/gvSIGAssociation/gvsig-mobile

Finally, from the gvSIG Association we want to thank two of our companies, HydroloGIS and Scolab, for the work done so that today all of us have the possibility to use gvSIG Mobile freely.

What are you waiting for to download it?

 


Filed under: development, english, gvSIG Mobile, testing Tagged: Android
Categories: OSGeo Planet

gvSIG Team: El nuevo gvSIG Mobile ya disponible. El SIG móvil en software libre de la Suite gvSIG.

OSGeo Planet - Wed, 2017-11-22 08:02

Una excelente noticia para todos aquellos que necesitan de aplicaciones SIG para toma de datos en campo. Ya tenéis disponible el nuevo gvSIG Mobile, disponible para dispositivos Android y que podéis instalar desde ‘Google Play’.

gvSIG Mobile es software libre, como todas las soluciones de la Suite gvSIG. Su licencia es la GNU/GPLv3.

El nuevo gvSIG Mobile está basado en Geopaparazzi, con similitudes más que evidentes, pero con un enfoque distinto que se verá reflejado en su evolución. gvSIG Mobile nace con el objetivo de disponer de una aplicación SIG móvil para profesionales y, como tal, dispone de herramientas que facilitan su integración con el resto de la Suite gvSIG. Así, por ejemplo, en gvSIG Mobile tenemos importador y exportador de datos a gvSIG Online, funcionalidad que ya utilizan las cada vez más numerosas organizaciones que están apostando por implantar sus Infraestructuras de Datos Espaciales (IDE) con esta plataforma. Poder realizar censos o inventarios, actualizar o auditar información y todo ello integrado con la IDE. Del mismo modo la próxima versión de gvSIG Desktop trae entre sus (numerosísimas) mejoras un plugin que permitirá el volcado de datos entre ambas aplicaciones. Y esto sólo es el comienzo…

Por supuesto se puede utilizar gvSIG Mobile de forma independiente al resto de componentes de la Suite gvSIG. A nivel individual es una aplicación fantástica para toma de datos en campo. Sus funcionalidades son numerosas, sin que esto reste a su facilidad de uso. Se puede realizar toma de datos en campo, editar existentes, asociar imágenes, notas, croquis a elementos geoposicionados, etc. Sin olvidar la capacidad para disponer de formularios que nos faciliten la toma de datos.

En los próximos meses iremos complementando la información actualmente disponible de la aplicación con manuales de usuario en varios idiomas, vídeo-tutoriales, etc. Para esta primera versión, además y hasta que tengamos la documentación especifica de gvSIG Mobile disponible, podéis también consultar todo el material disponible de Geopaparazzi y totalmente aplicable al uso de gvSIG Mobile. Y, por supuesto, utilizar las listas de usuarios para consultar cualquier duda o problema que tengáis con el que va a convertirse en vuestro SIG móvil favorito.

Para los interesados en la parte de desarrollo, el proyecto lo podéis encontrar aquí: https://github.com/gvSIGAssociation/gvsig-mobile

Por último, desde la Asociación gvSIG queremos agradecer a dos de nuestras empresas, HydroloGIS y Scolab, el trabajo realizado para que hoy todos tengamos la posibilidad de usar libremente gvSIG Mobile.

¿Qué estáis esperando para descargarlo?


Filed under: gvSIG Mobile, spanish Tagged: Android
Categories: OSGeo Planet
Syndicate content