OSGeo Planet

GIScussions: Sh*thole Geography Standup Quiz

OSGeo Planet - Fri, 2018-11-30 17:05

Shithole President - DSC07529_ep

Here is a little bit of fun for you to enjoy at a staff social, after your Xmas lunch or in the pub one evening

This is how the Sh*thole Geography Standup Quiz works.

“At the beginning everyone stands up. 

Quizmaster asks a question with 3 suggested answers.

If you think the answer is the first one put your left hand up, if you think it is the second one both hands up, if you think it is the third one put your right hand up. 

QM tells you the correct answer, everyone who got it wrong sit down. If you didn’t make a choice then you also sit down. There are some other interesting stats in the speaker notes for each slide (you need to download the presentation)

Then we go again until one person is left standing – if everyone gets knocked out you can all stand up again and keep answering questions.

There is a tie breaker question at the end”

And you could ignore all my instructions and just have a fun time after Xmas lunch or even do the quiz on your own and see how many questions you can answer, you could tweet your score with the hashtag #ShitholeQuiz

What’s the point of the maps at the end of each slide/question? I’m not sure! The quiz was first presented at FOSS4G with Ken and we had a cartographic spin to the whole talk, I changed it to become a bit more philosophical – maybe the maps disprove some of our preconceptions. 



Don’t press the autoplay button on the quiz or it will go into auto advance mode which is not the ideal way to run the quiz, use the > button to advance and the < to go backwards if needed. If you want to download the quiz or access directly you can find the quiz here

Categories: OSGeo Planet

Even Rouault: SRS barn raising: 6th report

OSGeo Planet - Fri, 2018-11-30 14:05
This is the sixth progress report of the GDAL SRS barn effort. The pace of changes has not yet slow down, with still a significant part of the work being in PROJ, and an initial integration in GDAL.
The major news item is that RFC2, implementing the new capabilities (WKT-2 support, late-binding approach, SQLite database), has now been merged into PROJ master.
An initial integration of PROJ master into GDAL has been started in a custom GDAL branch . This includes:
  • PROJ master, which will be released as 6.0, is now a required dependency of GDAL. It actually becomes the only required external third-party dependency (if we except the copies of a few libraries, such as libtiff, libgeotiff, etc. that have been traditionaly included into the GDAL source tree)
  • The dozen of continuous integration configurations have been modified to build PROJ master as a preliminary step.
  • Related to the above, we have including into PROJ a way to "version" its symbols. If PROJ is built with -DPROJ_RENAME_SYMBOLS in CFLAGS and CXXFLAGS, all its exported symbols are prefixed with "internal_". This enables GDAL to link against PROJ master, while still using pre-compiled dependencies (such as libspatialite) that link against the system PROJ version, without a risk of symbol clash. This is particularly useful to be able to run GDAL autotests on continuous integration environments that use pre-packaged dependencies (or if you want to test the new GDAL without rebuilding all reverse dependencies of GDAL). This however remains a hack, and ultimately when PROJ 6 has been released, all reverse dependencies should be built against it. (this solution has been successfully tested in the past where GDAL had a libtiff 4.0 internal copy, whereas external libtiff used by some GDAL dependencies relied on the system libtiff 3.X)
  • Compatibility mechanisms which were required to support older PROJ versions have been removed. In particular, the runtime loading (using dlopen() / LoadLibrary() mechanism) has been removed. It proved to cause code complication, and users frequently ran into headaches with different PROJ versions being loaded and clashing/crashing at runtime.
  • The OGRSpatialReference class which implements CRS manipulation in GDAL has been modified to use the new PROJ functions to import and export between WKT and PROJ strings. Previously GDAL had such code, which is now redundant with what PROJ offers. This preliminary integration caused a number of fixes to be made on PROJ to have compatibility with the input and output of GDAL for WKT.1 and PROJ strings. Besides "moving" code from GDAL to PROJ, a practical consequence is that the addition of a new projection method into PROJ will no longer require changes to be made to GDAL for it to be usable for reprojection purposes.

There have been reflections on how to use the new code developped in PROJ by the existing PROJ code. A pull request is currently under review and implements:
  • changes needed to remove from the data/ directory the now obsolete EPSG, IGNF, esri and esri.extra files to rely instead of the proj.db dataase 
  • making the proj_create_crs_to_crs() API use the new late-binding approach to create transformation pipelines
  • updating cs2cs to use that new API. 
  • list and address backward compatibility issues related to honouring official axis order
Integration phase in GDAL continus with the aim of using more of the new PROJ code. Typically the OGRSpatialReference class that models in GDAL the CRS/SRS was up to now mostly a hierarchy of WKT nodes, where setters methods of OGRSpatialReference would directly create/modify/delete nodes, and getter methods query them. This approach was fine when you had to manage just one WKT version (with the caveat that it was also easy to produce invalid WKT representions, lacking mandatory nodes). However, this is no longer appropriate now that we want to support multiple WKT versions. Our goal is to make OGRSpatialReference act rather on osgeo::proj::CRS objects (and its derived classes). Switching between the two abstractions is a non-trivial task and doing it in a bing-bang approach seemed risky, so we are progressively doing it by using a dual internal modelling. A OGRSpatialReference instance will maintain as a primary source a osgeo::proj::CRS object, and for operations not yet converted to the new approach, will fallback to translating it internally to WKT.1 to allow direct manipulation of the nodes, and then retranslate that updated WKT.1 representation back to a osgeo::proj::CRS object. Ultimately the proportion of methods using the fallback way should decrease (it is not completely clear we can remove all of them since direct node manipulation is spread in a significant number of GDAL drivers). The task is slowly progressing, because each change can subtely modify the final WKT.1 representation (nodes being added, number of significant digits changing) and cause a number of unit tests to break (GDAL autotest suite is made of 280 000 lines of Python code) and be analyzed to see if there was a bug and or just an expected result to be slightly altered.Because of all the above impacts, we have decided to do an early release in December of GDAL master as GDAL 2.4.0 with all the new features since GDAL 2.3, in order to be able to land this PROJ integration afterwards afterwards. A GDAL 2.5.0 release will hopefully follow around May 2019 with the result of the gdalbarn work.

Other side activities regarding collecting transformation grids:
  • Following a clarification from IGN France on the open licensing of their geodesy related resources, their CRS and transformation XML registry is now processed to populate the IGNF objects in the proj.db database (the previous import used the already processed IGNF file containing PROJ string, which caused information losses). The associated vertical shift grids have also been converted from their text-based format to the PROJ digestable .gtx format, integrated in the proj-datumgrid-europe package, and they have been referenced in the database for transformations that use them.
  • The NGS GEOID 2012B vertical grids to convert between NAD83 ellipsoidal heights and NAVD88 heights have also been integrated in the proj-datumgrid-north-america package
Categories: OSGeo Planet

Free and Open Source GIS Ramblings: TimeManager 3.0.2 released!

OSGeo Planet - Thu, 2018-11-29 17:34

Bugfix release 3.0.2 fixes an issue where “accumulate features” was broken for timestamps with milliseconds.

If you like TimeManager, know your way around setting up Travis for testing QGIS plugins, and want to help improve TimeManager stability, please get in touch!

Categories: OSGeo Planet

Fernando Quadro: Desenvolvimento de plugins do QGIS: Primeiros passos

OSGeo Planet - Thu, 2018-11-29 11:33

O QGIS é uma ferramenta brilhante para automação baseada em Python em forma de scripts personalizados ou até mesmo plugins. Os primeiros passos para escrever o código personalizado podem ser um pouco difíceis, já que você precisa entender a API do Python que é um pouco complexa. A série de Desenvolvimento de Plugins do QGIS que inicia hoje tem como objetivo o desenvolvimento de um plug-in personalizado totalmente funcional capaz de gravar valores de atributos de uma camada de origem para uma camada de destino com base em sua proximidade espacial.

Nesta parte, mencionarei o básico, o que é bom saber antes de começar.

1. Documentação

Diferentes versões do QGIS vêm com diferentes APIs do Python. A documentação deve ser encontrada em https://qgis.org, sendo a mais recente a versão 3.2. Observe que, se você acessar diretamente o site http://qgis.org/api/, verá os documentos atuais.

Alternativamente, você pode o “apt install qgis-api-doc” em seu sistema Ubuntu e rodar “python -m SimpleHTTPServer[port]” dentro de “/usr/share/qgis/doc/api”. Você encontrará a documentação em “http://localhost:8000” (se você não fornecer o número da porta) e estará disponível mesmo quando estiver off-line.

2. Estrutura Básica da API

Abaixo uma visão resumida do que está disponível dentro da API:

  • Pacote qgis.core traz todos os objetos básicos como QgsMapLayer, QgsDataSourceURI, QgsFeature, etc
  • O pacote qgis.gui traz elementos GUI que podem ser usados ​​dentro do QGIS como QgsMessageBar ou QgsInterface
  • qgis.analysis, qgis.networkanalysis, qgis.server e qgis.testing pacotes que não serão abordados na série
  • Módulo qgis.utils que vem com iface (muito útil no console do QGIS Python)

3. Console Python do QGIS

Usar o console Python é a maneira mais fácil de automatizar o fluxo de trabalho do QGIS. Pode ser acessado pressionando Ctrl + Alt + P ou navegando pelo menu em Plugins -> Python Console. Como mencionado acima o módulo iface do qgis.utils é exposto por padrão dentro do console, permitindo-lhe interagir com o QGIS GUI. Experimente os exemplos a seguir, para fazer um teste inicial:

iface.mapCanvas().scale() # retorna a escala atual do mapa iface.mapCanvas().zoomScale(100) # zoom para escala 1:100 iface.activeLayer().name() # obtém o nome da camada ativa iface.activeLayer().startEditing() # realizar edição

Essa foi uma breve introdução à API do QGIS, no próximo post falaremos mais profundamente sobre o console do QGIS.

Categories: OSGeo Planet

GRASS GIS: GRASS GIS 7.4.3 released

OSGeo Planet - Thu, 2018-11-29 06:27
We are pleased to announce the GRASS GIS 7.4.3 release
Categories: OSGeo Planet

Fernando Quadro: Alteração do JTS no GeoServer

OSGeo Planet - Wed, 2018-11-28 14:05

A partir do GeoServer 2.14, a saída produzida pelo recurso REST para as solicitações de camadas estão usando um nome de pacote diferente (org.locationtech em vez de com.vividsolutions) para os tipos de geometria devido ao upgrade para o JTS (Java Topology Suite) 1.16.0. Por exemplo:

Antes:

... <attribute>   <name>geom</name>   <minOccurs>0</minOccurs>   <maxOccurs>1</maxOccurs>   <nillable>true</nillable>   <binding><strong>com.vividsolutions.jts.geom.Point</strong></binding> </attribute> ...

Depois:

... <attribute>   <name>geom</name>   <minOccurs>0</minOccurs>   <maxOccurs>1</maxOccurs>   <nillable>true</nillable>   <binding>org.locationtech.jts.geom.Point</binding> </attribute> ...

Qualquer cliente REST que dependa dessa informação de ligação deve atualizadar para suportar os novos nomes.

Fonte: GeoServer Documentation

Categories: OSGeo Planet

Paul Ramsey: Esri and Winning

OSGeo Planet - Mon, 2018-11-26 13:00

How much winning is enough? Have you been winning so much that you’re tired of winning now?

I ask because last month I gave a talk (PDF Download) to the regional GIS association in Manitoba, about open source and open data. My talk included a few points that warned about the downsides of being beholden to a single software vendor, and it included some information about the software available in the open source geospatial ecosystem.

MGUG Keynote 2018

In a testament to the full-spectrum dominance of Esri, the audience was almost entirely made up of Esri customers. The event was sponsored by Esri. Esri had a table at the back of the room. Before giving my talk, I made a little joke that my talk would in fact include some digs at Esri (though I left the most pointed ones on the cutting room floor).

People seemed to like the talk. They laughed at my jokes. They nodded in the right places.

Among the points I made:

  • Single-vendor dominance in our field is narrowing the understanding of “what is possible” amongst practitioners in that ecosystem.
  • Maintaining a single-vendor policy dramatically reduces negotiating power with that vendor.
  • Maintaining a single-vendor policy progressively de-skills your staff, as they become dependant on a single set of tooling.
  • Practitioners have higher market value when they learn more than just the tools of one vendor, so self-interest dictates learning tools outside the single-vendor ecosystem.
  • Point’n’click GIS tools from Esri have widened access to GIS, which is a good thing, but driven down the market value of practitioners who limit themselves to those tools.

None of these points is unique to Esri – they are true of any situation where a single tool has driven competitors off the field, whether it be Adobe graphics tools or Autodesk CAD tools or Microsoft office automation tools.

Nor are any of these points indicative of any sort of ill will or malign intent on the part of Esri – they are just the systemic effects of market dominance. It is not contingent on Esri to change their behaviour or limit their success; it’s contingent on practitioners and managers to recognize the negative aspects of the situation and react accordingly.

And yet.

Esri and Winning

Despite the fact that almost all the people in the room were already their customers, that no new business would be endangered by my message, that all the students would still be taught their tools, that all the employers would still include them in job requirements, that people would continue to use the very words they choose to describe basic functions of our profession …

Despite all that, the Esri representative still went to the president of the association, complained to her about the content of my talk, and asked her to ensure that nothing I would say in my afternoon technical talk would be objectionable to himself. (In the event, I made some nasty jokes about Oracle, nobody complained.)

For some of these people, no amount of winning is enough, no position of dominance is safe, no amount of market leverage is sufficient.

It’s sad and it’s dangerous.

I was reminded of this last week, meeting an old friend in Australia and learning that he’d been blackballed out of a job for recommending software that wasn’t Esri software. Esri took away his livelihood for insufficient fealty.

This is the danger of dominance.

When the local Esri rep has a better relationship with your boss than you do, do you advocate for using alternative tools? You could be limiting or even potentially jeapardizing your career.

When Esri has locked up the local geospatial software market, do you bid an RFP with an alternative open tool set? You could lose your Esri partnership agreement and with it your ability to bid any other local contracts. Esri will make your situation clear to you.

This is the danger of dominance.

A market with only one vendor is not a market. There’s a name for it, and there’s laws against it. And yet, our profession glories in it. We celebrate “GIS day”, a marketing creation of our dominant vendor. Our publicly funded colleges and universities teach whole curricula using only Esri tools.

And we, as a profession, do not protest. We smile and nod. We accept our “free” or “discounted” trainings from Esri (comes with our site license!) and our “free” or “discounted” tickets to the Esri user conference. If we are particularly oblivious, we wonder why those open source folks never come around to market their tools to us.

We have met the enemy, and he is us.

We have met the enemy and he is us

Categories: OSGeo Planet

Nyall Dawson: Thoughts on “FOSS4G/SOTM Oceania 2018”, and the PyQGIS API improvements which it caused

OSGeo Planet - Sun, 2018-11-25 00:05

Last week the first official “FOSS4G/SOTM Oceania” conference was held at Melbourne University. This was a fantastic event, and there’s simply no way I can extend sufficient thanks to all the organisers and volunteers who put this event together. They did a brilliant job, and their efforts are even more impressive considering it was the inaugural event!

Upfront — this is not a recap of the conference (I’m sure someone else is working on a much more detailed write up of the event!), just some musings I’ve had following my experiences assisting Nathan Woodrow deliver an introductory Python for QGIS workshop he put together for the conference. In short, we both found that delivering this workshop to a group of PyQGIS newcomers was a great way for us to identify “pain points” in the PyQGIS API and areas where we need to improve. The good news is that as a direct result of the experiences during this workshop the API has been improved and streamlined! Let’s explore how:

Part of Nathan’s workshop (notes are available here) focused on a hands-on example of creating a custom QGIS “Processing” script. I’ve found that preparing workshops is guaranteed to expose a bunch of rare and tricky software bugs, and this was no exception! Unfortunately the workshop was scheduled just before the QGIS 3.4.2 patch release which fixed these bugs, but at least they’re fixed now and we can move on…

The bulk of Nathan’s example algorithm is contained within the following block (where “distance” is the length of line segments we want to chop our features up into):

for input_feature in enumerate(features): geom = feature.geometry().constGet() if isinstance(geom, QgsLineString): continue first_part = geom.geometryN(0) start = 0 end = distance length = first_part.length() while start < length: new_geom = first_part.curveSubstring(start,end) output_feature = input_feature output_feature.setGeometry(QgsGeometry(new_geom)) sink.addFeature(output_feature) start += distance end += distance

There’s a lot here, but really the guts of this algorithm breaks down to one line:

new_geom = first_part.curveSubstring(start,end)

Basically, a new geometry is created for each trimmed section in the output layer by calling the “curveSubstring” method on the input geometry and passing it a start and end distance along the input line. This returns the portion of that input LineString (or CircularString, or CompoundCurve) between those distances. The PyQGIS API nicely hides the details here – you can safely call this one method and be confident that regardless of the input geometry type the result will be correct.

Unfortunately, while calling the “curveSubstring” method is elegant, all the code surrounding this call is not so elegant. As a (mostly) full-time QGIS developer myself, I tend to look over oddities in the API. It’s easy to justify ugly API as just “how it’s always been”, and over time it’s natural to develop a type of blind spot to these issues.

Let’s start with the first ugly part of this code:

geom = input_feature.geometry().constGet() if isinstance(geom, QgsLineString): continue first_part = geom.geometryN(0) # chop first_part into sections of desired length ...

This is rather… confusing… logic to follow. Here the script is fetching the geometry of the input feature, checking if it’s a LineString, and if it IS, then it skips that feature and continues to the next. Wait… what? It’s skipping features with LineString geometries?

Well, yes. The algorithm was written specifically for one workshop, which was using a MultiLineString layer as the demo layer. The script takes a huge shortcut here and says “if the input feature isn’t a MultiLineString, ignore it — we only know how to deal with multi-part geometries”. Immediately following this logic there’s a call to geometryN( 0 ), which returns just the first part of the MultiLineString geometry.

There’s two issues here — one is that the script just plain won’t work for LineString inputs, and the second is that it ignores everything BUT the first part in the geometry. While it would be possible to fix the script and add a check for the input geometry type, put in logic to loop over all the parts of a multi-part input, etc, that’s instantly going to add a LOT of complexity or duplicate code here.

Fortunately, this was the perfect excuse to improve the PyQGIS API itself so that this kind of operation is simpler in future! Nathan and I had a debrief/brainstorm after the workshop, and as a result a new “parts iterator” has been implemented and merged to QGIS master. It’ll be available from version 3.6 on. Using the new iterator, we can simplify the script:

geom = input_feature.geometry() for part in geom.parts(): # chop part into sections of desired length ...

Win! This is simultaneously more readable, more Pythonic, and automatically works for both LineString and MultiLineString inputs (and in the case of MultiLineStrings, we now correctly handle all parts).

Here’s another pain-point. Looking at the block:

new_geom = part.curveSubstring(start,end) output_feature = input_feature output_feature.setGeometry(QgsGeometry(new_geom))

At first glance this looks reasonable – we use curveSubstring to get the portion of the curve, then make a copy of the input_feature as output_feature (this ensures that the features output by the algorithm maintain all the attributes from the input features), and finally set the geometry of the output_feature to be the newly calculated curve portion. The ugliness here comes in this line:

output_feature.setGeometry(QgsGeometry(new_geom))

What’s that extra QgsGeometry(…) call doing here? Without getting too sidetracked into the QGIS geometry API internals, QgsFeature.setGeometry requires a QgsGeometry argument, not the QgsAbstractGeometry subclass which is returned by curveSubstring.

This is a prime example of a “paper-cut” style issue in the PyQGIS API. Experienced developers know and understand the reasons behind this, but for newcomers to PyQGIS, it’s an obscure complexity. Fortunately the solution here was simple — and after the workshop Nathan and I added a new overload to QgsFeature.setGeometry which accepts a QgsAbstractGeometry argument. So in QGIS 3.6 this line can be simplified to:

output_feature.setGeometry(new_geom)

Or, if you wanted to make things more concise, you could put the curveSubstring call directly in here:

output_feature = input_feature output_feature.setGeometry(part.curveSubstring(start,end))

Let’s have a look at the simplified script for QGIS 3.6:

for input_feature in enumerate(features): geom = feature.geometry() for part in geom.parts(): start = 0 end = distance length = part.length() while start < length: output_feature = input_feature output_feature.setGeometry(part.curveSubstring(start,end)) sink.addFeature(output_feature) start += distance end += distance

This is MUCH nicer, and will be much easier to explain in the next workshop! The good news is that Nathan has more niceness on the way which will further improve the process of writing QGIS Processing script algorithms. You can see some early prototypes of this work here:

I couldn’t be at the community day but managed to knock out some of the new API on the plane on the way home. **API subject to change

Categories: OSGeo Planet

PostGIS Development: PostGIS 2.3.8, 2.4.6

OSGeo Planet - Sat, 2018-11-24 00:00

The PostGIS development team is pleased to provide bug fix 2.3.8 and 2.4.6 for the 2.3 and 2.4 stable branches.

Continue Reading by clicking title hyperlink ..
Categories: OSGeo Planet

Fernando Quadro: Editor de estilos do GeoServer agora com modo “Tela cheia”

OSGeo Planet - Fri, 2018-11-23 10:30

A versão 2.14.1 do GeoServer veio com uma novidade: A página do editor de estilo agora tem um botão de “Tela cheia” no canto superior direito da janela:

Se pressionado, o editor e a visualização serão exibidos lado a lado e usarão todo o espaço da janela do navegador:

A cada atualização a seção de estilos do GeoServer tem melhorado. A equipe de desenvolvimento já a algum tempo tem dado uma atenção especial a essa seção, e já está muito mais fácil lidar com a edição e visualização dos estilo no GeoServer.

Categories: OSGeo Planet

XYCarto: Wellington Elevations: Interpolating the Bathymetry

OSGeo Planet - Fri, 2018-11-23 04:39

It is important to note something from the very beginning. The interpolated bathymetry developed in this project does not reflect the actual bathymetry of the Wellington Harbour. It is my best guess based on the tools I had and the data I worked with. Furthermore, this interpolation is NOT the official product of any institution. It is an interpolation created by me only for the purposes of visualization.

welly_harbour-colour-and-aerial_FULLVIEW

Part of the goal when visualizing the Wellington landscape was to incorporate a better idea about what may be happening below the surface of the harbor. Various bathymetric scans in the past have gathered much of the information and institutions like NIWA have done the work visualizing that data. As for myself, I did not have access to those bathymetries; however, I did have a sounding point data set to work with, so I set about interpolating those points.

The data set, in CSV format, was over a million points; too dense for a single interpolation. I worked out a basic plan for the interpolation based on splitting the points into a grid, interpolate the smaller bits, then reassemble the grid tiles into a uniform bathymetry.

Conversion from CSV to shp
Using the open option (-oo) switch, OGR will convert CSV to shp seamlessly

ogr2ogr -s_srs EPSG:4167 -t_srs EPSG:4167 -oo X_POSSIBLE_NAMES=$xname* -oo Y_POSSIBLE_NAMES=$yname* -f "ESRI Shapefile" $outputshapepath/$basenme.shp $i

Gridding the Shapefile
With the shapefile in place, I next needed to break it into smaller pieces for interpolation. For now, I create the grid by hand in QGIS using the ‘Create Grid’ function. This is found under Vector>Reasearch Tools>Create Grid. Determining a grid size that works best for the interpolation is a bit of trial and error. You want the largest size your interpolation can manage without crashing. Using the grid tool from QGIS in very convenient, in that it creates an attribute table of the xmin, xmax, ymin, ymax corrodinates for each tile in the grid. These attributes become very helpful during the interpolation process.

Interpolating the Points
I switched things up in the interpolation methods this time and tried out SAGA GIS. I have been looking for a while now for a fast and efficient method of interpolation that I could easily build into a scripted process. SAGA seemed like a good tool for this. The only drawback, I had a very hard time finding examples online about how to use this tool. My work around to was to test the tool in QGIS first. I noticed when the command would run, QGIS saved the last command in a log file. I found that log, copied out the command line function, and began to build my SAGA command for my script from there.

Here is look at the command I used:

saga_cmd grid_spline "Multilevel B-Spline Interpolation" -TARGET_DEFINITION 0 -SHAPES "$inputpoints" -FIELD "depth" -METHOD 0 -EPSILON 0.0001 -TARGET_USER_XMIN $xmin -TARGET_USER_XMAX $xmax -TARGET_USER_YMIN $ymin -TARGET_USER_YMAX $ymax -TARGET_USER_SIZE $reso -TARGET_USER_FITS 0 -TARGET_OUT_GRID "$rasteroutput/sdat/spline_${i}"

I tested a number of methods and landed on ‘grid_spline’ as producing the best results for the project. It was useful because it did a smooth interpolation across the large ‘nodata’ spaces.

Once the initial interpolation was complete, I needed to convert the output to GeoTIFF since SAGA exports in an .sdat format. Easy enough since GDAL_TRANSLATE recognizes the .sdat format. I then did my standard prepping and formatting for visualization:

gdal_translate "$iupput_sdat/IDW_${i}.sdat" "$output_tif/IDW_${i}.tif" gdaldem hillshade -multidirectional -compute_edges "$output_tif/IDW_${i}.tif" "$ouput_hs/IDW_${i}.tif" gdaladdo -ro "$output_tif/IDW_${i}.tif" 2 4 8 16 32 64 128 gdaladdo -ro "$ouput_hs/IDW_${i}.tif"2 4 8 16 32 64 128

Here is look at the interpolated harbour bathymetry, hillshaded, with Wellington 1m DEM hillshade added over top
welly_harbour_bw_all

And here is a look at the same bathy hillshade with coloring
welly_harbour_bw-and-aerial

Visualizing the Bathymetry
With the bathymetry, complete it was simply a matter of building it into the existing visualization I built for the Wellington Region. Learn more about the project here. The visualization was four steps:

Hillshade
addedbathy_bathyonlypng
Color
addedbathy_bathyonly_withcolor
Aerial Imagery
addedbathy_bathyonly_withcoloraerial
Then merge the models together
addedbathy_final

Easy as, eh? Let me know what you think!

Note: All imagery was produced during my time at Land Information New Zealand. Imagery licensing can be found here:
“Source: Land Information New Zealand (LINZ) and licensed by LINZ for re-use under the Creative Commons Attribution 4.0 International licence."

Categories: OSGeo Planet

XYCarto: Building the Wellington Model with 1m DEM and DSM

OSGeo Planet - Thu, 2018-11-22 22:21

As interest in LiDAR derived elevation increases, so grows the interest in the capabilities. LiDAR derived elevation data has been great for my visualization game and in helping me communicate the story out about what LiDAR can do. It all starts with a picture to get the imagination going.

wellyvation

The Wellington model derived for this project is part of an ongoing project to help increase the exposure of the Wellington 1m DEM/DSM elevation data derived from LiDAR. Step one for me is getting a working model built in QGIS, capturing still images, and increasing interest in the data.

I’ve talked about the processing of the elevation data for Wellington visualizations in the past, so for this post I’m only focusing on the blending of the data sets in building the model. This project is a good model since it encompasses a number of subtle techniques to get the model to stand out. This post is one of a two part series; the second post discusses the techniques used to derive and visualize the bathymetry for the surrounding harbor.

Let’s start with the base, Aerial Imagery.
wellyhabour_aerialonly

Blended with a hillshade
wellyhabour_aerial_withHS

DSM added for texture and context
wellyhabour_aerial_withDSMHS

Slope added to define some edges
wellyhabour_aerial_withDSMDEMSLOPEHS

Some darker shading added to the bathymetry to frame the elevation data
wellyhabour_aerial_withDSMDEMSLOPEHS_darkenframe

And finally some added bathymetry to lighten the edges at the shoreline enhancing the frame a bit more.
wellyhabour_aerial_withDSMDEMSLOPEHS_edgeframe

In the end there is some post-processing in Photoshop to lighten up the image. Honestly, this could have been done in QGIS, but I was being lazy. For the images produced, there was no need to retain the georeferencing, and when that is the case, I rely on Photoshop for color and light balancing.

The greatest difficultly in this project so far has been trying to create a universal model for the data set. I’m finding that as I visualize different regions using this model, I need to adjust the hillshading quite significantly to draw out different features. Take a look at the images here. It is the same model, but with the noticeably different gradients used in the hillshades. The techniques used for the images in this post worked well for the urban region shown, but fall apart as you move further out into the more mountainous regions. Much of the blending is too harsh and turns the mountains into a black muddled mess. I am almost there, but like any project, it takes a good bit of subtle tweaking of the blending to get a universal image to work.

The entire base mapping work is completed in QGIS. The elevation data was processed using GDAL and the bathymetric interpolations were produced SAGA GIS. There are no color palettes for this project. The aerial imagery does all the work in that department.

Base data can be found here:
DEM: https://data.linz.govt.nz/layer/53621-wellington-lidar-1m-dem-2013/
DSM: https://data.linz.govt.nz/layer/53592-wellington-lidar-1m-dsm-2013/
Aerial Imagery: https://data.linz.govt.nz/layer/51870-wellington-03m-rural-aerial-photos-2012-2013/

The next post covers the development of the bathymetry for the surrounding harbor. Thanks for having a look and let me know what you think.

Note: All imagery was produced during my time at Land Information New Zealand. Imagery licensing can be found here:
“Source: Land Information New Zealand (LINZ) and licensed by LINZ for re-use under the Creative Commons Attribution 4.0 International licence.”

Categories: OSGeo Planet

Fernando Quadro: Camadas WMS agora com suporte a Dimensões

OSGeo Planet - Thu, 2018-11-22 14:07

O GeoServer adicionou suporte a adição de dimensões específicas para camadas WMS, conforme especificado nos padrões WMS 1.1.1 e WMS 1.3.0. Existem duas dimensões pré-definidas nos padrões WMS: TIME e ELEVATION. A ativação das dimensões para uma camada permite que os usuários as especifiquem como parâmetros extras em solicitações GetMap, úteis para criar mapas ou animações a partir de dados multidimensionais.

Essas configurações podem ser ativadas e configuradas na Aba “Dimensões”:

Para cada dimensão ativa, as seguintes opções de configuração estão disponíveis:

  • Atributo – Atribua o nome para escolher o valor para esta dimensão (somente vetor). Isso é tratado no início do intervalo se o atributo final também for fornecido.
  • Atributo final – Atribua o nome para escolher o final do intervalo de valores para essa dimensão (opcional, somente para vetores).
  • Apresentação – Tipo de apresentação para os valores disponíveis no documento de recursos.
  • Valor padrão – Valor padrão a ser usado para essa dimensão, se nenhuma for fornecida com a solicitação. Selecione uma das quatro estratégias:
    • Menor valor de domínio – usa o menor valor disponível dos dados
    • Maior valor de domínio – usa o maior valor disponível a partir dos dados mais próximo do valor de referência – Seleciona o valor de dados mais próximo do valor de referência fornecido
    • Valor de referência – tenta usar o valor de referência fornecido como está, independentemente de estar realmente disponível nos dados ou não.
  • Valor de referência – O especificador de valor padrão. Exibido apenas para as estratégias de valor padrão onde é usado.
  • Correspondente mais próximo – Permitir, ou não, suporte ao correspondente mais próximo do WMS nesta dimensão. Atualmente suportado apenas na dimensão de tempo.
  • Intervalo aceitável – uma distância máxima de pesquisa do valor especificado (disponível somente quando o correspondente mais próximo estiver ativado). Pode ser vazio (sem limite), um único valor (pesquisa simétrica) ou usado para especificar um intervalo de pesquisa assimétrica. As distâncias de tempo devem ser especificadas usando a sintaxe do ISO. Por exemplo, PT1H/PT0H permite pesquisar até uma hora antes do valor especificado pelo usuário, mas não depois.

Para dimensão de tempo, o valor deve estar no formato ISO 8601 DateTime (yyyy-MM-ddThh:mm:ss.SSSZ). Para dimensão de elevação, o valor deve ser e inteiro (ponto flutuante).

Somente para a estratégia “Valor de referência”, é possível usar intervalos de tempos e intervalos de elevação, no formulário fromValue/toValue. Somente para a estratégia “valor de referência”
também é possível usar tempos relativos como P1M/PRESENT, mas é preciso ter cuidado para que o valor de referência seja copiado integralmente no documento de recursos e, como resultado, nem todos os clientes estejam reconhecendo essa sintaxe.

Fonte: Blog do GeoServer

Categories: OSGeo Planet

XYCarto: The Rejects

OSGeo Planet - Thu, 2018-11-22 03:46

Sometimes there is simply not enough room for all the ideas. Sometimes you want all the images to make it to the final round.

wairarapa

In a recent project to promote some of our elevation data, I was asked to present a number of ideas for a 2000mm x 900mm wall hanging. The piece was to act as a conversation starter and demonstrate some of the finer details elevation from LiDAR possesses.

In the end, the image above was the chosen candidate. Below are the drafts I initially presented for review. You can see the difference in treatment from the original ideas to the final product. Personally, I really enjoyed the images developed for the draft series, I liked the silvery undertones, and I thought it was a shame to merely let these images sit on my hard drive.
Below, you’ll find a brief description about a few challenges faced in the image development.

near_lake_ferry
nice_farm
masterton_region
random
draft_wairarapa

Artifacts and Finer Details
The hardest part of this job was drawing out the finer details of the chosen location. There was a strong interest in showing the ancient river bed; however, without a good bit of tweaking in the hillshades, the image is quite flat. After some trial and error, I found I could get a good contrast by limiting the hillshade values range to 170-190. That’s it, but the readability of the project really hinged on the simple tweak. It really made the details stand out.
That said, the gain in detail also revealed a significant artifact in the data. If you go back up and have a closer look, you will find diagonal depressions running across the images in equal intervals. These are lines from where the LiDAR scans overlap. I haven’t quite had the time to figure out how to remove these from the original data source, so for now I leave them in as conversational piece around improving LiDAR capture practices.
As usual, all map layout work was completed on QGIS, with the bulk of the data processing done using GDAL. The ‘Reject’ images for this post are direct exports from QGIS, with no manipulation apart from some down-sampling and cropping in Photoshop.

Base data can be found here:
DEM: https://data.linz.govt.nz/layer/53621-wellington-lidar-1m-dem-2013/
DSM: https://data.linz.govt.nz/layer/53592-wellington-lidar-1m-dsm-2013/
Aerial Imagery: https://data.linz.govt.nz/layer/51870-wellington-03m-rural-aerial-photos-2012-2013/

I produced a public repository for some of the scripting work. This repository is not specific to the above project but does contain some of the base processing I did on the Wellington elevation data: https://github.com/IReese/wellyvation/tree/master/utils
Hope you like and thanks for checking in!

Note: All imagery was produced during my time at Land Information New Zealand. Imagery licensing can be found here:
“Source: Land Information New Zealand (LINZ) and licensed by LINZ for re-use under the Creative Commons Attribution 4.0 International licence.”

Categories: OSGeo Planet

PostGIS Development: PostGIS 2.2.8 EOL

OSGeo Planet - Thu, 2018-11-22 00:00

The PostGIS development team is pleased to provide bug fix 2.2.8 for the 2.2 stable branch.

This is the End-Of-Life and final release for PostGIS 2.2 series.

We encourage you to upgrade to a newer minor PostGIS version. Refer to our Version compatibility and EOL Policy for details on versions you can upgrade to.

This release supports PostgreSQL 9.1-9.6.

2.2.8

Continue Reading by clicking title hyperlink ..
Categories: OSGeo Planet

GeoServer Team: GeoSever 2.14.1 released

OSGeo Planet - Tue, 2018-11-20 23:32

We are happy to announce the release of GeoServer 2.14.1. Downloads are provided (zip|war|exe) along with docs (html|pdf) and extensions.

This is a stable release of the GeoServer 2.14 series and is recommended for all production systems. Users of prior releases of GeoServer are encouraged to upgrade.

This release is made in conjunction with GeoTools 20.1 and GeoWebCache 1.14.1. Thanks to all who contributed to this release.

For more information please see our release notes (2.14.1|2.14.0|2.14-RC).

Improvements and Fixes

This release includes a number of new features and improvements:

  • New coordinate formatting options for WFS layers
  • REST API granule management remove of metadata or all
  • Coverage view support for indexed color model
  • Check handling of WCS 2.0 time/elevation/range subsetting
  • WFS 1.1.0 handling of gml:id
  • WFS Shapefile and GeoJSON output now support geometry with measures
  • Fixes to the start.bat and shutdown.bat scripts on windows
About GeoServer 2.14 Series

Additional information on the GeoServer 2.14 series:

Categories: OSGeo Planet

GeoTools Team: GeoTools 20.1 Released

OSGeo Planet - Tue, 2018-11-20 23:30
The GeoTools team is happy to announce the release of GeoTools 20.1: geotools-20.1-bin.zip geotools-201-doc.zip geotools-20.1-project.zip geotools-20.1-userguide.zip maven repository This release is a stable release and is recommend for new development and production systems. This release is made in conjunction with GeoServer 2.14.1. Improvements GML respects formatting options for
Categories: OSGeo Planet

GIScussions: Shithole Geography re-interpreted at KortDage

OSGeo Planet - Tue, 2018-11-20 20:18

I was invited to deliver a keynote last week at KortDage to an audience of nearly 800 Danish (and Scandinavian) GI professionals. There were just a few snags:

  • The person who invited me said “Last year Jack Dangermond gave a keynote, this year we wanted something different”. No challenge there!
  • The slot was at 9.00am on the final morning after the delegates had been up till 1.30am boogying
  • I don’t speak Danish and I wasn’t sure how my dry (I’d like to think) Brit humour would go down

So they wanted something different, they liked my #FAKEMAPS talk, I didn’t have a lot of time to come up with something completely new so I decided to re-invent Sh*thole Geography.

Ken Field and I had done a double act at FOSS4G in Dar es Salaam around Donald Trump’s opinion that some countries were “shithole countries” whose citizens were not welcome in the US. We had riffed on the “Orange Clown” as Ken likes to call 45, we had played with UN Sustainable Development Goal stats, Ken gave a brief cartography master class on making global maps and I had created a pub style geography quiz. I could work with that and switch the focus to make a slightly different message for this audience (with apologies to Ken) – a map may not always be the best way to present global statistics.

They didn’t record the whole talk so I can’t share that with you but you can get the drift of the points I was making from this short interview that the organisers did with me after my talk.

Kortdage 2018 – Kort fortæller ikke altid sandheden from Geoforum on Vimeo.

At the end of the talk I offered Sh*thole Geography badges and stickers to the attendees in return for donations to the FOSS4G Travel Grant Programme – the audience were amazingly generous and we raised almost $500

Apart from my talk I also got to meet Anna Webber, an artist who had made an “earth map” of Denmark

You can read more about her map and why she made it on Ken’s and my Mappery site.

Thanks to the organisers for inviting me to speak, thanks to all the people who laughed at my jokes and made me feel welcome. KortDage was a great event.

 

Categories: OSGeo Planet

gvSIG Team: I Congreso de la Red JUST-SIDE: Infraestructuras de Datos Espaciales y Justicia Territorial

OSGeo Planet - Tue, 2018-11-20 16:28

Os hacemos llegar la invitación al primer evento organizado por la Red JUST-SIDE, de la que forma parte la Asociación gvSIG. Esperamos sea de vuestro interés (si podéis asistir…allí nos veremos!):

El Departamento de Geografía de la Facultad de Ciencias (UdelaR) tiene el agrado de invitarle al primer congreso de la red JUST-SIDE (Justicia y Sostenibilidad en el Territorio a través de las Infraestructuras de Datos Espaciales) a realizarse en el día 22 de noviembre de 2018 entre las 14 y 18 horas; el mismo abordará la temática de las Tecnologías Libres y contará con expositores nacionales y extranjeros.

Nuestra institución integra esta red de universidades y empresas de Argentina, Brasil, Chile, Costa Rica, España, México, Portugal y Uruguay que, coordinados por la Universidad de Coimbra (Portugal), está llevando adelante un proyecto que vincula el derecho ambiental y territorial con el acceso a la información geográfica y el desarrollo de las Infraestructuras de Datos Espaciales.

La red JUST-SIDE integra investigadores con experiencia en las áreas del derecho, ciencias sociales y tecnologías de información geográfica y busca crear una metodología basada en una Infraestructura de Datos Espaciales como apoyo a la toma de decisiones, promoviendo y fortaleciendo las políticas públicas para hacer frente a los desafíos sociales, ambientales, económicos, jurídicos y democráticos. Está iniciativa está financiada por CYTED (Programa Iberoamericano de Ciencia y Tecnología para el Desarrollo), y estudiará 16 casos concretos en territorios para la mejoría de la eficacia y justicia de las políticas públicas con incidencia territorial.

Quienes estén interesados en asistir les solicitamos realizar su registro previo a través del siguiente enlace, dado que el cupo es limitado; la actividad se desarrollará en el Hotel Dazzler (Salón Laureles), calle 21 de setiembre 2752 esquina Luis de la Torre.

Por favor realizar su registro

Se adjunta programa de la jornada.

– DEPARTAMENTO DE GEOGRAFÍA

Categories: OSGeo Planet

gvSIG Team: Disponibles las ponencias y talleres de las 14as Jornadas Internacionales gvSIG

OSGeo Planet - Tue, 2018-11-20 12:16

Ya se han publicado las presentaciones realizadas en las 14as Jornadas Internacionales gvSIG, que se celebraron del 24 al 26 de octubre en Valencia (España).

También está disponible la grabación de las mismas y de algunos de los talleres que se impartieron. Los vídeos se encuentran disponibles en su idioma original, habiendo presentaciones y talleres en español e inglés.

Los talleres de los cuales está disponible la grabación para poder seguirlos fueron los de “gvSIG aplicado a Geología” y “Generación de informes con gvSIG Desktop“, ambos en español, y el de “Environmental modelling using gvSIG and the HortonMachine” en inglés.

Cabe destacar que el taller de Informes es la primera documentación disponible del plugin publicado hace unos días sobre la nueva versión gvSIG 2.5 que permite crear informes de forma automática en gvSIG Desktop. ¡No os lo perdáis!

Si no pudiste asistir a las jornadas ahora puedes disponer de la grabación de las distintas sesiones, y aparte de un buen material didáctico gratuito con los talleres.

Categories: OSGeo Planet
Syndicate content