Style | StandardCards

OpenStreetMap Blogs

Sunday, 22. February 2026

OpenStreetMap User's Diaries

ссылки


weeklyOSM

weeklyOSM 813

12/02/2026-18/02/2026 [1] | DER SPIEGEL has built its own open-source mapping stack based on MapLibre and Protomaps | © MapLibre – Protomaps – map data © OpenStreetMap Contributors. Mapping Comments are requested on the proposal bicycle_parking=absent. This tag aims to document that no bicycle parking is available around a feature, for example a shop or…

Continue readi

12/02/2026-18/02/2026

lead picture

[1] | DER SPIEGEL has built its own open-source mapping stack based on MapLibre and Protomaps | © MapLibreProtomaps – map data © OpenStreetMap Contributors.

Mapping

  • Comments are requested on the proposal bicycle_parking=absent. This tag aims to document that no bicycle parking is available around a feature, for example a shop or station, making such gaps in infrastructure discoverable in data analyses. Related discussion is also taking place on the forum.

Community

  • Clifford Snow introduced the ‘Safe Routes to School’ initiative, which supports families who want their children to walk or cycle to school in a safe way. The project focuses on identifying the most suitable routes and collaborating with local authorities to make those routes safer. The team is currently looking for additional volunteers to improve the data. Those who would like to contribute can join the #safe-routes-to-school channel on the OSM US Slack.
  • OpenCage has published an interview with Nicolas Collignon, CEO of Kale AI, a company developing urban delivery routing solutions powered by OpenStreetMap.
  • Anne-Karoline Distel showed how to add information about what is or was quarried at a quarry in OpenStreetMap, noting that such data can be valuable for historical research, whether in industrial, social, or even church history.
  • Marcelm005 has proposed a project to map the Lincolnshire ER Routes, emergency routes that enable people to quickly evacuate from flood-prone areas.
  • rtnf is trying to learn how OSM tile servers work.
  • The OpenStreetMap US community is currently deliberating on the most effective method for querying OpenStreetMap objects based on their geometric shapes.

OpenStreetMap Foundation

  • The next OSM Foundation Board meeting will take place on Thursday 26 February 2026 at 13:00 UTC.The meeting will be accessible through the video room.The topics to be covered are:
    • Chairperson’s report
    • Secretary’s report
    • Treasurer’s report
    • 2026 board face-to-face meeting update
    • French cadastre changes and release of code updates for ‘OSM components’ on GitHub
    • Creation of an OSMF Coinbase account for BTC donations
    • Potential statement/policy for OSMF’s participation with other parties in proposals for EU-funded projects, and related topics:
      • Draft blog post: OSMF approach to EU-funded project collaborations
      • Potential OSMF participation in Horizon Europe CSA (GeosTeX)
      • European Institute of Technology (EIT) communications
      • 2026 Sovereign Tech Agency call for tenders
    • Request for comments for GERS as an OGC Community Standard
    • Guest comments or questions.
  • The Marche Region (Italy) reported about its recent entry among the supporters of OpenStreetMap as a ‘Silver Member’. This is the first time that an Italian public body has officially recognised the usefulness of OpenStreetMap and decided to actively contribute to its financial support.

Local chapter news

  • OpenHistoricalMap has kicked off their first donation drive to help fund technical development and operations.

OSM research

  • A Danish survey conducted in January 2026 found that in Denmark, OpenStreetMap contained approximately 20,878 km more paths, footways, and tracks than the official Danish road network dataset, GeoDanmark Vejmidter. The survey was carried out by the Danish organisation GeoDanmark.
  • A new study published in Nature Communications uses OpenStreetMap land use and AOI data to help classify 110 million buildings across 109 Chinese cities, demonstrating how open, community-mapped data supports national-scale urban inequality analysis and evidence-based planning.

Maps

  • kafked has presented his side project rename.world on Hacker News. rename.world is a MapLibre-based map where users can click any place and propose new names. Around 40,000 renames have already been submitted; the non-commercial project runs on SvelteKit, with self-hosted vector tiles, and is explicitly not intended for navigation.
  • The Welikia project shows the native ecology of New York City (i.e. what it was before settlement), using OSM as base map. The project is maintained by the Urban Conservation team at the New York Botanical Garden.

OSM in action

  • [1] At the News-Infographics-Analytics-Maps 2026 conference in Berlin, data visualisation journalist Ferdinand Holsten presented how the German news magazine DER SPIEGEL has built its own open-source mapping stack based on MapLibre and Protomaps. This allows DER SPIEGEL to host tiles for interactive maps independently of commercial services such as Mapbox. The presentation, which is now available as a video on media.ccc.de, outlines the workflow from data preparation to tile generation and the integration into an interactive storytelling. Ferdinand Holsten has kindly provided us with an automatic translation of the presentation into English.
  • Jake Coppinger (UrbanSpectra) has published a map tracking community and government projects across a river catchment, utilising OpenStreetMap data (Sydney, Australia).
  • rbb24, part of public service broadcasting in Germany, naturally uses OpenStreetMap, with correct attribution, in its reporting of locations.

Open Data

  • HeiGIT reported that it has published new open and ready-to-use global risk assessment datasets, with the objective to simplify risk analysis by removing major technical barriers to data preparation. These datasets are designed for easy use with the risk assessment QGIS plugin and enable humanitarian stakeholders to conduct multi-hazard, evidence-based risk assessments to support anticipatory action.
  • Xiong et al. have published a dataset, which contains a topologically connected representation of the European high-voltage grid (220 kV to 750 kV) compiled from OpenStreetMap data extracted with overpass turbo.

Software

  • sylvester_aswin has introduced their project Map Frame, which allows users to generate minimalist map posters based on OpenStreetMap data. Any location worldwide can be selected and downloaded as a 4K PNG (3600×4800); the first poster is free, additional downloads cost one US dollar.
  • Carlos Froh introduced OnRouteMap, a web tool that helps find petrol stations, supermarkets, snack bars, drinking fountains, and similar places along your uploaded GPX track.
  • The solo developer thattechiedude, from Hudson Valley, has presented ROLLIN, a map platform rating locations from 0–100 based on six features such as wheelchair access, accessible toilets, and lifts. The project uses OpenStreetMap data, cross-references Google Places, and adds community verification, and offers a free API tier for developers.
  • Terence Eden, the developer of OpenBenches, has recently implemented a login with OpenStreetMap function in OpenBenches.
  • Ulf Rompe has developedWhat Did You Do‘, a simple tool that shows the number of OpenStreetMap edits made by each software application within a certain period of time.

Programming

  • Thomas de Wolff has introduced his Go library geo/osm on Reddit, offering fast parsing of OSM PBF files through handwritten protobuf decoding, optimised readVarint and readSint routines, and custom zlib decompression. The library can skip specific object types, generate file statistics, and extract geometries by region filter, making it suitable for building custom renderers.
  • In response to recent Overpass API service instability, Matt Whilden has developed microcosm, a GitHub Actions script that retrieves a narrow slice of OSM data and updates it nightly.
  • Andy Townsend discussed the difficulties encountered in setting up your own Overpass API server.

Did you know that …

  • … you can zoom one-handed in Organic Maps?
  • … you can easily submit a brand to the Name Suggestion Index project by using NSI Submit a Brand?
  • … there is a Mastodon instance run by and for OpenStreetMap contributors that is funded by the OSMF?

Other “geo” things

  • Emmanuel Mathot and Jonas Sølvsteen wrote on the Development Seed blog about the release of the ‘EOPF Sentinel Zarr Explorer’, a framework for spatial analysis based on Sentinel images. The cloud-based geospatial project is funded by the European Space Agency through the Copernicus Space Component programme, and was developed by a consortium led by Development Seed and EOX, with community outreach led by thriveGEO.

Upcoming Events

Country Where Venue What When
flag Karlsruhe Geofabrik, Amalienstraße 44, 76133 Karlsruhe Karlsruhe Hack Weekend February 2026 2026-02-21 – 2026-02-22
flag Belfast School of Geosciences, Queen’s University Belfast Belfast Mapathon 2026-02-21
flag TAK Kadıköy Tasarım Atölyesi OpenStreetMap Outdoor Editing 2026-02-21
flag Toulouse Artilect – 10, Rue Tripière – Toulouse Rencontre OSM Toulouse 2026-02-21
flag Kalyani Nagar TomTom Pune Office, India OSM Mapping Party at TomTom Pune, India 2026-02-21
flag Atelier Vélo Utile Rencontre OSM Saint-Brieuc 2026-02-21
flag Mumbai High Point restaurant, Lokhandwala Market, Andheri OSM Mumbai Mapping Party No.7 (Western Line – South) 2026-02-22
Missing Maps : Mapathon en ligne – CartONG [fr] 2026-02-23
flag Saint-Étienne Zoomacom Rencontre Saint-Étienne et sud Loire 2026-02-23
flag Olomouc Přírodovědecká fakulta Univerzity Palackého Únorový olomoucký mapathon 2026-02-24
flag Online Mappy Hour OSM España 2026-02-24
flag Derby The Brunswick, Railway Terrace, Derby East Midlands pub meet-up 2026-02-24
flag Berlin Online OSM-Verkehrswende #72 2026-02-24
flag City of Edinburgh Guildford Arms, Edinburgh OSM Edinburgh pub meetup 2026-02-24
flag Praha Fakulta Elektrotechnická ČVUT v Praze Missing Maps Mapathon na ČVUT v Praze 2026-02-25
flag Hannover Kuriosum OSM-Stammtisch Hannover 2026-02-25
flag Luxembourg neimënster, Luxembourg & online MSF Luxembourg hybrid Mapathon 2026-02-25
flag Düsseldorf Online bei https://meet.jit.si/OSM-DUS-2026 Düsseldorfer OpenStreetMap-Treffen (online) 2026-02-25
flag Seattle Seattle, WA, US OpenThePaths 2026: Connecting People and Places Through Sustainable Access 2026-02-26 – 2026-02-27
flag Essen Fahrrad-Messe Essen, Halle 5, Show-Truck Vortrag: Mitmachen bei OpenStreetMap, der Basis vieler Outdoor-Apps 2026-02-26
flag Milano Building 3A Ground Floor – Politecnico di Milano PoliMappers Maptedì 2026-02-26
flag Zürich Meta Zurich Office Mapillary: Celebrating 3 Billion Images 2026-02-26
flag Online Asamblea General Ordinaria – Asociación OpenStreetMap España 2026-02-26
flag Santa Clara Santa Clara University Friends of MSF Mapathon 2026-02-26
UN Maps Validation Friday Chat & Map 2026-02-27
flag Essen Fahrrad-Messe Essen, Halle 5, Show-Truck Vortrag: Mitmachen bei OpenStreetMap, der Basis vieler Outdoor-Apps 2026-02-27
flag Potsdam Hafthorn Potsdamer Mappertreffen 2026-02-27
flag Ferrara Cimitero monumentale della Certosa di Ferrara Ferrara mapping party 2026-02-28
flag Messina Messina Mapping Day @ Messina 2026-02-28
flag नई दिल्ली Jitsi Meet (online) OSM India – Monthly Online Mapathon 2026-03-01
flag Milano Building 4A, Room Fassò – Politecnico di Milano PoliMappers Maptedì 2026-03-03
flag Salzburg Bewohnerservice Elisabeth-Vorstadt OSM-Treffpunkt 2026-03-03
flag Lille Salle Yser, MRES, 5 rue Jules de Vicq, Lille Rencontre OpenStreetMap à Lille 2026-03-03
Missing Maps London: (Online) Mapathon [eng] 2026-03-03
iD Community Chat 2026-03-04
OSM Indoor Meetup 2026-03-04
flag Brno Kvartální OSM pivo 2026-03-04
flag Stuttgart Stuttgart Stuttgarter OpenStreetMap-Treffen 2026-03-04
OSM US Mappy Hour: OpenHistoricalMap in North America 2026-03-04
flag Online OpenHistoricalMap in North America 2026-03-04
flag Flensburg Offener Kanal Flensburg 3. Open Data Day Flensburg 2026-03-05
OSMF Engineering Working Group meeting 2026-03-06
flag Gent Wijgaard OpenStreetMap meetup in Gent – Pre-VLA-congres editie 2026-03-06
flag Hogeschool Odissee Hospitaalstraat 23 Sint-Niklaas Vereniging Leraars Aardrijkskunde (VLA) conference 2026 2026-03-07
flag Perth Espresso Perk U Later Social Mapping Sunday: Moort-ak Waadiny / Wellington Square Perth 2026-03-07
flag Perth Espresso Perk U Later Social Mapping Sunday: Moort-ak Waadiny / Wellington Square Perth 2026-03-08
flag Delhi OSM Delhi Mapping Party No.27 (East Zone) 2026-03-08
flag København Cafe Bevar’s OSMmapperCPH 2026-03-08
flag London Social Sciences Centre – Western University Friends of MSF UWO Mapathon 2026-03-09
Missing Maps : Mapathon en ligne – CartONG [fr] 2026-03-09
flag Brno Geografický ústav, PřF MUNI, Brno Březnový brněnský Missing Maps Mapathon na Geografickém ústavu 2026-03-09
flag 臺北市 MozSpace Taipei OpenStreetMap x Wikidata Taipei #86 2026-03-09

Note:
If you like to see your event here, please put it into the OSM calendar. Only data which is there, will appear in weeklyOSM.

This weeklyOSM was produced by MarcoR, MatthiasMatthias, PierZen, Raquel IVIDES DATA, Strubbl, Andrew Davidson, barefootstache, derFred, mcliquid.
We welcome link suggestions for the next issue via this form and look forward to your contributions.

Saturday, 21. February 2026

OpenStreetMap User's Diaries

苏州园区湖东区域的大量误标记和偏移数据

包含建筑物、森林。之前陆陆续续修复了一些,不过都是游击式地修复,没有系统地记录过。现在有时间捡起这件事了,先在这里留个坑吧。

二编:?给房子加layer标签来逃避冲突检查器的检查??这是人类啊?

包含建筑物、森林。之前陆陆续续修复了一些,不过都是游击式地修复,没有系统地记录过。现在有时间捡起这件事了,先在这里留个坑吧。

二编:?给房子加layer标签来逃避冲突检查器的检查??这是人类啊?


1 Week off

Taking a break for 1 week because of ramadhan and installing gentoo as my main system.

Taking a break for 1 week because of ramadhan and installing gentoo as my main system.


Improving OSRM Foot Routing with Greenery Waypoints

I have a large set of photographs I made while running. They are geotagged, as I took them with my phone camera. The compass direction is completely unreliable, but lat/lon is more trustworthy. I thought it would be an interesting experiment to extract greenery like grass and trees from these photographs. It can be a useful addition for creating routes that are more pleasant to walk, since the e

I have a large set of photographs I made while running. They are geotagged, as I took them with my phone camera. The compass direction is completely unreliable, but lat/lon is more trustworthy. I thought it would be an interesting experiment to extract greenery like grass and trees from these photographs. It can be a useful addition for creating routes that are more pleasant to walk, since the eye-level point of view is not available in OSM. As this is based on my personal photographs, it has the additional benefit of recommending routes that I tend to use. The first challenge I encountered is that out of a few thousand photographs, only a handful were taken during the daytime. After deduplicating and dropping all photos that contain no greenery, this becomes a relatively small set of waypoints. I decided not to extrapolate additional points along OSM ways to keep the dataset small and avoid adding misleading info. The greenery detection works well enough with the SegFormer model, although it is somewhat slow locally. My plan is to select waypoints from this dataset before calling OSRM. This way I get routes that are more enjoyable to walk and run, but are generally longer than the default shortest route. You can find my dataset on Kaggle.

Friday, 20. February 2026

OpenStreetMap User's Diaries

Some local changes to OSM of my area

A few quick notes on some changes I made to OSM based on local knowledge.

  1. Changed the point for the Riverside Centre building to reflect that it is now a Builder’s Corner hardware store.

  2. Added a point for the nearby Hole in the Wall Centre

  3. Defined an area for the Somerset Lofts apartment complex and added some details fo

A few quick notes on some changes I made to OSM based on local knowledge.

  1. Changed the point for the Riverside Centre building to reflect that it is now a Builder’s Corner hardware store.

  2. Added a point for the nearby Hole in the Wall Centre

  3. Defined an area for the Somerset Lofts apartment complex and added some details for it.


Converting dash cam videos into Panoramax images

I’ve recently begun contributing street-level imagery on Mapillary and Panoramax in my local area. I figured that my dash cam was already recording anyway, so if it could be of use to anyone, why not share it?

Contributing to Mapillary was very easy; since my dash cam has an integrated GPS that encoded its data into the video file, I could just upload the video to Mapillary and their web

I’ve recently begun contributing street-level imagery on Mapillary and Panoramax in my local area. I figured that my dash cam was already recording anyway, so if it could be of use to anyone, why not share it?

Contributing to Mapillary was very easy; since my dash cam has an integrated GPS that encoded its data into the video file, I could just upload the video to Mapillary and their website would turn it into an image sequence. Panoramax requires you to preprocess the video into geotagged images yourself, which made it hard to contribute to. Some cameras can be configured to save periodic images instead of videos, but that didn’t work for me because I still needed the dash cam to work normally as a dash cam first and Panoramax instrument second. It took me a while to figure it out, so I’m writing this blog post to hopefully help out the next guy in the same situation.

The task involves four basic steps. I scripted a solution that works specifically for my dash cam model (Garmin 47) and operating system (Linux). If Panoramax continues to grow, I imagine that separate scripts could be written for each step to mix and match for different camera types and computing environments. The steps are:

  1. Extract the raw GPS data from the dash cam video clip(s)

  2. Along the GPS trace, create a set of evenly-spaced points

  3. Extract images from the video occurring at the evenly-spaced points, and

  4. Add the GPS and time data to the image files

One could go even further and automatically upload the images to Panoramax straight from the terminal, but that’s beyond my coding abilities.

Let’s take a look at each step in detail:

Step 1 - Getting GPS data from the video

Thankfully, Garmin makes this relatively easy to do with exiftool. If you open the terminal in the directory with the video clips and run the command

exiftool GRMN<number>.MP4

The output will contain a warning:

Warning : [minor] The ExtractEmbedded option may find more tags in the media data

So we can modify the command into

exiftool -ee3 GRMN<number>.MP4

Now exiftool will output all the same information as before, as well as a bunch of the following

Sample Time                     : 0:00:58
Sample Duration                 : 1.00 s
GPS Latitude                    : XX deg YY' ZZ.ZZ" N
GPS Longitude                   : UU deg VV' WW.WW" W
GPS Speed                       : 11.2654
GPS Date/Time                   : 2026:02:13 22:24:45.000Z

Jackpot! Now we can redirect the output to a file and get our GPS coordinates. We need to have a file saved in the working directory to tell exiftool how to format the data. So I saved the following as gps_format.fmt:

#[IF]  $gpslatitude $gpslongitude
#[BODY]$gpslatitude#,$gpslongitude#,${gpsdatetime#;DateFmt("%Y-%m-%dT%H:%M:%S%f")}

Now we pass that to exiftool to only print the metadata we’re interested in. We’ll also put > gps.tmp to save the output to a file:

exiftool -p gps_format.fmt -ee3 GRMN<number>.MP4 > gps.tmp

And we’re done! Now we have the raw GPS information out of the video and into plain text.

Step 2 - Turn the GPS data into evenly spaced points

To do this, I use python to linearly interpolate between GPS points approximately 3 meters apart. And I do mean very approximately: instead of doing a proper distance calculation, I just eyeball how many meters are in a degree. One meter is very roughly about 0.000009° of latitude. Since one meter is a larger portion of a degree near the poles, it needs to be adjusted based on the latitude. I blindly use the latitude of the first point of the sequence and assume it doesn’t change enough over time to matter.

from math import cos, radians
cosd = lambda x: cos(radians(x))

scale_lat =  1 / 9e-6
scale_lon = (1 / 9e-6) * cosd(lat0)

Now it is easy to use the Pythagorean Theorem to estimate the distance between two points:

dx = scale_lon * (lon1 - lon0)
dy = scale_lat * (lat1 - lat0)
dist_between_points = (dx**2 + dy**2)**0.5

Recursively find this distance for each pair of points along the GPS trace. Also keep a running tally of the total distance traveled. For example, consider the following data after you stop at a red light, sit for a while, and then keep going:

Pt | Dist | Tot
A  | --   | 0
B  | 10   | 10
C  | 6    | 16
D  | 2    | 18
E  | 0    | 18
(sit at the red light...)
Q  | 0    | 18
R  | 1    | 19
S  | 3    | 22
T  | 7    | 29
U  | 11   | 40
V  | 14   | 54
(and so on)

Suppose you want image spacing of about 3 meters (about 10 feet or half a car length). So you want images at 0, 3, 6, 9, 12, 15, …, and so on. We can take point A as our first point, but we need to interpolate between GPS points to find evenly-spaced points. I’ll use the notation X -> Y N% to mean “interpolate N% from X to Y.” Then to find our desired points, we need:

Pt | Formula
0  | A
3  | A -> B 30%
6  | A -> B 60%
9  | A -> B 90%
12 | B -> C 33%
15 | B -> C 83%
18 | D
21 | R -> S 67%
24 | S -> T 29%
27 | S -> T 71%
30 | T -> U  9%
etc...

Since Garmin takes GPS measurements once per second, this is a convenient way to determine at exactly what time each new point occurred. For the point 60% from A to B, it’s just the GPS timestamp of A plus 0.60 seconds. For the latitude and longitude of the interpolated point, we can just interpolate the latitude and longitude coordinates separately. 3 meters is not even close to far enough for great-circle paths to matter. So e.g.

lerp = lambda a, b, x: (1 - x) * a + x * b

lat_interp = lerp(latA, latB, 0.6)
lon_interp = lerp(lonA, lonB, 0.6)

# And so on for each interpolated point

Save this output to a file (I call mine processed_points.csv), and you’re done with step 2!

Step 3 - Extract images from the video

It is possible to extract a single frame of a video using ffmpeg. The time should be a decimal number of seconds after the start of the video to exactly three decimal places.

ffmpeg -ss <time> -i <video>.MP4 -frames:v 1 output.jpg

By default, ffmpeg compresses the images quite a bit. It was enough that I could notice a quality difference when I put a paused frame of the video side-by-side with an extracted image. We can force ffmpeg to improve the quality with q:v (number). A smaller number here produces a higher quality image at the expense of file size and processing time. I’ve settled on a value of 3, but feel free to play around with this to get the quality or file sizes you want.

ffmpeg -ss <time> -i <video>.MP4 -q:v 3 -frames:v 1 output.jpg

ffmpeg will print a bunch of text to the console that we don’t care about. To avoid flooding the screen, use the -hide_banner and -loglevel options to reduce (but not completely shut up) the amount it outputs to the console:

ffmpeg -ss <time> -i <video>.MP4 -q:v 3 -frames:v 1 -hide_banner -loglevel fatal output.jpg

Since you are going to extract many images, you’ll have to use this command in a loop with a bunch of variables that change from iteration to iteration, e.g.

ffmpeg -ss $(printf "%.3f" "$time") -i "$input_dir""/DCIM/105UNSVD/GRMN""$num"".MP4" -q:v "$jpeg_quality" -frames:v 1 -hide_banner -loglevel fatal "$output_dir"/"$num""-""$(printf "%04d" $img_num)"".jpg"

My naming convention produces file names of the format video number-image number.jpg. So for example, the 25th image extracted from GRMN4567.MP4 would be named 4567-0025.jpg.

And we’re almost there! Now we just need to put the metadata from step 2 into the images we just generated.

Step 4 - Add the GPS and time metadata to the images

You can write tags to files using exiftool using the format:

exiftool -<key>=<value> <file name>.jpg

You can add multiple tags in a single line.

exiftool -<key1>=<value1> -<key2>=<value2> <file name>.jpg

Note that exiftool only supports specific keys, so it won’t write the metadata if it doesn’t know what the key is. It creates a new image by default, so to avoid duplicating each image, add:

exiftool -overwrite_original -<key1>=<value1> -<key2>=<value2> <file name>.jpg

This will write a line to the terminal to confirm after every single image. To avoid that, redirect the output to /dev/null. This tells the terminal to throw the output into a black hole, or the wardrobe to Narnia, or anywhere else besides the terminal.

exiftool -overwrite_original -<key1>=<value1> -<key2>=<value2> <file name>.jpg 2> /dev/null

For Panoramax to accept your images, you need all of the following tags:

-gpslatitude=45.6789
-gpslongitude=-123.456789
-gpslatituderef=N
-gpslongituderef=W
-datetimeoriginal=2000-01-02T03:04:05

If you are missing these, Panoramax will reject your image. Note that the latitude and longitude ref tags are necessary because exiftool doesn’t understand negative coordinates as being in the southern or western hemispheres. You have to provide them separately for the GPS data to be read correctly. If you forget to add them, Panoramax may accept the image but put it in the wrong place. The date and time should be given in ISO 8601 format. If you don’t specify a time zone, Panoramax will assume local time and automatically convert it to UTC on their site.

You can theoretically add any tag in the exif specification. Some ones I like for Panoramax are:

-subsectimeoriginal=067
-author=FeetAndInches
-make=Garmin
-model="Garmin 47 Dash Cam"

The SubSecTimeOriginal field is important for getting Panoramax to put your sequence in the right order. Since the images come from a dash cam, speeds of 10-20 m/s are common, so multiple images are taken per second of video. The DateTimeOriginal tag does not preserve fractional seconds (even if you provide them when writing the tag), so several pictures would be recorded as the same time and Panoramax would have to guess their order. Note that this needs to be provided as an integer string after the decimal point. So for a time of 51.328 seconds, you would write -subsectimeoriginal=328. For a time of 51.1 seconds, you would just write -subsectimeoriginal=1. For a time of 51.001 seconds, you would need to include leading zeroes as -subsectimeoriginal=001.

If you don’t use the SubSecTimeOriginal tag, you can still get Panoramax to show your images in order if you use a suitable file naming convention. You can open the sequence on the website and select the option to sort by file name.

The author tag is just nice to attribute that it’s your image even if it gets shared outside Panoramax. The make and model tags help fill in some of the camera information on Panoramax and helps determine your GPS accuracy, which is used to determine the image’s quality score.

You can do step 4 in the same loop as step 3. Since the coordinates and time will change for each image, the command will look messy like:

exiftool -overwrite_original -gpslongitude=$lon -gpslatitude=$lat -gpslatituderef=$ns -gpslongituderef=$ew -datetimeoriginal=$timestamp -author="$exif_author" -subsectimeoriginal="$subsec" -make="$exif_make" -model="$exif_model" -usercomment="$exif_comment" "$output_dir"/"$num""-""$(printf "%04d" $img_num)"".jpg" > /dev/null

Closing Notes

This post explains the basic principles of how to turn a video into usable images on Panoramax. I plan to write a second post going into the 201 level - things like how to deal with missing a single GPS measurement, duplicated measurements, getting sent to Null Island, how to detect erroneous data, using the videos immediately before and after to interpolate better at the edges, recursively doing this for multiple video clips, etc. But for now, I hope this has been useful to you.

If anyone is interested, I can share the entire scripts that I use right now. They’re a little buggy, only partially commented, and occasionally require some babysitting to make sure they work properly. But if something is better than nothing and you are willing to try and deal with someone else’s amateur code, please let me know.

Thanks for reading,

FeetAndInches


Neighborhood Update: [Wadsa, Desaiganj]

  • I spent some time today improving the map data in my local area using the iD editor. As a local, I noticed that several roads were untracted

  • added roads but i got confused while selecting presets- then i realised the more i do mapping, the better i will get with using presets. Each preset serves a unique purpose.

  • Few weeks ago i

  • I spent some time today improving the map data in my local area using the iD editor. As a local, I noticed that several roads were untracted

  • added roads but i got confused while selecting presets- then i realised the more i do mapping, the better i will get with using presets. Each preset serves a unique purpose.

  • Few weeks ago i spent time mapping my school in my city, i was soo fun- just wish they could use more updated satelite image.

Thursday, 19. February 2026

Jochen Topf

OSM Spyglass

Two years ago or so I started the OSM XRAY project, later I wrote about it in this blog post. Since then I have renamed this project to “OSM Spyglass” and I have kept working on it on and off.

At the State of the Map Europe 2025 in Dundee I gave a talk with the title “Everything Everywhere All At Once” about this project. You can see the video on Youtube. This got

Two years ago or so I started the OSM XRAY project, later I wrote about it in this blog post. Since then I have renamed this project to “OSM Spyglass” and I have kept working on it on and off.

At the State of the Map Europe 2025 in Dundee I gave a talk with the title “Everything Everywhere All At Once” about this project. You can see the video on Youtube. This got some people excited about the project, there is even some talk about putting the tool on OSMF infrastructure. Until this comes about the tool is now hosted at spyglass.jochentopf.com.

I am finally getting around to writing some more about what’s been happening since my first announcement and since the talk.

User Interface

I keep fiddling with the user interface. Optional globe view (not much to do for me now that Maplibre supports that out of the box), map is now resizable (horizontally), display of city names in some zoom levels, improved pop-up menus for keys and tags, and much more. Generally the UI has been getting faster and more reliable.

There are still some bugs to fix and plenty of possible improvements. And I’d be happy about feedback and ideas. Its quite a lot of information we are trying to show here in limited space, so good ideas on how to do that are needed.

Caching

In the first blog post I wrote about some caching that I implemented in the database. That did work but it turns out it is pretty useless. The user wants to access the newest data anway and we can keep up with minutely updates (at least in larger zoom levels), so I removed the caching completely for vector tiles and for high zoom rasters. Only raster images at zoom levels up to 10 are cached. Currently we can not deliver them fast enough otherwise.

Map updates

The database is updated from OSM using minutely diffs. We are usually about 3 to 5 minutes behind the OSM data, that’s just how long it takes the OSM servers to create the minutely diffs, push them out to their server and for our update job to download the data and to apply it to the database. It is unlikely we can improve on that much further. Spyglass shows the timestamp of the latest data it has in the bottom right corner. This timestamp is updated whenever new data is loaded, i.e. when you move the map or so.

Vector tiles are always generated on the fly from the current database, for higher zoom levels they contain all data, for medium zoom levels only “larger” objects are shown, i.e. long ways and larger areas. In small and medium zoom levels raster tiles are shown. They always contain all data. So for the medium zoom levels raster data in gray is overlayed with vector data in black (nodes and ways) or blue (relations). So you can see everything, but only click on the larger items.

Raster tiles in small zoom levels are only updated once per day, for zoom 0 to 7 this happens by taking the zoom level 8 tiles, and merging and rescaling them. I have spent quite some time on optimizing this. The first version happened in the database but only generated black-and-white tiles, the current version uses code written in Go which creates grayscale images which are much better than the black-and-white images. And it is much faster than the gdal tools I tried for this task. Gdal is a great tool, but, as an “all purpose tool”, it has to cope with all sorts of different data sources, projections etc. which makes it much slower than a specialized tool for a specific use case. It only takes a few minutes now to create the low zoom tiles from the zoom level 8 tiles. And they are not stored in the database any more but on disk which is easier and they are faster to use that way, too.

Rasters are still generated in the database from the data. That is, unfortunately, not as efficient as one might think. We don’t need to copy the data from the database into another process, and the cost of actually getting the data seems to be not that huge, but the rasterizing costs time. This is probably something that could be improved inside PostGIS, or maybe we have to get rid of this idea alltogether and move rendering outside the database. There is plenty of space to experiment and improve performance here.

Server

Originally I used pg_tileserv as server to create the vector tiles from the database on the fly. It could also be tricked into creating the raster tiles. But I also needed GeoJSON output and some other API endpoints. I experimented with pg_featureserv which did work, but having two servers with lots of specialized PL/pgSQL functions in the database plus an ever growing configuration for nginx (used as reverse proxy) became too complicated and error prone. So I decided to rewrite the server from scratch in Go. Turns out it is really easy to write robust and featureful HTTP servers in Go, it comes with everything you need; the only external library I am using is for accessing the database. And deployment is really easy: Just copy over one Go binary and restart the server, no extra configuration files or functions to update in the database etc.

Filters

Everything is done three times for nodes, ways, and relations. There are three sets of raster tiles, 3 sets of vector tiles. It is easy to switch those layers on and off in the UI. And then there is the key or tag filter. The vector tiles in higher zoom levels contain all the data, the filter is applied on the client, which is very fast. For raster tiles the filtering has to be done on the server which takes somewhat more time. Filtering is (silently) disabled on the small zoom levels, so you always see all data there. This isn’t great as a user experience, I’ll still have to figure out a way to make this transition more user friendly. Or, ideally, allow filtering on all zoom levels.

It is a lot of fun to zip around the map and look at far away places and how they are mapped. Try it out!. And if you have any problems or ideas, open an issue on Codeberg.


OpenStreetMap User's Diaries

Querying OSM objects by their shapes

There has been a very interesting question on the OSM US Slack lately.

“Does anyone have a method to search through the OSM database for a building of a particular shape? I need assistance finding OSM buildings with this specific shape. They should be located in NJ, DE, northeastern MD, eastern PA, or southern NY.”

The question quickly exploded into a

There has been a very interesting question on the OSM US Slack lately.

“Does anyone have a method to search through the OSM database for a building of a particular shape? I need assistance finding OSM buildings with this specific shape. They should be located in NJ, DE, northeastern MD, eastern PA, or southern NY.”

The question quickly exploded into a huge discussion. At the time of writing, there are already 71 replies.

Someone suggested :

“You could load OSM buildings into PostGIS and then use ST_HausdorffDistance to compare the geometries.”

From there, the discussion veered into how to solve that specific puzzle and find the exact OSM building in question.

One person added, “So the strategy is: create the shape of the building you want to search for, scale it to, say, fill a 100x100 m bounding box or something. Ask Postgres to, within a search-area bounding box, take each building and scale it to a 100x100 m bounding box, compute the Hausdorff distance with the scaled input shape, and return all OSM element IDs and their Hausdorff distances, sorted in ascending order.”

Another said, “What I’m currently doing is combining several shape exports into a single file with around 20,000 objects that have concavity. Concavity plus more than 10 nodes eliminates most buildings.”


At that point, instead of hunting that elusive specific OSM building, I became more interested in the generalized version of the problem.

So I added my two cents to the discussion:

“The generalized version of this problem would be : Can we represent a shape in some kind of data type that allows us to computationally check whether two objects have the same shape, regardless of rotation and scaling?

I haven’t studied the Hausdorff distance yet, but I’m wondering whether it can solve this problem, or if there’s a better alternative—Hu moments, Procrustes analysis, Fourier descriptors for contours…”

Someone replied :

“Hu moments are a good option. Elliptic Fourier Descriptors, Shape Context Histograms, Turning functions, etc. I’ve experimented with those four while trying to classify sports pitches more accurately. You can actually get pretty far with just compactness, convexity, and aspect ratio, thankfully.”

Do you have any other ideas on how to solve this problem?

Wednesday, 18. February 2026

OpenStreetMap User's Diaries

New CNEFE Tool Revolutionizes Street Name Correction in OpenStreetMap Brazil.

New CNEFE Tool Revolutionizes Street Name Correction in OpenStreetMap Brazil

The community of Brazilian mappers has just gained a powerful ally to improve one of the most crucial and, at the same time, challenging data points in any map: street names. The CNEFE Verification System platform has been launched, accessible at cnefe.mapaslivre.com.br, a tool created by and for the OpenStreetM

New CNEFE Tool Revolutionizes Street Name Correction in OpenStreetMap Brazil

The community of Brazilian mappers has just gained a powerful ally to improve one of the most crucial and, at the same time, challenging data points in any map: street names. The CNEFE Verification System platform has been launched, accessible at https://cnefe.mapaslivre.com.br, a tool created by and for the OpenStreetMap (OSM) community in Brazil, aimed at validating and correcting address data using the latest information from the 2022 IBGE Census.

The project is an initiative of UMBRAOSM (Union of Brazilian OpenStreetMap Mappers) and was developed by experienced mappers Raphael de Assis, president of UMBRAOSM and member of the OpenStreetMap Foundation, and Anderson Toniazo, both active members of the OSM Brazil community. The tool arrives to solve a long-standing bottleneck in national mapping: the updating and verification of street names based on official sources. The Challenge of Street Names in Brazil

For those mapping in Brazil, one of the biggest challenges has always been the lack of a complete, accurate, and freely accessible street database. Through the Demographic Census, IBGE compiles the National Registry of Addresses for Statistical Purposes (CNEFE) . This registry is a vast list of addresses from across the country, containing street names, address types, neighborhoods, and, in many cases, geographic coordinates, especially in rural and non-residential areas.

Historically, the OSM community has used CNEFE data from previous censuses (such as 2010) to enrich the map. However, the process was complex, involving downloading text files (fixed format), cross-referencing them with census tract shapefiles, and extensive manual work to match the information with the streets already drawn on the map, in addition to correcting spelling differences.

With the recent publication of the CNEFE 2022 microdata by IBGE, the need for an efficient tool to integrate this new data into OSM became even more evident. CNEFE System: A Bridge Between Official Data and the Collaborative Map

It is in this context that the CNEFE Verification System emerges. The platform created by Raphael de Assis and Anderson Toniazo is not just a data viewer; it is a complete work tool, designed to optimize the collaborative verification and correction workflow.

The system’s intuitive interface allows mappers of all experience levels to:

Visualize CNEFE 2022 Data: The tool presents official address data from the most recent census clearly, overlaid on the map.

Compare with OpenStreetMap: The mapper can easily identify discrepancies between a street name recorded in CNEFE and the name currently present in OSM.

Correct and Include Names: When a street in OSM is unnamed (very common in less mapped areas) or has a different name than the IBGE registry, the tool facilitates the correction and inclusion of the correct name directly on the map.

Fill Gaps: In places where IBGE registered addresses, but the corresponding streets have not yet been drawn in OSM, the application highlights these areas, encouraging the complete mapping of road geometries and, subsequently, the addition of names.

The platform is already at version 1.0, updated on January 22, 2026, and features rich support material for the community. Mappers can access a step-by-step tutorial with images, watch demonstrative videos, and even download complete PDF tutorials for offline consultation, ensuring everyone can make the most of the tool. The Strength of the Community Behind the Tool

The development of the CNEFE System is a testament to the power and organization of the OSM Brazil community. UMBRAOSM, under the leadership of Raphael de Assis, has stood out for promoting initiatives that facilitate and professionalize collaborative mapping in the country. Projects like “Mapeia Crato” have already demonstrated the capacity of unity in training new mappers and carrying out large-scale tasks.

The partnership between Raphael and Anderson in developing this tool reinforces the community’s commitment to not only use open data but also to give back, creating ecosystems that improve the quality of geospatial information available to everyone. Their work directly aligns with broader discussions within the community, such as the matching of CNEFE 2022 variables with OSM tags, a fundamental step for any data import or validation process. A Future with More Accurate Maps

The availability of the CNEFE System marks a significant advance for Brazilian mapping. By facilitating access and comparison with official Census 2022 data, the tool not only speeds up the map update process but also increases the reliability of the OpenStreetMap database as a whole.

For the end-user, whether a driver using a navigation app, a delivery person, or a researcher, the result is more accurate maps, with correctly identified streets and addresses that are easier to locate. The CNEFE tool is, therefore, a key piece in Brazil’s open data infrastructure, built collaboratively by those who understand the subject best: the mapping community itself.

Visit https://cnefe.mapaslivre.com.br and start contributing to a more complete and correct map of Brazil.


Nova Ferramenta CNEFE Revoluciona a Correção de Nomes de Ruas no OpenStreetMap Brasil.

Nova Ferramenta CNEFE Revoluciona a Correção de Nomes de Ruas no OpenStreetMap Brasil

A comunidade de mapeadores brasileiros acaba de ganhar uma poderosa aliada para aprimorar um dos dados mais cruciais e, ao mesmo tempo, desafiadores de qualquer mapa: os nomes das ruas. Foi lançada a plataforma Sistema de Verificação CNEFE, acessível em cnefe.mapaslivre.com.br, uma ferramenta criada por e para

Nova Ferramenta CNEFE Revoluciona a Correção de Nomes de Ruas no OpenStreetMap Brasil

A comunidade de mapeadores brasileiros acaba de ganhar uma poderosa aliada para aprimorar um dos dados mais cruciais e, ao mesmo tempo, desafiadores de qualquer mapa: os nomes das ruas. Foi lançada a plataforma Sistema de Verificação CNEFE, acessível em https://cnefe.mapaslivre.com.br, uma ferramenta criada por e para a comunidade OpenStreetMap (OSM) no Brasil, com o objetivo de validar e corrigir os dados de logradouros utilizando as informações mais recentes do Censo 2022 do IBGE.

O projeto é uma iniciativa da UMBRAOSM (União dos Mapeadores Brasileiros do OpenStreetMap) e foi desenvolvido pelos experientes mapeadores Raphael de Assis, presidente da UMBRAOSM e membro da Fundação OpenStreetMap, e Anderson Toniazo, ambos membros ativos da comunidade OSM Brasil. A ferramenta chega para resolver um antigo gargalo no mapeamento nacional: a atualização e verificação dos nomes das ruas a partir de fontes oficiais . O Desafio dos Nomes de Ruas no Brasil

Para quem mapeia no Brasil, um dos grandes desafios sempre foi a falta de uma base de dados de logradouros completa, precisa e de livre acesso. O IBGE, através do Censo Demográfico, coleta o Cadastro Nacional de Endereços para Fins Estatísticos (CNEFE). Este cadastro é uma vasta lista de endereços de todo o país, contendo nomes de ruas, tipos de logradouro, bairros e, em muitos casos, coordenadas geográficas, especialmente em áreas rurais e não residenciais .

Historicamente, a comunidade OSM já utilizava dados do CNEFE de censos anteriores (como o de 2010) para enriquecer o mapa. No entanto, o processo era complexo, envolvendo o download de arquivos de texto (formato fixo), o cruzamento com shapefiles de setores censitários e um trabalho manual intenso para casar as informações com as ruas já desenhadas no mapa, além de corrigir diferenças de grafia .

Com a recente publicação dos microdados do CNEFE 2022 pelo IBGE, a necessidade de uma ferramenta eficiente para integrar esses novos dados ao OSM tornou-se ainda mais evidente . Sistema CNEFE: Uma Ponte entre o Dado Oficial e o Mapa Colaborativo

É nesse contexto que surge o Sistema de Verificação CNEFE. A plataforma criada por Raphael de Assis e Anderson Toniazo não é apenas um visualizador de dados; é uma ferramenta de trabalho completa, projetada para otimizar o fluxo de verificação e correção colaborativa.

A interface intuitiva do sistema permite que mapeadores de todos os níveis de experiência possam:

Visualizar os Dados do CNEFE 2022: A ferramenta apresenta os dados oficiais de logradouros do censo mais recente de forma clara e sobreposta ao mapa.

Comparar com o OpenStreetMap: O mapeador pode facilmente identificar discrepâncias entre o nome de uma rua registrado no CNEFE e o nome atualmente presente no OSM.

Corrigir e Incluir Nomes: Quando uma rua no OSM está sem nome (algo muito comum em áreas menos mapeadas) ou com um nome diferente do cadastro do IBGE, a ferramenta facilita a correção e a inclusão do nome correto diretamente no mapa .

Preencher Lacunas: Em locais onde o IBGE registrou endereços, mas as ruas correspondentes ainda não foram desenhadas no OSM, a aplicação sinaliza essas áreas, incentivando o mapeamento completo da geometria das vias e, posteriormente, a adição dos nomes.

A plataforma já está na versão 1.0, atualizada em 22 de janeiro de 2026, e conta com um rico material de suporte para a comunidade. Os mapeadores podem acessar um tutorial passo a passo com imagens, assistir a vídeos demonstrativos e até baixar tutoriais completos em PDF para consulta offline, garantindo que todos possam aproveitar a ferramenta ao máximo. A Força da Comunidade por Trás da Ferramenta

O desenvolvimento do Sistema CNEFE é um testemunho do poder e da organização da comunidade OSM Brasil. A UMBRAOSM, sob a liderança de Raphael de Assis, tem se destacado por promover iniciativas que facilitam e profissionalizam o mapeamento colaborativo no país. Projetos como o “Mapeia Crato” já demonstraram a capacidade da união em capacitar novos mapeadores e realizar tarefas de grande escala .

A parceria entre Raphael e Anderson no desenvolvimento desta ferramenta reforça o compromisso da comunidade em não apenas usar os dados abertos, mas também em retribuir, criando ecossistemas que melhoram a qualidade da informação geoespacial disponível para todos. O trabalho deles dialoga diretamente com discussões mais amplas na comunidade, como a correspondência das variáveis do CNEFE 2022 com as etiquetas do OSM, um passo fundamental para qualquer processo de importação ou validação de dados . #Um Futuro com Mapas Mais Precisos

A disponibilização do Sistema CNEFE marca um avanço significativo para o mapeamento brasileiro. Ao facilitar o acesso e a comparação com os dados oficiais do Censo 2022, a ferramenta não só acelera o processo de atualização do mapa, mas também aumenta a confiabilidade da base de dados do OpenStreetMap como um todo.

Para o usuário final, seja ele um motorista usando um aplicativo de navegação, um entregador ou um pesquisador, o resultado são mapas mais precisos, com ruas corretamente identificadas e endereços mais fáceis de localizar. A ferramenta do CNEFE é, portanto, uma peça chave na infraestrutura de dados abertos do Brasil, construída colaborativamente por quem mais entende do assunto: a própria comunidade de mapeadores.

Acesse https://cnefe.mapaslivre.com.br e comece a contribuir para um mapa do Brasil mais completo e correto.


Structured POI Enrichment in Bengaluru, Karnataka

Changeset: 178729012

Today I contributed to OpenStreetMap by improving map completeness in my local area in Bengaluru, Karnataka.

🔹 What I Worked On

Added a missing café using local knowledge Verified placement to ensure it was mapped at the correct entrance location Added appropriate tags including: amenity=cafe name= ##Bean Stop Café

Checked for duplicate entries befor

Changeset: 178729012

Today I contributed to OpenStreetMap by improving map completeness in my local area in Bengaluru, Karnataka.

🔹 What I Worked On

Added a missing café using local knowledge Verified placement to ensure it was mapped at the correct entrance location Added appropriate tags including: amenity=cafe name= ##Bean Stop Café

Checked for duplicate entries before uploading

🔹 Mapping Approach

I focused only on verified, ground-truth information and avoided copying from copyrighted sources. All additions were based on direct familiarity with the area.

🔹 Quality Checks

Ensured the point was not placed on the roadway Confirmed correct spelling and capitalization Reviewed surrounding features for consistency

🔹 Objective

The goal was to improve local POI completeness and contribute accurate, structured data to OpenStreetMap. This is part of my effort to make consistent, quality-focused contributions rather than large, unverified edits.


Pascal Neis

Adding the Missing Dimension: Position Tracking for Vehicle Data Logging

In one of my previous blog posts, I explored how to read live vehicle data through the OBD II port that is present in most (modern) cars. As mentioned in the outlook, the next step in my project is to combine vehicle telemetry with (accurate) positional information in order to enable more advanced analysis. To […]

In one of my previous blog posts, I explored how to read live vehicle data through the OBD II port that is present in most (modern) cars. As mentioned in the outlook, the next step in my project is to combine vehicle telemetry with (accurate) positional information in order to enable more advanced analysis. To achieve this, I created a small GNSS test setup. The platform for all experiments is again a Raspberry Pi. For a first comparison, I selected two GNSS boards from Waveshare: the L76X GPS HAT and the ZED F9X GPS RTK HAT.

Why these two modules?
The L76X is an inexpensive entry level device that is suitable for navigation, mapping or general position tracking. It supports GPS and BDS and normally delivers a position accuracy of a few meters. The ZED F9X belongs to a completely different class. It is a multi band GNSS receiver that supports real time kinematic (RTK) processing. When correction data is available, it can reach accuracy in the range of centimeters, which makes it suitable for robotics, surveying, precision agriculture or any application that requires very accurate geolocation data. The antenna systems also show clear differences. The L76X includes a simple single band GPS antenna, while the ZED F9X works together with a multi band active GNSS antenna that allows reception of several frequency ranges at once. This antenna design is essential for achieving the high accuracy that the ZED F9X is capable of.

From the provided software to writing my own scripts
Both modules are delivered with example software and Python scripts on the manufacturer web pages. I tried using these examples first, but outdated Python versions and older code libraries quickly created compatibility problems. Because of this I moved directly to writing my own scripts, which turned out to be the better choice later on. The L76X operates at one update per second in its default configuration, but it can be configured to send up to ten updates per second. The ZED F9X can operate with even higher update rates, in some cases up to twenty five updates per second depending on the selected messages. However, not every communication protocol supports these higher update rates. I started with NMEA, which worked well up to ten updates per second. Above that limit the protocol becomes inefficient because the messages are relatively large. For the ZED F9X, switching to UBX made much more sense because UBX uses compact binary messages. Unfortunately the L76X does not support UBX, which means NMEA remains the only option for that board.

What comes next?
With the hardware and software configured and with automated startup and first measurement routines working reliably, the next step will be real world testing inside a car. In particular, I want to find out how the speed of the vehicle affects the quality of the GNSS measurements, how different surroundings such as hills, forests and tall buildings influence the accuracy, and how big the practical performance gap is between the simple L76X with its basic antenna and the ZED F9X combined with a multi band active antenna.


OpenStreetMap User's Diaries

Automatic Pedestrian Detection at Signalised Crossings

Automatic Pedestrian Detection at Signalised Crossings

Hi everyone,

I recently noticed that many modern pedestrian crossings are equipped with automatic detection sensors that trigger the traffic signal without requiring a push button.

Currently, in OpenStreetMap, we can tag:

  • highway=crossing and crossing=traffic_signals for signalised c

Automatic Pedestrian Detection at Signalised Crossings

Hi everyone,

I recently noticed that many modern pedestrian crossings are equipped with automatic detection sensors that trigger the traffic signal without requiring a push button.

Currently, in OpenStreetMap, we can tag:

  • highway=crossing and crossing=traffic_signals for signalised crossings
  • button_operated=yes/no to indicate if a manual button is present
  • traffic_signals:sound=yes/no for auditory signals

However, there is no standard way to indicate automatic activation by a detector for pedestrians or vehicles.

To address this, I have proposed a new tag on the OSM forum: detector_operated=yes/no, which would clearly indicate that a traffic signal is automatically triggered by a detector.

You can view and comment on the proposal here: https://community.openstreetmap.org/t/proposal-tag-traffic-signals-detector-operated-pedestrian-presence-sensor/141624

Here is an example illustration showing automatic pedestrian detection:
Automatic pedestrian detection

This tag would help improve mapping of intersections, pedestrian routing, traffic simulation, and accessibility information.

I’d love to hear your thoughts and experiences with automatic pedestrian detection at crossings in your area!


OpenCage

Interview: Nicolas Collignon - Kale AI

Interview with Nicolas Collignon of Kale AI about how they are using OpenStreetMap to build the future of urban logistics delivery

In the second 2026 edition of our OpenStreetMap interview series it was my pleasure to chat with Nicolas Collignon, co-founder and CEO of Kale AI, who are building urban routing solutions for delivery using OpenStreetMap.

Screenshot of the Kale AI

1. Who are you and what do you do? What got you into OpenStreetMap?

I’m Nico, my background is in computational cognitive science. I’m now the CEO of Kale AI, a start up building technology for urban logistics planning. I initially got into OpenStreetMap during a side quest where I got really curious about how to better understand urban tissue, and how to represent it computationally.

2. What is Kale AI? What prompted you to create it?

Kale AI is a company focused on solving the inefficiency problem in urban logistics. We build tools to make complex logistics planning easy. It’s a very hard and interesting problem, and planning is one of the biggest weaknesses of LLMs. We’ve been focused on supporting the transition to Light EVs and cargo-bikes in modern urban logistics fleets. Light EVs are up to 2x more efficient in dense urban areas and use 95% less energy than diesel vans. They’re a multi-solution to improve urban life.

3. Why do we need special routing for urban logistics?

Different vehicles need tailored routing because urban space is becoming increasingly complex. With improving cycling infrastructure, Low Traffic Neighbourhoods and so on, all of this can lead to improved efficiency if we better route vehicles through street networks. For example, a 2-wheeled cargo bike might be able to take a shortcut that a 3-wheeler is blocked from by a bollard. For the 2-wheeler that can save 5-10 minutes off their route, but having to backtrack could add this in additional time for the slightly larger vehicle.

Most of our work doesn’t focus specifically on “navigation” but on planning, assigning deliveries to vehicles and designing the sequence of stops on those routes. Dantzig, who first proposed the Vehicle Routing Problem, explains quite well why it’s hard in his 1958 paper: “Even for small values of n the total number of routes is exceedingly large, e.g. for n = 15, there are 653,837,184,000 different routes.”

In our research, we found that deliverers spend 60-80% of their day not driving, but looking for parking and walking to the door. Different vehicles have different performance advantages in different parts of a city. Light EVs have a big advantage in the centre. Our work focuses on leveraging the different strengths of each vehicle type, and taking into account that diversity makes the VRP even harder to solve.

4. What are the unique challenges involved in routing with OpenStreetMap, particularly for urban logistics?

The data quality is surprisingly good in well-mapped areas. The OSM community is incredibly detail-oriented. But two challenges stand out for us.

The first is completeness and heterogeneity. Coverage varies enormously, not just between cities but within them, and sometimes between streets that are literally 300 metres apart. In our research we found a striking example in Boston where two neighbouring hexagonal cells with almost identical satellite imagery had wildly different tagging. One had 167 highway:service tags, the other just 3. In Chicago suburbs we found a municipality with the highest population density in Illinois where OSM had recorded only 8% of its buildings. That kind of patchiness is a real problem when you’re trying to build models that generalise across cities.

The second is semantic consistency. OSM relies on contributors to categorise things freely, which means the same real-world object can be tagged in multiple ways depending on who mapped it and where. We saw this clearly across our study cities. Contributors in Los Angeles tagged single-family homes as building=house, while the same homes in other cities were tagged with the catch-all building=yes. Locally that’s fine, but the moment you try to build a model that works across cities, those inconsistencies become noise you have to work around.

And beyond the map itself, OSM captures the physical world but not the operational reality of deliveries. How long it takes to park, unload, walk to a door varies enormously by urban context and is invisible to any map. In our research, service time turned out to be one of the biggest drivers of delivery efficiency, yet almost no publicly available data exists on it. That’s a gap OSM can’t fill alone, but it points to how much logistics-specific ground truth is still missing.

5. What steps could the OpenStreetMap community take to improve mapping for urban logistics?

Keep tagging surfaces, seriously. It might feel niche, but it’s one of the most operationally significant pieces of data we use. The granularity OSM brings to surface data is something you simply can’t get from commercial providers, and it makes a real difference in planning accuracy.

Beyond that, access restrictions need more attention: bollards, width restrictions, turning restrictions, loading zone locations. These are the invisible barriers that can completely change how a fleet operates in a city, and they’re often missing or under-tagged. A restriction that a small vehicle sails through might stop a larger one entirely, and right now OSM rarely has enough detail to distinguish those cases.

More broadly, mapping Low Traffic Neighbourhoods and filtered permeability in a consistent, machine-readable way would be hugely valuable. These are increasingly shaping how urban freight actually moves, and having reliable structured data on them would let us plan far more accurately.

6. Recently OpenStreetMap celebrated 20 years. Where do you think the project will be in another 20 years?

I think OSM is going to become even more foundational than it already is, but probably in ways that are less visible. A lot of the most interesting work being done today in autonomous mobility, urban planning, and logistics quietly depends on OSM as a base layer. That’s only going to grow.

What excites me is the intersection with AI. Models are getting better at extracting structured data from imagery, which could dramatically accelerate how quickly OSM reflects the real world: new infrastructure, surface changes, new access restrictions. The community’s role might shift from purely manual contribution toward curation and validation at scale.

And as cities get more complex, with more vehicle types, more restricted zones, more differentiated infrastructure, the value of a community that actually cares about tagging a bollard correctly becomes hard to overstate. That local, granular knowledge is something no corporate mapping effort has ever quite replicated.


Thank you, Nico! Wonderful to see OpenStreetMap becoming a core part of the infrastructure of modern cities. As people, companies, communities use and rely on OSM, they will in turn start editing and maintaining the data for all of us to benefit.

Forward!

Ed

Please let us know if your community would like to be part of our interview series here on our blog. If you are or know of someone we should interview, please get in touch, we’re always looking to promote people doing interesting things with open geo data.

Tuesday, 17. February 2026

OpenStreetMap User's Diaries

Initial Building Mapping and Data Refinement Session

Changeset: 176210161

In this changeset (176210161), I focused on improving building-level mapping by adding missing building outlines and refining structural details using Bing Maps aerial imagery.

The objective of this session was to enhance spatial accuracy and improve map completeness in the area. I ensured that:

  • Building footprints were aligned correctly with satelli

Changeset: 176210161

In this changeset (176210161), I focused on improving building-level mapping by adding missing building outlines and refining structural details using Bing Maps aerial imagery.

The objective of this session was to enhance spatial accuracy and improve map completeness in the area. I ensured that:

  • Building footprints were aligned correctly with satellite imagery
  • Proper geometry was maintained
  • No duplicate structures were created
  • Tagging remained consistent with OSM standards

This edit was completed using the iD editor (v2.37.3), and I requested a review to ensure quality validation and community feedback.

Working on building details helped strengthen my understanding of:

  • Accurate polygon tracing
  • Satellite imagery interpretation
  • Clean data structuring
  • Version control within OSM changesets

I will continue improving structured building data and map quality in Karnataka.


Campus Mapping & Data Refinement in Yelahanka, Karnataka

Changeset number is 178690672

Today, I worked on improving map data around Yelahanka Taluku, Karnataka. I updated the official name of Sai Vidya Institute of Technology to reflect accurate real-world information and ensured proper tagging consistency.

In addition to correcting the name, I reviewed campus boundary structure, building tagging, and surrounding infrastructure to avoid dupli

Changeset number is 178690672

Today, I worked on improving map data around Yelahanka Taluku, Karnataka. I updated the official name of Sai Vidya Institute of Technology to reflect accurate real-world information and ensured proper tagging consistency.

In addition to correcting the name, I reviewed campus boundary structure, building tagging, and surrounding infrastructure to avoid duplication and maintain data integrity. I verified that the edits align with real-world sources and OSM tagging standards.

My focus during this session was on:

  • Accurate name correction
  • Structured campus boundary validation
  • Road connectivity refinement
  • POI accuracy improvement
  • Avoiding duplicate objects

This session helped reinforce the importance of precise tagging, version tracking, and reviewing live map data versus cached tiles. I will continue contributing to improving structured geospatial data across Karnataka.


[Résolu] Vérification post-correction : Itinéraire cyclable à Toulouse (Note #5169818)

✅ Info : Problème résolu

Le bloc de béton obsolète à Toulouse (coordonnées : 43,5615376 ; 1,4920996) a été supprimé dans OpenStreetMap.
L’itinéraire cyclable est désormais correct sur Geovelo, sans détour inutile.

Contexte : Le 17 février 2026, j’ai résolu la note #5169818 signalant un problème d’itinéraire cyclable à Toulouse (coordonnées : 43,

✅ Info : Problème résolu

Le bloc de béton obsolète à Toulouse (coordonnées : 43,5615376 ; 1,4920996) a été supprimé dans OpenStreetMap.
L’itinéraire cyclable est désormais correct sur Geovelo, sans détour inutile.

Contexte : Le 17 février 2026, j’ai résolu la note #5169818 signalant un problème d’itinéraire cyclable à Toulouse (coordonnées : 43,5615376 ; 1,4920996). Un bloc de béton obsolète (après la fin des travaux) provoquait un détour inutile sur les calculs d’itinéraire.

Actions réalisées : - Correction dans OSM : suppression de l’obstacle (changeset #178691426). - Attente de la mise à jour des données par Geovelo.


Trajet test problématique

Itinéraire Geovelo avec détour Lien direct pour tester : Geovelo - Itinéraire test


À faire : - Vérifier vers le 19 mars 2026 si l’itinéraire est corrigé sur Geovelo/OSRM. - Si le problème persiste, rouvrir la note ou contacter Geovelo.

Localisation : Voir sur OSM


#OpenStreetMap #Toulouse #Vélo #Contribution #Geovelo


Attività di Alternanza Scuola/Lavoro (ex PCTO) a Pesaro

🗺️ Pesaro ha bisogno di te!

Di seguito le persone coinvolte nel progetto: “Pesaro ha bisogno di te!”.

A un gruppo di studenti e studentesse è stato chiesto di incrementare il livello di precisione e accuratezza della mappa nella loro città (e dintorni, alcune e alcuni vivono in zone limitrofe).

Tutte le modifiche saranno ritenute valide se, e solo se, riporteranno l’hashtag #PCT

🗺️ Pesaro ha bisogno di te!

Di seguito le persone coinvolte nel progetto: “Pesaro ha bisogno di te!”.

A un gruppo di studenti e studentesse è stato chiesto di incrementare il livello di precisione e accuratezza della mappa nella loro città (e dintorni, alcune e alcuni vivono in zone limitrofe).

Tutte le modifiche saranno ritenute valide se, e solo se, riporteranno l’hashtag #PCTOMarconi2026 e verranno effettuate dagli utenti coinvolti nel progetto di seguito elencati (viene riportato solo il nome utente).

📭 Per contattare il coordinatore del progetto

⚠️ In caso di necessità, vi prego di contattarmi via email <[email protected]> o su Telegram (@galessandroni). In questi canali sono più reattivo rispetto alla messaggistica interna.

Naturalmente, sentitevi liberi di correggere qualsiasi vandalismo refuso doveste notare.

👩‍🎓👨‍🎓 Utenti coinvolti nel progetto

N Utente (attività) Modifiche
0 Galessandroni Tutor
1 _basii 0
2    
3 CoolCastle561 0
4    
5    
6 Lorenzo-Cecchini 0
7 FedericoCrine 0
8    
9 ANTOHH 0
10    
11 Pit-_- 2
12    
13 Roberto Fazzini 0
14    
15 ga gasparri 0
16    
17 dadograss 0
18    
19    
20 Ariannapagnoni 36
21    
22 santa222 23
23 ele stefanini 4
24    
25    
26 davide zagaria 0

Ultimo aggiornamento: 20 febbraio 2026

🛠️ Strumenti utili


Setting up an Overpass API server - how hard can it be?

Many people have noticed that publicly available Overpass servers have been suffering from overuse (a typical “tragedy of the commons”). OSM usage policies generally contain the line “OpenStreetMap (OSM) data is free for everyone to use. Our tile servers are not”. Unfortunately, there have been problems with overuse of the public Overpass servers, despite the usage policy. “Just blo

All the hospitals in the UK and Ireland, in about 10 seconds

Many people have noticed that publicly available Overpass servers have been suffering from overuse (a typical “tragedy of the commons”). OSM usage policies generally contain the line “OpenStreetMap (OSM) data is free for everyone to use. Our tile servers are not”. Unfortunately, there have been problems with overuse of the public Overpass servers, despite the usage policy. “Just blocking cloud providers” isn’t an option, because (see here - use the translate button below) lots of different sorts of IP addresses, including residential proxy addresses, are the problem.

People who want to use e.g. Overpass Turbo do have the option to point it at a different Overpass API instance. If you’re using Overpass Turbo and you get an error due to unavailability, likely that is because the Overpass API that it is using is overwhelmed. There are other public Overpass API instances, but they may be complete (in terms of geography, or history) or up to date.

At this point, if you’re one of the people who created the problem you’ll likely just spin up more instances to retry after timeouts and make the problem worse. Most people reading this are I hope not in that category. There are commercial Overpass API providers - more details for the example in that table can be found here.

Other people (including me) might wonder whether it’s possible (without too much work) to set up an Overpass API server that just covers one or two countries. To keep it simple, let’s restrict myself to Britain and Ireland 2.3GB in OSM, and let’s not worry about Attic Data (used for “queries about what was in OSM in the past”) or metadata.

Let’s just try and do regular Overpass queries such as you might start from this taginfo page, like this. I’ll also only target the Overpass API, and will use “settings” in an Overpass Turbo instance to point to my Overpass API server. I do want to apply updates as OSM data is changed.

I’m interested in creating a server covering the UK and Ireland. In terms of size, have a look at how much bigger or smaller your area of interest is than the 2.3GB of Britain and Ireland below and use that to judge what size server you might need.

Documentation

At this point it’s perhaps worth mentioning that the documentation around Overpass is … (and I’m channelling my inner Sir Humphrey here) “challenging”.

There’s the OSM wiki which talks about “Debian 6.0 (squeeze) or Debian 7.0 (wheezy)”, the latter of which went EOL in May 2018. There is also an HTML file on overpass-api.de. That is … (engages Sir Humphrey mode again) not entirely accurate, in that it says to run something that doesn’t exist if you’ve cloned the github repository.

One of the best document by far is external and is by ZeLonewolf, which starts off by saying “I found the existing guides to be lacking”. It then says “This is a combination of various other guides, the official docs, Kai Johnson’s diary entry…”

(which is the other “best document”)

“… and suggestions from @mmd on Slack. This guide is intended to demonstrate how to configure a server dedicated to only running overpass”. Kai’s diary entry from 2023 is definitely worth reading (sample quote there “Running an Overpass server is not for the faint of heart. The software is really finicky and not easy to maintain. You need to have some good experience with Linux system administration and the will and patience to deal with things that don’t work the way they’re supposed to”).

Also, this github issue (and things linked from it) summarises some of the issues that I had on the way to getting my test server set up.

Where what I’m doing below differs from what the other guides say I’ll try and say why I am doing it differently. Usually it’s because my requirements are different (e.g. an overpass server for a small area rather than everywhere, on a VSP rather than a piece of tin, or because I need limited functionality).

Server

For my use case, we’ll need a server that is publicly accessible on the internet to do this. I’m already a customer of Hetzner, so I’ll create a test server there. Other providers are available, and may make more sense depending where you are in the world and how much you want to pay. For testing, spinning up something at one of the hyperscalers might make financial sense, but I suspect not long-term. I went with a CX43 with 160GB of SSD disk space, 16GB RAM and a rather large amount of bandwidth. This turned out to be about the right size for Britain and Ireland. I went with Debian 13 and public ipv4 and ipv6 addresses. I don’t know if Overpass releases need a particular architecture, but went with “x86” rather than “ARM” just in case.

If you’re needs are different you don’t have to use a cloud server for this. and Kai’s diary entry has a lot of information about physical server sourcing and setup.

Sizing was alas largely guesswork and trial and error - while I’m sure that the commercial providers know chapter and verse on this, there isn’t a lot written down about “sizing based on extract size” that isn’t “how long is a piece of string”. I found that loading even North Yorkshire (just 56MB in OSM) created a nodes file in the database area of 23GB, so that sets the minimum server size, even for very small test extracts.

The speed of the disk used needs to be able to apply updates in less time than the updates are of. If it takes 2 hours to apply 1 hour of updates, your server will never catch up. In practice I didn’t find this to be an issue with the servers at Hetzer and the relatively small extracts that I was working with.

Initial server setup

In what follows I’ll use youruseraccount, yourserver and yourdomain in place of the actual values I used.

I already have some ssh keys stored at Hetzner, so when buying the server, I chose a new name in the format “yourserver.yourdomain” and added my ssh keys. I have yourdomain registered at a DNS provider, and I added the IPV4 and IPV6 addresses there. I can now ssh in as root to “yourserver.yourdomain”, and run the usual:

ssl -l root yourserver.yourdomain
apt update
apt upgrade

and bounce the server and log back in again.

The next job is to create a non-root account for regular use and add it to the “sudo” group:

useradd -m  youruseraccount
usermod -aG sudo youruseraccount
chsh -s /bin/bash youruseraccount

I’ll create a new password in my password manager for youruseraccount on this server (obviously I used my account name rather than actually youruseraccount, but you get the idea…). Next, set the new account password to the newly chosen password

passwd youruseraccount

and check I can login to the new server as youruseraccount with that password, and become root:

ssh -l youruseraccount yourserver.yourdomain
sudo -i 
exit

Install some initial software:

sudo apt install emacs-nox screen git tar unzip wget bzip2 net-tools curl apache2 wget g++ make expat libexpat1-dev zlib1g-dev libtool autoconf automake locate 

That list includes both software prereqquisites (apache2) and things that will be really useful (screen). It also includes emacs as a text editor; you can use your preferred one instead wherever emacs is mentioned below.

To use screen you just type screen and then press return. You can manually detach from it by using ^a^d and later reattach by using “screen -r”. If there are multiple screens you can attach to you’ll see something like this:

There are several suitable screens on:
    95207.pts-2.h23 (02/15/2026 09:20:20 AM)        (Detached)
    95200.pts-2.h23 (02/15/2026 09:19:57 AM)        (Detached)
    1633.pts-2.h23  (02/14/2026 12:37:50 PM)        (Attached)
Type "screen [-d] -r [pid.]tty.host" to resume one of them.

and you can choose which one to reconnect to by typing in (say) “95207” and pressing “tab”. To force a reconnection to a screen that something else is attached to, use “screen -d -r”.

In many cases below I’ll say “(in screen)” - this just means it’s a good idea to run these commands from somewhere that you can detach from and reattach to. It doesn’t mean you need to create a new screen every time.

The ssh keys that I had stored have been added for root by Hetzner, but I also want to add them to my new account too:

sudo -i
sudo -u youruseraccount -i
ssh-keygen -t rsa
(either use existing password for ssh passphrase, or create and store a new one)
exit

cp /root/.ssh/authorized_keys /home/youruseraccount/.ssh/
emacs /home/youruseraccount/.ssh

… and in there change the ownership of the files to youruseraccount.

Next, check that you can ssh in to yourserver.yourdomain without a password. Next disable regular password access. We don’t want people to be able to brute force password access to a server on the internet, so we can just turn this off.

sudo emacs /etc/ssh/sshd_config

Find the line that says

# To disable tunneled clear text passwords, change to "no" here!

and uncomment and change the next two lines to say

PasswordAuthentication no
PermitEmptyPasswords no

save the file and then

sudo /etc/init.d/ssh restart

and then try and login (from the shell on that machine will work as a test)

ssh 127.0.0.1

It should say Permission denied (publickey).

Setting up a certificate is the next priority. Everything on the internet these days pretty much assumes https access, so let’s do that before even thinking about overpass. I’ll use acme.sh for that. Other providers and tooling are available and you can use them if you prefer. Login as your non-root account and then:

sudo -i
cd
wget -O -  https://get.acme.sh | sh -s email=youremailaddress
exit
sudo -i
/etc/init.d/apache2 stop
acme.sh --standalone --issue -d yourserver.yourdomain -w /home/www/html  --server letsencrypt

the last lines of the output you get should be like

-----END CERTIFICATE-----
[Sat Feb 14 12:51:45 AM UTC 2026] Your cert is in: /root/.acme.sh/yourserver.yourdomain_ecc/yourserver.yourdomain.cer
[Sat Feb 14 12:51:45 AM UTC 2026] Your cert key is in: /root/.acme.sh/yourserver.yourdomain_ecc/yourserver.yourdomain.key
[Sat Feb 14 12:51:45 AM UTC 2026] The intermediate CA cert is in: /root/.acme.sh/yourserver.yourdomain_ecc/ca.cer
[Sat Feb 14 12:51:45 AM UTC 2026] And the full-chain cert is in: /root/.acme.sh/yourserver.yourdomain_ecc/fullchain.cer

Next do

sudo a2ensite default-ssl
sudo a2enmod ssl
sudo systemctl reload apache2

and then edit the default site config

sudo emacs /etc/apache2/sites-enabled/default-ssl.conf

Replace the SSL references with the correct ones.

SSLCertificateFile      /root/.acme.sh/yourserver.yourdomain_ecc/fullchain.cer
SSLCertificateKeyFile   /root/.acme.sh/yourserver.yourdomain_ecc/yourserver.yourdomain.key

Restart apache

sudo systemctl restart apache2

and browse to https://yourserver.yourdomain to make sure that the certificate is working. You’ll need to arrange for that certificate to be renewed every couple of months, but let’s concentrate on overpass for now.

That is it for the initial server setup, so now would be a good time for a server snapshot or other sort of backup.

Setting up the user for Overpass

For this part, we’re going to follow parts of ZeLoneWolf’s guide. I’ve reproduced that mostly as written below, although some of the software already was installed earlier.

sudo su

mkdir -p /opt/op
groupadd op
usermod -a -G op youruseraccount
useradd -d /opt/op -g op -G sudo -m -s /bin/bash op
chown -R op:op /opt/op
apt-get update
apt-get install g++ make expat libexpat1-dev zlib1g-dev apache2 liblz4-dev curl git
a2enmod cgid
a2enmod ext_filter
a2enmod headers

exit

The username that we created above is “op”. We won’t use a password for that but will just use

sudo -u op -i

when we need to change to it from our normal user account.

Configuring Apache

We already have Apache set up with a default HTTPS website that says “It works!”. We’ll use some of what’s in ZeLoneWolf’s Guide but we DON’T want to completely replace our config with that one. Instead we’ll selectively copy in some sections. Edit the file as is:

sudo emacs /etc/apache2/sites-available/default-ssl.conf

Note that we are using https with the defaults and the filename is different to the example.

Find this line:

DocumentRoot /var/www/html

and after it insert this section:

# Overpass API (CGI backend)                                                                          
ScriptAlias /api/ /opt/op/cgi-bin/

<Directory "/opt/op/cgi-bin/">
        AllowOverride None
        Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch
        Require all granted

        # CORS for Overpass Turbo                                                                     
        Header always set Access-Control-Allow-Origin "*"
        Header always set Access-Control-Allow-Methods "GET, POST, OPTIONS"
        Header always set Access-Control-Allow-Headers "Content-Type"
</Directory>

# Compression (for API responses)                                                                     
ExtFilterDefine gzip mode=output cmd=/bin/gzip

# Logging                                                                                             
ErrorLog /var/log/apache2/error.log
LogLevel warn
CustomLog /var/log/apache2/access.log combined

# Long-running Overpass queries                                                                       
TimeOut 300

I then deleted a bunch of lines, all comments of functional duplicates of what we had just added, down to but not including:

#   SSL Engine Switch:                                                                                

Save and restart apache:

sudo /etc/init.d/apache2 restart

and check that you can still browse to “https://yourserver.yourdomain”. It won’t look any different as the default website has not been changed; we’ll test the “cgi-bin” parts later.

Compile and Install Overpass

This is drawn directly from ZeLoneWolf’s guide. Note that this does NOT clone the github repository and build it locally. At the time of writing the latest version is “v0.7.62.10” so you’ll see that number below.

sudo su op

cd
wget https://dev.overpass-api.de/releases/osm-3s_latest.tar.gz
tar xvzf osm-3s_latest.tar.gz

cd osm-3s_v0.7.62.10/

time ./configure CXXFLAGS="-O2" --prefix=/opt/op --enable-lz4

That took 5s when I ran it. Next:

time make install

That took 9 minutes. Next:

cp -pr cgi-bin ..
cd
chmod -R 755 cgi-bin
mkdir db
mkdir diff
mkdir log
cp -pr osm-3s_v0.7.62.10/rules db

Those three directories created are for the database, minutely diff files and logfiles. In operation, the biggest by far will be “db” - we’ll expect 2.3GB of .pbf extract to create a database of initially 80GB or so. We’ll talk more about this later.

Loading OSM Data

The equivalent section of ZeLoneWolf’s guide is called “Download the Planet”. We don’t actually want to do that - we just want a data extract for our area of interest.

I’ll download a Geofabrik extract in my normal user account and make sure that it is accessible to the “op” user. Firstly browse to (in may caase) https://download.geofabrik.de/europe/britain-and-ireland.html . There is a link there to https://download.geofabrik.de/europe/britain-and-ireland-latest.osm.pbf anf a comment that says something like “This file was last modified 22 hours ago and contains all OSM data up to 2026-02-12T21:23:29Z”.

When logged in as youruseraccount:

mkdir ~/data
cd ~/data
time wget https://download.geofabrik.de/europe/britain-and-ireland-latest.osm.pbf

I then moved the file so that the filename contained the timestamp

mv britain-and-ireland-latest.osm.pbf britain-and-ireland_2026-02-12T21:23:29Z.osm.pbf

That is a .pbf format download - that format was introduced to OSM around 2010 and is basically pretty standard now. Unfortunately, Overpass still needs the previously used .bz2 format, but we can convert it:

(in screen)
sudo apt install osmium-tool
time osmium cat britain-and-ireland_2026-02-12T21\:23\:29Z.osm.pbf -o britain-and-ireland_2026-02-12T21\:23\:29Z.osm.bz2

That took around 1 hour 20 minutes (and frustratingly the progress bar looks like it was written by someone from Windows 2000) - don’t cancel it if it appears to be stuck, instead have a look to see if it is actually writing out a file. If you want to verify the resulting file:

(in screen)
time bzip2 --test britain-and-ireland_2026-02-12T21\:23\:29Z.osm.bz2 

That took around 11 minutes for me.

Still as youruseraccount, make the download area browsable via the “op” user”::

chmod o+rx ~
chmod o+rx ~/data

If you’re not comfortable with this then you can of couurse copy or more the file as root later.

Configure launch scripts.

This is based on ZeLoneWolf’s guide again, which in turn is using scripts that Kai Johnson wrote.

As the overpass user:

mv bin bin.bak && mkdir bin
git clone --depth=1 https://github.com/ZeLonewolf/better-overpass-scripts.git bin
rm -rf bin/.git

and we’ll need to copy some things from the build into that directory. This will include at least:

cp /opt/op/osm-3s_v0.7.62.10/bin/update_database bin/
cp /opt/op/osm-3s_v0.7.62.10/bin/update_from_dir bin/
cp /opt/op/osm-3s_v0.7.62.10/bin/osm3s_query bin/
cp /opt/op/osm-3s_v0.7.62.10/bin/dispatcher bin/

but I actually copied everything missing from the new “bin” directory. We installed “locate” above. If anything hs been inadvertantly missed you can use e.g. “locate nameofmissingthing” and it will find it. This is a bit messy, and it’d be great to have something that’s a bit more solid and has less of the “porcine face paint applicator” feel to it; but I did not want to go too far down that road as I was trying to set something up “without too much work”.

Change the scripts to work with data extracts and no attic or meta data

We’re going to load a data extract from Geofabrik, and we’d also like to be able to update it with changes as other people update OSM. Normally the workflow that I’d suggest for this sort of thing is to download minutely updates from https://planet.osm.org, use trim_osm.py to snip them down to the area that we’re interested in and then apply those as updates.

By default, Overpass does run with planet.osm.org minutely diffs but alas I’ve struggled to get those to work with a data extrct; the updater falls over when it finds certain sorts of data that it is not expecting (i.e. was never originally loaded) in diff files. However, Geofabrik does provide daily diff files that match their extracts, so we can use those instead.

Also, we’re only interested in “now” data - we’re not creating an Overpass server with “attic” data that allows us to query data from back in 2012.

We therefore have to make a bunch of changes to scripts.

startup.sh

In there, we will change “https://planet.openstreetmap.org/replication/minute” to “https://download.geofabrik.de/europe/britain-and-ireland-updates”.

We’ll change --meta=attic to --meta=no because we’re not doing anything with “attic” data.

We’ll remove the --attic from the “dispatcher” call.

apply_osc_to_db.sh

We’ll change EXPECTED_UPDATE_INTERVAL from 57 to 3557 or even longer. We’re expecting files once a day not once a minute, but checking every hour is not too bad.

log file management

There’s a section in ZeLoneWolf’s guide that covers this.

Log files will eventually grow large and will eventually need a log rotation mechanism to be set up, but let’s gloss over that for now as I’m eager to see Overpass actually running!

Server automation

See ZeLoneWolf’s guide.

I have deliberately not done this yet as I don’t want to automatically do anything; rather I’d like to control it manually so that I can watch that it does what it is supposed to.

Load the data.

(in screen)
time bin/init_osm3s.sh /home/youruseraccount/data/britain-and-ireland_2026-02-12T21\:23\:29Z.osm.bz2 "db/" "./" --meta=no

That took about 77 minutes for me. Lots of files will have been created in “db”. A quick check on disk usage is in order:

df .
Filesystem     1K-blocks     Used Available Use% Mounted on
/dev/sda1      157207480 76407544  74363468  51% /

op@h23:~$ fc du
du -BG db/* | sort -n -r | head
53G     db/nodes.map
9G      db/ways.map
3G      db/ways.bin
3G      db/nodes_meta.bin
3G      db/nodes.bin
2G      db/way_tags_global.bin
2G      db/ways_attic.map
2G      db/nodes_attic.map
1G      db/way_tags_local.bin.idx
1G      db/way_tags_local.bin

It’s worth noting that those are large numbers for an extract. The 2.3GB data extract has created a 53GB nodes.map file. Compression is supported, but I haven’t tested it.

Set up replicate_id

There’s a file in the db directory (which will be created if it does not already exist) that determines the place to start consuming diffs from. These vary by server; the number corresponding to planet.osm.org replication from a certain data will different to the one for Geofabrik replication for the same date.

In our example we’re using Geofabrik data from 12th Feb 2026. We can browse through https://download.geofabrik.de/europe/britain-and-ireland-updates/ and https://download.geofabrik.de/europe/britain-and-ireland-updates/000/004/ until we find the immediately prior state file https://download.geofabrik.de/europe/britain-and-ireland-updates/000/004/693.state.txt , which contains sequenceNumber=4693. This means that 4693 is our magic number.

We’ll therefore edit the replicate_id file (creating it if it does not exist) and write 4693 (with a linefeed after) to it.

Before we do anything else, now is a good opportunity for another snapshot.

Start overpass

If this isn’t the first time you’ve started overpass you may want to take backup copies of previous “diff” directories or “log” files. Then:

bin/startup.sh

You should see something like this:

[2026-02-15 12:43:14] INFO: Starting Overpass API components...
[2026-02-15 12:43:14] INFO: Starting base_dispatcher...
[2026-02-15 12:43:14] INFO: Cleaning up stale files...
[2026-02-15 12:43:14] INFO: base_dispatcher is running (PID: 107771)
[2026-02-15 12:43:14] INFO: Starting area_dispatcher...
[2026-02-15 12:43:14] INFO: area_dispatcher is running (PID: 107783)
[2026-02-15 12:43:14] INFO: Starting apply_osc...
[2026-02-15 12:43:14] INFO: apply_osc is running (PID: 107795)
[2026-02-15 12:43:14] INFO: Starting fetch_osc...
[2026-02-15 12:43:14] INFO: fetch_osc is running (PID: 107835)
[2026-02-15 12:43:14] INFO: Performing final verification...
[2026-02-15 12:43:16] INFO: base_dispatcher verified (PID: 107771)
[2026-02-15 12:43:17] INFO: area_dispatcher verified (PID: 107783)
[2026-02-15 12:43:17] INFO: apply_osc verified (PID: 107795)
[2026-02-15 12:43:17] INFO: fetch_osc verified (PID: 107835)
[2026-02-15 12:43:17] INFO: All Overpass components started successfully

[2026-02-15 12:43:17] INFO: === Process Status ===
  base_dispatcher      PID: 107771
  area_dispatcher      PID: 107783
  apply_osc            PID: 107795
  fetch_osc            PID: 107835

In the directories below “diff”, you should see that it has downloaded daily diffs for any days since your extract, for example:

  /opt/op/diff/000/004: (56 GiB available)
  drwxrwxr-x 2 op op    4096 Feb 16 01:07 .
  -rw-rw-r-- 1 op op 3874289 Feb 16 01:07 697.osc.gz
  -rw-rw-r-- 1 op op     113 Feb 16 01:07 697.state.txt
  -rw-rw-r-- 1 op op 3033325 Feb 15 12:43 696.osc.gz
  -rw-rw-r-- 1 op op 3405594 Feb 15 12:43 695.osc.gz
  -rw-rw-r-- 1 op op 3057997 Feb 15 12:43 694.osc.gz
  -rw-rw-r-- 1 op op     113 Feb 15 12:43 695.state.txt
  -rw-rw-r-- 1 op op     113 Feb 15 12:43 696.state.txt
  -rw-rw-r-- 1 op op     113 Feb 15 12:43 694.state.txt
  drwxrwxr-x 3 op op    4096 Feb 15 12:43 ..

In “log” you should see something like:

  /opt/op/log: (56 GiB available)
  -rw-rw-r--  1 op op  12111701 Feb 16 23:39 apply_osc_to_db.out
  drwxr-xr-x 13 op op      4096 Feb 16 20:14 ..
  drwxrwxr-x  2 op op      4096 Feb 15 12:43 .
  -rw-rw-r--  1 op op         0 Feb 15 12:43 osm_base.out
  -rw-rw-r--  1 op op         0 Feb 14 14:26 fetch_osc.out
  -rw-rw-r--  1 op op         0 Feb 14 14:26 areas.out

Testing standalone

At the command line type:

bin/osm3s_query

Paste in this:

<query type="nwr"><bbox-query n="51.96" s="51.86" w="-3.31" e="-3.22"/><has-kv k="amenity" v="pub"/></query><print/>

Press return. Press ^d. A selection of data will be returned.

Testing from Overpass Turbo

In a web browser, browse to https://overpass-turbo.eu/s/2kEW .

Click “settings”. Change “server” from “https://overpass-api.de/api/” to “https://yourserver.yourdomain/api/”. Click “run”. You should not get an error, and should get a couple of nodes and 4 ways returned.

For the avoidance of doubt - if you browse to “https://yourserver.yourdomain/” you’ll get some sort of “It works!” page. If you browse to “https://yourserver.yourdomain/api/” you’ll actually get an error - it’s designed to be accessed (see the CORS settings above) by Overpass Turbo, not a regular browser.

Now what?

Shutting everything down and taking a snapshot of the server is a good idea at this point. The long-term cost of snapshots is small (€0.20 per month or so). The cost of leaving a server of this specification running 24x7 isn’t that large - around €10, perhaps a couple of beers or a couple of fancy coffees.

You might also want to think about setting up an Overpass server that does include metadata and attic data - but you’re probably better off with a dedicated server for that, and better off following one of the other guides linked above.

Edit: Minor clarification re use of Overpass API URL following a question on IRC.

Monday, 16. February 2026

OpenStreetMap User's Diaries

Lincolnshire Flood Emergency Routes Out

Lincolnshire ER OUT Routes

Hello! This is my first Diary Entry and I wanted to dedicate it to the Forum Post that I made about the UKs Only (I Believe) ER OUT routes in the case of any emergencies: mainly flooding in this case.

Overview

After major flooding in 2013 the council created the Lincolnshire ER Routes to enable people to quickly evacuate from the flood areas. Many of you make

Lincolnshire ER OUT Routes

Hello! This is my first Diary Entry and I wanted to dedicate it to the Forum Post that I made about the UKs Only (I Believe) ER OUT routes in the case of any emergencies: mainly flooding in this case.

Overview

After major flooding in 2013 the council created the Lincolnshire ER Routes to enable people to quickly evacuate from the flood areas. Many of you make have driven past these and never even noticed! They are Red rectangular signs with the white text of ER out on them with a direction to follow. They are placed at every turn, so the evacuees follow the road ahead until a signs says otherwise.

Example Sign

https://www.geograph.org.uk/photo/6485754

Route End

The end of the route signifies that the evacuees are clear of the major flood risk and (presumably) there would be further guidance at the end of the route. The route end sign is the same as the direction signs however it features 5 black diagonal lines.

https://www.geograph.org.uk/photo/6033649

OSM Mapping

In the forum post I have included some proposed tags and along with Insert User who has suggested some changes to the signage.

I will be unable to fully map these routes out as I rarely venture to the south of Lincolnshire. If you live near one of these routes please do help to map these! I presume it will take a while to map all of the routes but I think it will be worth it in the event of any flooding within the region!

Please do not hesitate to contribute to the forum post!


E65 CENTRAL GREECE

Σε εφαρμογές που “πατάνε” στους Open Street Maps (σίγουρα στις Mapy. OSMand, Organic Maps, ίσως και αλλού), ως E65 εμφανίζεται ΛΑΘΟΣ ο παλιός δρόμος Λαμία-Δομοκός-Φάρσαλα-Λάρισα και όχι ΣΩΣΤΑ ο αυτοκινητόδρομος Θερμοπύλες-Καλαμπάκα (και ημιτελές Βορειότερα ως την συμβολή με την Εγνατία). Ιδίως για τους ξένους ταξιδιώτες είναι μέγα μπέρδεμα.

Σε εφαρμογές που “πατάνε” στους Open Street Maps (σίγουρα στις Mapy. OSMand, Organic Maps, ίσως και αλλού), ως E65 εμφανίζεται ΛΑΘΟΣ ο παλιός δρόμος Λαμία-Δομοκός-Φάρσαλα-Λάρισα και όχι ΣΩΣΤΑ ο αυτοκινητόδρομος Θερμοπύλες-Καλαμπάκα (και ημιτελές Βορειότερα ως την συμβολή με την Εγνατία). Ιδίως για τους ξένους ταξιδιώτες είναι μέγα μπέρδεμα.


FOSSGIS e.V. / OSM Germany

Nur noch wenige Wochen bis zur FOSSGIS 2026 in Göttingen - die Vorfreude steigt

Die FOSSGIS-Konferenz 2026 findet vom 25.-28. März 2026 in Göttingen und Online statt. Es sind nur noch wenige Wochen bis zur Konferenz. Die Vorfreude wächst stetig und die Vorbereitungen laufen auf Hochtouren!

Die Konferenz wird vom gemeinnützigen FOSSGIS e.V, der OpenStreetMap Community in Kooperation mit dem Geographischen Institut der Georg-August-Universität Göttingen organisiert u

Die FOSSGIS-Konferenz 2026 findet vom 25.-28. März 2026 in Göttingen und Online statt. Es sind nur noch wenige Wochen bis zur Konferenz. Die Vorfreude wächst stetig und die Vorbereitungen laufen auf Hochtouren!

Die Konferenz wird vom gemeinnützigen FOSSGIS e.V, der OpenStreetMap Community in Kooperation mit dem Geographischen Institut der Georg-August-Universität Göttingen organisiert und findet auf dem Campus der Uni Göttingen statt.

Auch in diesem Jahr zeichnet sich ein großes Interesse an der Konferenz ab. Die Anmeldungen steigen von Woche zu Woche. Zum Glück bietet das Zentrale Hörsaalgebäude der Uni Göttingen ausreichend Platz, so dass es die bisher größte FOSSGIS-Konferenz werden könnte.

FOSSGIS Konferenz 2026 Göttingen

FOSSGIS 2026 Programm und Zeitplan

Das FOSSGIS Team freut sich auch in diesem Jahr auf ein spannendes Programm mit zahlreichen Vorträgen, ExpertInnenfragestunden, Demosessions, BoFs und Anwendertreffen und sowie 28 Workshops. Das Konferenzprogramm findet von Mittwoch bis Freitag im Zentralen Hörsaalgebäude (ZHG) der Uni Göttingen statt. Am Samstag finden OSM-Samstag und Community Sprint an der Fakultät für Geowissenschaften und Geographie am Nordcampus statt.

https://www.fossgis-konferenz.de/2026/programm/

Die Konferenz startet in diesem Jahr schon am Dienstag, den 24.03.2026 ab 10 Uhr mit längeren Workshops (180 Minuten). Wählen Sie unter 7 Workshops aus siehe Programm und reisen Sie schon am Dienstag an. Die Workshops sprechen sowohl Einsteiger:innen als auch Fortgeschrittene an, es sind noch Plätze frei. Buchen Sie gerne noch einen Workshop und nutzen Sie die Chance in kurzer Zeit Wissen zu einem Thema aufzubauen.

FOSSGIS vernetzt - Anwendertreffen und Community Sprint

Rund um die und während der Konferenz gibt es zahlreiche Möglichkeiten sich zu vernetzen. Die Pausenversorgung kombiniert mit Firmen-Ausstellung und Poster-Ausstellung finden im Foyer des ZHG statt sowie auch die Abendveranstaltung am ersten Konferenztag. Für die fachliche Vernetzung bieten sich Gelegenheiten bei den Anwendertreffen, Expert:innenfragestunden und weiteren Community Sessions, eine Onlineteilnahme ist möglich. https://www.fossgis-konferenz.de/2026/socialevents/

Reichhaltiges Rahmenprogramm

In diesem Jahr freuen wir uns über ein vielseitiges Rahmenprogramm mit spannenden Exkursionen und Treffen in interessanten Lokationen Göttingens. FOSSGIS steht auch für Netzwerken. Dies ist schon am Dienstagabend möglich. Die Geochicas laden zu einem Treffen ein. Außerdem findet der Inoffizielle Start mit einem gemeinsamen Abendessen (Selbstzahler) statt und heißt alle schon angereisten Konferenzteilnehmenden willkommen.

Alle Informationen finden sich unter https://www.fossgis-konferenz.de/2026/socialevents/

FOSSGIS Konferenz 2026 Sponsoren

Herzlichen Dank an die Sponsoren der Konferenz. die durch Ihre Unterstützung maßgeblich zur Finanzierung der Veranstaltung beitragen. Werden auch Sie FOSSGIS-Sponsor. Wir freuen uns über weitere Unterstützung. Informationen finden Sie unter https://fossgis-konferenz.de/2026/#Sponsoring

FOSSGIS Konferenz 2026 Sponsoren

FOSSGIS - ein Teamevent

Die FOSSGIS lebt vom ehrenamtlichen Engagement, zahlreiche Helfer:innen bringen sich ein und übernehmen unterschiedlichste Aufgaben vor und während der Konferenz. Herzlichen Dank dafür!

Es werden noch Helfende gesucht, insbesondere für Sessionleitung, Unterstützung im Hörsaal für die Vortragenden sowie beim Catering, siehe https://www.fossgis-konferenz.de/2026/helfen/.

OSM-Samstag und Community Sprint

Am Samstag, den 28.03.2026 werden OSM-Samstag und Community Sprint in den Räumen des Geographischen Instituts in der Goldschmidtstr. 3-5, 37073 Göttingen stattfinden. Die Gelegenheit ins Gespräch zu kommen oder beim Community Sprint sich einzubringen oder Know-How aufzubauen. Jede:r ist herzlich willkommen teilzunehmen, https://pretalx.com/fossgis2026/talk/VVYN7A/.

Informiert rund um die Konferenz

Informationen rund um die FOSSGIS finden sich unter dem Hashtag #FOSSGIS2026. Den Haschtag #FOSSGIS2026 nutzen wir für Informtionen in den Social Media, nutzen sie es auch, um die Social Media Aktivitäten zu verbinden.

Archiv FOSSGIS-Konferenzen

Im FOSSGIS-Archiv finden Sie die Homepages der vergangenen Konferenzen, inkl. Programm und Videos https://fossgis-konferenz.de/liste.html.

Das FOSSGIS Team 2026 wünscht eine gute Anreise und freut sich auf eine spannende Konferenz in Göttingen