User:Barnes38/Tools

From OpenStreetMap Wiki
Jump to navigation Jump to search

(2014) Tools/Outils

(2024) Annoncer

(2023) Adresses

(2018) Datatrucs

(2023) OSRM

(2023) VROOM

OpenRouteService

Valhalla


MapBox


OpenData

bicycle_repair_station


Pistes Cyclables


Photos dans JOSM

Signalisation routière

Panoramax

Podcasts

Articles

Vidéos

JOSM

Plugins JOSM à installer:

  • Photoadjust
  • Photo_geotagging

Et rajouter le heading dans les photos, éventuellement les recaler, ensuite les sauverarger, ce qui modifie les infos exifs

update geovisio_cli

pip install --upgrade geovisio_cli

Premier essai de téléversement d'une séquence de 50 photos

$ geovisio upload --api-url https://panoramax.openstreetmap.fr/ /home/paul/Panoramax/2023-10-07/BergesIsère/
🔭 Your computer is not yet authorized against GeoVisio API https://panoramax.openstreetmap.fr/. To authenticate, please either go to the URL below, or scan the QR code below https://panoramax.openstreetmap.fr/api/auth/tokens/5a33454b-5250-4e67-ba73-e63dff725550/claim

$ geovisio upload --api-url https://panoramax.openstreetmap.fr/ /home/paul/Panoramax/2023-10-07/BergesIsère/
👤 Using stored credentials, logged in as barnes38

🔍 Listing pictures... ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 100% 0:00:00

🗂 All pictures belong to a single sequence

📡 Uploading sequence "BergesIsère" (part 1/1)
- Folder: /home/paul/Panoramax/2023-10-07/BergesIsère
✅ Created collection https://panoramax.openstreetmap.fr/api/collections/1a4240dc-1cb9-44a5-999e-ef1d321553e9
🚀 Uploading pictures... ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0:01:19 [50/50]
📷 Processing IMG_20231007_175210.jpg ..
╭───────────────────────────────────────────────────────────────Errors───────────────────────────────────────────────────────────────╮
│ No errors ╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
🎉 50 pictures uploaded
Note: You can follow the picture processing with the command:
geovisio collection-status --wait --location https://panoramax.openstreetmap.fr/api/collections/1a4240dc-1cb9-44a5-999e-ef1d321553e9



$ geovisio collection-status --wait --location https://panoramax.openstreetmap.fr/api/collections/1a4240dc-1cb9-44a5-999e-ef1d321553e9
Sequence BergesIsère produced by barnes38 taken with Fairphone FP4
┏━━━━━━━┳━━━━━━━┳━━━━━━━━━┳━━━━━━━━━━━┳━━━━━━━━┓
┃ Total ┃ Ready ┃ Waiting ┃ Preparing ┃ Broken ┃
┡━━━━━━━╇━━━━━━━╇━━━━━━━━━╇━━━━━━━━━━━╇━━━━━━━━┩
│ 50 │ 0 │ 50 │ 0 │ 0 │
└───────┴───────┴─────────┴───────────┴────────┘
🔭 Waiting for pictures to be processed by geovisio
⏳ Processing ... ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0:01:26 0/50 • 0 picture currently processedgeovisio upload --api-url https://panoramax.openstreetmap.fr/ /home/paul/Panoramax/2023-10-07/BergesIsère/
                                                  ⏳ Processing ... ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0:02:33 0/50 • 0 picture currently processed
⏳ Processing ... ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0:02:34 0/50 • 0 picture currently processed
⏳ Processing ... ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0:02:41 0/50 • 0 picture currently processed


$ geovisio collection-status --wait --location https://panoramax.openstreetmap.fr/api/collections/1a4240dc-1cb9-44a5-999e-ef1d321553e9
Sequence BergesIsère produced by barnes38 taken with Fairphone FP4
┏━━━━━━━┳━━━━━━━┳━━━━━━━━━┳━━━━━━━━━━━┳━━━━━━━━┓
┃ Total ┃ Ready ┃ Waiting ┃ Preparing ┃ Broken ┃
┡━━━━━━━╇━━━━━━━╇━━━━━━━━━╇━━━━━━━━━━━╇━━━━━━━━┩
│ 50 │ 50 │ 0 │ 0 │ 0 │
└───────┴───────┴─────────┴───────────┴────────┘
🎉 50 pictures processed


Résultat


Répertoire de sauvegarde des photos

cd /home/paul/ASSOS/OSM/2023/Panoramax

Instance Panoramax de l'IGN

Forum GeoCommuns

https://forum.geocommuns.fr/

scrcpy

geocode

Canvas


BRouter

1

2

Northwest France: N45W005, N45W000 http://brouter.de/brouter/segments4/W5_N45.rd5 http://brouter.de/brouter/segments4/E0_N45.rd5 Northeast France: N45E005, N45E010 http://brouter.de/brouter/segments4/E5_N45.rd5 http://brouter.de/brouter/segments4/E10_N45.rd5 Southwest France: N40W005, N40W000 http://brouter.de/brouter/segments4/W5_N40.rd5 http://brouter.de/brouter/segments4/E0_N40.rd5 Southeast France: N40E005, N40E010 http://brouter.de/brouter/segments4/E5_N40.rd5 http://brouter.de/brouter/segments4/E10_N40.rd5

for a in W5_N45.rd5 E0_N45.rd5 E5_N45.rd5 E10_N45.rd5 W5_N40.rd5 E0_N40.rd5 E5_N40.rd5 E10_N40.rd5
do
  echo $a
  wget http://brouter.de/brouter/segments4/${a}
done


3

cd /home/paul/brouter/abrensch/brouter
docker run --rm -v ./misc/segments4:/segments4 -v ./misc/profiles2:/profiles2 -p 17777:17777 --name brouter brouter
//wget "http://localhost:17777/brouter?lonlats=4.8357,45.7640%7C4.8357,45.7640&profile=trekking&alternativeidx=0&format=gpx"
wget -O output "http://localhost:17777/brouter?lonlats=4.8357,45.7640%7C4.8357,45.7640&profile=trekking&alternativeidx=0&format=gpx"

4 v0.1 quick&dirty

Explications

 - continuer le poc
 -- essayer des profils vélos : safety/shortest
 -- faire des stats et en déduire (comment ?) un index global
 -- faire tourner ceci sur plusieurs métropoles
 - améliorer globalement
 -- améliorer le code 
 -- tout mettre dans le container
 -- mettre ça dans git
 -- avoir des outils d'intégration

Les requêtes Overpass qui vont bien

Geographie

Grenoble-Alpes Métropole https://overpass-turbo.eu/s/1QvX

   boundary = local_authority
   local_authority:FR = metropole
   name = Grenoble-Alpes Métropole
   ref:FR:SIREN = 200040715
   short_name = La Métro
   type = boundary
   website = https://www.lametro.fr/
   wikidata = Q999238
   wikipedia = fr:Grenoble-Alpes Métropole

Saint-Étienne Métropole https://overpass-turbo.eu/s/1Qwu

   boundary = local_authority
   local_authority:FR = metropole
   name = Saint-Étienne Métropole
   ref:FR:SIREN = 244200770
   short_name = SEM
   type = boundary

Auvergne-Rhone-Alpes https://overpass-turbo.eu/s/1Qws

   ISO3166-2 = FR-ARA
   admin_level = 4
   boundary = administrative
   name = Auvergne-Rhône-Alpes
   type = boundary

Saint-Brieuc Armor Agglomération https://overpass-turbo.eu/s/1QNq

   boundary = local_authority
   local_authority:FR = CA
   name = Saint-Brieuc Armor Agglomération
   type = boundary

Saint-Brieuc Agglomération - Baie d'Armor https://overpass-turbo.eu/s/1QNr

   disused:boundary = local_authority
   disused:local_authority:FR = CA
   end_date = 2016-12-31
   population = 113801
   type = boundary

Wiki local_authority:FR https://wiki.openstreetmap.org/wiki/FR:Key:local_authority:FR
Toutes les communautés de communes dans Isère https://overpass-turbo.eu/s/1Qw2
Toutes les communautés d'agglomération en France https://overpass-turbo.eu/s/1Qw0
Toutes les communautés urbaines en France https://overpass-turbo.eu/s/1Qw1
Toutes les métropoles en France https://overpass-turbo.eu/s/1QvZ

Réseaux de transport

Grenoble
Saint-Etienne, Clermond-Ferrand, Nancy, Saint-Brieuc

ce qu'il faudrait faire pour faire ça dans une autre ville

Les adhérences actuelles sont:

tests

tests1

$ cd /home/paul/brouter/osmium
$ python3 transports7offserialized.py >output-fastbike-verylowtraffic2 2>&1
$ python3 transports8Grenoble.py > output-Grenoble 2>&1 OK
$ python3 transports8SaintEtienne.py > output-SaintEtienne 2>&1 KO
$ python3 transports8ClermontFerrand.py > output-ClermontFerrand 2>&1 KO
$ python3 transports8Nancy.py > output-Nancy 2>&1 OK

tests2

$ cd /home/paul/brouter/osmium/py
$ python3 transports8Grenoble.py > ../outputs/output-Grenoble 2>&1 OK
$ python3 transports8Nancy.py > ../outputs/output-Nancy 2>&1 OK
$ python3 transports8SaintEtienne.py > ../outputs/output-SaintEtienne 2>&1 KO
$ python3 transports8ClermontFerrand.py > ../outputs/output-ClermontFerrand 2>&1 KO


tests3

$ cd /home/paul/brouter/osmium/py
$ python3 transports8.py

  1. Taking input-data.yml file and generating ../outputs/output-XXXX

tests 4

$ cd /home/paul/brouter/osmium/py
$ python3 transports9.py

  1. Taking input-data.yml file and generating ../outputs/output-XXXX
  2. Trying to optimise the way the nodes lon/lat are read, by only fetching nodes which are necessary


tests 11 ok

$ cd /home/paul/brouter/osmium/py
$ python3 transports11.py

  1. Taking input-data.yml file and generating ../outputs/output-XXXX
  2. Refacto
  3. Trying to optimise the way the nodes lon/lat are read, by only fetching nodes which are necessary
  4. OK

24 -rw-rw-r-- 1 paul paul 21911 sept. 15 10:50 Grenoble_output.json
12 -rw-rw-r-- 1 paul paul 8234 sept. 15 10:51 Nancy_output.json
16 -rw-rw-r-- 1 paul paul 13278 sept. 15 10:52 Saint-Brieuc_output.json

code

   import osmium as osm
   from shapely.geometry import Polygon, Point
   import xml.etree.ElementTree as ET
   import requests
   import json
   import pickle
   import os
   import re
   osm_file_name = 'planet_5.487,44.962_6.106,45.315.osm'
   serialized_file_name = 'planet_5.487,44.962_6.106,45.315.pkl'
   def compute_allnodes(osm_file):
       print(f"starting parsing all nodes")
       tree = ET.parse(osm_file)
       root = tree.getroot()
       allnodes = root.findall('node')
       print(f"done parsing all nodes")
       return allnodes
   def save_allnodes(allnodes, filename):
       with open(filename, 'wb') as f:
           pickle.dump(allnodes, f)
   def load_allnodes(filename):
       with open(filename, 'rb') as f:
           return pickle.load(f)
       
   if os.path.exists(serialized_file_name):
       # Load 'allnodes' from the file if it exists
       print(f"starting loading all nodes from file")
       allnodes = load_allnodes(serialized_file_name)
       print("done loading all nodes from file")
   else:
       # Compute 'allnodes' and save it to the file
       allnodes = compute_allnodes(osm_file_name)
       save_allnodes(allnodes, serialized_file_name)


   def fetch_and_parse_geojson(lon1, lat1, lon2, lat2, profile):
       url = f"http://localhost:17777/brouter?lonlats={lon1},{lat1}%7C{lon2},{lat2}&profile={profile}&alternativeidx=0&format=geojson"
       response = requests.get(url)
       if response.status_code == 200:
           json_data = response.text
           json_data = re.sub(r"'", '"', json_data)
           # print(f"json_data: {json_data}")
           data = json.loads(str(json_data))
           # Extract the required properties
           properties = data['features'][0]['properties']
           mydata = {
               "track-length": properties["track-length"],
               "filtered ascend": properties["filtered ascend"],
               "plain-ascend": properties["plain-ascend"]
           }
           return mydata
       else:
           print(f"error with this url: {url}")
           return {
               "track-length": "unknown",
               "filtered ascend": "unknown",
               "plain-ascend": "unknown"
           }


   def get_node_lon_lat(node_id):
       for node in allnodes:
           if node.attrib['id'] == str(node_id):
               lon = node.attrib['lon']
               lat = node.attrib['lat']
               return lon, lat
       return None


   def nodesStartEnd(relation):
       # print(f"relation {relation}")
       # print(f"relation.id {relation['id']}")
       # print(f"relation.tags {relation['tags']}")
       # print(f"relation.members {relation['members'][0][0]} {relation['members'][-1][0]}")
       nodesStartEnd = {
           'id': relation['id'],
           'lonlatstart': get_node_lon_lat(relation['members'][0][0]), 
           'lonlatend': get_node_lon_lat(relation['members'][-1][0])
       }
       return nodesStartEnd
   # Define a handler to process the OSM data
   class Handler(osm.SimpleHandler):
       def __init__(self):
           super(Handler, self).__init__()
           self.relations = []
       def relation(self, r):
           if 'network' in r.tags and r.tags['network'] == 'TAG' \
               and 'type' in r.tags and r.tags['type'] == 'route':
               # print(f"Line: {r.tags.get('name')} {r.tags.get('from')} {r.tags.get('to')} ")
               # Make a deep copy of the relation object
               only_stop_members = [member for member in r.members if member.role == "stop"]
               rcopy = {
                   'id': r.id,
                   'tags': {tag.k: tag.v for tag in r.tags},
                   'members': [(m.ref, m.role, m.type) for m in only_stop_members]
               }
               # print(f"rcopy {rcopy} ")
               nodes = nodesStartEnd(rcopy) 
               # print(f"nodesStartEnd {nodesStartEnd}")
               relationKept = { 
                   'Name': r.tags.get('name'), 
                   'From': r.tags.get('from'), 
                   'To': r.tags.get('to'), 
                   'NodesStartEnd': nodes
               }
               # print(f"relationKept {relationKept}")
               self.relations.append(relationKept)
   # Initialize the handler
   handler = Handler()
   # Apply the handler to your local .osm file
   handler.apply_file("planet_5.487,44.962_6.106,45.315.osm")
   print(f"handler.relations size= {len(handler.relations)}")
   profile = 'trekking'
   for relation2 in handler.relations:
       print(f"relation2: {relation2.get('Name')}\n\t(From: {relation2.get('From')}\n\tTo: {relation2.get('To')}\n\tNodesStartEnd: {relation2.get('NodesStartEnd')}")
       lon1 = relation2.get('NodesStartEnd')['lonlatstart'][0]
       lat1 = relation2.get('NodesStartEnd')['lonlatstart'][1]
       lon2 = relation2.get('NodesStartEnd')['lonlatend'][0]
       lat2 = relation2.get('NodesStartEnd')['lonlatend'][1]
       geojson_data = fetch_and_parse_geojson(lon1, lat1, lon2, lat2, profile)
       print(f"longueur et denivelés: {geojson_data}")
   print(f"handler.relations size= {len(handler.relations)}")

LidarHD

Umap

Wikipedia

OSM Toulouse

organic map = rendu vectoriel avec un rendu calculé avec les préferences utilisateurs
Cartes.map = rendu vectoriel aussi
Street complete : traduire de nouvelles quêtes