Compare commits

..

723 commits

Author SHA1 Message Date
10268a02b2 Aggiorna README.md
Some checks failed
Docker / build (push) Has been cancelled
Java CI with Gradle / build (push) Has been cancelled
2025-03-26 20:47:46 +08:00
6259e00005 Aggiorna README.md
Some checks are pending
Docker / build (push) Waiting to run
Java CI with Gradle / build (push) Waiting to run
2025-03-26 20:43:22 +08:00
153f0a3777 Aggiorna README.md
Some checks are pending
Docker / build (push) Waiting to run
Java CI with Gradle / build (push) Waiting to run
2025-03-26 20:42:39 +08:00
4d04047def Aggiorna README.md
Some checks are pending
Docker / build (push) Waiting to run
Java CI with Gradle / build (push) Waiting to run
2025-03-26 19:19:42 +08:00
73b75da8ee Aggiorna README.md
Some checks are pending
Docker / build (push) Waiting to run
Java CI with Gradle / build (push) Waiting to run
2025-03-26 19:18:23 +08:00
2b4c9d8fad Aggiorna README.md
Some checks are pending
Docker / build (push) Waiting to run
Java CI with Gradle / build (push) Waiting to run
2025-03-26 19:14:28 +08:00
24b4f6e43a Aggiorna README.md
Some checks are pending
Docker / build (push) Waiting to run
Java CI with Gradle / build (push) Waiting to run
2025-03-26 18:20:54 +08:00
vcoppe
fade673ac7
Update server.sh 2025-03-09 11:10:59 +01:00
vcoppe
23c19d5014
Update mtb.brf 2024-10-04 17:28:21 +02:00
vcoppe
a3090970ac
Update MTB.brf 2024-10-04 17:27:29 +02:00
vcoppe
c0a48f8f05 update routing profiles 2024-10-04 17:27:20 +02:00
vcoppe
71889ba9fd add gravel private profile and update jar version 2024-07-23 18:28:20 +02:00
vcoppe
c616bccabe
Merge branch 'abrensch:master' into master 2024-07-23 17:11:26 +02:00
afischerdev
e63cc9888f
Merge pull request #717 from afischerdev/update-android
Update Android Version
2024-07-23 10:18:31 +02:00
afischerdev
15bf08aaef prepared the version change 2024-07-21 11:31:14 +02:00
afischerdev
e46444cf57 updated revision doc 2024-07-13 09:46:58 +02:00
afischerdev
b234d48c00 changed to android 34 2024-07-13 09:43:49 +02:00
afischerdev
e379b7abb0 changed android tests for "deprecation" 2024-07-12 18:37:03 +02:00
afischerdev
f9c6ad1ae8
Merge pull request #711 from jmizv/master
Added port mapping for docker run command
2024-07-12 10:56:58 +02:00
afischerdev
1f2f655863 changed printStackTrace to log 2024-07-12 10:34:47 +02:00
afischerdev
f289b0cd83 suppressed "deprecation" 2024-07-12 10:25:52 +02:00
afischerdev
8d22a2d0eb added new gradle app name 2024-07-12 09:56:14 +02:00
afischerdev
b1e9208be6 added compiler params 2024-07-12 09:54:57 +02:00
afischerdev
dec6cc8ba0 changed gradle lib versions 2024-07-12 09:53:54 +02:00
afischerdev
c631714c1f changed Android version 2024-07-12 09:51:35 +02:00
afischerdev
9dcb7ca92e
Merge pull request #713 from afischerdev/car-712
Preparing for version 1.7.6
2024-06-21 09:35:50 +02:00
afischerdev
928bd0e28f preparing for version 1.7.6 2024-06-20 11:09:53 +02:00
afischerdev
77e9bd316b keep btools classes in proguard 2024-06-20 11:08:08 +02:00
Alex
2b3bbca448
Added port mapping for docker run command
It probably makes it easier for others using the instruction to have the port mapping ready. Also, use an explicit name for the container instead of a randomly by docker generated one.
2024-06-17 22:14:14 +02:00
afischerdev
6c69e9cea4
Merge pull request #707 from afischerdev/new-apk
Preparing for version 1.7.5
2024-06-05 14:26:36 +02:00
afischerdev
8f50671b98 Preparing for version 1.7.5 2024-06-03 20:04:34 +02:00
afischerdev
e011343ab0
Merge pull request #705 from afischerdev/new-apk
Prevent exceptions when importing profiles
2024-06-03 17:31:22 +02:00
afischerdev
2a77f71c85 updated doc 2024-05-29 18:38:31 +02:00
afischerdev
646f805b99 protect exception 2024-05-29 18:36:51 +02:00
afischerdev
be0aa77ee8
Merge pull request #703 from afischerdev/new-apk
Add a check for the nogo list
2024-05-29 17:48:59 +02:00
afischerdev
cbf172656b updated doc for publishing 2024-05-28 10:13:35 +02:00
afischerdev
7001c4cbc7 check for nogolist 2024-05-22 17:54:44 +02:00
afischerdev
73e7873583
Merge pull request #697 from quaelnix/remove-traffic-simulation
Remove unused traffic simulation code
2024-05-15 09:58:06 +02:00
afischerdev
0b6500cdad
Merge pull request #700 from mjaschen/task/update-geojson-mime-type
Update MIME type for GeoJSON responses
2024-05-15 09:43:21 +02:00
abrensch
a6611ed303
Merge pull request #701 from mjaschen/task/iso8601-compatible-log-timestamps
ISO8601 compatible timestamps in log output
2024-05-15 08:04:17 +02:00
Marcus Jaschen
4e858f5e49 ISO8601 compatible timestamps in log output
This fixes #699.

**Warning:** this change breaks with backward compatibility,
e.g. for log parsing tool chains.
2024-05-15 07:59:18 +02:00
Marcus Jaschen
6d7b8f0d77 Update MIME type for GeoJSON responses
The MIME type for GeoJSON registered with IANA is application/geo+json,
replacing the old value application/vnd.geo+json. The change was published
with RFC 7946 in 2016.

Example request:

`GET /brouter?lonlats=13.377485,52.516247%7C13.351221,52.515004&profile=trekking&alternativeidx=0&format=geojson HTTP/1.1`

Exampe response headers:

```
HTTP/1.1 200 OK
Content-Encoding: gzip
Content-Disposition: attachment; filename="brouter.geojson"
Access-Control-Allow-Origin: *
Connection: close
Content-Type: application/geo+json; charset=utf-8
```

References:

- https://www.iana.org/assignments/media-types/application/vnd.geo+json
- https://www.iana.org/assignments/media-types/application/geo+json
- https://datatracker.ietf.org/doc/html/rfc7946#section-12
2024-05-14 21:03:47 +02:00
ulteq
2a94b7f300 Remove unused traffic simulation code 2024-05-11 19:53:11 +02:00
quaelnix
8270ae6638
Merge pull request #693 from quaelnix/gravel-profile
Update gravel.brf
2024-05-11 12:21:44 +02:00
quaelnix
584a2a82d6
Update gravel.brf
- Fix flaws in 'vehicle=' and 'bicycle=use_sidepath' logic
- Use more realistic drag coefficient
- Fix typo in the downhillcost logic
- Improve maxspeed penalty
- Improve noise penalty
2024-05-05 14:47:41 +02:00
afischerdev
f2c01b80d3
Merge pull request #690 from zod/docker-publish
Publish docker images
2024-04-17 20:25:48 +02:00
afischerdev
36e169ab48
Merge pull request #689 from zod/pmd-7
Update PMD to 7.0.0
2024-04-17 20:19:52 +02:00
Manuel Fuhr
47f58126e7 Add dependency on brouter-routing-app for distZip 2024-04-15 20:49:25 +02:00
Manuel Fuhr
83f001e3a1 Merge branch 'master' into pmd-7 2024-04-15 20:46:52 +02:00
afischerdev
23c8123931
Merge pull request #688 from zod/bundle-update
Update jekyll dependencies
2024-04-10 16:38:21 +02:00
Manuel Fuhr
51ebfd346b Disable image signing 2024-04-09 23:00:27 +02:00
zod
a148ba70eb Add docker publish workflow based on workflow template 2024-04-09 22:45:40 +02:00
afischerdev
f3af9d6f4b
Merge pull request #687 from afischerdev/docs
Preparing for  version 1.7.4
2024-04-09 16:54:34 +02:00
Manuel Fuhr
d969ac11cb Downgrade AGP version for IntelliJ IDEA compatibility 2024-04-06 00:35:19 +02:00
Manuel Fuhr
13781fb1fc fixup! Upgrade to AGP 8.0 and Gradle 8.4 2024-04-06 00:28:57 +02:00
Manuel Fuhr
8e3c9a9512 Target Java 11 2024-04-06 00:28:13 +02:00
Manuel Fuhr
258a0c107d Remove leftover maven file 2024-04-06 00:23:09 +02:00
Manuel Fuhr
5d4065d141 Use conventions instead of cross-project configuration
gradle userguide suggests to avoid allprojects/subprojects and use conventions instead

https://docs.gradle.org/current/userguide/sharing_build_logic_between_subprojects.html#sec:convention_plugins_vs_cross_configuration
2024-04-06 00:23:09 +02:00
Manuel Fuhr
b8929ab414 Global build ignore 2024-04-05 23:09:42 +02:00
afischerdev
616266084c changed version numbers 2024-04-05 13:29:52 +02:00
Manuel Fuhr
6c22d7d012 Update workflows to java 17 2024-04-03 22:42:30 +02:00
Manuel Fuhr
c73a8cebb8 Enable PMD rule UnnecessaryBoxing and fix violations 2024-04-03 22:42:30 +02:00
Manuel Fuhr
dd896347a2 Fix newly detected violations from PMD 7 2024-04-03 22:42:30 +02:00
Manuel Fuhr
2f7ce42480 Upgrade to PMD 7.0.0 and disable violated rules 2024-04-03 22:36:30 +02:00
Manuel Fuhr
1573aa52e0 Upgrade to AGP 8.0 and Gradle 8.4 2024-04-03 22:36:30 +02:00
Manuel Fuhr
e2752c78bb Remove AndroidManifests for pure java libraries
AGP 8.0 upgrade assistant transforms those to build.gradle instructions which aren't supported for pure java libs
2024-04-03 22:36:30 +02:00
Manuel Fuhr
a4388ce5c9 bundle update ruby dependencies 2024-04-03 21:14:17 +02:00
afischerdev
eebf22de84 updated for next version 2024-04-03 14:38:57 +02:00
afischerdev
6330325d04
Merge pull request #644 from afischerdev/srtm-gen
Elevation raster generation (*.bef files)
2024-04-03 14:29:34 +02:00
afischerdev
7b176b4b6f
Merge pull request #678 from afischerdev/rework-voicehint
Last message in track
2024-04-03 14:24:04 +02:00
afischerdev
107a6725e6
Merge pull request #677 from afischerdev/new-apk
App: certificate fallback
2024-04-03 14:20:12 +02:00
vcoppe
7229bc4c54 experimental private profiles 2024-04-02 17:38:40 +02:00
vcoppe
385009b9e6 add private profiles 2024-04-02 17:23:20 +02:00
afischerdev
7f880a5826
Merge branch 'master' into rework-voicehint 2024-03-08 12:44:22 +01:00
afischerdev
aa393ab7dd
Merge pull request #679 from abrensch/revert-674-rerouting
Revert "App: Rerouting"
2024-03-08 12:24:03 +01:00
afischerdev
b9b629185a
Revert "App: Rerouting" 2024-03-08 12:16:13 +01:00
afischerdev
28e6523ab9 wrong last message in track #657 2024-03-06 19:57:12 +01:00
afischerdev
e75adfb555 enabled raw track for test only 2024-03-06 19:54:48 +01:00
afischerdev
1a0b38d375
Merge pull request #674 from afischerdev/rerouting
App: Rerouting
2024-03-03 18:23:13 +01:00
afischerdev
09248679db
Merge branch 'master' into rerouting 2024-03-03 18:20:44 +01:00
afischerdev
531f913c5b
Merge pull request #665 from afischerdev/rework-voicehint
Rework on voicehints
2024-03-03 18:17:27 +01:00
afischerdev
dcc9719ba6 added create elev image with diff colors 2024-03-03 17:52:23 +01:00
afischerdev
6e858b6c91 added app certificate fallback 2024-03-02 15:41:32 +01:00
afischerdev
526bb53b70 added create elev image 2024-03-02 10:44:38 +01:00
afischerdev
86e62e1163
Update gradle-publish.yml action version
Update actions from v3 to v4
2024-03-01 12:01:27 +01:00
afischerdev
260e960baf
Update gradle.yml switch actions to v4
git actions do not generate complete zip file with v3
2024-02-26 18:51:31 +01:00
afischerdev
e94b80e579
Merge pull request #669 from afischerdev/server
Enabled custum profiles on server
2024-02-20 15:31:34 +01:00
afischerdev
b2009cf7e8 more on roundabout #664 2024-02-20 12:00:59 +01:00
afischerdev
47ee77bc35
Merge pull request #662 from afischerdev/docs
Updated service, osmand doc, added docker doc
2024-02-19 16:51:34 +01:00
afischerdev
97e9a824be
Merge pull request #672 from devemux86/rawtrack
Error reading rawTrack: do not throw exception, fix #671
2024-02-19 16:36:39 +01:00
afischerdev
f1e5732dc2 removed throw exception #671 2024-02-17 10:45:42 +01:00
Emux
6427dab483
Error reading rawTrack: do not throw exception, fix #671 2024-02-12 15:18:55 +02:00
afischerdev
503ceb625b added docker doc with command samples 2024-02-09 12:34:56 +01:00
afischerdev
3baf7a08fa added doc new osmand function 2024-02-09 12:34:20 +01:00
afischerdev
4ebd378800 Revert "added docker doc with command samples"
This reverts commit e289571471.
2024-02-09 12:31:49 +01:00
afischerdev
e289571471 added docker doc with command samples 2024-02-09 12:25:25 +01:00
afischerdev
9c5b380105 enabled custum profiles on server 2024-02-06 17:02:31 +01:00
quaelnix
6a0f69d546
Fix typo in profile_developers_guide.md 2024-02-05 11:00:57 +01:00
afischerdev
91ccb858dd undo tmp gpx name 2024-01-30 18:31:34 +01:00
afischerdev
b3002a78e3 rework for vh on roundabouts #664 2024-01-30 18:12:51 +01:00
afischerdev
26879159da updated service doc 2024-01-21 16:48:14 +01:00
afischerdev
1bf367b43e allow multiple segments for rerouting 2024-01-19 16:40:56 +01:00
afischerdev
e5ecd14ce1 prepare rerouting 2024-01-19 16:37:29 +01:00
afischerdev
ae7411d4a0 added nogo for heading calc 2024-01-19 10:16:42 +01:00
afischerdev
bf07e2e6d2 prepared CLI for raw testing 2024-01-18 18:39:13 +01:00
simdens
2f1422352e
Add "DIVIDE" command and new "maxslope" and "maxslopecost" parameters (#642)
* Added 'DIV' expression for profiles

* Added 'uphillmaxbuffercost' and 'downhillmaxbuffercost' parameter. This makes it possible to penalize very steep path sections

* Added 'div by zero' check in BExpression.java DIV command

* Simplify maxbuffercostdiv logic

* Added documentation about new features

* Fix typo

* Rename new DIV command

* Redesign the new commands
- Allow to set both the maxslope and the maxslopecost in the way context separately for uphill and downhill
- New names for the new commands that better reflect what they actually do

* Adapt the profile developers guide to the latest changes

* Improve wording

---------

Co-authored-by: quaelnix <122357328+quaelnix@users.noreply.github.com>
2024-01-17 16:34:52 +01:00
afischerdev
d2e183c625
Merge pull request #647 from joe-akeem/master
A Dockerfile
2024-01-17 16:27:18 +01:00
Joachim Lengacher
226f677b26 Provide default profiles including variants. 2024-01-16 17:36:35 +01:00
Joachim Lengacher
58dc4afa1e
Merge branch 'abrensch:master' into master 2024-01-16 09:44:32 +01:00
afischerdev
56ba451888
Merge pull request #651 from devemux86/lint
Fix compile and Lint issues
2024-01-15 18:21:23 +01:00
afischerdev
a3ae1ce5e7
Merge pull request #649 from devemux86/targetSdk34
Android targetSdk 34
2024-01-15 18:18:08 +01:00
afischerdev
0a3aa3a84c
Merge pull request #660 from waldyrious/typo-sees
Fix 'sees' typo in consider_river switch
2024-01-15 18:15:41 +01:00
afischerdev
5979df131b
Merge pull request #650 from afischerdev/app-trans
App: Added more strings in resources
2024-01-15 18:12:11 +01:00
afischerdev
6b659def02
Merge pull request #653 from afischerdev/engine-mode
App: some error handling
2024-01-15 18:08:26 +01:00
Waldir Pimenta
ae951d9aa5 Fix 'sees' typo in consider_river switch
Use wording proposed by @gnbl on https://github.com/nrenner/brouter-web/issues/747
2024-01-12 23:05:48 +00:00
afischerdev
152833386b new lang Korean translations 2023-12-23 14:37:37 +01:00
afischerdev
e73d0e8001 change for voicehint list second step 2023-12-22 18:25:09 +01:00
afischerdev
7ffee3a911
Merge pull request #655 from devemux86/drawrect
Fix Canvas.drawRect on older Android

Great, thank you. I have seen that before but was focused on other issues.
2023-12-22 18:23:35 +01:00
afischerdev
01ac57a929 change for voicehint list first step 2023-12-22 18:09:44 +01:00
afischerdev
c5f158ec43 calls with transport mode 2023-12-22 18:05:07 +01:00
afischerdev
c31c38a5d6 added transport mode param, switch morestraight logic 2023-12-22 17:56:48 +01:00
afischerdev
0f8bdfee6c switch gpx name to point name - temp only 2023-12-22 17:52:15 +01:00
afischerdev
f405b0e16e switch to numeric transport mode 2023-12-22 17:51:04 +01:00
Emux
29673062b5
Fix Canvas.drawRect on older Android 2023-12-21 20:32:44 +02:00
afischerdev
12309f298c update for some translations 2023-12-19 17:25:02 +01:00
afischerdev
ec3461d8a2 changed handle back pressed logic 2023-12-15 14:25:03 +01:00
afischerdev
9ef31e6d2c remove profile param when handled in app 2023-12-15 14:10:20 +01:00
afischerdev
158dc5e54d do only compress gpx/json 2023-12-15 14:01:13 +01:00
afischerdev
77014ffddb remove unused text 2023-12-14 19:59:12 +01:00
afischerdev
a70c95c576 added translation for new messages and fixes 2023-12-14 17:50:14 +01:00
Emux
de70dec44a
Fix some Lint issues 2023-12-14 18:46:25 +02:00
afischerdev
e918700bca added new message and rename msg 2023-12-14 12:01:29 +01:00
afischerdev
f37c77267a added translations and new messages 2023-12-14 11:47:35 +01:00
afischerdev
a21dee5923 string name misspelling 2023-12-11 14:02:45 +01:00
afischerdev
1e9783819b added str format for resource strings 2023-12-11 10:52:22 +01:00
afischerdev
4c310bf4b4
Merge branch 'abrensch:master' into rework-voicehint 2023-12-10 14:19:49 +01:00
afischerdev
bcf6a7f630
Merge pull request #646 from afischerdev/engine-mode
Update for new output logic
2023-12-10 14:13:07 +01:00
afischerdev
30d4fbb0b0 xml error correction 2023-12-10 13:58:46 +01:00
afischerdev
28ca2b7217 move new resources to all xml 2023-12-10 13:24:31 +01:00
afischerdev
3acbd96575 more new resources from hardcoded messages 2023-12-10 13:19:54 +01:00
afischerdev
b9cd75b7a0 more new resources from hardcoded string 2023-12-10 12:26:20 +01:00
afischerdev
2669ae9558 added new resources from hardcoded string 2023-12-10 12:16:03 +01:00
afischerdev
7657985c1a
Merge pull request #645 from devemux86/translations
BRouter translations
2023-12-09 15:41:02 +01:00
afischerdev
03bbbcfc0c enabled compressed json output for app 2023-12-09 15:28:17 +01:00
Emux
a3d561e144
Android targetSdk 34 2023-12-09 14:00:50 +02:00
afischerdev
068a5ff714 use elevation type for filter value 2023-11-28 15:17:51 +01:00
afischerdev
16d019c1d0 read elevation type from rd5 2023-11-28 15:14:13 +01:00
afischerdev
477c675d46 prepare elevation type in rd5 2023-11-28 15:04:14 +01:00
Joachim Lengacher
0e6cd542cf Added description for Docker usage. 2023-11-23 16:24:27 +01:00
Joachim Lengacher
15b5e42533
Merge branch 'abrensch:master' into master 2023-11-22 17:32:15 +01:00
Emux
c52bd86dd5
Translations: Arabic, Catalan, Dutch, French, German, Greek, Italian, Polish, Spanish 2023-11-21 19:43:22 +02:00
afischerdev
cb0b1d8855 removed unused formatting 2023-11-21 14:05:19 +01:00
afischerdev
8a7fa9fa81 changes for engineMode 2 2023-11-21 13:26:05 +01:00
afischerdev
fcbaf598aa moved some format routines 2023-11-21 13:00:39 +01:00
Joachim Lengacher
ac0d3ae518 new Dockerfile 2023-11-21 12:26:15 +01:00
afischerdev
24e15b5466 updated output for app 2023-11-20 18:12:21 +01:00
afischerdev
149b83056e updated output for server 2023-11-20 17:39:29 +01:00
afischerdev
ad3db9c004 updated output for command line 2023-11-20 17:33:09 +01:00
afischerdev
c47444e0f8 enabled functions for public 2023-11-20 17:09:14 +01:00
afischerdev
56dbd52065 added new format classes 2023-11-20 17:08:20 +01:00
afischerdev
c6473055f4
Merge pull request #634 from afischerdev/engine-mode
Update new parameter collector for BRouter app
2023-11-17 15:41:00 +01:00
afischerdev
cae367025f added an elevation raster generation part 2023-11-06 17:48:10 +01:00
afischerdev
fbad694746 removed unused files 2023-11-06 12:55:50 +01:00
afischerdev
36d692da84 set new raster calls 2023-11-06 12:50:50 +01:00
afischerdev
5198559c77 moved converter to one file 2023-11-06 12:48:42 +01:00
afischerdev
50eb62361d rename RasterCoder 2023-11-06 12:06:43 +01:00
afischerdev
2be7e0c19c rename SrmtRaster 2023-11-06 12:04:47 +01:00
afischerdev
69489c2b7e updated test 2023-10-27 16:26:21 +02:00
afischerdev
859b401c8b unable raster weighting 2023-10-27 16:25:44 +02:00
afischerdev
c7786f03ec added 1sec bef generation 2023-10-27 16:22:52 +02:00
afischerdev
254aff19b8 added 1sec srtm use 2023-10-27 16:21:58 +02:00
afischerdev
109782d362
Merge pull request #635 from afischerdev/update-cmdline2
Update cmd line BRouter
2023-10-20 12:57:58 +02:00
afischerdev
c22d64945f
Merge pull request #637 from zod/docs
Rework pseudo tags docs
2023-10-20 12:04:43 +02:00
Manuel Fuhr
94089d2b60 Explain website generation using jekyll 2023-10-18 22:45:40 +02:00
Manuel Fuhr
94a823f805 Add webrick to fix jekyll serve on ruby 3 2023-10-18 22:45:40 +02:00
Manuel Fuhr
eb43a6cf2a Clarify some wordings & format table 2023-10-18 22:45:40 +02:00
Manuel Fuhr
94b3727840 Format pseudo tags docs 2023-10-18 22:45:40 +02:00
afischerdev
890e7f9824 removed older param handling 2023-10-18 12:50:53 +02:00
afischerdev
3fae9246d6 add param collector and calls 2023-10-18 12:37:05 +02:00
afischerdev
5825047847 remove seed, added file param 2023-10-07 18:10:54 +02:00
afischerdev
c07454a8ba enable new post process on voicehints 2023-10-06 14:57:46 +02:00
afischerdev
90cc045404 moved string control to app worker 2023-10-05 12:48:06 +02:00
afischerdev
8d4012211e changed debug logging 2023-09-30 17:20:33 +02:00
afischerdev
48c8c3edd1 removed old param handling 2023-09-30 17:17:01 +02:00
afischerdev
298893352c added param collector 2023-09-30 17:15:15 +02:00
afischerdev
3acb0b1fdb updated doc entries 2023-09-30 11:13:57 +02:00
afischerdev
fe08674632 added test if nogo array exists 2023-09-30 11:04:34 +02:00
afischerdev
42f0ac6627 added app specific vars 2023-09-30 11:01:21 +02:00
Stapawe
90fbe8345a
environmental_considerations_and_pseudo_tags.md (#612)
Document the new pseudo tag logic
2023-09-24 10:08:07 +02:00
afischerdev
50306100e9
Merge pull request #630 from zod/bundle-update
`bundle update` docs dependencies
2023-09-20 12:07:24 +02:00
Manuel Fuhr
0177b8fea4 bundle update docs dependencies 2023-09-19 14:55:59 +02:00
afischerdev
5c050f7ed1
Merge pull request #619 from afischerdev/engine-mode
New BRouter command line with new parameter collector
2023-09-19 12:28:43 +02:00
afischerdev
1e061d7157
Merge pull request #621 from afischerdev/update-expressions
Replace multiple occurrences of toLowerCase() - continued
2023-09-19 12:09:27 +02:00
afischerdev
3d3617e79f
Merge pull request #626 from quaelnix/add-gravel-profile
Add gravel profile
2023-09-19 12:03:29 +02:00
quaelnix
93b13be1d4
Add gravel profile 2023-09-09 10:17:06 +02:00
afischerdev
88ec15f1d6
Merge pull request #601 from quaelnix/fix-st-area
Fix bug in exclusion of small water bodies
2023-08-30 17:08:13 +02:00
afischerdev
8f793150b0 changed wording for foot/feet 2023-08-30 17:03:45 +02:00
afischerdev
9d22709017 added to if-else-tree 2023-08-30 16:59:47 +02:00
afischerdev
093d400c5b added from/to check in test 2023-08-29 16:58:52 +02:00
afischerdev
2b17ae9255 added new character check 2023-08-29 16:55:41 +02:00
afischerdev
d26f77380d replaced windows characters 2023-08-29 16:51:59 +02:00
afischerdev
e954d2174b added check for from-to units 2023-08-29 16:49:29 +02:00
afischerdev
e697734d64 remove a last lower case 2023-08-27 17:24:54 +02:00
afischerdev
1ffd42904b update rework #616 and remove trim 2023-08-26 17:28:21 +02:00
afischerdev
6b3cfb4c91 added a parameter test 2023-08-24 10:06:00 +02:00
afischerdev
ed7f473556 added one place for parameter 2023-08-24 10:05:38 +02:00
afischerdev
5ed5259912 reworked command line start 2023-08-24 10:04:08 +02:00
afischerdev
3b650a51c2
Merge pull request #616 from moving-bits/toLowerCase
Replace multiple occurrences of toLowerCase()
2023-08-20 17:58:30 +02:00
moving-bits
5fea70d588 Replace multiple occurrences of toLowerCase() 2023-08-20 13:04:41 +02:00
afischerdev
fba51cc7b9
Merge pull request #615 from afischerdev/new-apk
Prepare version number 1.7.3
2023-08-19 16:31:36 +02:00
afischerdev
dad4ea583c change version number 2023-08-17 18:03:03 +02:00
afischerdev
790152770f
Merge pull request #605 from afischerdev/new-apk
Update the APK for multiple issues
2023-08-17 13:02:32 +02:00
quaelnix
8d123e3375
Merge pull request #606 from afischerdev/update_db
Latest scripts for database generation
2023-08-15 19:31:33 +02:00
afischerdev
b15bdf3192 update locus to new output 2023-08-15 16:35:52 +02:00
afischerdev
1600c4356e code cleanup, rework 2023-08-15 16:31:06 +02:00
afischerdev
f6c5953241 added missed idle parameter 2023-08-02 15:32:59 +02:00
afischerdev
76265e7713 latest scripts for db generation 2023-08-02 13:27:21 +02:00
afischerdev
8ae614cfa4 monochrome icon added 2023-07-31 10:34:22 +02:00
afischerdev
071ff5863f break on config change prevented 2023-07-31 10:30:40 +02:00
afischerdev
2dbb57dd4e check for NPE on config 2023-07-31 10:25:58 +02:00
afischerdev
71dfbac13c check for NPE on badWays 2023-07-31 10:24:14 +02:00
quaelnix
a610255529 Fix bug in exclusion of small water bodies 2023-07-27 18:18:42 +02:00
quaelnix
03aab82e1e
Merge pull request #599 from quaelnix/improve-mapcreation-readme
Improve mapcreation readme
2023-07-27 17:32:30 +02:00
quaelnix
3bb0693fa3
Merge pull request #598 from quaelnix/fix-fastbike-options
Fix fastbike options
2023-07-27 17:31:17 +02:00
quaelnix
838141f597 Improve mapcreation readme 2023-07-27 16:05:19 +02:00
quaelnix
5ccc6ef766
Exploit constant expression optimization in fastbike.brf 2023-07-24 17:11:14 +02:00
quaelnix
c38a9186fd
Remove unused profile options 2023-07-24 17:01:04 +02:00
quaelnix
67f923b96e
Implement 'allow_steps' and 'allow_ferries' in fastbike.brf 2023-07-24 17:00:49 +02:00
afischerdev
0b6608eddb
Update gradle-publish.yml
Removed bundle from gradle build
2023-07-20 12:58:00 +02:00
afischerdev
cabdea1e94
Merge pull request #596 from afischerdev/new-apk
Prepare new release 1.7.2
2023-07-20 12:31:59 +02:00
afischerdev
ff73608f0a
Merge pull request #593 from quaelnix/fix-fastbike-regression
Fix profile regressions
2023-07-18 14:39:16 +02:00
afischerdev
f14b8795ea change new version number and date 2023-07-18 12:53:27 +02:00
afischerdev
ab60c442de add doc for new version 2023-07-18 12:45:52 +02:00
afischerdev
3eed1e18c8 add manual start, add Android bundle 2023-07-18 12:30:58 +02:00
afischerdev
ead951d149 change java version in git actions 2023-07-18 11:01:04 +02:00
afischerdev
1109021018
Merge pull request #590 from moving-bits/minor_spelling_fixes
Fix minor spelling issues

Thank you for your contribution.
2023-07-17 16:31:32 +02:00
quaelnix
38fc780055
Remove unused profile options 2023-07-17 14:52:28 +02:00
quaelnix
188280b448
Fix regression in trekking profile
The 'avoid_path' logic which was added in 89b71c2bfb ignores the cycleroute logic and makes no sense.
2023-07-17 14:48:16 +02:00
afischerdev
22cf0bba68
Merge pull request #594 from rkflx/PR/only-revert-voicehint-reindexing
Revert voice hint reindexing
2023-07-17 12:49:07 +02:00
afischerdev
bd6d22101c
Merge pull request #592 from afischerdev/new-apk
Update to new beta version
2023-07-17 12:41:39 +02:00
quaelnix
9125481aed
Fix regression in fastbike profile
e66468b091 broke the logic that handled highway=path. This patch reverts the problematic change in behavior. See: https://github.com/nrenner/brouter-web/issues/756
2023-07-16 19:33:34 +02:00
vcoppe
c454b5a71d update after merging from upstream 2023-07-16 16:44:12 +02:00
vcoppe
1c4fcc8969
Merge branch 'abrensch:master' into master 2023-07-16 16:28:37 +02:00
afischerdev
3706c0cb57 update to new beta version 2023-07-16 12:52:47 +02:00
moving-bits
0d89754ecf Fix minor spelling issues 2023-07-15 16:31:34 +02:00
afischerdev
2762744a84
Update build.gradle
Changed classifier  to archiveClassifier
2023-07-12 13:01:59 +02:00
afischerdev
c825f60eb1
Update gradle-publish.yml
Update checkout and setup-java to v3
2023-07-12 12:05:34 +02:00
afischerdev
67bc763188
Merge pull request #588 from afischerdev/new-apk
Version 1.7.1
2023-07-12 11:47:57 +02:00
afischerdev
f96b83750e
Merge pull request #587 from afischerdev/app-ui-update
Add a silent mode for app start
2023-07-12 11:38:44 +02:00
Henrik Fehlauer
d98b1060d4
Explicitly map internal voice hint ids to external JSON API ids
As c9ae7c8 showed, changing internal ids without being aware of the
possible impact might easily lead to break the external API.

While ids could be fixated by adding respective tests, an even more
elegant solution is to make the mapping from internal ids to external
ids explicit, similar how it is already done for other voice hint
formats.

To underline the purpose of the mapping even more, the
respective method is renamed appropriately.

Test Plan:
  - `./gradlew test`
  - Export a complex route in BRouter-Web and check voice hints have not
  been changed.
2023-07-12 08:59:12 +00:00
Henrik Fehlauer
82fecf95f6
Revert voice hint indexing change in JSON API to restore compatibility
a9e8731 made voice hints available from `formatAsGeoJson()`, which is
used both in the GeoJSON HTTP API and in the JSON Java API. To indicate
a specific type of voice hint, it was chosen to include its numeric id
in the output JSON array among other data. The full list of available
ids was defined in `class VoiceHint`, e.g. `static final int C = 1;`.

Consumers of the API now depended on the mapping from id to intended
voice hint not changing, since otherwise incorrect voice hints could be
displayed. Unfortunately that API contract was broken in c9ae7c8, where
instead of assigning unused ids to new commands, the meaning of existing
ids was changed. This broke compatibility: Clients adapted to the change
did not work with the old indexing anymore, and clients not yet adapted
would break with newer BRouter releases, e.g. they would suddenly
display "Off route" for a "right u-turn".

To restore compatibility, the indexing is reverted to its old state.

This will unbreak GeoJSON/JSON API users no yet adapted to BRouter 1.7.0
or 1.7.1, e.g. BRouter-Web as well as unmaintained clients. While API
users which already patched ids would need to undo or special-case their
changes, the impact is believed to be low, as no such users are
currently known and the breakage was released only recently.

The changed meaning of `TU` in output formats (before: `u-turn-left`,
now: `u-turn-180`) has not been reverted for now, since either that
command is mapped to fallback solutions anyway (e.g. Orux, old Locus,
Gpsies), the change has already been implemented in clients (new Locus,
Cruiser) or was only planned to be implemented in the future (OsmAnd).

Fixes #584

Test Plan:
  - `./gradlew test`
  - Run BRouter with an unpatched BRouter-Web and confirm voice hint
  ids have been restored to the same ones as emitted by BRouter 1.6.3.
2023-07-11 17:57:14 +00:00
afischerdev
bcc028606a update Android doc 2023-07-10 18:19:01 +02:00
afischerdev
6084db93d3 new version 1.7.1 2023-07-10 18:18:10 +02:00
afischerdev
993a3aa859
Merge pull request #579 from afischerdev/profiles-update
Update profiles for new db tags
2023-07-10 18:05:15 +02:00
afischerdev
c02ebecf65 add doc for silent mode 2023-07-10 12:12:03 +02:00
afischerdev
0fbf6ea096 add a silent mode for app start 2023-07-10 12:00:01 +02:00
afischerdev
086503e529
Merge pull request #586 from afischerdev/app-download
Fixed some smaller app problems
2023-07-10 11:07:37 +02:00
afischerdev
565cdde223
Merge pull request #583 from afischerdev/find-points
Find matching points in areas with longer distance between way points
2023-07-10 11:01:14 +02:00
afischerdev
94e29852d0
Merge pull request #582 from afischerdev/app-ui-update
Add portrait mode to app
2023-07-10 10:56:38 +02:00
afischerdev
9a8fd48418
Merge pull request #581 from afischerdev/app-problem
Fixed error in single download
2023-07-10 10:53:38 +02:00
afischerdev
62595b2553 move migration away from ui thread 2023-07-09 18:35:10 +02:00
afischerdev
9a029af8dd protect against limited data size 2023-07-09 18:22:25 +02:00
Arndt Brenschede
58e9ec301b changed db-tag-processing to csv-file 2023-07-09 15:39:55 +02:00
afischerdev
20ee509d39 basedir sometimes not filled 2023-07-09 14:56:42 +02:00
afischerdev
6cc5ae7717 change from StatFs to File 2023-07-09 14:54:50 +02:00
Arndt Brenschede
60c99500fa changed db-tag-processing to csv-file 2023-07-09 14:52:35 +02:00
Arndt Brenschede
30f548096b changed db-tag-processing to csv-file 2023-07-09 14:48:09 +02:00
afischerdev
cc265269e6 change finish route parameters 2023-07-08 18:17:06 +02:00
afischerdev
36dcc88a85 update test lookup with estimated_*_class 2023-07-08 11:33:38 +02:00
afischerdev
5628b885e2 update consider description 2023-07-08 11:20:03 +02:00
afischerdev
b2abdd720f update order to costfactor 2023-07-08 10:59:36 +02:00
afischerdev
89b71c2bfb update trekking for db tags 2023-07-08 10:57:43 +02:00
afischerdev
488d37b070 used greater diff value 2023-07-08 09:57:50 +02:00
afischerdev
daa33e3d34 give a second chance when wpt not found 2023-07-08 09:56:40 +02:00
afischerdev
9b2a2b2b0a add portrait mode to app 2023-07-07 18:32:11 +02:00
afischerdev
fc1e2ebc35 fixed error in single download 2023-07-07 16:15:09 +02:00
afischerdev
e66468b091 update profiles for new db tags 2023-07-06 13:45:44 +02:00
abrensch
79aa07ae84
Merge pull request #573 from abrensch/constant_exp_opti
optimizing constant expressions in profile parsing
2023-07-06 13:05:13 +02:00
Arndt Brenschede
65953faec0 constant expressions: reworks keyvalue-injection, unit-test 2023-07-05 19:07:14 +02:00
Arndt Brenschede
18f2cb548f changed db-tag-processing to csv-file 2023-07-03 08:24:42 +02:00
Arndt Brenschede
c109caac2a changed db-tag-processing to csv-file 2023-07-02 10:19:23 +02:00
Arndt Brenschede
21b0431a1a changed db-tag-processing to csv-file 2023-07-02 09:46:38 +02:00
Arndt Brenschede
1a2bb197d1 issue#572: filter out on_red turn restrictions 2023-07-02 08:32:43 +02:00
afischerdev
cada37b4de
Merge pull request #570 from Totorrr/profiles-oneway-bicycle-leftright
Also consider cycleway:left:oneway & cycleway:right:oneway in onewaypenalty
2023-06-29 11:39:59 +02:00
Arndt Brenschede
de0acb77c5 optimizing constant expressions in profile parsing 2023-06-25 12:51:10 +02:00
Totorrr
df19fcf891 Also consider cycleway:left:oneway & cycleway:right:oneway in onewaypenalty 2023-06-13 00:54:17 +02:00
afischerdev
ef73d468c0
Merge pull request #562 from afischerdev/app-problem
App error in parsing int
2023-06-06 11:41:27 +02:00
afischerdev
99eba591fa prevent parse int error 2023-06-01 18:03:01 +02:00
Arndt Brenschede
8d711bf73a preliminary supress hgt reading in pos-unifier (performace problem) 2023-05-28 21:08:20 +02:00
Arndt Brenschede
f7bce89b7c pseudo-tags from DB, here: preload and use in-memory matching 2023-05-28 19:53:35 +02:00
Arndt Brenschede
bfe1f4a6a4 pseudo-tags from DB, here: preload and use in-memory matching 2023-05-28 19:25:42 +02:00
abrensch
624edc63ee
Merge pull request #556 from afischerdev/jdbc-import
Works for me, but performance test for planet processing still running. Maybe I'll change to prelaod Database Info and matching against a memory map, but I first merge and do eventual changes in a new PR
2023-05-28 14:05:25 +02:00
afischerdev
6158c44d82 reformat sql by hand 2023-05-24 14:12:51 +02:00
afischerdev
f702100e8a rework database routine 2023-05-24 11:46:25 +02:00
afischerdev
72195d3b4c reformat lua 2023-05-24 11:33:32 +02:00
afischerdev
7e57824d9f mark optional parameter 2023-05-23 12:01:01 +02:00
afischerdev
4fbe368f2a rename function 2023-05-23 11:57:13 +02:00
afischerdev
7ce31e3c16 remove comment, set standard out 2023-05-23 11:54:39 +02:00
afischerdev
fb5f293dc9
Merge pull request #557 from zod/elevation-tests
Use elevation data in tests
2023-05-22 14:13:28 +02:00
Manuel Fuhr
d508337d7e Fix tests with elevation data 2023-05-22 13:54:14 +02:00
Manuel Fuhr
1b45d203f0 Use elevation data in mapcreator & route tests 2023-05-22 13:39:12 +02:00
Manuel Fuhr
0831f94750 Add sparse 90m CGIAR SRTM data for dreieich.pbf 2023-05-22 13:39:12 +02:00
afischerdev
c2400a96e7 Avoid SRTM cache for HGT files 2023-05-22 13:38:40 +02:00
afischerdev
3602e4202c set exit(1) on jdbc error 2023-05-22 11:27:17 +02:00
afischerdev
781661ea12
Merge pull request #549 from afischerdev/engine-mode
Introducing engineMode for future use
2023-05-21 11:27:15 +02:00
afischerdev
64cabbe42f
Update RoutingEngine.java 2023-05-21 11:23:38 +02:00
afischerdev
fcab1a31fd
Merge branch 'master' into engine-mode 2023-05-21 11:14:00 +02:00
afischerdev
511ac2752a set new lookups version 2023-05-20 18:26:40 +02:00
afischerdev
1142c28343 add scripts and readme for db use 2023-05-20 18:17:09 +02:00
afischerdev
3ea409078e modify hgt call in bef generation 2023-05-20 18:16:01 +02:00
afischerdev
7750ba98c5 update test with sdbc call null 2023-05-20 18:13:08 +02:00
afischerdev
e362e5c6c0 add jdbc call 2023-05-20 18:12:11 +02:00
afischerdev
14b1ece960
Merge pull request #555 from zod/pbfparser
Replace XML parser with pbfparser
2023-05-20 15:37:12 +02:00
Manuel Fuhr
c0245df07b Adapt scripts & documentation 2023-05-17 20:04:56 +02:00
Manuel Fuhr
c058cc57ef Configurable map polling 2023-05-17 19:42:57 +02:00
Manuel Fuhr
93c2d676d0 Convert testfile to pbf 2023-05-17 19:42:57 +02:00
Manuel Fuhr
78f33ee479 Use pbfparser instead of XML parser in map-creator 2023-05-17 19:42:57 +02:00
afischerdev
11a9843f41
Merge pull request #553 from zod/elev-source-cleanup
Cleanup hgt reader
2023-05-17 10:48:35 +02:00
afischerdev
0388d5534f
Merge pull request #554 from zod/bundle-update
Update bundle & bundler
2023-05-16 11:58:47 +02:00
afischerdev
4e9d3d90eb re formatting entry point 2023-05-16 10:35:41 +02:00
Manuel Fuhr
2b8270f075 Update bundle & bundler 2023-05-16 08:01:31 +02:00
Manuel Fuhr
53a5f44645 Rename SRTM variables because it's not SRTM specific 2023-05-15 21:46:19 +02:00
Manuel Fuhr
7c184a03f9 Deduplicate code 2023-05-15 21:21:57 +02:00
Manuel Fuhr
8c6be04228 Delete HgtReader 2023-05-15 21:20:20 +02:00
afischerdev
8bf0125fa3
Merge pull request #550 from zod/pmd-fixes
@zod 
Ok, I didn't control that.

More PMD rules & fixes
2023-05-15 17:19:12 +02:00
afischerdev
b21ca106dd
Merge pull request #548 from afischerdev/elev-source
Add a hgt Reader
2023-05-15 10:15:22 +02:00
afischerdev
f071df5dd5 formatting error 2023-05-14 17:07:35 +02:00
afischerdev
ec942243be docs get elevation 2023-05-14 16:51:52 +02:00
afischerdev
40b4794573 use of get elevation in app 2023-05-14 16:38:02 +02:00
afischerdev
3c5ac660bf get elevaton single point 2023-05-14 16:36:52 +02:00
afischerdev
c20b2ba686 update to one routine 2023-05-10 12:56:21 +02:00
afischerdev
fa1a6b3c27 better wording 2023-05-10 11:45:56 +02:00
Manuel Fuhr
28f205c1ad Enable PMD rule PrimitiveWrapperInstantiation and fix violations 2023-05-09 23:11:14 +02:00
Manuel Fuhr
7a6d3bd9d9 Enable PMD rule UseDiamondOperator and fix violations 2023-05-09 23:11:14 +02:00
Manuel Fuhr
2e1722150c Collect rules similar to Android Studio inspections 2023-05-09 23:11:14 +02:00
afischerdev
3dffea1753 introducing engineMode for future use 2023-05-09 12:26:54 +02:00
afischerdev
0c32770cfd
Merge pull request #546 from afischerdev/app-params-update
Add an app params dialog
2023-05-08 16:30:25 +02:00
afischerdev
7b04e0bde2
Merge pull request #547 from afischerdev/roundabout
Rework on roundabout
2023-05-08 16:26:47 +02:00
afischerdev
a2c5ed68fc change hgt to ConvertLidarTile 2023-05-08 16:18:24 +02:00
afischerdev
580b7b2421 add hgt reader 2023-05-06 17:48:54 +02:00
afischerdev
43ea1ef054 rework start inside roundabout, turnInstructionRoundabouts 2023-05-04 12:28:05 +02:00
afischerdev
3976750f75 add param dialog to app 2023-05-04 11:19:07 +02:00
afischerdev
cdda6ee32c add check for apk warning 2023-05-04 10:39:53 +02:00
afischerdev
2c707c977b
Merge pull request #542 from moving-bits/blockformatting
Unify brackets for opening blocks with comments
2023-04-30 10:33:54 +02:00
afischerdev
355e893644
Merge pull request #541 from moving-bits/spellings
fix some spellings
2023-04-30 10:22:25 +02:00
moving-bits
79b1eda1ed Unify brackets for opening blocks with comments 2023-04-29 19:04:52 +02:00
moving-bits
52185f9860 fix some spellings 2023-04-29 18:38:16 +02:00
afischerdev
340227016a
Merge pull request #540 from afischerdev/app-version-update
version 1.7.0
2023-04-29 12:24:21 +02:00
afischerdev
80d9a47927 version 1.7.0 2023-04-29 12:17:54 +02:00
afischerdev
242a1d7b93
Merge pull request #538 from afischerdev/lib-update-export
Lib update export
2023-04-26 13:26:08 +02:00
afischerdev
b7422c0ca7 smaller rework on elev and energy values 2023-04-24 13:41:29 +02:00
afischerdev
8a6579ef4d error correction 2023-04-22 15:09:34 +02:00
afischerdev
1779b1d3b5 update misplace check routine for roundabouts 2023-04-22 13:17:52 +02:00
afischerdev
1df5a468b0 update docs for incoming params 2023-04-22 12:47:29 +02:00
afischerdev
d43edb311d update import of way point names 2023-04-22 12:46:35 +02:00
afischerdev
8190aaa92d update new exports 2023-04-22 12:45:36 +02:00
afischerdev
e1766792ac
Merge pull request #532 from afischerdev/app-online-check
App changes, online check
2023-04-22 10:03:16 +02:00
afischerdev
60de94f9e4
Merge pull request #537 from zod/docs-bundle-update
Update gh-pages dependencies
2023-04-22 09:59:54 +02:00
Manuel Fuhr
7657496af8 bundle update gh-pages dependencies 2023-04-20 23:03:58 +02:00
afischerdev
4529a40640
Merge pull request #535 from quaelnix/recalc-track-cleanup
Cleanup recalcTrack
2023-04-20 11:13:08 +02:00
quaelnix
4c75de08c6
Cleanup recalcTrack 2023-04-18 15:11:24 +02:00
afischerdev
4559f17d85 change url for elevation data #534 2023-04-17 17:09:59 +02:00
afischerdev
13aad459b7 change foot default brf 2023-04-17 10:04:15 +02:00
afischerdev
b4ad0c4b38 more checks for online problems 2023-04-17 10:03:32 +02:00
afischerdev
3675a2c9dd more heap to avoid OOM 2023-04-17 10:02:07 +02:00
afischerdev
9a61ddac93 clean up log 2023-04-17 10:00:40 +02:00
afischerdev
575c24c93d
Merge pull request #526 from afischerdev/app-version-check
App version check

@polyscias 
Please keep on testing. We could change on problems later.
2023-04-14 09:41:02 +02:00
afischerdev
c9b3cc8457
Merge pull request #528 from afischerdev/app-version-update
Prepare update
2023-04-10 13:18:39 +02:00
afischerdev
7e2973222f
Merge pull request #502 from quaelnix/fastmath-fix
Fix inappropriate approximation of Math.exp
2023-04-10 13:15:34 +02:00
quaelnix
f5f3a7a6d6 Remove FastMath.exp
FastMath.exp was neither continuous nor strictly monotonically increasing for x < -1 and therefore inappropriate for the intended purpose.
2023-04-10 11:48:35 +02:00
afischerdev
fa00520f1e prepare update 2023-04-07 17:19:43 +02:00
afischerdev
4fd6788bbb rework if profile is defined in params 2023-04-04 11:47:21 +02:00
afischerdev
fa64ff9192 add app version check to downloader 2023-04-04 11:46:50 +02:00
afischerdev
2eb47300cf
Merge pull request #525 from afischerdev/app-update
Profile updates
2023-04-04 11:09:34 +02:00
afischerdev
0bc24c121c modify bike brf on roundabout #485 one more 2023-04-03 09:51:53 +02:00
afischerdev
5e217ff67b modify bike brf on roundabout #485 2023-04-02 18:40:21 +02:00
afischerdev
9dd91bf004 add mtb.brf to update queue 2023-04-02 18:38:48 +02:00
afischerdev
560358d9cb
Merge pull request #179 from bagage/master
Add MTB profile
2023-04-02 18:06:25 +02:00
afischerdev
b7842e1722
Merge pull request #471 from Totorrr/profiles-no-total-access-cycleroutes
Remove cycleroute always having access granted
2023-04-02 18:01:06 +02:00
afischerdev
5d56bb9abe
Merge pull request #523 from afischerdev/app-update
Repair #268
2023-04-02 17:57:31 +02:00
afischerdev
443b01e9fd repair #268 2023-04-02 17:52:56 +02:00
afischerdev
609f62d6d5
Merge pull request #268 from Totorrr/profiles-oneway-bicycle-leftright
Consider cycleway:left&right in onewaypenalty
2023-04-02 17:32:40 +02:00
afischerdev
7b89db71a0
Merge pull request #521 from afischerdev/app-update
App update
2023-03-31 15:17:38 +02:00
afischerdev
4d3edd0571 update interface list for server and app 2023-03-31 15:10:26 +02:00
afischerdev
9d2e4171c4 update action node.js warning 2023-03-31 15:09:57 +02:00
afischerdev
ab2780424f
Merge pull request #520 from afischerdev/doc-update
Update docs
2023-03-30 10:56:21 +02:00
afischerdev
49c146e0eb enable dist zip without apk 2023-03-30 10:17:46 +02:00
afischerdev
3653cfec59 add privacy policy 2023-03-29 18:44:06 +02:00
afischerdev
0b8a7fda39
Merge pull request #516 from afischerdev/app-update
App update: Show only updatable tiles
2023-03-28 09:40:53 +02:00
afischerdev
88977cca3a
Merge pull request #518 from afischerdev/fix-nogos
Fix wpt and nogo handling
2023-03-28 09:38:18 +02:00
afischerdev
4147405362
Merge branch 'abrensch:master' into fix-nogos 2023-03-28 09:33:04 +02:00
abrensch
dc5602b816
Merge pull request #497 from quaelnix/cost-cutoff-way-context
Allow hill cost and hill cutoff in way context
2023-03-27 09:20:35 +02:00
afischerdev
b225dd22d9 do not remove wpt in weighted nogos 2023-03-25 12:12:29 +01:00
afischerdev
22c92635b2 count only updatable tiles 2023-03-23 11:38:28 +01:00
afischerdev
2f0b5f18e1 enable user data save on delete app 2023-03-23 11:36:25 +01:00
afischerdev
86ae6d2b2b
Merge pull request #514 from afischerdev/app-update
App update: control update for serverconfig
2023-03-23 11:27:58 +01:00
afischerdev
a75ee2b5e6
Merge pull request #515 from afischerdev/misc-pbfparser
Use an interface call in pbfparser lib
2023-03-22 18:45:16 +01:00
afischerdev
b304789e43 repair call to interface, pmd conform 2023-03-22 18:25:37 +01:00
afischerdev
4c2bf8f8bf control update for serverconfig 2023-03-22 14:14:41 +01:00
afischerdev
fc22892a66
Merge pull request #511 from afischerdev/app-update
App update
2023-03-20 16:59:13 +01:00
afischerdev
ba18a93cfe
Merge pull request #513 from afischerdev/coordreader-problem
Fix Nogo waypoint use
2023-03-20 16:55:13 +01:00
afischerdev
63912941f0 rework nogo and vetos 2023-03-19 13:59:38 +01:00
afischerdev
0b3e6b19b0 add check wpt vs nogos 2023-03-19 13:58:32 +01:00
afischerdev
db180ef76c add wpt use for nogos 2023-03-16 17:06:09 +01:00
afischerdev
8a7e973bda add dependsOn to avoid warnings 2023-03-15 19:23:23 +01:00
afischerdev
1925cbecab rework deprecated showDialog 2023-03-15 19:12:33 +01:00
afischerdev
585724dbf8 suppress deprecation warnings for StatFs 2023-03-15 19:08:24 +01:00
afischerdev
1649b07faa protect new version download 2023-03-15 19:04:48 +01:00
afischerdev
594e6e3193 replace deprecated drawBitmap 2023-03-15 18:26:29 +01:00
afischerdev
8ab74b87bd replace deprecated get 2023-03-15 18:22:20 +01:00
afischerdev
c903ae7417 gradle add namespace 2023-03-15 18:12:58 +01:00
afischerdev
fa977b6cc3 new logic in installer app 2023-03-14 17:36:17 +01:00
afischerdev
9e542ab541 rework downloader new array 2023-03-14 17:35:16 +01:00
afischerdev
1d9b85c4b7 new resources for installer activity 2023-03-14 14:44:47 +01:00
afischerdev
9e772cb12e rework downloader 2023-03-14 14:43:20 +01:00
afischerdev
1f246297e2 add permission check for notification 2023-03-14 14:37:21 +01:00
afischerdev
b16b7d2362 add permission for status info on download 2023-03-14 14:29:56 +01:00
afischerdev
25be21fec9 reduze exception info 2023-03-14 14:27:57 +01:00
afischerdev
3d31b8284c reduze exception info to message only 2023-03-14 14:22:12 +01:00
afischerdev
fbc01c0ba9 add version check, smaller size version info 2023-03-14 14:17:27 +01:00
afischerdev
d315f4e33e gradle updates 2023-03-14 14:14:59 +01:00
afischerdev
0cf83456f7
Merge pull request #510 from afischerdev/lib-update-six
Update Lib Part Six - Change Export
2023-03-09 11:05:13 +01:00
afischerdev
e49e039d73 clean up gpx outputs 2023-03-06 18:39:35 +01:00
afischerdev
59199d7339 add new vh tags for output 2023-03-06 18:33:20 +01:00
afischerdev
7e581ccb9e fix elev at last pt 2023-03-06 18:29:40 +01:00
afischerdev
d85905035c change affected files for output rules 2023-02-26 13:04:43 +01:00
afischerdev
7434c12b31 change output rules 2023-02-26 13:02:39 +01:00
afischerdev
7edc35009f add direction tests 2023-02-26 12:37:24 +01:00
afischerdev
16a5ebe737 add direction on matching wpts 2023-02-26 12:36:56 +01:00
afischerdev
b735cd3e4e
Merge pull request #508 from afischerdev/update-lib-five
Update lib part five - change voicehints
2023-02-25 12:18:41 +01:00
afischerdev
b0eb0840d9 add messagedata for beeline 2023-02-20 18:32:57 +01:00
afischerdev
0e74fd5240 add small voice hint description 2023-02-19 13:49:02 +01:00
afischerdev
e905eefc6c change distance call 2023-02-19 13:48:20 +01:00
afischerdev
c9ae7c8681 change vh rules 2023-02-19 13:46:37 +01:00
afischerdev
15dd1f30f1 add rework vh process and add post process 2023-02-19 13:45:47 +01:00
afischerdev
c586245db5 add small elev rework 2023-02-19 13:43:57 +01:00
afischerdev
5571eee82c
Merge pull request #472 from Totorrr/profile-indent-typo-correc
Correc indent typo
2023-02-15 17:29:41 +01:00
afischerdev
8903939176
Merge pull request #499 from quaelnix/fix-calc-distance
Fix rounding error in calcDistance
2023-02-15 17:24:03 +01:00
afischerdev
c588daa68f
Merge pull request #500 from quaelnix/elevation-filter-regression-fix
Fix regression in elevation filter logic
2023-02-15 17:21:19 +01:00
afischerdev
ca5279d7c7
Merge pull request #506 from quaelnix/eta-regression-fix
Fix regression in travel time computation
2023-02-13 18:47:14 +01:00
afischerdev
101c72b6dc
Merge pull request #503 from quaelnix/priorityclassifier-initialization-fix
Fix priorityclassifier initialization
2023-02-13 18:42:52 +01:00
quaelnix
480977ec46
Fix regression in travel time computation
abrensch@17be365 introduced a bug that causes a negative bias in the calculated incline each time the elevation buffer is reset, which results in an additional misestimation of the travel time when via points are added.
2023-02-08 22:01:09 +01:00
quaelnix
829baba037 Allow hill cost and hill cutoff in way context
This removes the limitation that `downhillcutoff` and `uphillcutoff` as well as `downhillcost` and `uphillcost` cannot be used in the way context.
2023-01-29 17:00:54 +01:00
quaelnix
f46c94083e
Fix priorityclassifier initialization 2023-01-28 10:34:01 +01:00
quaelnix
c3508c2adc
Fix regression in elevation filter logic
25e506d changed the order in which the elevation deltas are passed through the elevation filter, which can lead to the undesirable behavior that appending segments to the end of a route can decrease the calculated total ascent. This fixes the bug by adjusting the elevation filter accordingly.
2023-01-24 23:38:11 +01:00
quaelnix
4495952625 Fix rounding error in calcDistance 2023-01-20 10:22:01 +01:00
afischerdev
2387513a1f
Merge pull request #498 from afischerdev/update-version
Update lib part four - recalculation at end of track
2023-01-20 10:04:04 +01:00
afischerdev
23d55aa40c customize tests 2023-01-16 11:24:21 +01:00
afischerdev
3d34340e14 recalc elevation at end 2023-01-16 11:18:12 +01:00
afischerdev
32b258c188 recalc track rules, reorg detours 2023-01-16 10:37:19 +01:00
afischerdev
25e506dcbe recalc track at end 2023-01-15 18:05:18 +01:00
afischerdev
4867368296 move process voice hint, speed profile to the end 2023-01-15 17:43:51 +01:00
afischerdev
2b9a9d5bdd
Merge pull request #494 from afischerdev/update-version
Update lib part three - ignore misplaced via points
2023-01-15 16:44:56 +01:00
afischerdev
d081e5eb18
Merge pull request #495 from quaelnix/compute-kinematic-regression-fix
Fix regression of travel time calculation
2023-01-11 17:05:56 +01:00
quaelnix
a49c43d1ef
Refactor computeKinematic 2023-01-10 15:34:37 +01:00
quaelnix
1e819cf5bd
Fix regression of travel time calculation
This fixes a regression in the travel time calculation in the kinematic model caused by combining the following two commits:
* bd025875d4 (diff-59799a4a78f59692f35951f94cd8733f7e34718c2d60a6248685159f637517a4)
* 57da34d205 (diff-59799a4a78f59692f35951f94cd8733f7e34718c2d60a6248685159f637517a4)
2023-01-10 15:32:17 +01:00
afischerdev
b75a6cdab1 test add for misplaced pts 2023-01-09 19:00:02 +01:00
afischerdev
c03f21b72f test adjust 2023-01-09 18:35:35 +01:00
afischerdev
b98a576fe3 add routines for misplaced pts 2023-01-09 18:25:16 +01:00
afischerdev
d2297c0c52 add vars for misplaced pts 2023-01-09 18:13:03 +01:00
afischerdev
06328dec3d
Merge pull request #488 from afischerdev/update-version
I haven't heard of @zod for a while, so I do it my self.
Update lib part two - direct routing
2023-01-08 15:53:48 +01:00
afischerdev
d39501bb04 add 3 direct routing samples 2022-12-05 11:53:23 +01:00
afischerdev
d67b3c0ec9 change use of direct routing 2022-12-05 11:50:52 +01:00
afischerdev
3e53659f18 add voicehint test 2022-11-24 20:13:36 +01:00
afischerdev
8fa27bcf6e remove compiler warnings 2022-11-23 16:28:43 +01:00
afischerdev
a764c788ba
Merge pull request #483 from zod/code-analysis
Static code analysis using PMD

I haven't tested that results. For me the diffs looking well and useful.
The future will show if we need all this rules, drop some or add more.
2022-11-23 11:20:54 +01:00
Manuel Fuhr
49295eb850 Enable PMD rule LogicInversion and fix violations 2022-11-14 22:06:18 +01:00
Manuel Fuhr
41d25cd523 Enable PMD rule SimplifyBooleanReturns and fix violations 2022-11-14 22:06:18 +01:00
Manuel Fuhr
5f942cc458 Enable PMD rule UnnecessaryModifier and fix violations 2022-11-14 22:06:18 +01:00
Manuel Fuhr
b68f1587b2 Enable PMD rule UnnecessaryFullyQualifiedName and fix violations 2022-11-14 22:06:18 +01:00
Manuel Fuhr
b1a88b01ab Enable PMD rule SimplifiableTestAssertion and fix violations 2022-11-14 22:06:18 +01:00
Manuel Fuhr
09a9c1a104 Enable PMD rule SimplifiedTernary and fix violations 2022-11-14 22:06:18 +01:00
Manuel Fuhr
a07fc132d2 Enable PMD rule SingularField and fix violations 2022-11-14 22:06:18 +01:00
Manuel Fuhr
3b77f93c00 Enable PMD rule UnnecessaryReturn and fix violations 2022-11-14 22:06:18 +01:00
Manuel Fuhr
1bff48b649 Enable PMD rule AvoidInstanceofChecksInCatchClause and fix violations 2022-11-14 22:06:18 +01:00
Manuel Fuhr
30be64cbbe Enable PMD rule LooseCoupling and fix violations 2022-11-14 22:06:18 +01:00
Manuel Fuhr
c75a8cb703 Enable PMD rule UnnecessaryImport and fix violations 2022-11-14 22:06:18 +01:00
Manuel Fuhr
9d0703f898 Add PMD and enable quickstart ruleset
PMD allows checking code for violation and quickstart provides a ruleset
which are "rules that are most likely to apply everywhere". Violated
rules are disabled to get started.
2022-11-14 22:03:47 +01:00
Manuel Fuhr
91459dbeb3 Fix codestyle violations from #481 2022-11-14 22:00:41 +01:00
zod
fce160e89d
Merge pull request #481 from afischerdev/update-test
Add test profiles with lookups.dat
2022-11-14 21:48:23 +01:00
zod
0566f8154c
Merge pull request #478 from afischerdev/update-version
Extend android service interface and http API
2022-11-14 21:47:58 +01:00
afischerdev
cebcd566c6 reformat RouteServer again 2022-11-13 16:27:37 +01:00
afischerdev
5c970ed71f reformat RouteServer 2022-11-13 16:14:56 +01:00
afischerdev
f6afafb46c reformat BRouterWorker 2022-11-13 16:08:16 +01:00
afischerdev
f6d4eee763 update description for direction 2022-11-13 15:55:38 +01:00
afischerdev
78cd395ca1
Merge pull request #474 from zod/test-routeserver
Tests for RouteServer
2022-11-10 13:07:10 +01:00
afischerdev
31e10c5475
Merge pull request #482 from zod/checkstyle-android
Enable checkstyle for brouter-routing-app
2022-11-10 10:56:35 +01:00
afischerdev
70be61a81e made it pretty 2022-11-10 10:53:53 +01:00
Manuel Fuhr
556555b8ae Test profile upload 2022-11-09 18:35:49 +01:00
Manuel Fuhr
06f0315cf4 Remove broken spam/referrer handling 2022-11-09 18:35:49 +01:00
Manuel Fuhr
3a4d743eb5 Increase startup timeout 2022-11-09 18:35:49 +01:00
Manuel Fuhr
fed171fd06 Tests for RouteServer 2022-11-09 18:35:44 +01:00
Manuel Fuhr
15df3d392d Add testcase for param override 2022-11-09 18:35:44 +01:00
afischerdev
da878abb90 add cmd line test profiles vs lookup 2022-11-09 10:06:37 +01:00
afischerdev
717f8c03d4 add cmd line test profiles vs lookup 2022-11-09 10:00:58 +01:00
Manuel Fuhr
6f6d99b3e3 Stricter whitespace handling 2022-11-09 07:29:53 +01:00
Manuel Fuhr
227596eb90 Fix checkstyle errors in android app 2022-11-09 07:29:32 +01:00
Manuel Fuhr
78ce7b659b Enable checkstyle for android
The checkstyle gradle plugin works only for java libraries. Android
requires additional configuration.
2022-11-09 07:29:32 +01:00
afischerdev
fb7c73bb0f add test profiles with lookups.dat 2022-11-08 18:09:03 +01:00
vcoppe
81b27b3199 fallback to reset_segments if failed from diff 2022-11-02 15:04:50 +01:00
afischerdev
54a7ad6b9d update lib part one 2022-11-02 09:37:12 +01:00
zod
eba0b18689
Merge pull request #477 from zod/475-docs-nogo-naming
Link from vianogo to android docs
2022-10-29 20:48:53 +02:00
Manuel Fuhr
5c12140c55 Link from vianogo to android docs 2022-10-29 20:47:19 +02:00
afischerdev
00a2183153
Merge pull request #473 from zod/move-testcases
Move testcases from brouter-server
2022-10-23 12:40:38 +02:00
zod
fd10303d0f
Merge pull request #469 from zod/bundle-update
Update jekyll/gh-pages dependencies
2022-10-23 11:40:09 +02:00
Manuel Fuhr
fd158dc0ce Cleanup testcases (split, dependencies, assertions) 2022-10-23 10:51:02 +02:00
Manuel Fuhr
3787076839 Move testcases from brouter-server to their modules 2022-10-23 10:22:47 +02:00
afischerdev
93e8a81005
Merge pull request #305 from multimodaal/patch-1
Update river profile
2022-10-22 10:58:10 +02:00
Totorrr
8d2a2db770
Correc indent typo
Minor 1 char change to improve profile code readability.
2022-10-22 00:58:37 +02:00
Totorrr
e2c6811761
Remove cycleroute always having access granted
Some bicycle routes can have portions which are not accessible.

Potential reasons for this:
 - temporary construction work on the ways (sometimes for months),
 - local access change without simultaneous update of bicycle route relation
   (lack of mapper's knowledge or lack of time to redefine an updated route),  
 - ... 

This commit changes the fastbike profile so that being on a cycle route does 
not automatically mean "access granted" for bicycles. The normal access 
checkings are used instead.

However, the accesspenalty is set to 15 if the no-access way belongs to such a 
route, so that the way is not completely forbidden if there is no other 
alternative unavailable.
2022-10-22 00:51:23 +02:00
Totorrr
35f4e3312e
Remove cycleroute always having access granted
Some bicycle routes can have portions which are not accessible.

Potential reasons for this:
 - temporary construction work on the ways (sometimes for months),
 - local access change without simultaneous update of bicycle route relation
   (lack of mapper's knowledge or lack of time to redefine an updated route),  
 - ... 

This commit changes the trekking profile so that being on a cycle route does 
not automatically mean "access granted" for bicycles. The normal access 
checkings are used instead.

However, the accesspenalty is set to 15 if the no-access way belongs to such a 
route, so that the way is not completely forbidden if there is no other 
alternative unavailable.
2022-10-22 00:19:22 +02:00
Manuel Fuhr
94a91c3baf Update jekyll/gh-pages dependencies
dependabot noticed another dependency with security issues (#461), but
lets just update all dependencies.
2022-10-21 07:04:40 +02:00
afischerdev
13a41bd9f8
Enable use of direction to app (#464)
* make app equal to server #314
2022-10-19 21:12:38 +02:00
zod
51dc7fb1fb
Merge pull request #467 from afischerdev/update-test
Update osm test data
2022-10-18 07:17:02 +02:00
afischerdev
27b609f050 update osm test data 2022-10-16 17:45:40 +02:00
afischerdev
9662e50a43
Merge pull request #449 from zod/reformat-codebase
Reformat codebase
2022-10-03 17:49:04 +02:00
vcoppe
14bcd6e4b4 remove empty diff files 2022-09-17 11:24:39 +02:00
vcoppe
872c88b268 add script to reset all segment files 2022-08-15 16:07:41 +02:00
afischerdev
70c0e7ca3e
Merge pull request #455 from zod/gh-pages-update
Update GitHub pages gem
2022-08-03 18:39:08 +02:00
afischerdev
993c194621
Merge pull request #454 from zod/remove-coordinate-readers
Update docs after coordinate reader removal
2022-08-03 18:38:42 +02:00
Manuel Fuhr
33e3183259 Update GitHub pages gem
Dependabot proposed updating tzinfo (#451) but it's better to just update
our GitHub pages gem which depends on a more recent tzinfo version.
2022-08-03 17:13:21 +02:00
afischerdev
04f8f925dd update docs 2022-08-03 17:02:23 +02:00
Manuel Fuhr
f8d6aa7906 Fix checkstyle errors 2022-07-25 06:15:47 +02:00
Manuel Fuhr
4e0dcbd0bf Adapt checkstyle rules and disable suppressions 2022-07-25 06:15:47 +02:00
Manuel Fuhr
adc14df33a Ignore reformatting in git blame 2022-07-25 06:15:41 +02:00
Manuel Fuhr
c15913c1ab Reformat whole codebase using Android Studio 2022-07-25 06:14:46 +02:00
Arndt Brenschede
d5322667d5 Suspect-Manager: allow 999 days for resubmission, track neighbour resubmissions 2022-07-17 11:38:28 +02:00
afischerdev
08161b47fd
Merge pull request #446 from zod/checkstyle
Automatically check codestyle using checkstyle

Great idea to do it this way.
2022-07-10 18:31:31 +02:00
afischerdev
2dd8682f13
Merge pull request #445 from zod/remove-coordinate-readers
Update APK path
2022-07-10 18:24:17 +02:00
Manuel Fuhr
f18cded3a7 Enable checkstyle checks
Basic indention / formatting checks but suppressed for most modules
because they currently fail the checks.
2022-07-09 07:52:40 +02:00
Manuel Fuhr
c80e34fdc5 Add checkstyle 2022-07-09 07:38:04 +02:00
afischerdev
74d6683ac8 update server gradle for copy apk 2022-07-07 16:17:49 +02:00
afischerdev
9d37e2c648
Merge pull request #439 from zod/remove-coordinate-readers
Remove coordinate readers

Sorry for the delay I didn't recognize the ready sign.
2022-06-26 15:25:20 +02:00
Manuel Fuhr
08df1d9909 Finish after sharing track 2022-06-08 06:52:50 +02:00
Manuel Fuhr
f42f10ba4b Remove help dialog for api10 2022-06-08 06:49:34 +02:00
Manuel Fuhr
669ea28d1b Share track via intent 2022-05-30 22:39:11 +02:00
Manuel Fuhr
bdecc2e1b9 Provide only single targetSdkVersion build
Remove api19 build and default to api30 because we no longer access
files of other apps.
2022-05-30 22:39:11 +02:00
Manuel Fuhr
a767ed6dbb Sort profiles 2022-05-30 22:39:11 +02:00
Manuel Fuhr
3cfa1d954b Test CoordinateReader 2022-05-30 22:39:11 +02:00
Manuel Fuhr
56b7c108e4 Remove redirect handling 2022-05-30 22:39:11 +02:00
Manuel Fuhr
9d93d25a84 Remove abstract CoordinateReader 2022-05-30 22:39:11 +02:00
Manuel Fuhr
01ad4dc09a Remove app specific coordinate readers 2022-05-30 22:39:11 +02:00
Manuel Fuhr
924a33ccb5 Remove wakelock 2022-05-30 22:39:11 +02:00
Manuel Fuhr
fc524fb4db Remove WpDatabaseScanner 2022-05-30 22:39:11 +02:00
Manuel Fuhr
b5895e1fd8 Split functions 2022-05-30 22:39:11 +02:00
Manuel Fuhr
fc9deccad7 Fix some AndroidStudio warnings 2022-05-30 22:39:11 +02:00
Manuel Fuhr
1d2809de70 Silent code cleanup 2022-05-30 22:39:11 +02:00
Manuel Fuhr
11f104afe9 Rearange BRouterActivity/BRouterView 2022-05-30 22:39:11 +02:00
afischerdev
c67374a268
Merge pull request #437 from zod/fixes
Minor fixes to DownloadWorker and setup
2022-05-30 13:52:37 +02:00
Manuel Fuhr
eae79d3425 Report errors from DownloadWorker 2022-05-28 22:41:13 +02:00
Manuel Fuhr
9445361f28 Fix unpacking readmes.zip which contains directories 2022-05-24 22:53:28 +02:00
zod
0adc618c13
Merge pull request #433 from afischerdev/update-version
Repair BRouterWorker
2022-05-23 17:54:46 +02:00
zod
3a81a45b97
Merge pull request #435 from zod/docs
Update github-pages gems and deployment
2022-05-22 22:26:27 +02:00
Manuel Fuhr
a5e9d29ba1 Update lockfile using "bundle update"
This should also fix PR #434
2022-05-22 22:23:24 +02:00
Manuel Fuhr
903308e870 Update jekyll config to abrensch repository 2022-05-22 22:03:32 +02:00
afischerdev
13b8b1d3a1 changed aidl description 2022-05-22 14:01:11 +02:00
afischerdev
1765ec4523
Merge pull request #431 from zod/docs
Update documentation and transform to markdown
2022-05-22 13:28:23 +02:00
afischerdev
244596bdab removed test code 2022-05-22 13:11:31 +02:00
Manuel Fuhr
89bf71376f Add latest release 2022-05-17 06:45:13 +02:00
Manuel Fuhr
a75570a027 Merge branch 'master' into docs 2022-05-17 06:40:31 +02:00
Manuel Fuhr
8c15dc54b2 Autogenerate readmes.zip from docs 2022-05-17 06:25:16 +02:00
Manuel Fuhr
6c5b1ddec6 Reference docs and remove readmes 2022-05-17 06:24:04 +02:00
Manuel Fuhr
76feb56cdf Fix broken links 2022-05-17 05:52:01 +02:00
zod
ced79f61ef
Merge pull request #426 from afischerdev/update-version
Change config for hiking-mountain
2022-04-25 07:56:45 +02:00
afischerdev
aaa8862e12 change config for hiking-mountain 2022-04-24 17:34:06 +02:00
afischerdev
7a6f54e24d
Merge pull request #424 from zod/update-hiking-profile
Update hiking profile
2022-04-24 16:59:52 +02:00
Manuel Fuhr
4b6e34ca55 Update profile to Hiking-Mountain-SAC3 v1.8.7 2022-04-24 14:09:04 +02:00
Manuel Fuhr
56c9030406 Rename hiking profile 2022-04-24 14:09:04 +02:00
afischerdev
4794b28774
Merge pull request #413 from zod/download-worker
Use WorkManager for Downloads
2022-04-23 11:00:49 +02:00
Manuel Fuhr
4edc1b3c11 Move some information from notifaction to log 2022-04-19 17:38:34 +02:00
afischerdev
f858064238
Merge pull request #422 from polyscias/master
Update locus xml generation to have only one extension section
2022-04-18 14:54:24 +02:00
polyscias
62444f8387 Update locus html generation to have only one extension section 2022-04-18 12:23:21 +02:00
vcoppe
e4a0cac8b6 update server options 2022-04-14 16:32:33 +02:00
vcoppe
c562d58ca9 update server options 2022-04-14 16:29:07 +02:00
vcoppe
6794ef320e use bash for scripts 2022-04-14 13:50:52 +02:00
vcoppe
72671bd448 add routing profiles 2022-04-13 12:48:55 +02:00
Manuel Fuhr
cc9732ea91 Remove System.exit(0) which causes app-restart 2022-04-05 21:55:28 +02:00
Manuel Fuhr
02eddeff81 Throw Exception in checkFileIntegrity on failure
DownloadWorker didn't check the string return value which should detect
failed downloads. Throwing (checked) exceptions simplifies error
handling in DownloadWorker.
2022-04-04 18:02:42 +02:00
Manuel Fuhr
f0df9f94d4 Cleanup 2022-04-03 17:53:23 +02:00
Manuel Fuhr
952ea803b2 Use LinarProgessIndicator instead of sub-view 2022-04-03 17:53:21 +02:00
Manuel Fuhr
3a2c109ded Remove DownloadService 2022-04-03 17:53:05 +02:00
Manuel Fuhr
ecc4def40c Use WorkManager for downloads 2022-04-03 17:53:03 +02:00
Manuel Fuhr
21abce0139 Hacky way to disable reporting for small files 2022-04-03 17:51:01 +02:00
Manuel Fuhr
d74d0af687 Rewrite DownloadService
Split code into smaller pieces and remove duplication which already
caused confusion (492d79d42e changed wrong
download speed limit)
2022-04-03 17:51:01 +02:00
Manuel Fuhr
a091b07cb6 Reformat DownloadService 2022-04-03 17:51:01 +02:00
Manuel Fuhr
0a8d4dd1f2 Rd5Diff: Specify IOException instead of generic Exception 2022-04-03 17:51:01 +02:00
Manuel Fuhr
13f5ad0bcf Small cleanup of DownloadService 2022-04-03 17:51:01 +02:00
Manuel Fuhr
db42ae9f33 Always show main dialog (with Download Manager)
It can be confusing when the dialog is shown only sometimes and there is
no indication why the dialog isn't shown. The connection status can also
change after the start of the download manager so it has to handle those
errors anyway.

Closes #389
2022-04-03 17:51:01 +02:00
Manuel Fuhr
c80ad5f03b Update sdk and dependencies 2022-04-03 17:50:13 +02:00
Manuel Fuhr
d92c3beb3e Switch activities to AppCompat and adapt themes
No longer uses fullscreen, statusbar shall be visible to check connection status

Closes #57
2022-04-03 15:56:49 +02:00
Manuel Fuhr
cde4606760 Reformat and fix warnings in AndroidManifest 2022-04-02 18:35:42 +02:00
afischerdev
7fef58a7b6
Merge pull request #410 from zod/409-railway-metro
Use more railway lines for routing
2022-03-30 17:34:50 +02:00
Manuel Fuhr
eb89f9c999 Use more railway lines for routing
Closes #409
2022-03-30 17:24:56 +02:00
Arndt Brenschede
96ab3bf5c2 suspect-manager: display trigger-list 2022-02-12 21:48:26 +01:00
vcoppe
ffba802c9c scripts to handle diff updates 2022-02-11 21:41:01 +01:00
afischerdev
182426a4a2
Merge pull request #400 from afischerdev/update-osmand
trackdir and brouter.redirect changes
2022-02-11 15:59:14 +01:00
afischerdev
f577756433 trackdir and brouter.redirect changes 2022-02-11 15:00:55 +01:00
abrensch
fc5cf1f88a
Merge pull request #396 from rauner/master
correct spelling error in README.md
2022-02-02 10:13:01 +01:00
Sebastian R
518f045d16
spelling error in README.md 2022-02-02 09:48:46 +01:00
Arndt Brenschede
1640bafa80 fixed unknown value bug in pre-processor 2022-01-16 13:31:46 +01:00
Arndt Brenschede
771770af22 added bad-TRs analysis to pre-processor 2022-01-15 10:05:06 +01:00
Arndt Brenschede
8fd38da5c9 Merge branch 'master' of https://github.com/abrensch/brouter 2022-01-15 09:58:37 +01:00
Arndt Brenschede
7173e78214 TR bike exceptions also for foot-mode 2022-01-15 09:57:22 +01:00
abrensch
8624511fec
Merge pull request #392 from vodie/patch-1
no turnrestrictions for footmode
2022-01-15 09:53:19 +01:00
vodie
5f62723cbb
no turnrestrictions for footmode 2022-01-15 01:49:02 +01:00
afischerdev
80d0a30729
Merge pull request #390 from zod/cleanup-binstaller
Use layouts in DownloadManager
2022-01-12 14:54:41 +01:00
Manuel Fuhr
1a3a77de72 Use initial view as minimal zoom level 2022-01-11 21:46:34 +01:00
afischerdev
f54d0f2f97
Merge pull request #385 from zod/workflow-fixes
Fix GitHub workflows
2022-01-11 16:57:26 +01:00
afischerdev
cb684efc31
Merge pull request #386 from zod/379-migration-fixes
Restore access to legacy storage when targeting older SDK
2022-01-11 16:54:36 +01:00
Manuel Fuhr
395586cdda Delete test workflow 2022-01-07 13:17:41 +01:00
Manuel Fuhr
31e7c4ebbd Move info and button to own views 2022-01-07 13:03:01 +01:00
Manuel Fuhr
712bff8459 Use GestureDetector to handle touch events 2022-01-07 13:03:01 +01:00
Manuel Fuhr
b8496ffe5e Use View dimensions instead of display dimensions 2022-01-07 13:03:01 +01:00
Manuel Fuhr
50a7c2244f Clear all flags when scanning files 2022-01-07 13:03:01 +01:00
Manuel Fuhr
0eba6cb345 Move all download handling to BInstallerActivity 2022-01-07 13:03:01 +01:00
Manuel Fuhr
7b6fce1481 Move deleting tiles to BInstallerActivity 2022-01-07 13:03:00 +01:00
Manuel Fuhr
da7569b0a0 Use onClick handler to start download 2022-01-07 09:54:11 +01:00
Manuel Fuhr
51ef5c6aad Show download progress in different view 2022-01-07 09:53:39 +01:00
Manuel Fuhr
6045a18a61 Inflate BInstallerView from layout 2022-01-07 09:51:23 +01:00
Manuel Fuhr
89f075fa61 Draw only available segments 2022-01-07 09:51:23 +01:00
Manuel Fuhr
806ae6250e Draw rect using canvas 2022-01-07 09:51:23 +01:00
Manuel Fuhr
e045a732fb Rename DownloadReceiver 2022-01-07 09:51:23 +01:00
Manuel Fuhr
32747a1f6f Remove wakelock from BInstallerActivity 2022-01-07 09:51:23 +01:00
Manuel Fuhr
64a80e763b Merge startInstaller into constructor 2022-01-07 09:51:23 +01:00
Manuel Fuhr
dd7a2fcd98 More Fixes 2022-01-07 09:51:23 +01:00
Manuel Fuhr
de7dd71a94 Apply Quick Fixes suggested by Android Studio 2022-01-07 09:51:23 +01:00
Manuel Fuhr
89ef74f95b Android Studio automatic cleanup 2022-01-07 09:51:23 +01:00
Manuel Fuhr
553f064ce0 Optimize Imports 2022-01-07 09:51:23 +01:00
Manuel Fuhr
d9b8f69f59 Add test to ensure legacy storage access 2021-12-31 11:26:26 +01:00
Manuel Fuhr
b5f6acf63a Fix gradle-publish 2021-12-31 08:33:43 +01:00
Manuel Fuhr
82d28ed08a Fix build with empty signing environment variables
GitHub action secrets default to empty values if they aren't defined in
a repository. Any fork of the repo doesn't have access to the secrets
and the jobs therefore fail.
2021-12-31 08:30:37 +01:00
Manuel Fuhr
236c65d8ed Fix external storage access
- Allow writing on all versions
- Skip migration if BRouter version has already setup basedir
2021-12-30 14:24:03 +01:00
Manuel Fuhr
f2d4c755f6 Always use h1 for heading 2021-12-29 08:50:02 +01:00
Manuel Fuhr
80681b78c6 Restrucuture documentation to smaller parts 2021-12-29 08:49:47 +01:00
Manuel Fuhr
76f20ca864 Move algorithm to developer information 2021-12-29 08:45:40 +01:00
Manuel Fuhr
d45a811720 Move mapcreation to developer content 2021-12-29 08:45:40 +01:00
Manuel Fuhr
90b08e3fe5 Use baseurl like GitHub pages 2021-12-29 07:24:44 +01:00
Manuel Fuhr
0fba5307fe Add content from README 2021-12-29 07:24:44 +01:00
Manuel Fuhr
ca30bc86b9 Add profile developers guide 2021-12-29 07:17:31 +01:00
Manuel Fuhr
76736072bf Add feature navigation 2021-12-29 07:17:12 +01:00
Manuel Fuhr
0b9ea9b4c0 Import brouter website as markdown 2021-12-29 07:17:09 +01:00
afischerdev
15e84c81ea
Merge pull request #382 from afischerdev/update-version
Add file check for A10 #379
2021-12-28 11:02:38 +01:00
afischerdev
7b460d25d3 add file check for A10 #379 2021-12-28 10:49:56 +01:00
afischerdev
010141af47
Merge pull request #381 from afischerdev/update-version
Update workflow, add check equal folder #379
2021-12-27 20:04:25 +01:00
afischerdev
f5c3103dcf update workflow, add check #379 2021-12-27 20:02:13 +01:00
abrensch
f1b21fc270
Update gradle-publish-test.yml 2021-12-24 14:45:18 +01:00
abrensch
a7276c9be8
Update gradle-publish-test.yml 2021-12-24 14:40:47 +01:00
abrensch
d74b82944e
Update gradle-publish-test.yml 2021-12-24 14:24:37 +01:00
abrensch
7394ff151a
Update gradle-publish-test.yml 2021-12-24 14:12:55 +01:00
abrensch
06550dcbc8
Update gradle-publish-test.yml 2021-12-24 10:26:49 +01:00
abrensch
c128a45cc2
Copy gradle-publish-test from afisherdev repo 2021-12-23 17:45:54 +01:00
abrensch
50c5581e03
Update build.gradle
android version code 43->45 (due to problems uploading to google-play)
2021-12-23 16:38:41 +01:00
afischerdev
a85c742d7a
Merge pull request #377 from afischerdev/update-version
Update version
2021-12-21 17:14:24 +01:00
afischerdev
623f3c0279 set new version 1.6.3 2021-12-21 17:02:35 +01:00
afischerdev
905fe6df19 add osmand folder to distribution 2021-12-21 17:00:20 +01:00
Arndt Brenschede
a5b8ba459b removed brouter-suspects/traffic-analyisis profiles 2021-12-20 10:10:12 +01:00
Arndt Brenschede
492d79d42e inreased download speed limit from 4 to 16 mbit/s 2021-12-20 09:52:53 +01:00
Manuel Fuhr
c6bdf1709c Add jekyll configuration for GitHub pages 2021-12-01 07:41:14 +01:00
Manuel Fuhr
87a5cd21e8 Rewrap 2021-12-01 07:35:30 +01:00
Manuel Fuhr
dfc4bff7fd Use markdown syntax 2021-12-01 07:33:18 +01:00
Manuel Fuhr
d3482098f0 Rename to markdown 2021-11-30 17:32:16 +01:00
afischerdev
f0853adaca
Merge pull request #257 from Totorrr/profiles-paved-surfaces
Consider sett as paved in trekking.brf & fastbike.brf
2021-11-28 16:59:09 +01:00
afischerdev
19a021b36e
Merge pull request #372 from zod/102-unclassified-unpaved
fastbike: Check isunpaved for highway=unclassified
2021-11-28 16:30:20 +01:00
Manuel Fuhr
d1a5911b34 fastbike: Check isunpaved for highway=unclassified
Fixes #102
2021-11-26 17:44:41 +01:00
afischerdev
a37ee5b3d9
Merge pull request #368 from zod/reformat
Reformat using Android Studio
2021-11-21 11:37:34 +01:00
Manuel Fuhr
f349a4c2c5 Add reformat commit to blame-ignore-revs 2021-11-20 16:51:29 +01:00
Manuel Fuhr
54d5c5e943 Reformat files using Android Studio
android-studio/bin/format.sh -m '*.java' -r brouter-routing-app

To rebase active branches on top of the new master just rebase your
branch onto the commit prior to the reformatting and format every commit
of your branch using (<commit> should be replaced by this commit)

git rebase \
        --strategy-option=theirs \
        --onto <commit> \
        --exec 'format.sh -m "*.java" -r brouter-routing-app' \
        --exec 'git commit --all --no-edit --amend' \
        <commit>~

To ignore this mass edit during git blame see `.git-blame-ignore-revs`
2021-11-20 16:50:23 +01:00
Manuel Fuhr
13e0e382c1 Add blame-ignore-revs to allow ignoring mass edits 2021-11-20 16:48:38 +01:00
afischerdev
43028e0722
Merge pull request #367 from afischerdev/test-and11
Smaller changes for #312
2021-11-18 11:31:46 +01:00
afischerdev
227e7394b8 add a warning for BRouter folder in Android Q 2021-11-18 10:53:36 +01:00
afischerdev
168970ea34 removed option for main folder in Android Q 2021-11-18 10:47:09 +01:00
afischerdev
2794f5376e Cosmetics 2021-11-18 10:44:48 +01:00
afischerdev
7fa59d65db Cosmetics 2021-11-18 10:35:50 +01:00
afischerdev
3fddf3bd32 removed old icon 2021-11-18 10:27:20 +01:00
afischerdev
8088b13d63
Merge pull request #364 from zod/essbee-profile-intent
Add intent receiver for profiles

@zod and @EssBee59 
Many thanks for your work.
2021-11-18 10:05:25 +01:00
Manuel Fuhr
c92c289a3f Change profile check to also pass on river.brf 2021-11-18 06:27:21 +01:00
Manuel Fuhr
f29616eefc Don't overwrite built-in profiles 2021-11-18 06:27:21 +01:00
Manuel Fuhr
67bbc3d2ac Move serverconfig.txt handling to own class 2021-11-18 06:27:21 +01:00
Manuel Fuhr
1e594574b5 Improve UI 2021-11-16 16:25:44 +01:00
Manuel Fuhr
6a8f5036b2 Cleanup 2021-11-16 16:13:34 +01:00
Manuel Fuhr
18e015a3b5 Fix warning 2021-11-16 16:13:34 +01:00
Manuel Fuhr
a528630af9 Move profile import to own activity 2021-11-16 16:07:14 +01:00
Manuel Fuhr
a2c5b76105 Import code from @EssBee59
Fixes #362
2021-11-13 06:41:54 +01:00
afischerdev
31594880ef
Merge pull request #361 from afischerdev/test-and11
Smaller changes for #312
2021-11-07 15:22:45 +01:00
afischerdev
30b2c5d6aa check rd5 available 2021-11-07 14:24:16 +01:00
afischerdev
78baefcfeb change migration check 2021-11-07 13:56:21 +01:00
afischerdev
83693903ee change import path internal coord reader 2021-11-07 13:40:14 +01:00
afischerdev
0f0d7db18f Comparison profiles for download 2021-11-07 13:32:44 +01:00
afischerdev
c9baec210a
Merge pull request #356 from zod/permission-handling
Improve permission handling
2021-11-07 12:56:32 +01:00
Manuel Fuhr
db77728d4c Always fallback to CoordinateReaderInternal 2021-11-07 11:19:53 +01:00
afischerdev
80a043568c
Merge pull request #352 from zod/scoped-storage
Set preserveLegacyExternalStorage for easier upgrades
2021-11-02 19:15:13 +01:00
afischerdev
e4a29a163e
Merge pull request #349 from zod/update-rd5-sizes
Update rd5 sizes
2021-11-02 18:39:26 +01:00
Manuel Fuhr
dc95984199 Improve detection of sdcard write access
This allows reading waypoints from apps on till devices running API 28
if they store their files in a accessible path (not Android/data). It
even works for devices running API 30 if they installed it as an update.
2021-11-02 17:41:25 +01:00
afischerdev
ac15951eeb
Merge pull request #355 from afischerdev/test-and11
change manifest, add dependson
2021-10-31 11:14:28 +01:00
afischerdev
d383d271ac change manifest, add dependson 2021-10-31 11:05:27 +01:00
afischerdev
61e648df0d
Merge pull request #348 from zod/gradle-assets
Generate assets with gradle
2021-10-30 16:20:26 +02:00
Manuel Fuhr
5f00e94a4c gradle: Disable 'generate_profile_variants.sh' on Windows 2021-10-29 17:35:53 +02:00
Manuel Fuhr
bfe50a349b gradle: Autogenerate profiles2.zip and add as asset 2021-10-29 17:35:53 +02:00
Manuel Fuhr
12148f6a5d Request runtime permission for WRITE_EXTERNAL_STORAGE
Originally implemented by Erlkoenig90 (#244) but lost during the api
split.
2021-10-23 07:16:20 +02:00
Manuel Fuhr
0e04c1a849 Increase minSdkVersion to 14 and merge implementations
AndroidX needs at least API level 14 (Ice Cream Sandwich) which was
released 10 years ago and should not exclude many devices. Having a
merged tree simplifies the development.
2021-10-23 06:55:17 +02:00
Manuel Fuhr
cf4a188e40 Rename flavours to specify targetSdkVersion
The targetSdkVersion specifies which behaviour the app expects from the
android platform. For past releases of BRouter the targetSdkVersion was
specified in the filename, therefore this restores the old bevahior.
2021-10-23 06:55:17 +02:00
Manuel Fuhr
4d9046d0f5 Update rd5 sizes 2021-10-21 07:53:44 +02:00
Manuel Fuhr
555fa98914 Extend ReadSizes to get sizes from server index
Call "java ReadSizes.java https://brouter.de/brouter/segments4/" to
get an updated list of segment sizes which can be inserted into
BInstallerSizes.java
2021-10-21 07:53:44 +02:00
Manuel Fuhr
4e8b8643e6 Set preserveLegacyExternalStorage for easier upgrades
When updating from a previous app which has it's basedir on e.g.
/sdcard/emulated/0/brouter it's still possible to access the files on
android 11 with this flag.
2021-10-21 07:52:07 +02:00
Manuel Fuhr
f5a415bd68 Format ReadSizes before changes 2021-10-17 09:20:56 +02:00
Manuel Fuhr
e8d8dcda4a gradle: Fix indention before changes 2021-10-15 16:18:47 +02:00
afischerdev
ac13b1fe34
Merge pull request #347 from afischerdev/test-and11
Update for Android 11 part 5
2021-10-15 13:09:47 +02:00
afischerdev
51291237c5 add some brf - removed to much 2021-10-15 12:57:04 +02:00
afischerdev
7fc66c153f Merge branch 'test-and11' of https://github.com/afischerdev/brouter into test-and11 2021-10-15 11:29:21 +02:00
afischerdev
e835e9d5f2
Merge pull request #344 from zod/gradle-android
GitHub Workflow: Provide Artifacts (APK & ZIP)
2021-10-15 11:22:24 +02:00
Manuel Fuhr
3173658690 Make distZip depend on fatJar 2021-10-13 19:30:40 +02:00
Manuel Fuhr
abb338afc9 Upload brouter-*.zip as artifact 2021-10-13 19:30:40 +02:00
Manuel Fuhr
4b39e6ea66 Update JDK and build brouter-routing-app (android) 2021-10-13 19:30:17 +02:00
afischerdev
2849271e24
Merge pull request #345 from zod/editorconfig
Add .editorconfig to harmonize indention
2021-10-11 15:45:07 +02:00
Manuel Fuhr
23bca4ccb3 Add .editorconfig to harmonize indention
EditorConfig (https://editorconfig.org) helps maintain consistent coding
styles for multiple developers working on the same project across
various editors and IDEs.
2021-10-09 08:21:38 +02:00
afischerdev
a2c5007552 Merge branch 'test-and11' of https://github.com/afischerdev/brouter into test-and11 2021-10-07 13:38:40 +02:00
afischerdev
014079898a
Merge pull request #343 from abrensch/revert-342-test-and11
Revert "Update from abrensch/brouter"
2021-10-07 13:28:45 +02:00
afischerdev
151cb6b60e gradle for publish #339 2021-10-07 12:21:02 +02:00
multimodaal
4ba29cac95
Update river profile
Add support for routing over fairways which are used for navigable routes in a body of water
(where waterway=river or waterway=canal are not appropriate).
2021-05-22 14:13:02 +02:00
Fabien
7c56886eec Same onewaypenalty for fastbike-verylowtraffic:
Consider cycleway:left&right in onewaypenalty
Avoids discarding some fine ways during routing
2020-10-03 22:56:38 +02:00
Fabien
33c984d558 Consider cycleway:left&right in onewaypenalty
Avoids discarding some fine ways during routing
2020-09-25 15:28:07 +02:00
Totorrr
41b4008330 Update trekking.brf & fastbike.brf
add "sett" as a paved surface (often nicer for bikes than cobblestone)
otherwise, sett surfaces tend to be avoided in routing, while it is usually smooth
even for thin wheels
2020-06-29 19:22:49 +02:00
Gautier Pelloux-Prayer
f39d8834f1 Add MTB profile 2019-09-02 19:05:09 +02:00
453 changed files with 71677 additions and 27856 deletions

14
.editorconfig Normal file
View file

@ -0,0 +1,14 @@
root = true
[*]
end_of_line = lf
insert_final_newline = true
trim_trailing_whitespace = true
[*.java]
indent_style = space
indent_size = 2
[*.gradle]
indent_style = space
indent_size = 4

12
.git-blame-ignore-revs Normal file
View file

@ -0,0 +1,12 @@
# Commits which should be ignored by git blame
# You have to instruct git to use this file using either
# `git blame --ignore-revs-file .git-blame-ignore-revs`
# or update you git config to always include those commits
# `git config blame.ignoreRevsFile .git-blame-ignore-revs`
# Reformat brouter-routing-app using Android Studio
54d5c5e9439be2c3df4c95b6fc12d33fdcc9b389
# Reformat whole codebase using Android Studio
c15913c1ab9befd8d583d4a7716d5043d2966f64

98
.github/workflows/docker-publish.yml vendored Normal file
View file

@ -0,0 +1,98 @@
name: Docker
# This workflow uses actions that are not certified by GitHub.
# They are provided by a third-party and are governed by
# separate terms of service, privacy policy, and support
# documentation.
on:
schedule:
- cron: '21 9 * * *'
push:
branches: [ "master" ]
# Publish semver tags as releases.
tags: [ 'v*.*.*' ]
pull_request:
branches: [ "master" ]
env:
# Use docker.io for Docker Hub if empty
REGISTRY: ghcr.io
# github.repository as <account>/<repo>
IMAGE_NAME: ${{ github.repository }}
jobs:
build:
runs-on: ubuntu-latest
permissions:
contents: read
packages: write
# This is used to complete the identity challenge
# with sigstore/fulcio when running outside of PRs.
id-token: write
steps:
- name: Checkout repository
uses: actions/checkout@v3
# Install the cosign tool except on PR
# https://github.com/sigstore/cosign-installer
- name: Install cosign
if: github.event_name != 'pull_request'
uses: sigstore/cosign-installer@6e04d228eb30da1757ee4e1dd75a0ec73a653e06 #v3.1.1
with:
cosign-release: 'v2.1.1'
# Set up BuildKit Docker container builder to be able to build
# multi-platform images and export cache
# https://github.com/docker/setup-buildx-action
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@f95db51fddba0c2d1ec667646a06c2ce06100226 # v3.0.0
# Login against a Docker registry except on PR
# https://github.com/docker/login-action
- name: Log into registry ${{ env.REGISTRY }}
if: github.event_name != 'pull_request'
uses: docker/login-action@343f7c4344506bcbf9b4de18042ae17996df046d # v3.0.0
with:
registry: ${{ env.REGISTRY }}
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
# Extract metadata (tags, labels) for Docker
# https://github.com/docker/metadata-action
- name: Extract Docker metadata
id: meta
uses: docker/metadata-action@96383f45573cb7f253c731d3b3ab81c87ef81934 # v5.0.0
with:
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
# Build and push Docker image with Buildx (don't push on PR)
# https://github.com/docker/build-push-action
- name: Build and push Docker image
id: build-and-push
uses: docker/build-push-action@0565240e2d4ab88bba5387d719585280857ece09 # v5.0.0
with:
context: .
push: ${{ github.event_name != 'pull_request' }}
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=gha
cache-to: type=gha,mode=max
# Sign the resulting Docker image digest except on PRs.
# This will only write to the public Rekor transparency log when the Docker
# repository is public to avoid leaking data. If you would like to publish
# transparency data even for private images, pass --force to cosign below.
# https://github.com/sigstore/cosign
# - name: Sign the published Docker image
# if: ${{ github.event_name != 'pull_request' }}
# env:
# # https://docs.github.com/en/actions/security-guides/security-hardening-for-github-actions#using-an-intermediate-environment-variable
# TAGS: ${{ steps.meta.outputs.tags }}
# DIGEST: ${{ steps.build-and-push.outputs.digest }}
# # This step uses the identity token to provision an ephemeral certificate
# # against the sigstore community Fulcio instance.
# run: echo "${TAGS}" | xargs -I {} cosign sign --yes {}@${DIGEST}

View file

@ -4,7 +4,7 @@
name: Gradle Package
on:
workflow_dispatch:
release:
types: [created]
@ -12,25 +12,35 @@ jobs:
build:
runs-on: ubuntu-latest
environment: BRouter
permissions:
contents: read
packages: write
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v4
- name: Set up JDK 11
uses: actions/setup-java@v2
- name: Set up JDK 17
uses: actions/setup-java@v4
with:
java-version: '11'
distribution: 'adopt'
java-version: '17'
distribution: 'temurin'
server-id: github # Value of the distributionManagement/repository/id field of the pom.xml
settings-path: ${{ github.workspace }} # location for the settings.xml file
- name: Setup keystore
env:
BROUTER_KEYSTORE_BASE64: ${{ secrets.BROUTER_KEYSTORE_BASE64 }}
run: |
echo $BROUTER_KEYSTORE_BASE64 | base64 -di > ${{ github.workspace }}/brouter.jks
- name: Build with Gradle
env:
ORG_GRADLE_PROJECT_RELEASE_STORE_FILE: ${{ secrets.BROUTER_KEYSTORE_FILE }}
ORG_GRADLE_PROJECT_RELEASE_KEY_ALIAS: ${{ secrets.BROUTER_KEY_ALIAS }}
ORG_GRADLE_PROJECT_RELEASE_KEY_PASSWORD: ${{ secrets.BROUTER_KEY_PASSWORD }}
ORG_GRADLE_PROJECT_RELEASE_STORE_PASSWORD: ${{ secrets.BROUTER_STORE_PASSWORD }}
run: gradle build
# The USERNAME and TOKEN need to correspond to the credentials environment variables used in
# the publishing section of your build.gradle
- name: Publish to GitHub Packages

View file

@ -13,14 +13,31 @@ jobs:
build:
runs-on: ubuntu-latest
environment: BRouter
steps:
- uses: actions/checkout@v2
- name: Set up JDK 8
uses: actions/setup-java@v2
- uses: actions/checkout@v4
- name: Set up JDK 17
uses: actions/setup-java@v4
with:
java-version: '8'
distribution: 'zulu'
java-version: '17'
distribution: 'temurin'
cache: gradle
- name: Create local.properties
run: touch local.properties
- name: Setup keystore
env:
BROUTER_KEYSTORE_BASE64: ${{ secrets.BROUTER_KEYSTORE_BASE64 }}
run: |
echo $BROUTER_KEYSTORE_BASE64 | base64 -di > ${{ github.workspace }}/brouter.jks
- name: Build with Gradle
env:
ORG_GRADLE_PROJECT_RELEASE_STORE_FILE: ${{ secrets.BROUTER_KEYSTORE_FILE }}
ORG_GRADLE_PROJECT_RELEASE_KEY_ALIAS: ${{ secrets.BROUTER_KEY_ALIAS }}
ORG_GRADLE_PROJECT_RELEASE_KEY_PASSWORD: ${{ secrets.BROUTER_KEY_PASSWORD }}
ORG_GRADLE_PROJECT_RELEASE_STORE_PASSWORD: ${{ secrets.BROUTER_STORE_PASSWORD }}
run: ./gradlew build
- name: Upload ZIP
uses: actions/upload-artifact@v4
with:
name: ZIP
path: brouter-server/build/distributions/brouter-*.zip

2
.gitignore vendored
View file

@ -1,6 +1,7 @@
*.iml
.gradle
.idea/
build
/local.properties
/.idea/caches
/.idea/gradle.xml
@ -10,7 +11,6 @@
/.idea/navEditor.xml
/.idea/assetWizardSettings.xml
.DS_Store
/build
/captures
.externalNativeBuild
.cxx

14
Dockerfile Normal file
View file

@ -0,0 +1,14 @@
FROM gradle:jdk17-jammy as build
RUN mkdir /tmp/brouter
WORKDIR /tmp/brouter
COPY . .
RUN ./gradlew clean build
FROM openjdk:17.0.1-jdk-slim
COPY --from=build /tmp/brouter/brouter-server/build/libs/brouter-*-all.jar /brouter.jar
COPY --from=build /tmp/brouter/misc/scripts/standalone/server.sh /bin/
COPY --from=build /tmp/brouter/misc/* /profiles2
CMD /bin/server.sh

121
README.md
View file

@ -1,6 +1,63 @@
BRouter
=======
# Come fare
To build the Docker image run (in the project's top level directory):
```
docker build -t brouter .
```
Download the segment files
```
wget -rkpN -np -e robots=off -l1 https://brouter.de/brouter/segments4/
```
Download the profile files from github https://github.com/gpxstudio/brouter/tree/master/misc/profiles2 using
```
https://downgit.github.io/
```
or directly copy misc/profiles2 from git clone folder
si possono creare anche i profili con le varianti usando
```
generate_profile_variants.sh
```
si possono scaricare anche i profili di brouter originale usando
```
wget -rkpN -np -e robots=off -l1 https://brouter.de/brouter/profiles2/
```
e metterli nei folder che poi verranno caricati con i volumi condivisi nel docker
```
mv ./brouter.de/brouter/segments4 /home/nvme/dockerdata/brouter/segments
mv mv ./brouter.de/brouter/profiles2 /home/nvme/dockerdata/brouter/profiles
```
oppure usando lo script
```
download_segments.sh
```
far partire il docker
```
services:
brouterserver:
container_name: brouter
image: brouter
ports:
- 17777:17777
volumes:
- /home/nvme/dockerdata/brouter/segments:/segments4
- /home/nvme/dockerdata/brouter/profiles:/profiles2
restart: unless-stopped
```
## Come si interroga
```
https://brouter.patachina.it/?lonlats=12.08349699,44.25067665|12.09165011,44.24834552&profile=Trekking-dry&format=geojson&alternativeidx=0
```
# Original
BRouter is a configurable OSM offline router with elevation awareness, Java +
Android. Designed to be multi-modal with a particular emphasis on bicycle
and energy-based car routing.
@ -15,7 +72,7 @@ You can install the BRouter app on your Android device from
Store](https://play.google.com/store/apps/details?id=btools.routingapp). You
can also [build BRouter](#build-and-install) yourself. You can find detailed
documentation of the BRouter Android app in
[`misc/readmes/readme.txt`](misc/readmes/readme.txt).
[`docs/users/android_quickstart.md`](docs/users/android_quickstart.md).
<a href="https://f-droid.org/packages/btools.routingapp" target="_blank">
<img src="https://f-droid.org/badge/get-it-on.png" alt="Get it on F-Droid" height="90"/></a>
@ -39,7 +96,7 @@ Alternatively, you can also use BRouter as the offline routing engine for
[OSMAnd](https://osmand.net/) on your Android device.
A full documentation on how to set this up is available at
[`misc/readmes/osmand/README.md`](misc/readmes/osmand/README.md).
[`docs/users/osmand.md`](docs/users/osmand.md).
## BRouter on Windows/Linux/Mac OS
@ -98,7 +155,7 @@ Segments files from the whole planet are generated weekly at
[https://brouter.de/brouter/segments4/](http://brouter.de/brouter/segments4/).
You can download one or more segments files, covering the area of the planet
your want to route, into the `misc/segments4` directory.
you want to route, into the `misc/segments4` directory.
#### Generate your own segments files
@ -106,7 +163,7 @@ You can also generate the segments files you need directly from a planet dump
of OpenStreetMap data (or a [GeoFabrik extract](https://download.geofabrik.de/)).
More documentation of this is available in the
[`misc/readmes/mapcreation.md`](misc/readmes/mapcreation.md) file.
[`docs/developers/build_segments.md`](docs/developers/build_segments.md) file.
### (Optional) Generate profile variants
@ -120,7 +177,7 @@ to help you quickly generate variants based on the default profiles, to create
a default set of profiles covering most of the basic use cases.
Have a look at the
[`misc/readmes/profile_developers_guide.txt`](misc/readmes/profile_developers_guide.txt)
[`docs/developers/profile_developers_guide.md`](docs/developers/profile_developers_guide.md)
for an in-depth guide on profiles edition and customization.
@ -137,10 +194,62 @@ The API endpoints exposed by this HTTP server are documented in the
[`brouter-server/src/main/java/btools/server/request/ServerHandler.java`](brouter-server/src/main/java/btools/server/request/ServerHandler.java)
file.
The server emits log data for each routing request on stdout. For each routing
request a line with the following eight fields is printed. The fields are
separated by whitespace.
- timestamp, in ISO8601 format, e.g. `2024-05-14T21:11:26.499+02:00`
- current server session count (integer number 1-999) or "new" when a new
IP address is detected
- IP address (IPv4 or IPv6), prefixed by `ip=`
- duration of routing request in ms, prefixed by `ms=`
- divider `->`
- HTTP request method
- HTTP request URL
- HTTP request version
Example log output:
```
2024-05-14T21:11:26.499+02:00 new ip=127.0.0.1 ms=189 -> GET /brouter?lonlats=13.377485,52.516247%7C13.351221,52.515004&profile=trekking&alternativeidx=0&format=geojson HTTP/1.1
2024-05-14T21:11:33.229+02:00 1 ip=127.0.0.1 ms=65 -> GET /brouter?lonlats=13.377485,52.516247%7C13.351221,52.515004&profile=trekking&alternativeidx=0&format=geojson HTTP/1.1
```
## BRouter with Docker
To build the Docker image run (in the project's top level directory):
```
docker build -t brouter .
```
Download the segment files as described in the previous chapter. The folder containing the
segment files can be mounted into the container. Run BRouter as follows:
```
docker run --rm \
-v ./misc/scripts/segments4:/segments4 \
-p 17777:17777 \
--name brouter \
brouter
```
This will start brouter with a set of default routing profiles. It will be accessible on port 17777.
If you want to provide your own routing profiles, you can also mount the folder containing the custom profiles:
```
docker run --rm \
-v ./misc/scripts/segments4:/segments4 \
-v /path/to/custom/profiles:/profiles2 \
-p 17777:17777 \
--name brouter \
brouter
```
## Documentation
More documentation is available in the [`misc/readmes`](misc/readmes) folder.
More documentation is available in the [`docs`](docs) folder.
## Related Projects

View file

@ -1 +0,0 @@
/build/

View file

@ -1,8 +1,7 @@
plugins {
id 'java-library'
id 'brouter.library-conventions'
}
dependencies {
implementation project(':brouter-util')
testImplementation 'junit:junit:4.13.1'
}

View file

@ -1,3 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<manifest package="btools.codec" />

View file

@ -5,11 +5,10 @@ import btools.util.BitCoderContext;
/**
* Container for some re-usable databuffers for the decoder
*/
public final class DataBuffers
{
public final class DataBuffers {
public byte[] iobuffer;
public byte[] tagbuf1 = new byte[256];
public BitCoderContext bctx1 = new BitCoderContext( tagbuf1 );
public BitCoderContext bctx1 = new BitCoderContext(tagbuf1);
public byte[] bbuf1 = new byte[65636];
public int[] ibuf1 = new int[4096];
public int[] ibuf2 = new int[2048];
@ -17,17 +16,15 @@ public final class DataBuffers
public int[] alon = new int[2048];
public int[] alat = new int[2048];
public DataBuffers()
{
this( new byte[65636] );
public DataBuffers() {
this(new byte[65636]);
}
/**
* construct a set of databuffers except
* for 'iobuffer', where the given array is used
*/
public DataBuffers( byte[] iobuffer )
{
public DataBuffers(byte[] iobuffer) {
this.iobuffer = iobuffer;
}

View file

@ -3,16 +3,14 @@ package btools.codec;
/**
* Special integer fifo suitable for 3-pass encoding
*/
public class IntegerFifo3Pass
{
public class IntegerFifo3Pass {
private int[] a;
private int size;
private int pos;
private int pass;
public IntegerFifo3Pass( int capacity )
{
public IntegerFifo3Pass(int capacity) {
a = capacity < 4 ? new int[4] : new int[capacity];
}
@ -20,8 +18,7 @@ public class IntegerFifo3Pass
* Starts a new encoding pass and resets the reading pointer
* from the stats collected in pass2 and writes that to the given context
*/
public void init()
{
public void init() {
pass++;
pos = 0;
}
@ -29,14 +26,11 @@ public class IntegerFifo3Pass
/**
* writes to the fifo in pass2
*/
public void add( int value )
{
if ( pass == 2 )
{
if ( size == a.length )
{
public void add(int value) {
if (pass == 2) {
if (size == a.length) {
int[] aa = new int[2 * size];
System.arraycopy( a, 0, aa, 0, size );
System.arraycopy(a, 0, aa, 0, size);
a = aa;
}
a[size++] = value;
@ -46,16 +40,13 @@ public class IntegerFifo3Pass
/**
* reads from the fifo in pass3 (in pass1/2 returns just 1)
*/
public int getNext()
{
return pass == 3 ? get( pos++ ) : 1;
public int getNext() {
return pass == 3 ? get(pos++) : 1;
}
private int get( int idx )
{
if ( idx >= size )
{
throw new IndexOutOfBoundsException( "list size=" + size + " idx=" + idx );
private int get(int idx) {
if (idx >= size) {
throw new IndexOutOfBoundsException("list size=" + size + " idx=" + idx);
}
return a[idx];
}

View file

@ -3,8 +3,7 @@ package btools.codec;
/**
* Simple container for a list of lists of integers
*/
public class LinkedListContainer
{
public class LinkedListContainer {
private int[] ia; // prev, data, prev, data, ...
private int size;
private int[] startpointer; // 0=void, odd=head-data-cell
@ -12,49 +11,44 @@ public class LinkedListContainer
/**
* Construct a container for the given number of lists
*
* <p>
* If no default-buffer is given, an int[nlists*4] is constructed,
* able to hold 2 entries per list on average
*
* @param nlists the number of lists
* @param nlists the number of lists
* @param defaultbuffer an optional data array for re-use (gets replaced if too small)
*/
public LinkedListContainer( int nlists, int[] defaultbuffer )
{
ia = defaultbuffer == null ? new int[nlists*4] : defaultbuffer;
startpointer = new int[nlists];
public LinkedListContainer(int nlists, int[] defaultbuffer) {
ia = defaultbuffer == null ? new int[nlists * 4] : defaultbuffer;
startpointer = new int[nlists];
}
/**
* Add a data element to the given list
*
* @param listNr the list to add the data to
* @param data the data value
* @param data the data value
*/
public void addDataElement( int listNr, int data )
{
if ( size + 2 > ia.length )
{
public void addDataElement(int listNr, int data) {
if (size + 2 > ia.length) {
resize();
}
ia[size++] = startpointer[ listNr ];
startpointer[ listNr ] = size;
ia[size++] = startpointer[listNr];
startpointer[listNr] = size;
ia[size++] = data;
}
/**
* Initialize a list for reading
*
* @param listNr the list to initialize
* @return the number of entries in that list
*/
public int initList( int listNr )
{
public int initList(int listNr) {
int cnt = 0;
int lp = listpointer = startpointer[ listNr ];
while( lp != 0 )
{
lp = ia[ lp-1 ];
int lp = listpointer = startpointer[listNr];
while (lp != 0) {
lp = ia[lp - 1];
cnt++;
}
return cnt;
@ -67,21 +61,18 @@ public class LinkedListContainer
* @return the data element
* @throws IllegalArgumentException if no more element
*/
public int getDataElement()
{
if ( listpointer == 0 )
{
throw new IllegalArgumentException( "no more element!" );
public int getDataElement() {
if (listpointer == 0) {
throw new IllegalArgumentException("no more element!");
}
int data = ia[ listpointer ];
listpointer = ia[ listpointer-1 ];
int data = ia[listpointer];
listpointer = ia[listpointer - 1];
return data;
}
private void resize()
{
int[] ia2 = new int[2*ia.length];
System.arraycopy( ia, 0, ia2, 0, ia.length );
private void resize() {
int[] ia2 = new int[2 * ia.length];
System.arraycopy(ia, 0, ia2, 0, ia.length);
ia = ia2;
}
}

View file

@ -5,21 +5,20 @@ import btools.util.ByteDataWriter;
/**
* a micro-cache is a data cache for an area of some square kilometers or some
* hundreds or thousands nodes
*
* <p>
* This is the basic io-unit: always a full microcache is loaded from the
* data-file if a node is requested at a position not yet covered by the caches
* already loaded
*
* <p>
* The nodes are represented in a compact way (typical 20-50 bytes per node),
* but in a way that they do not depend on each other, and garbage collection is
* supported to remove the nodes already consumed from the cache.
*
* <p>
* The cache-internal data representation is different from that in the
* data-files, where a cache is encoded as a whole, allowing more
* redundancy-removal for a more compact encoding
*/
public class MicroCache extends ByteDataWriter
{
public class MicroCache extends ByteDataWriter {
protected int[] faid;
protected int[] fapos;
protected int size = 0;
@ -35,25 +34,21 @@ public class MicroCache extends ByteDataWriter
public static boolean debug = false;
protected MicroCache( byte[] ab )
{
super( ab );
protected MicroCache(byte[] ab) {
super(ab);
}
public final static MicroCache emptyNonVirgin = new MicroCache( null );
public final static MicroCache emptyNonVirgin = new MicroCache(null);
static
{
static {
emptyNonVirgin.virgin = false;
}
public static MicroCache emptyCache()
{
return new MicroCache( null ); // TODO: singleton?
public static MicroCache emptyCache() {
return new MicroCache(null); // TODO: singleton?
}
protected void init( int size )
{
protected void init(int size) {
this.size = size;
delcount = 0;
delbytes = 0;
@ -62,35 +57,31 @@ public class MicroCache extends ByteDataWriter
p2size >>= 1;
}
public final void finishNode( long id )
{
public final void finishNode(long id) {
fapos[size] = aboffset;
faid[size] = shrinkId( id );
faid[size] = shrinkId(id);
size++;
}
public final void discardNode()
{
aboffset = startPos( size );
public final void discardNode() {
aboffset = startPos(size);
}
public final int getSize()
{
public final int getSize() {
return size;
}
public final int getDataSize()
{
public final int getDataSize() {
return ab == null ? 0 : ab.length;
}
/**
* Set the internal reader (aboffset, aboffsetEnd) to the body data for the given id
*
* <p>
* If a node is not found in an empty cache, this is usually an edge-effect
* (data-file does not exist or neighboured data-files of differnt age),
* but is can as well be a symptom of a node-identity breaking bug.
*
* <p>
* Current implementation always returns false for not-found, however, for
* regression testing, at least for the case that is most likely a bug
* (node found but marked as deleted = ready for garbage collection
@ -98,38 +89,31 @@ public class MicroCache extends ByteDataWriter
*
* @return true if id was found
*/
public final boolean getAndClear( long id64 )
{
if ( size == 0 )
{
public final boolean getAndClear(long id64) {
if (size == 0) {
return false;
}
int id = shrinkId( id64 );
int id = shrinkId(id64);
int[] a = faid;
int offset = p2size;
int n = 0;
while (offset > 0)
{
while (offset > 0) {
int nn = n + offset;
if ( nn < size && a[nn] <= id )
{
if (nn < size && a[nn] <= id) {
n = nn;
}
offset >>= 1;
}
if ( a[n] == id )
{
if ( ( fapos[n] & 0x80000000 ) == 0 )
{
aboffset = startPos( n );
if (a[n] == id) {
if ((fapos[n] & 0x80000000) == 0) {
aboffset = startPos(n);
aboffsetEnd = fapos[n];
fapos[n] |= 0x80000000; // mark deleted
delbytes += aboffsetEnd - aboffset;
delcount++;
return true;
}
else // .. marked as deleted
} else // .. marked as deleted
{
// throw new RuntimeException( "MicroCache: node already consumed: id=" + id );
}
@ -137,43 +121,35 @@ public class MicroCache extends ByteDataWriter
return false;
}
protected final int startPos( int n )
{
protected final int startPos(int n) {
return n > 0 ? fapos[n - 1] & 0x7fffffff : 0;
}
public final int collect( int threshold )
{
if ( delcount <= threshold )
{
public final int collect(int threshold) {
if (delcount <= threshold) {
return 0;
}
virgin = false;
int nsize = size - delcount;
if ( nsize == 0 )
{
if (nsize == 0) {
faid = null;
fapos = null;
}
else
{
} else {
int[] nfaid = new int[nsize];
int[] nfapos = new int[nsize];
int idx = 0;
byte[] nab = new byte[ab.length - delbytes];
int nab_off = 0;
for ( int i = 0; i < size; i++ )
{
for (int i = 0; i < size; i++) {
int pos = fapos[i];
if ( ( pos & 0x80000000 ) == 0 )
{
int start = startPos( i );
if ((pos & 0x80000000) == 0) {
int start = startPos(i);
int end = fapos[i];
int len = end - start;
System.arraycopy( ab, start, nab, nab_off, len );
System.arraycopy(ab, start, nab, nab_off, len);
nfaid[idx] = faid[i];
nab_off += len;
nfapos[idx] = nab_off;
@ -185,17 +161,15 @@ public class MicroCache extends ByteDataWriter
ab = nab;
}
int deleted = delbytes;
init( nsize );
init(nsize);
return deleted;
}
public final void unGhost()
{
public final void unGhost() {
ghost = false;
delcount = 0;
delbytes = 0;
for ( int i = 0; i < size; i++ )
{
for (int i = 0; i < size; i++) {
fapos[i] &= 0x7fffffff; // clear deleted flags
}
}
@ -203,201 +177,166 @@ public class MicroCache extends ByteDataWriter
/**
* @return the 64-bit global id for the given cache-position
*/
public final long getIdForIndex( int i )
{
public final long getIdForIndex(int i) {
int id32 = faid[i];
return expandId( id32 );
return expandId(id32);
}
/**
* expand a 32-bit micro-cache-internal id into a 64-bit (lon|lat) global-id
*
*
* @see #shrinkId
*/
public long expandId( int id32 )
{
throw new IllegalArgumentException( "expandId for empty cache" );
public long expandId(int id32) {
throw new IllegalArgumentException("expandId for empty cache");
}
/**
* shrink a 64-bit (lon|lat) global-id into a a 32-bit micro-cache-internal id
*
*
* @see #expandId
*/
public int shrinkId( long id64 )
{
throw new IllegalArgumentException( "shrinkId for empty cache" );
public int shrinkId(long id64) {
throw new IllegalArgumentException("shrinkId for empty cache");
}
/**
* @return true if the given lon/lat position is internal for that micro-cache
*/
public boolean isInternal( int ilon, int ilat )
{
throw new IllegalArgumentException( "isInternal for empty cache" );
public boolean isInternal(int ilon, int ilat) {
throw new IllegalArgumentException("isInternal for empty cache");
}
/**
* (stasticially) encode the micro-cache into the format used in the datafiles
*
* @param buffer
* byte array to encode into (considered big enough)
*
* @param buffer byte array to encode into (considered big enough)
* @return the size of the encoded data
*/
public int encodeMicroCache( byte[] buffer )
{
throw new IllegalArgumentException( "encodeMicroCache for empty cache" );
public int encodeMicroCache(byte[] buffer) {
throw new IllegalArgumentException("encodeMicroCache for empty cache");
}
/**
* Compare the content of this microcache to another
*
*
* @return null if equals, else a diff-report
*/
public String compareWith( MicroCache mc )
{
String msg = _compareWith( mc );
if ( msg != null )
{
StringBuilder sb = new StringBuilder( msg );
sb.append( "\nencode cache:\n" ).append( summary() );
sb.append( "\ndecode cache:\n" ).append( mc.summary() );
public String compareWith(MicroCache mc) {
String msg = _compareWith(mc);
if (msg != null) {
StringBuilder sb = new StringBuilder(msg);
sb.append("\nencode cache:\n").append(summary());
sb.append("\ndecode cache:\n").append(mc.summary());
return sb.toString();
}
return null;
}
private String summary()
{
StringBuilder sb = new StringBuilder( "size=" + size + " aboffset=" + aboffset );
for ( int i = 0; i < size; i++ )
{
sb.append( "\nidx=" + i + " faid=" + faid[i] + " fapos=" + fapos[i] );
private String summary() {
StringBuilder sb = new StringBuilder("size=" + size + " aboffset=" + aboffset);
for (int i = 0; i < size; i++) {
sb.append("\nidx=" + i + " faid=" + faid[i] + " fapos=" + fapos[i]);
}
return sb.toString();
}
private String _compareWith( MicroCache mc )
{
if ( size != mc.size )
{
return "size missmatch: " + size + "->" + mc.size;
private String _compareWith(MicroCache mc) {
if (size != mc.size) {
return "size mismatch: " + size + "->" + mc.size;
}
for ( int i = 0; i < size; i++ )
{
if ( faid[i] != mc.faid[i] )
{
return "faid missmatch at index " + i + ":" + faid[i] + "->" + mc.faid[i];
for (int i = 0; i < size; i++) {
if (faid[i] != mc.faid[i]) {
return "faid mismatch at index " + i + ":" + faid[i] + "->" + mc.faid[i];
}
int start = i > 0 ? fapos[i - 1] : 0;
int end = fapos[i] < mc.fapos[i] ? fapos[i] : mc.fapos[i];
int len = end - start;
for ( int offset = 0; offset < len; offset++ )
{
if ( mc.ab.length <= start + offset )
{
for (int offset = 0; offset < len; offset++) {
if (mc.ab.length <= start + offset) {
return "data buffer too small";
}
if ( ab[start + offset] != mc.ab[start + offset] )
{
return "data missmatch at index " + i + " offset=" + offset;
if (ab[start + offset] != mc.ab[start + offset]) {
return "data mismatch at index " + i + " offset=" + offset;
}
}
if ( fapos[i] != mc.fapos[i] )
{
return "fapos missmatch at index " + i + ":" + fapos[i] + "->" + mc.fapos[i];
if (fapos[i] != mc.fapos[i]) {
return "fapos mismatch at index " + i + ":" + fapos[i] + "->" + mc.fapos[i];
}
}
if ( aboffset != mc.aboffset )
{
return "datasize missmatch: " + aboffset + "->" + mc.aboffset;
if (aboffset != mc.aboffset) {
return "datasize mismatch: " + aboffset + "->" + mc.aboffset;
}
return null;
}
public void calcDelta( MicroCache mc1, MicroCache mc2 )
{
int idx1 = 0;
int idx2 = 0;
public void calcDelta(MicroCache mc1, MicroCache mc2) {
int idx1 = 0;
int idx2 = 0;
while( idx1 < mc1.size || idx2 < mc2.size )
{
int id1 = idx1 < mc1.size ? mc1.faid[idx1] : Integer.MAX_VALUE;
int id2 = idx2 < mc2.size ? mc2.faid[idx2] : Integer.MAX_VALUE;
int id;
if ( id1 >= id2 )
{
id = id2;
int start2 = idx2 > 0 ? mc2.fapos[idx2 - 1] : 0;
int len2 = mc2.fapos[idx2++] - start2;
while (idx1 < mc1.size || idx2 < mc2.size) {
int id1 = idx1 < mc1.size ? mc1.faid[idx1] : Integer.MAX_VALUE;
int id2 = idx2 < mc2.size ? mc2.faid[idx2] : Integer.MAX_VALUE;
int id;
if (id1 >= id2) {
id = id2;
int start2 = idx2 > 0 ? mc2.fapos[idx2 - 1] : 0;
int len2 = mc2.fapos[idx2++] - start2;
if ( id1 == id2 )
{
// id exists in both caches, compare data
int start1 = idx1 > 0 ? mc1.fapos[idx1 - 1] : 0;
int len1 = mc1.fapos[idx1++] - start1;
if ( len1 == len2 )
{
int i = 0;
while( i<len1 )
{
if ( mc1.ab[start1+i] != mc2.ab[start2+i] )
{
break;
}
i++;
}
if ( i == len1 )
{
continue; // same data -> do nothing
}
}
}
write( mc2.ab, start2, len2 );
}
else
{
idx1++;
id = id1; // deleted node
}
fapos[size] = aboffset;
faid[size] = id;
size++;
}
if (id1 == id2) {
// id exists in both caches, compare data
int start1 = idx1 > 0 ? mc1.fapos[idx1 - 1] : 0;
int len1 = mc1.fapos[idx1++] - start1;
if (len1 == len2) {
int i = 0;
while (i < len1) {
if (mc1.ab[start1 + i] != mc2.ab[start2 + i]) {
break;
}
i++;
}
if (i == len1) {
continue; // same data -> do nothing
}
}
}
write(mc2.ab, start2, len2);
} else {
idx1++;
id = id1; // deleted node
}
fapos[size] = aboffset;
faid[size] = id;
size++;
}
}
public void addDelta( MicroCache mc1, MicroCache mc2, boolean keepEmptyNodes )
{
int idx1 = 0;
int idx2 = 0;
public void addDelta(MicroCache mc1, MicroCache mc2, boolean keepEmptyNodes) {
int idx1 = 0;
int idx2 = 0;
while( idx1 < mc1.size || idx2 < mc2.size )
{
int id1 = idx1 < mc1.size ? mc1.faid[idx1] : Integer.MAX_VALUE;
int id2 = idx2 < mc2.size ? mc2.faid[idx2] : Integer.MAX_VALUE;
if ( id1 >= id2 ) // data from diff file wins
{
int start2 = idx2 > 0 ? mc2.fapos[idx2 - 1] : 0;
int len2 = mc2.fapos[idx2++] - start2;
if ( keepEmptyNodes || len2 > 0 )
{
write( mc2.ab, start2, len2 );
fapos[size] = aboffset;
faid[size++] = id2;
}
if ( id1 == id2 ) // // id exists in both caches
{
idx1++;
}
}
else // use data from base file
{
int start1 = idx1 > 0 ? mc1.fapos[idx1 - 1] : 0;
int len1 = mc1.fapos[idx1++] - start1;
write( mc1.ab, start1, len1 );
fapos[size] = aboffset;
faid[size++] = id1;
}
}
while (idx1 < mc1.size || idx2 < mc2.size) {
int id1 = idx1 < mc1.size ? mc1.faid[idx1] : Integer.MAX_VALUE;
int id2 = idx2 < mc2.size ? mc2.faid[idx2] : Integer.MAX_VALUE;
if (id1 >= id2) { // data from diff file wins
int start2 = idx2 > 0 ? mc2.fapos[idx2 - 1] : 0;
int len2 = mc2.fapos[idx2++] - start2;
if (keepEmptyNodes || len2 > 0) {
write(mc2.ab, start2, len2);
fapos[size] = aboffset;
faid[size++] = id2;
}
if (id1 == id2) { // // id exists in both caches
idx1++;
}
} else // use data from base file
{
int start1 = idx1 > 0 ? mc1.fapos[idx1 - 1] : 0;
int len1 = mc1.fapos[idx1++] - start1;
write(mc1.ab, start1, len1);
fapos[size] = aboffset;
faid[size++] = id1;
}
}
}
}

View file

@ -1,6 +1,7 @@
package btools.codec;
import java.util.HashMap;
import java.util.Map;
import btools.util.ByteDataReader;
import btools.util.IByteArrayUnifier;
@ -9,222 +10,198 @@ import btools.util.IByteArrayUnifier;
* MicroCache2 is the new format that uses statistical encoding and
* is able to do access filtering and waypoint matching during encoding
*/
public final class MicroCache2 extends MicroCache
{
public final class MicroCache2 extends MicroCache {
private int lonBase;
private int latBase;
private int cellsize;
public MicroCache2( int size, byte[] databuffer, int lonIdx, int latIdx, int divisor ) throws Exception
{
super( databuffer ); // sets ab=databuffer, aboffset=0
public MicroCache2(int size, byte[] databuffer, int lonIdx, int latIdx, int divisor) {
super(databuffer); // sets ab=databuffer, aboffset=0
faid = new int[size];
fapos = new int[size];
this.size = 0;
cellsize = 1000000 / divisor;
lonBase = lonIdx*cellsize;
latBase = latIdx*cellsize;
}
public byte[] readUnified( int len, IByteArrayUnifier u )
{
byte[] b = u.unify( ab, aboffset, len );
aboffset += len;
return b;
lonBase = lonIdx * cellsize;
latBase = latIdx * cellsize;
}
public MicroCache2( StatCoderContext bc, DataBuffers dataBuffers, int lonIdx, int latIdx, int divisor, TagValueValidator wayValidator, WaypointMatcher waypointMatcher ) throws Exception
{
super( null );
public byte[] readUnified(int len, IByteArrayUnifier u) {
byte[] b = u.unify(ab, aboffset, len);
aboffset += len;
return b;
}
public MicroCache2(StatCoderContext bc, DataBuffers dataBuffers, int lonIdx, int latIdx, int divisor, TagValueValidator wayValidator, WaypointMatcher waypointMatcher) {
super(null);
cellsize = 1000000 / divisor;
lonBase = lonIdx*cellsize;
latBase = latIdx*cellsize;
lonBase = lonIdx * cellsize;
latBase = latIdx * cellsize;
TagValueCoder wayTagCoder = new TagValueCoder( bc, dataBuffers, wayValidator );
TagValueCoder nodeTagCoder = new TagValueCoder( bc, dataBuffers, null );
NoisyDiffCoder nodeIdxDiff = new NoisyDiffCoder( bc );
NoisyDiffCoder nodeEleDiff = new NoisyDiffCoder( bc );
TagValueCoder wayTagCoder = new TagValueCoder(bc, dataBuffers, wayValidator);
TagValueCoder nodeTagCoder = new TagValueCoder(bc, dataBuffers, null);
NoisyDiffCoder nodeIdxDiff = new NoisyDiffCoder(bc);
NoisyDiffCoder nodeEleDiff = new NoisyDiffCoder(bc);
NoisyDiffCoder extLonDiff = new NoisyDiffCoder(bc);
NoisyDiffCoder extLatDiff = new NoisyDiffCoder(bc);
NoisyDiffCoder transEleDiff = new NoisyDiffCoder( bc );
NoisyDiffCoder transEleDiff = new NoisyDiffCoder(bc);
size = bc.decodeNoisyNumber( 5 );
size = bc.decodeNoisyNumber(5);
faid = size > dataBuffers.ibuf2.length ? new int[size] : dataBuffers.ibuf2;
fapos = size > dataBuffers.ibuf3.length ? new int[size] : dataBuffers.ibuf3;
int[] alon = size > dataBuffers.alon.length ? new int[size] : dataBuffers.alon;
int[] alat = size > dataBuffers.alat.length ? new int[size] : dataBuffers.alat;
if ( debug ) System.out.println( "*** decoding cache of size=" + size + " for lonIdx=" + lonIdx + " latIdx=" + latIdx );
bc.decodeSortedArray( faid, 0, size, 29, 0 );
for( int n = 0; n<size; n++ )
{
long id64 = expandId( faid[n] );
alon[n] = (int)(id64 >> 32);
alat[n] = (int)(id64 & 0xffffffff);
int[] alon = size > dataBuffers.alon.length ? new int[size] : dataBuffers.alon;
int[] alat = size > dataBuffers.alat.length ? new int[size] : dataBuffers.alat;
if (debug)
System.out.println("*** decoding cache of size=" + size + " for lonIdx=" + lonIdx + " latIdx=" + latIdx);
bc.decodeSortedArray(faid, 0, size, 29, 0);
for (int n = 0; n < size; n++) {
long id64 = expandId(faid[n]);
alon[n] = (int) (id64 >> 32);
alat[n] = (int) (id64 & 0xffffffff);
}
int netdatasize = bc.decodeNoisyNumber( 10 );
int netdatasize = bc.decodeNoisyNumber(10);
ab = netdatasize > dataBuffers.bbuf1.length ? new byte[netdatasize] : dataBuffers.bbuf1;
aboffset = 0;
int[] validBits = new int[(size+31)>>5];
int[] validBits = new int[(size + 31) >> 5];
int finaldatasize = 0;
LinkedListContainer reverseLinks = new LinkedListContainer( size, dataBuffers.ibuf1 );
LinkedListContainer reverseLinks = new LinkedListContainer(size, dataBuffers.ibuf1);
int selev = 0;
for( int n=0; n<size; n++ ) // loop over nodes
{
for (int n = 0; n < size; n++) { // loop over nodes
int ilon = alon[n];
int ilat = alat[n];
// future escapes (turn restrictions?)
short trExceptions = 0;
int featureId = bc.decodeVarBits();
if ( featureId == 13 )
{
if (featureId == 13) {
fapos[n] = aboffset;
validBits[ n >> 5 ] |= 1 << n; // mark dummy-node valid
validBits[n >> 5] |= 1 << n; // mark dummy-node valid
continue; // empty node escape (delta files only)
}
while( featureId != 0 )
{
int bitsize = bc.decodeNoisyNumber( 5 );
while (featureId != 0) {
int bitsize = bc.decodeNoisyNumber(5);
if ( featureId == 2 ) // exceptions to turn-restriction
{
trExceptions = (short)bc.decodeBounded( 1023 );
}
else if ( featureId == 1 ) // turn-restriction
{
writeBoolean( true );
writeShort( trExceptions ); // exceptions from previous feature
if (featureId == 2) { // exceptions to turn-restriction
trExceptions = (short) bc.decodeBounded(1023);
} else if (featureId == 1) { // turn-restriction
writeBoolean(true);
writeShort(trExceptions); // exceptions from previous feature
trExceptions = 0;
writeBoolean( bc.decodeBit() ); // isPositive
writeInt( ilon + bc.decodeNoisyDiff( 10 ) ); // fromLon
writeInt( ilat + bc.decodeNoisyDiff( 10 ) ); // fromLat
writeInt( ilon + bc.decodeNoisyDiff( 10 ) ); // toLon
writeInt( ilat + bc.decodeNoisyDiff( 10 ) ); // toLat
}
else
{
for( int i=0; i< bitsize; i++ ) bc.decodeBit(); // unknown feature, just skip
writeBoolean(bc.decodeBit()); // isPositive
writeInt(ilon + bc.decodeNoisyDiff(10)); // fromLon
writeInt(ilat + bc.decodeNoisyDiff(10)); // fromLat
writeInt(ilon + bc.decodeNoisyDiff(10)); // toLon
writeInt(ilat + bc.decodeNoisyDiff(10)); // toLat
} else {
for (int i = 0; i < bitsize; i++) bc.decodeBit(); // unknown feature, just skip
}
featureId = bc.decodeVarBits();
}
writeBoolean( false );
writeBoolean(false);
selev += nodeEleDiff.decodeSignedValue();
writeShort( (short) selev );
writeShort((short) selev);
TagValueWrapper nodeTags = nodeTagCoder.decodeTagValueSet();
writeVarBytes( nodeTags == null ? null : nodeTags.data );
writeVarBytes(nodeTags == null ? null : nodeTags.data);
int links = bc.decodeNoisyNumber( 1 );
if ( debug ) System.out.println( "*** decoding node " + ilon + "/" + ilat + " with links=" + links );
for( int li=0; li<links; li++ )
{
int links = bc.decodeNoisyNumber(1);
if (debug)
System.out.println("*** decoding node " + ilon + "/" + ilat + " with links=" + links);
for (int li = 0; li < links; li++) {
int sizeoffset = 0;
int nodeIdx = n + nodeIdxDiff.decodeSignedValue();
int dlon_remaining;
int dlat_remaining;
boolean isReverse = false;
if ( nodeIdx != n ) // internal (forward-) link
{
if (nodeIdx != n) { // internal (forward-) link
dlon_remaining = alon[nodeIdx] - ilon;
dlat_remaining = alat[nodeIdx] - ilat;
}
else
{
} else {
isReverse = bc.decodeBit();
dlon_remaining = extLonDiff.decodeSignedValue();
dlat_remaining = extLatDiff.decodeSignedValue();
}
if ( debug ) System.out.println( "*** decoding link to " + (ilon+dlon_remaining) + "/" + (ilat+dlat_remaining) + " extern=" + (nodeIdx == n) );
if (debug)
System.out.println("*** decoding link to " + (ilon + dlon_remaining) + "/" + (ilat + dlat_remaining) + " extern=" + (nodeIdx == n));
TagValueWrapper wayTags = wayTagCoder.decodeTagValueSet();
boolean linkValid = wayTags != null || wayValidator == null;
if ( linkValid )
{
if (linkValid) {
int startPointer = aboffset;
sizeoffset = writeSizePlaceHolder();
writeVarLengthSigned( dlon_remaining );
writeVarLengthSigned( dlat_remaining );
writeVarLengthSigned(dlon_remaining);
writeVarLengthSigned(dlat_remaining);
validBits[ n >> 5 ] |= 1 << n; // mark source-node valid
if ( nodeIdx != n ) // valid internal (forward-) link
{
reverseLinks.addDataElement( nodeIdx, n ); // register reverse link
finaldatasize += 1 + aboffset-startPointer; // reserve place for reverse
validBits[ nodeIdx >> 5 ] |= 1 << nodeIdx; // mark target-node valid
validBits[n >> 5] |= 1 << n; // mark source-node valid
if (nodeIdx != n) { // valid internal (forward-) link
reverseLinks.addDataElement(nodeIdx, n); // register reverse link
finaldatasize += 1 + aboffset - startPointer; // reserve place for reverse
validBits[nodeIdx >> 5] |= 1 << nodeIdx; // mark target-node valid
}
writeModeAndDesc( isReverse, wayTags == null ? null : wayTags.data );
writeModeAndDesc(isReverse, wayTags == null ? null : wayTags.data);
}
if ( !isReverse ) // write geometry for forward links only
{
if (!isReverse) { // write geometry for forward links only
WaypointMatcher matcher = wayTags == null || wayTags.accessType < 2 ? null : waypointMatcher;
int ilontarget = ilon + dlon_remaining;
int ilattarget = ilat + dlat_remaining;
if ( matcher != null )
{
if ( !matcher.start( ilon, ilat, ilontarget, ilattarget ) )
{
if (matcher != null) {
if (!matcher.start(ilon, ilat, ilontarget, ilattarget)) {
matcher = null;
}
}
int transcount = bc.decodeVarBits();
if ( debug ) System.out.println( "*** decoding geometry with count=" + transcount );
int count = transcount+1;
for( int i=0; i<transcount; i++ )
{
int dlon = bc.decodePredictedValue( dlon_remaining/count );
int dlat = bc.decodePredictedValue( dlat_remaining/count );
if (debug) System.out.println("*** decoding geometry with count=" + transcount);
int count = transcount + 1;
for (int i = 0; i < transcount; i++) {
int dlon = bc.decodePredictedValue(dlon_remaining / count);
int dlat = bc.decodePredictedValue(dlat_remaining / count);
dlon_remaining -= dlon;
dlat_remaining -= dlat;
count--;
int elediff = transEleDiff.decodeSignedValue();
if ( wayTags != null )
{
writeVarLengthSigned( dlon );
writeVarLengthSigned( dlat );
writeVarLengthSigned( elediff );
if (wayTags != null) {
writeVarLengthSigned(dlon);
writeVarLengthSigned(dlat);
writeVarLengthSigned(elediff);
}
if ( matcher != null ) matcher.transferNode( ilontarget - dlon_remaining, ilattarget - dlat_remaining );
if (matcher != null)
matcher.transferNode(ilontarget - dlon_remaining, ilattarget - dlat_remaining);
}
if ( matcher != null ) matcher.end();
if (matcher != null) matcher.end();
}
if ( linkValid )
{
injectSize( sizeoffset );
if (linkValid) {
injectSize(sizeoffset);
}
}
fapos[n] = aboffset;
}
// calculate final data size
int finalsize = 0;
int startpos = 0;
for( int i=0; i<size; i++ )
{
for (int i = 0; i < size; i++) {
int endpos = fapos[i];
if ( ( validBits[ i >> 5 ] & (1 << i ) ) != 0 )
{
finaldatasize += endpos-startpos;
finalsize++;
if ((validBits[i >> 5] & (1 << i)) != 0) {
finaldatasize += endpos - startpos;
finalsize++;
}
startpos = endpos;
}
@ -240,29 +217,26 @@ public final class MicroCache2 extends MicroCache
size = 0;
startpos = 0;
for ( int n = 0; n < sizeOld; n++ )
{
for (int n = 0; n < sizeOld; n++) {
int endpos = faposOld[n];
if ( ( validBits[ n >> 5 ] & (1 << n ) ) != 0 )
{
if ((validBits[n >> 5] & (1 << n)) != 0) {
int len = endpos - startpos;
System.arraycopy( abOld, startpos, ab, aboffset, len );
if ( debug )
System.out.println( "*** copied " + len + " bytes from " + aboffset + " for node " + n );
System.arraycopy(abOld, startpos, ab, aboffset, len);
if (debug)
System.out.println("*** copied " + len + " bytes from " + aboffset + " for node " + n);
aboffset += len;
int cnt = reverseLinks.initList( n );
if ( debug )
System.out.println( "*** appending " + cnt + " reverse links for node " + n );
int cnt = reverseLinks.initList(n);
if (debug)
System.out.println("*** appending " + cnt + " reverse links for node " + n);
for ( int ri = 0; ri < cnt; ri++ )
{
for (int ri = 0; ri < cnt; ri++) {
int nodeIdx = reverseLinks.getDataElement();
int sizeoffset = writeSizePlaceHolder();
writeVarLengthSigned( alon[nodeIdx] - alon[n] );
writeVarLengthSigned( alat[nodeIdx] - alat[n] );
writeModeAndDesc( true, null );
injectSize( sizeoffset );
writeVarLengthSigned(alon[nodeIdx] - alon[n]);
writeVarLengthSigned(alat[nodeIdx] - alat[n]);
writeModeAndDesc(true, null);
injectSize(sizeoffset);
}
faid[size] = faidOld[n];
fapos[size] = aboffset;
@ -270,65 +244,58 @@ public final class MicroCache2 extends MicroCache
}
startpos = endpos;
}
init( size );
init(size);
}
@Override
public long expandId( int id32 )
{
public long expandId(int id32) {
int dlon = 0;
int dlat = 0;
for( int bm = 1; bm < 0x8000; bm <<= 1 )
{
if ( (id32 & 1) != 0 ) dlon |= bm;
if ( (id32 & 2) != 0 ) dlat |= bm;
for (int bm = 1; bm < 0x8000; bm <<= 1) {
if ((id32 & 1) != 0) dlon |= bm;
if ((id32 & 2) != 0) dlat |= bm;
id32 >>= 2;
}
int lon32 = lonBase + dlon;
int lat32 = latBase + dlat;
return ((long)lon32)<<32 | lat32;
return ((long) lon32) << 32 | lat32;
}
@Override
public int shrinkId( long id64 )
{
int lon32 = (int)(id64 >> 32);
int lat32 = (int)(id64 & 0xffffffff);
public int shrinkId(long id64) {
int lon32 = (int) (id64 >> 32);
int lat32 = (int) (id64 & 0xffffffff);
int dlon = lon32 - lonBase;
int dlat = lat32 - latBase;
int id32 = 0;
for( int bm = 0x4000; bm > 0; bm >>= 1 )
{
for (int bm = 0x4000; bm > 0; bm >>= 1) {
id32 <<= 2;
if ( ( dlon & bm ) != 0 ) id32 |= 1;
if ( ( dlat & bm ) != 0 ) id32 |= 2;
if ((dlon & bm) != 0) id32 |= 1;
if ((dlat & bm) != 0) id32 |= 2;
}
return id32;
}
@Override
public boolean isInternal( int ilon, int ilat )
{
public boolean isInternal(int ilon, int ilat) {
return ilon >= lonBase && ilon < lonBase + cellsize
&& ilat >= latBase && ilat < latBase + cellsize;
&& ilat >= latBase && ilat < latBase + cellsize;
}
@Override
public int encodeMicroCache( byte[] buffer )
{
HashMap<Long,Integer> idMap = new HashMap<Long,Integer>();
for( int n=0; n<size; n++ ) // loop over nodes
{
idMap.put( Long.valueOf( expandId( faid[n] ) ), Integer.valueOf( n ) );
public int encodeMicroCache(byte[] buffer) {
Map<Long, Integer> idMap = new HashMap<>();
for (int n = 0; n < size; n++) { // loop over nodes
idMap.put(expandId(faid[n]), n);
}
IntegerFifo3Pass linkCounts = new IntegerFifo3Pass( 256 );
IntegerFifo3Pass transCounts = new IntegerFifo3Pass( 256 );
IntegerFifo3Pass restrictionBits = new IntegerFifo3Pass( 16 );
IntegerFifo3Pass linkCounts = new IntegerFifo3Pass(256);
IntegerFifo3Pass transCounts = new IntegerFifo3Pass(256);
IntegerFifo3Pass restrictionBits = new IntegerFifo3Pass(16);
TagValueCoder wayTagCoder = new TagValueCoder();
TagValueCoder nodeTagCoder = new TagValueCoder();
@ -337,182 +304,170 @@ public final class MicroCache2 extends MicroCache
NoisyDiffCoder extLonDiff = new NoisyDiffCoder();
NoisyDiffCoder extLatDiff = new NoisyDiffCoder();
NoisyDiffCoder transEleDiff = new NoisyDiffCoder();
int netdatasize = 0;
for(int pass=1;; pass++) // 3 passes: counters, stat-collection, encoding
{
for (int pass = 1; ; pass++) { // 3 passes: counters, stat-collection, encoding
boolean dostats = pass == 3;
boolean dodebug = debug && pass == 3;
if ( pass < 3 ) netdatasize = fapos[size-1];
StatCoderContext bc = new StatCoderContext( buffer );
if (pass < 3) netdatasize = fapos[size - 1];
StatCoderContext bc = new StatCoderContext(buffer);
linkCounts.init();
transCounts.init();
restrictionBits.init();
wayTagCoder.encodeDictionary( bc );
if ( dostats ) bc.assignBits( "wayTagDictionary" );
nodeTagCoder.encodeDictionary( bc );
if ( dostats ) bc.assignBits( "nodeTagDictionary" );
nodeIdxDiff.encodeDictionary( bc );
nodeEleDiff.encodeDictionary( bc );
extLonDiff.encodeDictionary( bc );
extLatDiff.encodeDictionary( bc );
transEleDiff.encodeDictionary( bc );
if ( dostats ) bc.assignBits( "noisebits" );
bc.encodeNoisyNumber( size, 5 );
if ( dostats ) bc.assignBits( "nodecount" );
bc.encodeSortedArray( faid, 0, size, 0x20000000, 0 );
if ( dostats ) bc.assignBits( "node-positions" );
bc.encodeNoisyNumber( netdatasize, 10 ); // net-size
if ( dostats ) bc.assignBits( "netdatasize" );
if ( dodebug ) System.out.println( "*** encoding cache of size=" + size );
wayTagCoder.encodeDictionary(bc);
if (dostats) bc.assignBits("wayTagDictionary");
nodeTagCoder.encodeDictionary(bc);
if (dostats) bc.assignBits("nodeTagDictionary");
nodeIdxDiff.encodeDictionary(bc);
nodeEleDiff.encodeDictionary(bc);
extLonDiff.encodeDictionary(bc);
extLatDiff.encodeDictionary(bc);
transEleDiff.encodeDictionary(bc);
if (dostats) bc.assignBits("noisebits");
bc.encodeNoisyNumber(size, 5);
if (dostats) bc.assignBits("nodecount");
bc.encodeSortedArray(faid, 0, size, 0x20000000, 0);
if (dostats) bc.assignBits("node-positions");
bc.encodeNoisyNumber(netdatasize, 10); // net-size
if (dostats) bc.assignBits("netdatasize");
if (dodebug) System.out.println("*** encoding cache of size=" + size);
int lastSelev = 0;
for( int n=0; n<size; n++ ) // loop over nodes
{
aboffset = startPos( n );
for (int n = 0; n < size; n++) { // loop over nodes
aboffset = startPos(n);
aboffsetEnd = fapos[n];
if ( dodebug ) System.out.println( "*** encoding node " + n + " from " + aboffset + " to " + aboffsetEnd );
if (dodebug)
System.out.println("*** encoding node " + n + " from " + aboffset + " to " + aboffsetEnd);
long id64 = expandId( faid[n] );
int ilon = (int)(id64 >> 32);
int ilat = (int)(id64 & 0xffffffff);
long id64 = expandId(faid[n]);
int ilon = (int) (id64 >> 32);
int ilat = (int) (id64 & 0xffffffff);
if ( aboffset == aboffsetEnd )
{
bc.encodeVarBits( 13 ); // empty node escape (delta files only)
if (aboffset == aboffsetEnd) {
bc.encodeVarBits(13); // empty node escape (delta files only)
continue;
}
// write turn restrictions
while( readBoolean() )
{
while (readBoolean()) {
short exceptions = readShort(); // except bikes, psv, ...
if ( exceptions != 0 )
{
bc.encodeVarBits( 2 ); // 2 = tr exceptions
bc.encodeNoisyNumber( 10 , 5 ); // bit-count
bc.encodeBounded( 1023 , exceptions & 1023 );
if (exceptions != 0) {
bc.encodeVarBits(2); // 2 = tr exceptions
bc.encodeNoisyNumber(10, 5); // bit-count
bc.encodeBounded(1023, exceptions & 1023);
}
bc.encodeVarBits( 1 ); // 1 = turn restriction
bc.encodeNoisyNumber( restrictionBits.getNext(), 5 ); // bit-count using look-ahead fifo
bc.encodeVarBits(1); // 1 = turn restriction
bc.encodeNoisyNumber(restrictionBits.getNext(), 5); // bit-count using look-ahead fifo
long b0 = bc.getWritingBitPosition();
bc.encodeBit( readBoolean() ); // isPositive
bc.encodeNoisyDiff( readInt() - ilon, 10 ); // fromLon
bc.encodeNoisyDiff( readInt() - ilat, 10 ); // fromLat
bc.encodeNoisyDiff( readInt() - ilon, 10 ); // toLon
bc.encodeNoisyDiff( readInt() - ilat, 10 ); // toLat
restrictionBits.add( (int)( bc.getWritingBitPosition() - b0 ) );
bc.encodeBit(readBoolean()); // isPositive
bc.encodeNoisyDiff(readInt() - ilon, 10); // fromLon
bc.encodeNoisyDiff(readInt() - ilat, 10); // fromLat
bc.encodeNoisyDiff(readInt() - ilon, 10); // toLon
bc.encodeNoisyDiff(readInt() - ilat, 10); // toLat
restrictionBits.add((int) (bc.getWritingBitPosition() - b0));
}
bc.encodeVarBits( 0 ); // end of extra data
bc.encodeVarBits(0); // end of extra data
if ( dostats ) bc.assignBits( "extradata" );
if (dostats) bc.assignBits("extradata");
int selev = readShort();
nodeEleDiff.encodeSignedValue( selev - lastSelev );
if ( dostats ) bc.assignBits( "nodeele" );
nodeEleDiff.encodeSignedValue(selev - lastSelev);
if (dostats) bc.assignBits("nodeele");
lastSelev = selev;
nodeTagCoder.encodeTagValueSet( readVarBytes() );
if ( dostats ) bc.assignBits( "nodeTagIdx" );
nodeTagCoder.encodeTagValueSet(readVarBytes());
if (dostats) bc.assignBits("nodeTagIdx");
int nlinks = linkCounts.getNext();
if ( dodebug ) System.out.println( "*** nlinks=" + nlinks );
bc.encodeNoisyNumber( nlinks, 1 );
if ( dostats ) bc.assignBits( "link-counts" );
if (dodebug) System.out.println("*** nlinks=" + nlinks);
bc.encodeNoisyNumber(nlinks, 1);
if (dostats) bc.assignBits("link-counts");
nlinks = 0;
while( hasMoreData() ) // loop over links
{
while (hasMoreData()) { // loop over links
// read link data
int startPointer = aboffset;
int endPointer = getEndPointer();
int ilonlink = ilon + readVarLengthSigned();
int ilatlink = ilat + readVarLengthSigned();
int sizecode = readVarLengthUnsigned();
boolean isReverse = ( sizecode & 1 ) != 0;
boolean isReverse = (sizecode & 1) != 0;
int descSize = sizecode >> 1;
byte[] description = null;
if ( descSize > 0 )
{
if (descSize > 0) {
description = new byte[descSize];
readFully( description );
readFully(description);
}
long link64 = ((long)ilonlink)<<32 | ilatlink;
Integer idx = idMap.get( Long.valueOf( link64 ) );
long link64 = ((long) ilonlink) << 32 | ilatlink;
Integer idx = idMap.get(link64);
boolean isInternal = idx != null;
if ( isReverse && isInternal )
{
if ( dodebug ) System.out.println( "*** NOT encoding link reverse=" + isReverse + " internal=" + isInternal );
netdatasize -= aboffset-startPointer;
if (isReverse && isInternal) {
if (dodebug)
System.out.println("*** NOT encoding link reverse=" + isReverse + " internal=" + isInternal);
netdatasize -= aboffset - startPointer;
continue; // do not encode internal reverse links
}
if ( dodebug ) System.out.println( "*** encoding link reverse=" + isReverse + " internal=" + isInternal );
if (dodebug)
System.out.println("*** encoding link reverse=" + isReverse + " internal=" + isInternal);
nlinks++;
if ( isInternal )
{
int nodeIdx = idx.intValue();
if ( dodebug ) System.out.println( "*** target nodeIdx=" + nodeIdx );
if ( nodeIdx == n ) throw new RuntimeException( "ups: self ref?" );
nodeIdxDiff.encodeSignedValue( nodeIdx - n );
if ( dostats ) bc.assignBits( "nodeIdx" );
if (isInternal) {
int nodeIdx = idx;
if (dodebug) System.out.println("*** target nodeIdx=" + nodeIdx);
if (nodeIdx == n) throw new RuntimeException("ups: self ref?");
nodeIdxDiff.encodeSignedValue(nodeIdx - n);
if (dostats) bc.assignBits("nodeIdx");
} else {
nodeIdxDiff.encodeSignedValue(0);
bc.encodeBit(isReverse);
extLonDiff.encodeSignedValue(ilonlink - ilon);
extLatDiff.encodeSignedValue(ilatlink - ilat);
if (dostats) bc.assignBits("externalNode");
}
else
{
nodeIdxDiff.encodeSignedValue( 0 );
bc.encodeBit( isReverse );
extLonDiff.encodeSignedValue( ilonlink - ilon );
extLatDiff.encodeSignedValue( ilatlink - ilat );
if ( dostats ) bc.assignBits( "externalNode" );
}
wayTagCoder.encodeTagValueSet( description );
if ( dostats ) bc.assignBits( "wayDescIdx" );
if ( !isReverse )
{
byte[] geometry = readDataUntil( endPointer );
wayTagCoder.encodeTagValueSet(description);
if (dostats) bc.assignBits("wayDescIdx");
if (!isReverse) {
byte[] geometry = readDataUntil(endPointer);
// write transition nodes
int count = transCounts.getNext();
if ( dodebug ) System.out.println( "*** encoding geometry with count=" + count );
bc.encodeVarBits( count++ );
if ( dostats ) bc.assignBits( "transcount" );
if (dodebug) System.out.println("*** encoding geometry with count=" + count);
bc.encodeVarBits(count++);
if (dostats) bc.assignBits("transcount");
int transcount = 0;
if ( geometry != null )
{
if (geometry != null) {
int dlon_remaining = ilonlink - ilon;
int dlat_remaining = ilatlink - ilat;
ByteDataReader r = new ByteDataReader( geometry );
while ( r.hasMoreData() )
{
ByteDataReader r = new ByteDataReader(geometry);
while (r.hasMoreData()) {
transcount++;
int dlon = r.readVarLengthSigned();
int dlat = r.readVarLengthSigned();
bc.encodePredictedValue( dlon, dlon_remaining/count );
bc.encodePredictedValue( dlat, dlat_remaining/count );
bc.encodePredictedValue(dlon, dlon_remaining / count);
bc.encodePredictedValue(dlat, dlat_remaining / count);
dlon_remaining -= dlon;
dlat_remaining -= dlat;
if ( count > 1 ) count--;
if ( dostats ) bc.assignBits( "transpos" );
transEleDiff.encodeSignedValue( r.readVarLengthSigned() );
if ( dostats ) bc.assignBits( "transele" );
if (count > 1) count--;
if (dostats) bc.assignBits("transpos");
transEleDiff.encodeSignedValue(r.readVarLengthSigned());
if (dostats) bc.assignBits("transele");
}
}
transCounts.add( transcount );
transCounts.add(transcount);
}
}
linkCounts.add( nlinks );
linkCounts.add(nlinks);
}
if ( pass == 3 )
{
if (pass == 3) {
return bc.closeAndGetEncodedLength();
}
}

View file

@ -4,12 +4,11 @@ package btools.codec;
* Encoder/Decoder for signed integers that automatically detects the typical
* range of these numbers to determine a noisy-bit count as a very simple
* dictionary
*
* <p>
* Adapted for 3-pass encoding (counters -&gt; statistics -&gt; encoding )
* but doesn't do anything at pass1
*/
public final class NoisyDiffCoder
{
public final class NoisyDiffCoder {
private int tot;
private int[] freqs;
private int noisybits;
@ -19,8 +18,7 @@ public final class NoisyDiffCoder
/**
* Create a decoder and read the noisy-bit count from the gibe context
*/
public NoisyDiffCoder( StatCoderContext bc )
{
public NoisyDiffCoder(StatCoderContext bc) {
noisybits = bc.decodeVarBits();
this.bc = bc;
}
@ -28,60 +26,49 @@ public final class NoisyDiffCoder
/**
* Create an encoder for 3-pass-encoding
*/
public NoisyDiffCoder()
{
public NoisyDiffCoder() {
}
/**
* encodes a signed int (pass3 only, stats collection in pass2)
*/
public void encodeSignedValue( int value )
{
if ( pass == 3 )
{
bc.encodeNoisyDiff( value, noisybits );
}
else if ( pass == 2 )
{
count( value < 0 ? -value : value );
public void encodeSignedValue(int value) {
if (pass == 3) {
bc.encodeNoisyDiff(value, noisybits);
} else if (pass == 2) {
count(value < 0 ? -value : value);
}
}
/**
* decodes a signed int
*/
public int decodeSignedValue()
{
return bc.decodeNoisyDiff( noisybits );
public int decodeSignedValue() {
return bc.decodeNoisyDiff(noisybits);
}
/**
* Starts a new encoding pass and (in pass3) calculates the noisy-bit count
* from the stats collected in pass2 and writes that to the given context
*/
public void encodeDictionary( StatCoderContext bc )
{
if ( ++pass == 3 )
{
public void encodeDictionary(StatCoderContext bc) {
if (++pass == 3) {
// how many noisy bits?
for ( noisybits = 0; noisybits < 14 && tot > 0; noisybits++ )
{
if ( freqs[noisybits] < ( tot >> 1 ) )
for (noisybits = 0; noisybits < 14 && tot > 0; noisybits++) {
if (freqs[noisybits] < (tot >> 1))
break;
}
bc.encodeVarBits( noisybits );
bc.encodeVarBits(noisybits);
}
this.bc = bc;
}
private void count( int value )
{
if ( freqs == null )
private void count(int value) {
if (freqs == null)
freqs = new int[14];
int bm = 1;
for ( int i = 0; i < 14; i++ )
{
if ( value < bm )
for (int i = 0; i < 14; i++) {
if (value < bm)
break;
else
freqs[i]++;

View file

@ -1,26 +1,23 @@
package btools.codec;
import java.util.Map;
import java.util.TreeMap;
import btools.util.BitCoderContext;
public final class StatCoderContext extends BitCoderContext
{
private static TreeMap<String, long[]> statsPerName;
public final class StatCoderContext extends BitCoderContext {
private static Map<String, long[]> statsPerName;
private long lastbitpos = 0;
private static final int[] noisy_bits = new int[1024];
static
{
static {
// noisybits lookup
for( int i=0; i<1024; i++ )
{
for (int i = 0; i < 1024; i++) {
int p = i;
int noisybits = 0;
while (p > 2)
{
while (p > 2) {
noisybits++;
p >>= 1;
}
@ -29,29 +26,25 @@ public final class StatCoderContext extends BitCoderContext
}
public StatCoderContext( byte[] ab )
{
super( ab );
public StatCoderContext(byte[] ab) {
super(ab);
}
/**
* assign the de-/encoded bits since the last call assignBits to the given
* name. Used for encoding statistics
*
*
* @see #getBitReport
*/
public void assignBits( String name )
{
public void assignBits(String name) {
long bitpos = getWritingBitPosition();
if ( statsPerName == null )
{
statsPerName = new TreeMap<String, long[]>();
if (statsPerName == null) {
statsPerName = new TreeMap<>();
}
long[] stats = statsPerName.get( name );
if ( stats == null )
{
long[] stats = statsPerName.get(name);
if (stats == null) {
stats = new long[2];
statsPerName.put( name, stats );
statsPerName.put(name, stats);
}
stats[0] += bitpos - lastbitpos;
stats[1] += 1;
@ -60,20 +53,17 @@ public final class StatCoderContext extends BitCoderContext
/**
* Get a textual report on the bit-statistics
*
*
* @see #assignBits
*/
public static String getBitReport()
{
if ( statsPerName == null )
{
public static String getBitReport() {
if (statsPerName == null) {
return "<empty bit report>";
}
StringBuilder sb = new StringBuilder();
for ( String name : statsPerName.keySet() )
{
long[] stats = statsPerName.get( name );
sb.append( name + " count=" + stats[1] + " bits=" + stats[0] + "\n" );
for (String name : statsPerName.keySet()) {
long[] stats = statsPerName.get(name);
sb.append(name + " count=" + stats[1] + " bits=" + stats[0] + "\n");
}
statsPerName = null;
return sb.toString();
@ -82,76 +72,65 @@ public final class StatCoderContext extends BitCoderContext
/**
* encode an unsigned integer with some of of least significant bits
* considered noisy
*
*
* @see #decodeNoisyNumber
*/
public void encodeNoisyNumber( int value, int noisybits )
{
if ( value < 0 )
{
throw new IllegalArgumentException( "encodeVarBits expects positive value" );
public void encodeNoisyNumber(int value, int noisybits) {
if (value < 0) {
throw new IllegalArgumentException("encodeVarBits expects positive value");
}
if ( noisybits > 0 )
{
int mask = 0xffffffff >>> ( 32 - noisybits );
encodeBounded( mask, value & mask );
if (noisybits > 0) {
int mask = 0xffffffff >>> (32 - noisybits);
encodeBounded(mask, value & mask);
value >>= noisybits;
}
encodeVarBits( value );
encodeVarBits(value);
}
/**
* decode an unsigned integer with some of of least significant bits
* considered noisy
*
*
* @see #encodeNoisyNumber
*/
public int decodeNoisyNumber( int noisybits )
{
int value = decodeBits( noisybits );
return value | ( decodeVarBits() << noisybits );
public int decodeNoisyNumber(int noisybits) {
int value = decodeBits(noisybits);
return value | (decodeVarBits() << noisybits);
}
/**
* encode a signed integer with some of of least significant bits considered
* noisy
*
*
* @see #decodeNoisyDiff
*/
public void encodeNoisyDiff( int value, int noisybits )
{
if ( noisybits > 0 )
{
value += 1 << ( noisybits - 1 );
int mask = 0xffffffff >>> ( 32 - noisybits );
encodeBounded( mask, value & mask );
public void encodeNoisyDiff(int value, int noisybits) {
if (noisybits > 0) {
value += 1 << (noisybits - 1);
int mask = 0xffffffff >>> (32 - noisybits);
encodeBounded(mask, value & mask);
value >>= noisybits;
}
encodeVarBits( value < 0 ? -value : value );
if ( value != 0 )
{
encodeBit( value < 0 );
encodeVarBits(value < 0 ? -value : value);
if (value != 0) {
encodeBit(value < 0);
}
}
/**
* decode a signed integer with some of of least significant bits considered
* noisy
*
*
* @see #encodeNoisyDiff
*/
public int decodeNoisyDiff( int noisybits )
{
public int decodeNoisyDiff(int noisybits) {
int value = 0;
if ( noisybits > 0 )
{
value = decodeBits( noisybits ) - ( 1 << ( noisybits - 1 ) );
if (noisybits > 0) {
value = decodeBits(noisybits) - (1 << (noisybits - 1));
}
int val2 = decodeVarBits() << noisybits;
if ( val2 != 0 )
{
if ( decodeBit() )
{
if (val2 != 0) {
if (decodeBit()) {
val2 = -val2;
}
}
@ -161,38 +140,34 @@ public final class StatCoderContext extends BitCoderContext
/**
* encode a signed integer with the typical range and median taken from the
* predicted value
*
*
* @see #decodePredictedValue
*/
public void encodePredictedValue( int value, int predictor )
{
public void encodePredictedValue(int value, int predictor) {
int p = predictor < 0 ? -predictor : predictor;
int noisybits = 0;
while (p > 2)
{
while (p > 2) {
noisybits++;
p >>= 1;
}
encodeNoisyDiff( value - predictor, noisybits );
encodeNoisyDiff(value - predictor, noisybits);
}
/**
* decode a signed integer with the typical range and median taken from the
* predicted value
*
*
* @see #encodePredictedValue
*/
public int decodePredictedValue( int predictor )
{
public int decodePredictedValue(int predictor) {
int p = predictor < 0 ? -predictor : predictor;
int noisybits = 0;
while (p > 1023)
{
while (p > 1023) {
noisybits++;
p >>= 1;
}
return predictor + decodeNoisyDiff( noisybits + noisy_bits[p] );
return predictor + decodeNoisyDiff(noisybits + noisy_bits[p]);
}
/**
@ -201,30 +176,21 @@ public final class StatCoderContext extends BitCoderContext
* number of values with the current bit being 0. This yields an number of
* bits per value that only depends on the typical distance between subsequent
* values and also benefits
*
* @param values
* the array to encode
* @param offset
* position in this array where to start
* @param subsize
* number of values to encode
* @param nextbit
* bitmask with the most significant bit set to 1
* @param mask
* should be 0
*
* @param values the array to encode
* @param offset position in this array where to start
* @param subsize number of values to encode
* @param nextbit bitmask with the most significant bit set to 1
* @param mask should be 0
*/
public void encodeSortedArray( int[] values, int offset, int subsize, int nextbit, int mask )
{
if ( subsize == 1 ) // last-choice shortcut
{
while (nextbit != 0)
{
encodeBit( ( values[offset] & nextbit ) != 0 );
public void encodeSortedArray(int[] values, int offset, int subsize, int nextbit, int mask) {
if (subsize == 1) { // last-choice shortcut
while (nextbit != 0) {
encodeBit((values[offset] & nextbit) != 0);
nextbit >>= 1;
}
}
if ( nextbit == 0 )
{
if (nextbit == 0) {
return;
}
@ -234,71 +200,54 @@ public final class StatCoderContext extends BitCoderContext
// count 0-bit-fraction
int i = offset;
int end = subsize + offset;
for ( ; i < end; i++ )
{
if ( ( values[i] & mask ) != data )
{
for (; i < end; i++) {
if ((values[i] & mask) != data) {
break;
}
}
int size1 = i - offset;
int size2 = subsize - size1;
encodeBounded( subsize, size1 );
if ( size1 > 0 )
{
encodeSortedArray( values, offset, size1, nextbit >> 1, mask );
encodeBounded(subsize, size1);
if (size1 > 0) {
encodeSortedArray(values, offset, size1, nextbit >> 1, mask);
}
if ( size2 > 0 )
{
encodeSortedArray( values, i, size2, nextbit >> 1, mask );
if (size2 > 0) {
encodeSortedArray(values, i, size2, nextbit >> 1, mask);
}
}
/**
* @param values the array to encode
* @param offset position in this array where to start
* @param subsize number of values to encode
* @param nextbit bitmask with the most significant bit set to 1
* @param value should be 0
* @see #encodeSortedArray
*
* @param values
* the array to encode
* @param offset
* position in this array where to start
* @param subsize
* number of values to encode
* @param nextbit
* bitmask with the most significant bit set to 1
* @param value
* should be 0
*/
public void decodeSortedArray( int[] values, int offset, int subsize, int nextbitpos, int value )
{
if ( subsize == 1 ) // last-choice shortcut
{
if ( nextbitpos >= 0 )
{
value |= decodeBitsReverse( nextbitpos+1 );
public void decodeSortedArray(int[] values, int offset, int subsize, int nextbitpos, int value) {
if (subsize == 1) { // last-choice shortcut
if (nextbitpos >= 0) {
value |= decodeBitsReverse(nextbitpos + 1);
}
values[offset] = value;
return;
}
if ( nextbitpos < 0 )
{
while (subsize-- > 0)
{
if (nextbitpos < 0) {
while (subsize-- > 0) {
values[offset++] = value;
}
return;
}
int size1 = decodeBounded( subsize );
int size1 = decodeBounded(subsize);
int size2 = subsize - size1;
if ( size1 > 0 )
{
decodeSortedArray( values, offset, size1, nextbitpos-1, value );
if (size1 > 0) {
decodeSortedArray(values, offset, size1, nextbitpos - 1, value);
}
if ( size2 > 0 )
{
decodeSortedArray( values, offset + size1, size2, nextbitpos-1, value | (1 << nextbitpos) );
if (size2 > 0) {
decodeSortedArray(values, offset + size1, size2, nextbitpos - 1, value | (1 << nextbitpos));
}
}

View file

@ -2,57 +2,50 @@ package btools.codec;
import java.util.Comparator;
import java.util.HashMap;
import java.util.Map;
import java.util.PriorityQueue;
import java.util.Queue;
import btools.util.BitCoderContext;
/**
* Encoder/Decoder for way-/node-descriptions
*
* <p>
* It detects identical descriptions and sorts them
* into a huffman-tree according to their frequencies
*
* <p>
* Adapted for 3-pass encoding (counters -&gt; statistics -&gt; encoding )
* but doesn't do anything at pass1
*/
public final class TagValueCoder
{
private HashMap<TagValueSet, TagValueSet> identityMap;
public final class TagValueCoder {
private Map<TagValueSet, TagValueSet> identityMap;
private Object tree;
private BitCoderContext bc;
private int pass;
private int nextTagValueSetId;
public void encodeTagValueSet( byte[] data )
{
if ( pass == 1 )
{
public void encodeTagValueSet(byte[] data) {
if (pass == 1) {
return;
}
TagValueSet tvsProbe = new TagValueSet(nextTagValueSetId);
tvsProbe.data = data;
TagValueSet tvs = identityMap.get( tvsProbe );
if ( pass == 3 )
{
bc.encodeBounded( tvs.range - 1, tvs.code );
}
else if ( pass == 2 )
{
if ( tvs == null )
{
TagValueSet tvs = identityMap.get(tvsProbe);
if (pass == 3) {
bc.encodeBounded(tvs.range - 1, tvs.code);
} else if (pass == 2) {
if (tvs == null) {
tvs = tvsProbe;
nextTagValueSetId++;
identityMap.put( tvs, tvs );
identityMap.put(tvs, tvs);
}
tvs.frequency++;
}
}
public TagValueWrapper decodeTagValueSet()
{
public TagValueWrapper decodeTagValueSet() {
Object node = tree;
while (node instanceof TreeNode)
{
while (node instanceof TreeNode) {
TreeNode tn = (TreeNode) node;
boolean nextBit = bc.decodeBit();
node = nextBit ? tn.child2 : tn.child1;
@ -60,104 +53,87 @@ public final class TagValueCoder
return (TagValueWrapper) node;
}
public void encodeDictionary( BitCoderContext bc )
{
if ( ++pass == 3 )
{
if ( identityMap.size() == 0 )
{
public void encodeDictionary(BitCoderContext bc) {
if (++pass == 3) {
if (identityMap.size() == 0) {
TagValueSet dummy = new TagValueSet(nextTagValueSetId++);
identityMap.put( dummy, dummy );
identityMap.put(dummy, dummy);
}
PriorityQueue<TagValueSet> queue = new PriorityQueue<TagValueSet>(2*identityMap.size(), new TagValueSet.FrequencyComparator());
Queue<TagValueSet> queue = new PriorityQueue<>(2 * identityMap.size(), new TagValueSet.FrequencyComparator());
queue.addAll(identityMap.values());
while (queue.size() > 1)
{
while (queue.size() > 1) {
TagValueSet node = new TagValueSet(nextTagValueSetId++);
node.child1 = queue.poll();
node.child2 = queue.poll();
node.frequency = node.child1.frequency + node.child2.frequency;
queue.add( node );
queue.add(node);
}
TagValueSet root = queue.poll();
root.encode( bc, 1, 0 );
root.encode(bc, 1, 0);
}
this.bc = bc;
}
public TagValueCoder( BitCoderContext bc, DataBuffers buffers, TagValueValidator validator )
{
tree = decodeTree( bc, buffers, validator );
public TagValueCoder(BitCoderContext bc, DataBuffers buffers, TagValueValidator validator) {
tree = decodeTree(bc, buffers, validator);
this.bc = bc;
}
public TagValueCoder()
{
identityMap = new HashMap<TagValueSet, TagValueSet>();
public TagValueCoder() {
identityMap = new HashMap<>();
}
private Object decodeTree( BitCoderContext bc, DataBuffers buffers, TagValueValidator validator )
{
private Object decodeTree(BitCoderContext bc, DataBuffers buffers, TagValueValidator validator) {
boolean isNode = bc.decodeBit();
if ( isNode )
{
if (isNode) {
TreeNode node = new TreeNode();
node.child1 = decodeTree( bc, buffers, validator );
node.child2 = decodeTree( bc, buffers, validator );
node.child1 = decodeTree(bc, buffers, validator);
node.child2 = decodeTree(bc, buffers, validator);
return node;
}
byte[] buffer = buffers.tagbuf1;
BitCoderContext ctx = buffers.bctx1;
ctx.reset( buffer );
BitCoderContext ctx = buffers.bctx1;
ctx.reset(buffer);
int inum = 0;
int lastEncodedInum = 0;
boolean hasdata = false;
for ( ;; )
{
for (; ; ) {
int delta = bc.decodeVarBits();
if ( !hasdata )
{
if ( delta == 0 )
{
if (!hasdata) {
if (delta == 0) {
return null;
}
}
if ( delta == 0 )
{
ctx.encodeVarBits( 0 );
if (delta == 0) {
ctx.encodeVarBits(0);
break;
}
inum += delta;
int data = bc.decodeVarBits();
if ( validator == null || validator.isLookupIdxUsed( inum ) )
{
if (validator == null || validator.isLookupIdxUsed(inum)) {
hasdata = true;
ctx.encodeVarBits( inum - lastEncodedInum );
ctx.encodeVarBits( data );
ctx.encodeVarBits(inum - lastEncodedInum);
ctx.encodeVarBits(data);
lastEncodedInum = inum;
}
}
byte[] res;
int len = ctx.closeAndGetEncodedLength();
if ( validator == null )
{
if (validator == null) {
res = new byte[len];
System.arraycopy( buffer, 0, res, 0, len );
}
else
{
res = validator.unify( buffer, 0, len );
System.arraycopy(buffer, 0, res, 0, len);
} else {
res = validator.unify(buffer, 0, len);
}
int accessType = validator == null ? 2 : validator.accessType( res );
if ( accessType > 0 )
{
int accessType = validator == null ? 2 : validator.accessType(res);
if (accessType > 0) {
TagValueWrapper w = new TagValueWrapper();
w.data = res;
w.accessType = accessType;
@ -166,14 +142,12 @@ public final class TagValueCoder
return null;
}
public static final class TreeNode
{
public static final class TreeNode {
public Object child1;
public Object child2;
}
public static final class TagValueSet
{
public static final class TagValueSet {
public byte[] data;
public int frequency;
public int code;
@ -182,66 +156,51 @@ public final class TagValueCoder
public TagValueSet child2;
private int id; // serial number to make the comparator well defined in case of equal frequencies
public TagValueSet( int id )
{
public TagValueSet(int id) {
this.id = id;
}
public void encode( BitCoderContext bc, int range, int code )
{
public void encode(BitCoderContext bc, int range, int code) {
this.range = range;
this.code = code;
boolean isNode = child1 != null;
bc.encodeBit( isNode );
if ( isNode )
{
child1.encode( bc, range << 1, code );
child2.encode( bc, range << 1, code + range );
}
else
{
if ( data == null )
{
bc.encodeVarBits( 0 );
bc.encodeBit(isNode);
if (isNode) {
child1.encode(bc, range << 1, code);
child2.encode(bc, range << 1, code + range);
} else {
if (data == null) {
bc.encodeVarBits(0);
return;
}
BitCoderContext src = new BitCoderContext( data );
for ( ;; )
{
BitCoderContext src = new BitCoderContext(data);
for (; ; ) {
int delta = src.decodeVarBits();
bc.encodeVarBits( delta );
if ( delta == 0 )
{
bc.encodeVarBits(delta);
if (delta == 0) {
break;
}
int data = src.decodeVarBits();
bc.encodeVarBits( data );
bc.encodeVarBits(data);
}
}
}
@Override
public boolean equals( Object o )
{
if ( o instanceof TagValueSet )
{
public boolean equals(Object o) {
if (o instanceof TagValueSet) {
TagValueSet tvs = (TagValueSet) o;
if ( data == null )
{
if (data == null) {
return tvs.data == null;
}
if ( tvs.data == null )
{
if (tvs.data == null) {
return data == null;
}
if ( data.length != tvs.data.length )
{
if (data.length != tvs.data.length) {
return false;
}
for ( int i = 0; i < data.length; i++ )
{
if ( data[i] != tvs.data[i] )
{
for (int i = 0; i < data.length; i++) {
if (data[i] != tvs.data[i]) {
return false;
}
}
@ -251,39 +210,34 @@ public final class TagValueCoder
}
@Override
public int hashCode()
{
if ( data == null )
{
public int hashCode() {
if (data == null) {
return 0;
}
int h = 17;
for ( int i = 0; i < data.length; i++ )
{
h = ( h << 8 ) + data[i];
for (int i = 0; i < data.length; i++) {
h = (h << 8) + data[i];
}
return h;
}
public static class FrequencyComparator implements Comparator<TagValueSet>
{
public static class FrequencyComparator implements Comparator<TagValueSet> {
@Override
public int compare(TagValueSet tvs1, TagValueSet tvs2) {
if ( tvs1.frequency < tvs2.frequency )
if (tvs1.frequency < tvs2.frequency)
return -1;
if ( tvs1.frequency > tvs2.frequency )
if (tvs1.frequency > tvs2.frequency)
return 1;
// to avoid ordering instability, decide on the id if frequency is equal
if ( tvs1.id < tvs2.id )
if (tvs1.id < tvs2.id)
return -1;
if ( tvs1.id > tvs2.id )
if (tvs1.id > tvs2.id)
return 1;
if ( tvs1 != tvs2 )
{
throw new RuntimeException( "identity corruption!" );
if (tvs1 != tvs2) {
throw new RuntimeException("identity corruption!");
}
return 0;
}

View file

@ -1,17 +1,16 @@
package btools.codec;
public interface TagValueValidator
{
public interface TagValueValidator {
/**
* @param tagValueSet the way description to check
* @return 0 = nothing, 1=no matching, 2=normal
*/
public int accessType( byte[] tagValueSet );
int accessType(byte[] tagValueSet);
public byte[] unify( byte[] tagValueSet, int offset, int len );
byte[] unify(byte[] tagValueSet, int offset, int len);
public boolean isLookupIdxUsed( int idx );
boolean isLookupIdxUsed(int idx);
public void setDecodeForbidden( boolean decodeForbidden );
void setDecodeForbidden(boolean decodeForbidden);
}

View file

@ -5,8 +5,7 @@ package btools.codec;
* TagValueWrapper wrapps a description bitmap
* to add the access-type
*/
public final class TagValueWrapper
{
public final class TagValueWrapper {
public byte[] data;
public int accessType;
}

View file

@ -5,9 +5,10 @@ package btools.codec;
* from the decoder to find the closest
* matches to the waypoints
*/
public interface WaypointMatcher
{
boolean start( int ilonStart, int ilatStart, int ilonTarget, int ilatTarget );
void transferNode( int ilon, int ilat );
public interface WaypointMatcher {
boolean start(int ilonStart, int ilatStart, int ilonTarget, int ilatTarget);
void transferNode(int ilon, int ilat);
void end();
}

View file

@ -3,50 +3,35 @@ package btools.codec;
import org.junit.Assert;
import org.junit.Test;
public class LinkedListContainerTest
{
public class LinkedListContainerTest {
@Test
public void linkedListTest1()
{
public void linkedListTest1() {
int nlists = 553;
LinkedListContainer llc = new LinkedListContainer( nlists, null );
LinkedListContainer llc = new LinkedListContainer(nlists, null);
for ( int ln = 0; ln < nlists; ln++ )
{
for ( int i = 0; i < 10; i++ )
{
llc.addDataElement( ln, ln * i );
for (int ln = 0; ln < nlists; ln++) {
for (int i = 0; i < 10; i++) {
llc.addDataElement(ln, ln * i);
}
}
for ( int i = 0; i < 10; i++ )
{
for ( int ln = 0; ln < nlists; ln++ )
{
llc.addDataElement( ln, ln * i );
for (int i = 0; i < 10; i++) {
for (int ln = 0; ln < nlists; ln++) {
llc.addDataElement(ln, ln * i);
}
}
for ( int ln = 0; ln < nlists; ln++ )
{
int cnt = llc.initList( ln );
Assert.assertTrue( "list size test", cnt == 20 );
for (int ln = 0; ln < nlists; ln++) {
int cnt = llc.initList(ln);
Assert.assertEquals("list size test", 20, cnt);
for ( int i = 19; i >= 0; i-- )
{
for (int i = 19; i >= 0; i--) {
int data = llc.getDataElement();
Assert.assertTrue( "data value test", data == ln * ( i % 10 ) );
Assert.assertEquals("data value test", ln * (i % 10), data);
}
}
try
{
llc.getDataElement();
Assert.fail( "no more elements expected" );
}
catch (IllegalArgumentException e)
{
}
Assert.assertThrows("no more elements expected", IllegalArgumentException.class, () -> llc.getDataElement());
}
}

View file

@ -6,100 +6,79 @@ import java.util.Random;
import org.junit.Assert;
import org.junit.Test;
public class StatCoderContextTest
{
public class StatCoderContextTest {
@Test
public void noisyVarBitsEncodeDecodeTest()
{
public void noisyVarBitsEncodeDecodeTest() {
byte[] ab = new byte[40000];
StatCoderContext ctx = new StatCoderContext( ab );
for ( int noisybits = 1; noisybits < 12; noisybits++ )
{
for ( int i = 0; i < 1000; i++ )
{
ctx.encodeNoisyNumber( i, noisybits );
StatCoderContext ctx = new StatCoderContext(ab);
for (int noisybits = 1; noisybits < 12; noisybits++) {
for (int i = 0; i < 1000; i++) {
ctx.encodeNoisyNumber(i, noisybits);
}
}
ctx.closeAndGetEncodedLength();
ctx = new StatCoderContext( ab );
ctx = new StatCoderContext(ab);
for ( int noisybits = 1; noisybits < 12; noisybits++ )
{
for ( int i = 0; i < 1000; i++ )
{
int value = ctx.decodeNoisyNumber( noisybits );
if ( value != i )
{
Assert.fail( "value mismatch: noisybits=" + noisybits + " i=" + i + " value=" + value );
for (int noisybits = 1; noisybits < 12; noisybits++) {
for (int i = 0; i < 1000; i++) {
int value = ctx.decodeNoisyNumber(noisybits);
if (value != i) {
Assert.fail("value mismatch: noisybits=" + noisybits + " i=" + i + " value=" + value);
}
}
}
}
@Test
public void noisySignedVarBitsEncodeDecodeTest()
{
public void noisySignedVarBitsEncodeDecodeTest() {
byte[] ab = new byte[80000];
StatCoderContext ctx = new StatCoderContext( ab );
for ( int noisybits = 0; noisybits < 12; noisybits++ )
{
for ( int i = -1000; i < 1000; i++ )
{
ctx.encodeNoisyDiff( i, noisybits );
StatCoderContext ctx = new StatCoderContext(ab);
for (int noisybits = 0; noisybits < 12; noisybits++) {
for (int i = -1000; i < 1000; i++) {
ctx.encodeNoisyDiff(i, noisybits);
}
}
ctx.closeAndGetEncodedLength();
ctx = new StatCoderContext( ab );
ctx = new StatCoderContext(ab);
for ( int noisybits = 0; noisybits < 12; noisybits++ )
{
for ( int i = -1000; i < 1000; i++ )
{
int value = ctx.decodeNoisyDiff( noisybits );
if ( value != i )
{
Assert.fail( "value mismatch: noisybits=" + noisybits + " i=" + i + " value=" + value );
for (int noisybits = 0; noisybits < 12; noisybits++) {
for (int i = -1000; i < 1000; i++) {
int value = ctx.decodeNoisyDiff(noisybits);
if (value != i) {
Assert.fail("value mismatch: noisybits=" + noisybits + " i=" + i + " value=" + value);
}
}
}
}
@Test
public void predictedValueEncodeDecodeTest()
{
public void predictedValueEncodeDecodeTest() {
byte[] ab = new byte[80000];
StatCoderContext ctx = new StatCoderContext( ab );
for ( int value = -100; value < 100; value += 5 )
{
for ( int predictor = -200; predictor < 200; predictor += 7 )
{
ctx.encodePredictedValue( value, predictor );
StatCoderContext ctx = new StatCoderContext(ab);
for (int value = -100; value < 100; value += 5) {
for (int predictor = -200; predictor < 200; predictor += 7) {
ctx.encodePredictedValue(value, predictor);
}
}
ctx.closeAndGetEncodedLength();
ctx = new StatCoderContext( ab );
ctx = new StatCoderContext(ab);
for ( int value = -100; value < 100; value += 5 )
{
for ( int predictor = -200; predictor < 200; predictor += 7 )
{
int decodedValue = ctx.decodePredictedValue( predictor );
if ( value != decodedValue )
{
Assert.fail( "value mismatch: value=" + value + " predictor=" + predictor + " decodedValue=" + decodedValue );
for (int value = -100; value < 100; value += 5) {
for (int predictor = -200; predictor < 200; predictor += 7) {
int decodedValue = ctx.decodePredictedValue(predictor);
if (value != decodedValue) {
Assert.fail("value mismatch: value=" + value + " predictor=" + predictor + " decodedValue=" + decodedValue);
}
}
}
}
@Test
public void sortedArrayEncodeDecodeTest()
{
public void sortedArrayEncodeDecodeTest() {
Random rand = new Random();
int size = 1000000;
int[] values = new int[size];
for ( int i = 0; i < size; i++ )
{
for (int i = 0; i < size; i++) {
values[i] = rand.nextInt() & 0x0fffffff;
}
values[5] = 175384; // force collision
@ -108,23 +87,21 @@ public class StatCoderContextTest
values[15] = 275384; // force neighbours
values[18] = 275385;
Arrays.sort( values );
Arrays.sort(values);
byte[] ab = new byte[3000000];
StatCoderContext ctx = new StatCoderContext( ab );
ctx.encodeSortedArray( values, 0, size, 0x08000000, 0 );
StatCoderContext ctx = new StatCoderContext(ab);
ctx.encodeSortedArray(values, 0, size, 0x08000000, 0);
ctx.closeAndGetEncodedLength();
ctx = new StatCoderContext( ab );
ctx = new StatCoderContext(ab);
int[] decodedValues = new int[size];
ctx.decodeSortedArray( decodedValues, 0, size, 27, 0 );
ctx.decodeSortedArray(decodedValues, 0, size, 27, 0);
for ( int i = 0; i < size; i++ )
{
if ( values[i] != decodedValues[i] )
{
Assert.fail( "mismatch at i=" + i + " " + values[i] + "<>" + decodedValues[i] );
for (int i = 0; i < size; i++) {
if (values[i] != decodedValues[i]) {
Assert.fail("mismatch at i=" + i + " " + values[i] + "<>" + decodedValues[i]);
}
}
}

View file

@ -1 +0,0 @@
/build/

View file

@ -1,13 +1,13 @@
plugins {
id 'java-library'
id 'brouter.library-conventions'
}
dependencies {
implementation project(':brouter-mapaccess')
implementation project(':brouter-util')
implementation project(':brouter-expressions')
implementation project(':brouter-codec')
testImplementation 'junit:junit:4.13.1'
}
// MapcreatorTest generates segments which are used in tests
test.dependsOn ':brouter-map-creator:test'

View file

@ -1,3 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<manifest package="btools.router" />

View file

@ -0,0 +1,43 @@
package btools.router;
import java.io.BufferedWriter;
import java.io.StringWriter;
public class FormatCsv extends Formatter {
public FormatCsv(RoutingContext rc) {
super(rc);
}
@Override
public String format(OsmTrack t) {
try {
StringWriter sw = new StringWriter();
BufferedWriter bw = new BufferedWriter(sw);
writeMessages(bw, t);
return sw.toString();
} catch (Exception ex) {
return "Error: " + ex.getMessage();
}
}
public void writeMessages(BufferedWriter bw, OsmTrack t) throws Exception {
dumpLine(bw, MESSAGES_HEADER);
for (String m : t.aggregateMessages()) {
dumpLine(bw, m);
}
if (bw != null)
bw.close();
}
private void dumpLine(BufferedWriter bw, String s) throws Exception {
if (bw == null) {
System.out.println(s);
} else {
bw.write(s);
bw.write("\n");
}
}
}

View file

@ -0,0 +1,534 @@
package btools.router;
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileInputStream;
import java.io.IOException;
import java.io.InputStreamReader;
import java.io.StringWriter;
import java.util.Map;
import btools.mapaccess.MatchedWaypoint;
import btools.util.StringUtils;
public class FormatGpx extends Formatter {
public FormatGpx(RoutingContext rc) {
super(rc);
}
@Override
public String format(OsmTrack t) {
try {
StringWriter sw = new StringWriter(8192);
BufferedWriter bw = new BufferedWriter(sw);
formatAsGpx(bw, t);
bw.close();
return sw.toString();
} catch (Exception e) {
throw new RuntimeException(e);
}
}
public String formatAsGpx(BufferedWriter sb, OsmTrack t) throws IOException {
int turnInstructionMode = t.voiceHints != null ? t.voiceHints.turnInstructionMode : 0;
sb.append("<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n");
if (turnInstructionMode != 9) {
for (int i = t.messageList.size() - 1; i >= 0; i--) {
String message = t.messageList.get(i);
if (i < t.messageList.size() - 1)
message = "(alt-index " + i + ": " + message + " )";
if (message != null)
sb.append("<!-- ").append(message).append(" -->\n");
}
}
if (turnInstructionMode == 4) { // comment style
sb.append("<!-- $transport-mode$").append(t.voiceHints.getTransportMode()).append("$ -->\n");
sb.append("<!-- cmd idx lon lat d2next geometry -->\n");
sb.append("<!-- $turn-instruction-start$\n");
for (VoiceHint hint : t.voiceHints.list) {
sb.append(String.format(" $turn$%6s;%6d;%10s;%10s;%6d;%s$\n", hint.getCommandString(), hint.indexInTrack,
formatILon(hint.ilon), formatILat(hint.ilat), (int) (hint.distanceToNext), hint.formatGeometry()));
}
sb.append(" $turn-instruction-end$ -->\n");
}
sb.append("<gpx \n");
sb.append(" xmlns=\"http://www.topografix.com/GPX/1/1\" \n");
sb.append(" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" \n");
if (turnInstructionMode == 9) { // BRouter style
sb.append(" xmlns:brouter=\"Not yet documented\" \n");
}
if (turnInstructionMode == 7) { // old locus style
sb.append(" xmlns:locus=\"http://www.locusmap.eu\" \n");
}
sb.append(" xsi:schemaLocation=\"http://www.topografix.com/GPX/1/1 http://www.topografix.com/GPX/1/1/gpx.xsd\" \n");
if (turnInstructionMode == 3) {
sb.append(" creator=\"OsmAndRouter\" version=\"1.1\">\n");
} else {
sb.append(" creator=\"BRouter-" + t.version + "\" version=\"1.1\">\n");
}
if (turnInstructionMode == 9) {
sb.append(" <metadata>\n");
sb.append(" <name>").append(t.name).append("</name>\n");
sb.append(" <extensions>\n");
sb.append(" <brouter:info>").append(t.messageList.get(0)).append("</brouter:info>\n");
if (t.params != null && t.params.size() > 0) {
sb.append(" <brouter:params><![CDATA[");
int i = 0;
for (Map.Entry<String, String> e : t.params.entrySet()) {
if (i++ != 0) sb.append("&");
sb.append(e.getKey()).append("=").append(e.getValue());
}
sb.append("]]></brouter:params>\n");
}
sb.append(" </extensions>\n");
sb.append(" </metadata>\n");
}
if (turnInstructionMode == 3 || turnInstructionMode == 8) { // osmand style, cruiser
float lastRteTime = 0;
sb.append(" <rte>\n");
float rteTime = t.getVoiceHintTime(0);
StringBuffer first = new StringBuffer();
// define start point
{
first.append(" <rtept lat=\"").append(formatILat(t.nodes.get(0).getILat())).append("\" lon=\"")
.append(formatILon(t.nodes.get(0).getILon())).append("\">\n")
.append(" <desc>start</desc>\n <extensions>\n");
if (rteTime != lastRteTime) { // add timing only if available
double ti = rteTime - lastRteTime;
first.append(" <time>").append("" + (int) (ti + 0.5)).append("</time>\n");
lastRteTime = rteTime;
}
first.append(" <offset>0</offset>\n </extensions>\n </rtept>\n");
}
if (turnInstructionMode == 8) {
if (t.matchedWaypoints.get(0).direct && t.voiceHints.list.get(0).indexInTrack == 0) {
// has a voice hint do nothing, voice hint will do
} else {
sb.append(first.toString());
}
} else {
sb.append(first.toString());
}
for (int i = 0; i < t.voiceHints.list.size(); i++) {
VoiceHint hint = t.voiceHints.list.get(i);
sb.append(" <rtept lat=\"").append(formatILat(hint.ilat)).append("\" lon=\"")
.append(formatILon(hint.ilon)).append("\">\n")
.append(" <desc>")
.append(turnInstructionMode == 3 ? hint.getMessageString() : hint.getCruiserMessageString())
.append("</desc>\n <extensions>\n");
rteTime = t.getVoiceHintTime(i + 1);
if (rteTime != lastRteTime) { // add timing only if available
double ti = rteTime - lastRteTime;
sb.append(" <time>").append("" + (int) (ti + 0.5)).append("</time>\n");
lastRteTime = rteTime;
}
sb.append(" <turn>")
.append(turnInstructionMode == 3 ? hint.getCommandString() : hint.getCruiserCommandString())
.append("</turn>\n <turn-angle>").append("" + (int) hint.angle)
.append("</turn-angle>\n <offset>").append("" + hint.indexInTrack).append("</offset>\n </extensions>\n </rtept>\n");
}
sb.append(" <rtept lat=\"").append(formatILat(t.nodes.get(t.nodes.size() - 1).getILat())).append("\" lon=\"")
.append(formatILon(t.nodes.get(t.nodes.size() - 1).getILon())).append("\">\n")
.append(" <desc>destination</desc>\n <extensions>\n");
sb.append(" <time>0</time>\n");
sb.append(" <offset>").append("" + (t.nodes.size() - 1)).append("</offset>\n </extensions>\n </rtept>\n");
sb.append("</rte>\n");
}
if (turnInstructionMode == 7) { // old locus style
float lastRteTime = t.getVoiceHintTime(0);
for (int i = 0; i < t.voiceHints.list.size(); i++) {
VoiceHint hint = t.voiceHints.list.get(i);
sb.append(" <wpt lon=\"").append(formatILon(hint.ilon)).append("\" lat=\"")
.append(formatILat(hint.ilat)).append("\">")
.append(hint.selev == Short.MIN_VALUE ? "" : "<ele>" + (hint.selev / 4.) + "</ele>")
.append("<name>")
.append(hint.getMessageString())
.append("</name>")
.append("<extensions><locus:rteDistance>").append("" + hint.distanceToNext).append("</locus:rteDistance>");
float rteTime = t.getVoiceHintTime(i + 1);
if (rteTime != lastRteTime) { // add timing only if available
double ti = rteTime - lastRteTime;
double speed = hint.distanceToNext / ti;
sb.append("<locus:rteTime>").append("" + ti).append("</locus:rteTime>")
.append("<locus:rteSpeed>").append("" + speed).append("</locus:rteSpeed>");
lastRteTime = rteTime;
}
sb.append("<locus:rtePointAction>").append("" + hint.getLocusAction()).append("</locus:rtePointAction></extensions>")
.append("</wpt>\n");
}
}
if (turnInstructionMode == 5) { // gpsies style
for (VoiceHint hint : t.voiceHints.list) {
sb.append(" <wpt lon=\"").append(formatILon(hint.ilon)).append("\" lat=\"")
.append(formatILat(hint.ilat)).append("\">")
.append("<name>").append(hint.getMessageString()).append("</name>")
.append("<sym>").append(hint.getSymbolString().toLowerCase()).append("</sym>")
.append("<type>").append(hint.getSymbolString()).append("</type>")
.append("</wpt>\n");
}
}
if (turnInstructionMode == 6) { // orux style
for (VoiceHint hint : t.voiceHints.list) {
sb.append(" <wpt lat=\"").append(formatILat(hint.ilat)).append("\" lon=\"")
.append(formatILon(hint.ilon)).append("\">")
.append(hint.selev == Short.MIN_VALUE ? "" : "<ele>" + (hint.selev / 4.) + "</ele>")
.append("<extensions>\n" +
" <om:oruxmapsextensions xmlns:om=\"http://www.oruxmaps.com/oruxmapsextensions/1/0\">\n" +
" <om:ext type=\"ICON\" subtype=\"0\">").append("" + hint.getOruxAction())
.append("</om:ext>\n" +
" </om:oruxmapsextensions>\n" +
" </extensions>\n" +
" </wpt>\n");
}
}
for (int i = 0; i <= t.pois.size() - 1; i++) {
OsmNodeNamed poi = t.pois.get(i);
sb.append(" <wpt lon=\"").append(formatILon(poi.ilon)).append("\" lat=\"")
.append(formatILat(poi.ilat)).append("\">\n")
.append(" <name>").append(StringUtils.escapeXml10(poi.name)).append("</name>\n")
.append(" </wpt>\n");
}
if (t.exportWaypoints) {
for (int i = 0; i <= t.matchedWaypoints.size() - 1; i++) {
MatchedWaypoint wt = t.matchedWaypoints.get(i);
sb.append(" <wpt lon=\"").append(formatILon(wt.waypoint.ilon)).append("\" lat=\"")
.append(formatILat(wt.waypoint.ilat)).append("\">\n")
.append(" <name>").append(StringUtils.escapeXml10(wt.name)).append("</name>\n");
if (i == 0) {
sb.append(" <type>from</type>\n");
} else if (i == t.matchedWaypoints.size() - 1) {
sb.append(" <type>to</type>\n");
} else {
sb.append(" <type>via</type>\n");
}
sb.append(" </wpt>\n");
}
}
sb.append(" <trk>\n");
if (turnInstructionMode == 9
|| turnInstructionMode == 2
|| turnInstructionMode == 8
|| turnInstructionMode == 4) { // Locus, comment, cruise, brouter style
sb.append(" <src>").append(t.name).append("</src>\n");
sb.append(" <type>").append(t.voiceHints.getTransportMode()).append("</type>\n");
} else {
sb.append(" <name>").append(t.name).append("</name>\n");
}
if (turnInstructionMode == 7) {
sb.append(" <extensions>\n");
sb.append(" <locus:rteComputeType>").append("" + t.voiceHints.getLocusRouteType()).append("</locus:rteComputeType>\n");
sb.append(" <locus:rteSimpleRoundabouts>1</locus:rteSimpleRoundabouts>\n");
sb.append(" </extensions>\n");
}
// all points
sb.append(" <trkseg>\n");
String lastway = "";
boolean bNextDirect = false;
OsmPathElement nn = null;
String aSpeed;
for (int idx = 0; idx < t.nodes.size(); idx++) {
OsmPathElement n = t.nodes.get(idx);
String sele = n.getSElev() == Short.MIN_VALUE ? "" : "<ele>" + n.getElev() + "</ele>";
VoiceHint hint = t.getVoiceHint(idx);
MatchedWaypoint mwpt = t.getMatchedWaypoint(idx);
if (t.showTime) {
sele += "<time>" + getFormattedTime3(n.getTime()) + "</time>";
}
if (turnInstructionMode == 8) {
if (mwpt != null &&
!mwpt.name.startsWith("via") && !mwpt.name.startsWith("from") && !mwpt.name.startsWith("to")) {
sele += "<name>" + mwpt.name + "</name>";
}
}
boolean bNeedHeader = false;
if (turnInstructionMode == 9) { // trkpt/sym style
if (hint != null) {
if (mwpt != null &&
!mwpt.name.startsWith("via") && !mwpt.name.startsWith("from") && !mwpt.name.startsWith("to")) {
sele += "<name>" + mwpt.name + "</name>";
}
sele += "<desc>" + hint.getCruiserMessageString() + "</desc>";
sele += "<sym>" + hint.getCommandString(hint.cmd) + "</sym>";
if (mwpt != null) {
sele += "<type>Via</type>";
}
sele += "<extensions>";
if (t.showspeed) {
double speed = 0;
if (nn != null) {
int dist = n.calcDistance(nn);
float dt = n.getTime() - nn.getTime();
if (dt != 0.f) {
speed = ((3.6f * dist) / dt + 0.5);
}
}
sele += "<brouter:speed>" + (((int) (speed * 10)) / 10.f) + "</brouter:speed>";
}
sele += "<brouter:voicehint>" + hint.getCommandString() + ";" + (int) (hint.distanceToNext) + "," + hint.formatGeometry() + "</brouter:voicehint>";
if (n.message != null && n.message.wayKeyValues != null && !n.message.wayKeyValues.equals(lastway)) {
sele += "<brouter:way>" + n.message.wayKeyValues + "</brouter:way>";
lastway = n.message.wayKeyValues;
}
if (n.message != null && n.message.nodeKeyValues != null) {
sele += "<brouter:node>" + n.message.nodeKeyValues + "</brouter:node>";
}
sele += "</extensions>";
}
if (idx == 0 && hint == null) {
if (mwpt != null && mwpt.direct) {
sele += "<desc>beeline</desc>";
} else {
sele += "<desc>start</desc>";
}
sele += "<type>Via</type>";
} else if (idx == t.nodes.size() - 1 && hint == null) {
sele += "<desc>end</desc>";
sele += "<type>Via</type>";
} else {
if (mwpt != null && hint == null) {
if (mwpt.direct) {
// bNextDirect = true;
sele += "<desc>beeline</desc>";
} else {
sele += "<desc>" + mwpt.name + "</desc>";
}
sele += "<type>Via</type>";
bNextDirect = false;
}
}
if (hint == null) {
bNeedHeader = (t.showspeed || (n.message != null && n.message.wayKeyValues != null && !n.message.wayKeyValues.equals(lastway))) ||
(n.message != null && n.message.nodeKeyValues != null);
if (bNeedHeader) {
sele += "<extensions>";
if (t.showspeed) {
double speed = 0;
if (nn != null) {
int dist = n.calcDistance(nn);
float dt = n.getTime() - nn.getTime();
if (dt != 0.f) {
speed = ((3.6f * dist) / dt + 0.5);
}
}
sele += "<brouter:speed>" + (((int) (speed * 10)) / 10.f) + "</brouter:speed>";
}
if (n.message != null && n.message.wayKeyValues != null && !n.message.wayKeyValues.equals(lastway)) {
sele += "<brouter:way>" + n.message.wayKeyValues + "</brouter:way>";
lastway = n.message.wayKeyValues;
}
if (n.message != null && n.message.nodeKeyValues != null) {
sele += "<brouter:node>" + n.message.nodeKeyValues + "</brouter:node>";
}
sele += "</extensions>";
}
}
}
if (turnInstructionMode == 2) { // locus style new
if (hint != null) {
if (mwpt != null) {
if (!mwpt.name.startsWith("via") && !mwpt.name.startsWith("from") && !mwpt.name.startsWith("to")) {
sele += "<name>" + mwpt.name + "</name>";
}
if (mwpt.direct && bNextDirect) {
sele += "<src>" + hint.getLocusSymbolString() + "</src><sym>pass_place</sym><type>Shaping</type>";
// bNextDirect = false;
} else if (mwpt.direct) {
if (idx == 0)
sele += "<sym>pass_place</sym><type>Via</type>";
else
sele += "<sym>pass_place</sym><type>Shaping</type>";
bNextDirect = true;
} else if (bNextDirect) {
sele += "<src>beeline</src><sym>" + hint.getLocusSymbolString() + "</sym><type>Shaping</type>";
bNextDirect = false;
} else {
sele += "<sym>" + hint.getLocusSymbolString() + "</sym><type>Via</type>";
}
} else {
sele += "<sym>" + hint.getLocusSymbolString() + "</sym>";
}
} else {
if (idx == 0 && hint == null) {
int pos = sele.indexOf("<sym");
if (pos != -1) {
sele = sele.substring(0, pos);
}
if (mwpt != null && !mwpt.name.startsWith("from"))
sele += "<name>" + mwpt.name + "</name>";
if (mwpt != null && mwpt.direct) {
bNextDirect = true;
}
sele += "<sym>pass_place</sym>";
sele += "<type>Via</type>";
} else if (idx == t.nodes.size() - 1 && hint == null) {
int pos = sele.indexOf("<sym");
if (pos != -1) {
sele = sele.substring(0, pos);
}
if (mwpt != null && mwpt.name != null && !mwpt.name.startsWith("to"))
sele += "<name>" + mwpt.name + "</name>";
if (bNextDirect) {
sele += "<src>beeline</src>";
}
sele += "<sym>pass_place</sym>";
sele += "<type>Via</type>";
} else {
if (mwpt != null) {
if (!mwpt.name.startsWith("via") && !mwpt.name.startsWith("from") && !mwpt.name.startsWith("to")) {
sele += "<name>" + mwpt.name + "</name>";
}
if (mwpt.direct && bNextDirect) {
sele += "<src>beeline</src><sym>pass_place</sym><type>Shaping</type>";
} else if (mwpt.direct) {
if (idx == 0)
sele += "<sym>pass_place</sym><type>Via</type>";
else
sele += "<sym>pass_place</sym><type>Shaping</type>";
bNextDirect = true;
} else if (bNextDirect) {
sele += "<src>beeline</src><sym>pass_place</sym><type>Shaping</type>";
bNextDirect = false;
} else if (mwpt.name.startsWith("via") ||
mwpt.name.startsWith("from") ||
mwpt.name.startsWith("to")) {
if (bNextDirect) {
sele += "<src>beeline</src><sym>pass_place</sym><type>Shaping</type>";
} else {
sele += "<sym>pass_place</sym><type>Via</type>";
}
bNextDirect = false;
} else {
sele += "<name>" + mwpt.name + "</name>";
sele += "<sym>pass_place</sym><type>Via</type>";
}
}
}
}
}
sb.append(" <trkpt lon=\"").append(formatILon(n.getILon())).append("\" lat=\"")
.append(formatILat(n.getILat())).append("\">").append(sele).append("</trkpt>\n");
nn = n;
}
sb.append(" </trkseg>\n");
sb.append(" </trk>\n");
sb.append("</gpx>\n");
return sb.toString();
}
public String formatAsWaypoint(OsmNodeNamed n) {
try {
StringWriter sw = new StringWriter(8192);
BufferedWriter bw = new BufferedWriter(sw);
formatGpxHeader(bw);
formatWaypointGpx(bw, n);
formatGpxFooter(bw);
bw.close();
sw.close();
return sw.toString();
} catch (Exception e) {
throw new RuntimeException(e);
}
}
public void formatGpxHeader(BufferedWriter sb) throws IOException {
sb.append("<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n");
sb.append("<gpx \n");
sb.append(" xmlns=\"http://www.topografix.com/GPX/1/1\" \n");
sb.append(" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" \n");
sb.append(" xsi:schemaLocation=\"http://www.topografix.com/GPX/1/1 http://www.topografix.com/GPX/1/1/gpx.xsd\" \n");
sb.append(" creator=\"BRouter-" + OsmTrack.version + "\" version=\"1.1\">\n");
}
public void formatGpxFooter(BufferedWriter sb) throws IOException {
sb.append("</gpx>\n");
}
public void formatWaypointGpx(BufferedWriter sb, OsmNodeNamed n) throws IOException {
sb.append(" <wpt lon=\"").append(formatILon(n.ilon)).append("\" lat=\"")
.append(formatILat(n.ilat)).append("\">");
if (n.getSElev() != Short.MIN_VALUE) {
sb.append("<ele>").append("" + n.getElev()).append("</ele>");
}
if (n.name != null) {
sb.append("<name>").append(StringUtils.escapeXml10(n.name)).append("</name>");
}
if (n.nodeDescription != null && rc != null) {
sb.append("<desc>").append(rc.expctxWay.getKeyValueDescription(false, n.nodeDescription)).append("</desc>");
}
sb.append("</wpt>\n");
}
public static String getWaypoint(int ilon, int ilat, String name, String desc) {
return "<wpt lon=\"" + formatILon(ilon) + "\" lat=\"" + formatILat(ilat) + "\"><name>" + name + "</name>" + (desc != null ? "<desc>" + desc + "</desc>" : "") + "</wpt>";
}
public OsmTrack read(String filename) throws Exception {
File f = new File(filename);
if (!f.exists()) {
return null;
}
OsmTrack track = new OsmTrack();
BufferedReader br = new BufferedReader(new InputStreamReader(new FileInputStream(f)));
for (; ; ) {
String line = br.readLine();
if (line == null)
break;
int idx0 = line.indexOf("<trkpt ");
if (idx0 >= 0) {
idx0 = line.indexOf(" lon=\"");
idx0 += 6;
int idx1 = line.indexOf('"', idx0);
int ilon = (int) ((Double.parseDouble(line.substring(idx0, idx1)) + 180.) * 1000000. + 0.5);
int idx2 = line.indexOf(" lat=\"");
if (idx2 < 0)
continue;
idx2 += 6;
int idx3 = line.indexOf('"', idx2);
int ilat = (int) ((Double.parseDouble(line.substring(idx2, idx3)) + 90.) * 1000000. + 0.5);
track.nodes.add(OsmPathElement.create(ilon, ilat, (short) 0, null));
}
}
br.close();
return track;
}
}

View file

@ -0,0 +1,246 @@
package btools.router;
import java.io.BufferedWriter;
import java.io.StringWriter;
import java.text.DecimalFormat;
import java.text.NumberFormat;
import java.util.List;
import java.util.Locale;
import btools.mapaccess.MatchedWaypoint;
import btools.util.StringUtils;
public class FormatJson extends Formatter {
public FormatJson(RoutingContext rc) {
super(rc);
}
@Override
public String format(OsmTrack t) {
int turnInstructionMode = t.voiceHints != null ? t.voiceHints.turnInstructionMode : 0;
StringBuilder sb = new StringBuilder(8192);
sb.append("{\n");
sb.append(" \"type\": \"FeatureCollection\",\n");
sb.append(" \"features\": [\n");
sb.append(" {\n");
sb.append(" \"type\": \"Feature\",\n");
sb.append(" \"properties\": {\n");
sb.append(" \"creator\": \"BRouter-" + t.version + "\",\n");
sb.append(" \"name\": \"").append(t.name).append("\",\n");
sb.append(" \"track-length\": \"").append(t.distance).append("\",\n");
sb.append(" \"filtered ascend\": \"").append(t.ascend).append("\",\n");
sb.append(" \"plain-ascend\": \"").append(t.plainAscend).append("\",\n");
sb.append(" \"total-time\": \"").append(t.getTotalSeconds()).append("\",\n");
sb.append(" \"total-energy\": \"").append(t.energy).append("\",\n");
sb.append(" \"cost\": \"").append(t.cost).append("\",\n");
if (t.voiceHints != null && !t.voiceHints.list.isEmpty()) {
sb.append(" \"voicehints\": [\n");
for (VoiceHint hint : t.voiceHints.list) {
sb.append(" [");
sb.append(hint.indexInTrack);
sb.append(',').append(hint.getJsonCommandIndex());
sb.append(',').append(hint.getExitNumber());
sb.append(',').append(hint.distanceToNext);
sb.append(',').append((int) hint.angle);
// not always include geometry because longer and only needed for comment style
if (turnInstructionMode == 4) { // comment style
sb.append(",\"").append(hint.formatGeometry()).append("\"");
}
sb.append("],\n");
}
sb.deleteCharAt(sb.lastIndexOf(","));
sb.append(" ],\n");
}
if (t.showSpeedProfile) { // set in profile
List<String> sp = t.aggregateSpeedProfile();
if (sp.size() > 0) {
sb.append(" \"speedprofile\": [\n");
for (int i = sp.size() - 1; i >= 0; i--) {
sb.append(" [").append(sp.get(i)).append(i > 0 ? "],\n" : "]\n");
}
sb.append(" ],\n");
}
}
// ... traditional message list
{
sb.append(" \"messages\": [\n");
sb.append(" [\"").append(MESSAGES_HEADER.replaceAll("\t", "\", \"")).append("\"],\n");
for (String m : t.aggregateMessages()) {
sb.append(" [\"").append(m.replaceAll("\t", "\", \"")).append("\"],\n");
}
sb.deleteCharAt(sb.lastIndexOf(","));
sb.append(" ],\n");
}
if (t.getTotalSeconds() > 0) {
sb.append(" \"times\": [");
DecimalFormat decimalFormat = (DecimalFormat) NumberFormat.getInstance(Locale.ENGLISH);
decimalFormat.applyPattern("0.###");
for (OsmPathElement n : t.nodes) {
sb.append(decimalFormat.format(n.getTime())).append(",");
}
sb.deleteCharAt(sb.lastIndexOf(","));
sb.append("]\n");
} else {
sb.deleteCharAt(sb.lastIndexOf(","));
}
sb.append(" },\n");
if (t.iternity != null) {
sb.append(" \"iternity\": [\n");
for (String s : t.iternity) {
sb.append(" \"").append(s).append("\",\n");
}
sb.deleteCharAt(sb.lastIndexOf(","));
sb.append(" ],\n");
}
sb.append(" \"geometry\": {\n");
sb.append(" \"type\": \"LineString\",\n");
sb.append(" \"coordinates\": [\n");
OsmPathElement nn = null;
for (OsmPathElement n : t.nodes) {
String sele = n.getSElev() == Short.MIN_VALUE ? "" : ", " + n.getElev();
if (t.showspeed) { // hack: show speed instead of elevation
double speed = 0;
if (nn != null) {
int dist = n.calcDistance(nn);
float dt = n.getTime() - nn.getTime();
if (dt != 0.f) {
speed = ((3.6f * dist) / dt + 0.5);
}
}
sele = ", " + (((int) (speed * 10)) / 10.f);
}
sb.append(" [").append(formatILon(n.getILon())).append(", ").append(formatILat(n.getILat()))
.append(sele).append("],\n");
nn = n;
}
sb.deleteCharAt(sb.lastIndexOf(","));
sb.append(" ]\n");
sb.append(" }\n");
if (t.exportWaypoints || !t.pois.isEmpty()) {
sb.append(" },\n");
for (int i = 0; i <= t.pois.size() - 1; i++) {
OsmNodeNamed poi = t.pois.get(i);
addFeature(sb, "poi", poi.name, poi.ilat, poi.ilon);
if (i < t.matchedWaypoints.size() - 1) {
sb.append(",");
}
sb.append(" \n");
}
if (t.exportWaypoints) {
for (int i = 0; i <= t.matchedWaypoints.size() - 1; i++) {
String type;
if (i == 0) {
type = "from";
} else if (i == t.matchedWaypoints.size() - 1) {
type = "to";
} else {
type = "via";
}
MatchedWaypoint wp = t.matchedWaypoints.get(i);
addFeature(sb, type, wp.name, wp.waypoint.ilat, wp.waypoint.ilon);
if (i < t.matchedWaypoints.size() - 1) {
sb.append(",");
}
sb.append(" \n");
}
}
} else {
sb.append(" }\n");
}
sb.append(" ]\n");
sb.append("}\n");
return sb.toString();
}
private void addFeature(StringBuilder sb, String type, String name, int ilat, int ilon) {
sb.append(" {\n");
sb.append(" \"type\": \"Feature\",\n");
sb.append(" \"properties\": {\n");
sb.append(" \"name\": \"" + StringUtils.escapeJson(name) + "\",\n");
sb.append(" \"type\": \"" + type + "\"\n");
sb.append(" },\n");
sb.append(" \"geometry\": {\n");
sb.append(" \"type\": \"Point\",\n");
sb.append(" \"coordinates\": [\n");
sb.append(" " + formatILon(ilon) + ",\n");
sb.append(" " + formatILat(ilat) + "\n");
sb.append(" ]\n");
sb.append(" }\n");
sb.append(" }");
}
public String formatAsWaypoint(OsmNodeNamed n) {
try {
StringWriter sw = new StringWriter(8192);
BufferedWriter bw = new BufferedWriter(sw);
addJsonHeader(bw);
addJsonFeature(bw, "info", "wpinfo", n.ilon, n.ilat, n.getElev(), (n.nodeDescription != null ? rc.expctxWay.getKeyValueDescription(false, n.nodeDescription) : null));
addJsonFooter(bw);
bw.close();
sw.close();
return sw.toString();
} catch (Exception e) {
throw new RuntimeException(e);
}
}
private void addJsonFeature(BufferedWriter sb, String type, String name, int ilon, int ilat, double elev, String desc) {
try {
sb.append(" {\n");
sb.append(" \"type\": \"Feature\",\n");
sb.append(" \"properties\": {\n");
sb.append(" \"creator\": \"BRouter-" + OsmTrack.version + "\",\n");
sb.append(" \"name\": \"" + StringUtils.escapeJson(name) + "\",\n");
sb.append(" \"type\": \"" + type + "\"");
if (desc != null) {
sb.append(",\n \"message\": \"" + desc + "\"\n");
} else {
sb.append("\n");
}
sb.append(" },\n");
sb.append(" \"geometry\": {\n");
sb.append(" \"type\": \"Point\",\n");
sb.append(" \"coordinates\": [\n");
sb.append(" " + formatILon(ilon) + ",\n");
sb.append(" " + formatILat(ilat) + ",\n");
sb.append(" " + elev + "\n");
sb.append(" ]\n");
sb.append(" }\n");
sb.append(" }\n");
} catch (Exception e) {
throw new RuntimeException(e);
}
}
private static void addJsonHeader(BufferedWriter sb) {
try {
sb.append("{\n");
sb.append(" \"type\": \"FeatureCollection\",\n");
sb.append(" \"features\": [\n");
} catch (Exception e) {
throw new RuntimeException(e);
}
}
private static void addJsonFooter(BufferedWriter sb) {
try {
sb.append(" ]\n");
sb.append("}\n");
} catch (Exception e) {
throw new RuntimeException(e);
}
}
}

View file

@ -0,0 +1,91 @@
package btools.router;
import java.util.List;
import btools.mapaccess.MatchedWaypoint;
import btools.util.StringUtils;
public class FormatKml extends Formatter {
public FormatKml(RoutingContext rc) {
super(rc);
}
@Override
public String format(OsmTrack t) {
StringBuilder sb = new StringBuilder(8192);
sb.append("<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n");
sb.append("<kml xmlns=\"http://earth.google.com/kml/2.0\">\n");
sb.append(" <Document>\n");
sb.append(" <name>KML Samples</name>\n");
sb.append(" <open>1</open>\n");
sb.append(" <distance>3.497064</distance>\n");
sb.append(" <traveltime>872</traveltime>\n");
sb.append(" <description>To enable simple instructions add: 'instructions=1' as parameter to the URL</description>\n");
sb.append(" <Folder>\n");
sb.append(" <name>Paths</name>\n");
sb.append(" <visibility>0</visibility>\n");
sb.append(" <description>Examples of paths.</description>\n");
sb.append(" <Placemark>\n");
sb.append(" <name>Tessellated</name>\n");
sb.append(" <visibility>0</visibility>\n");
sb.append(" <description><![CDATA[If the <tessellate> tag has a value of 1, the line will contour to the underlying terrain]]></description>\n");
sb.append(" <LineString>\n");
sb.append(" <tessellate>1</tessellate>\n");
sb.append(" <coordinates>");
for (OsmPathElement n : t.nodes) {
sb.append(formatILon(n.getILon())).append(",").append(formatILat(n.getILat())).append("\n");
}
sb.append(" </coordinates>\n");
sb.append(" </LineString>\n");
sb.append(" </Placemark>\n");
sb.append(" </Folder>\n");
if (t.exportWaypoints || !t.pois.isEmpty()) {
if (!t.pois.isEmpty()) {
sb.append(" <Folder>\n");
sb.append(" <name>poi</name>\n");
for (int i = 0; i < t.pois.size(); i++) {
OsmNodeNamed poi = t.pois.get(i);
createPlaceMark(sb, poi.name, poi.ilat, poi.ilon);
}
sb.append(" </Folder>\n");
}
if (t.exportWaypoints) {
int size = t.matchedWaypoints.size();
createFolder(sb, "start", t.matchedWaypoints.subList(0, 1));
if (t.matchedWaypoints.size() > 2) {
createFolder(sb, "via", t.matchedWaypoints.subList(1, size - 1));
}
createFolder(sb, "end", t.matchedWaypoints.subList(size - 1, size));
}
}
sb.append(" </Document>\n");
sb.append("</kml>\n");
return sb.toString();
}
private void createFolder(StringBuilder sb, String type, List<MatchedWaypoint> waypoints) {
sb.append(" <Folder>\n");
sb.append(" <name>" + type + "</name>\n");
for (int i = 0; i < waypoints.size(); i++) {
MatchedWaypoint wp = waypoints.get(i);
createPlaceMark(sb, wp.name, wp.waypoint.ilat, wp.waypoint.ilon);
}
sb.append(" </Folder>\n");
}
private void createPlaceMark(StringBuilder sb, String name, int ilat, int ilon) {
sb.append(" <Placemark>\n");
sb.append(" <name>" + StringUtils.escapeXml10(name) + "</name>\n");
sb.append(" <Point>\n");
sb.append(" <coordinates>" + formatILon(ilon) + "," + formatILat(ilat) + "</coordinates>\n");
sb.append(" </Point>\n");
sb.append(" </Placemark>\n");
}
}

View file

@ -0,0 +1,110 @@
package btools.router;
import java.io.BufferedWriter;
import java.io.FileWriter;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.util.Locale;
import java.util.TimeZone;
public abstract class Formatter {
static final String MESSAGES_HEADER = "Longitude\tLatitude\tElevation\tDistance\tCostPerKm\tElevCost\tTurnCost\tNodeCost\tInitialCost\tWayTags\tNodeTags\tTime\tEnergy";
RoutingContext rc;
Formatter() {
}
Formatter(RoutingContext rc) {
this.rc = rc;
}
/**
* writes the track in gpx-format to a file
*
* @param filename the filename to write to
* @param t the track to write
*/
public void write(String filename, OsmTrack t) throws Exception {
BufferedWriter bw = new BufferedWriter(new FileWriter(filename));
bw.write(format(t));
bw.close();
}
public OsmTrack read(String filename) throws Exception {
return null;
}
/**
* writes the track in a selected output format to a string
*
* @param t the track to format
* @return the formatted string
*/
public abstract String format(OsmTrack t);
static String formatILon(int ilon) {
return formatPos(ilon - 180000000);
}
static String formatILat(int ilat) {
return formatPos(ilat - 90000000);
}
private static String formatPos(int p) {
boolean negative = p < 0;
if (negative)
p = -p;
char[] ac = new char[12];
int i = 11;
while (p != 0 || i > 3) {
ac[i--] = (char) ('0' + (p % 10));
p /= 10;
if (i == 5)
ac[i--] = '.';
}
if (negative)
ac[i--] = '-';
return new String(ac, i + 1, 11 - i);
}
public static String getFormattedTime2(int s) {
int seconds = (int) (s + 0.5);
int hours = seconds / 3600;
int minutes = (seconds - hours * 3600) / 60;
seconds = seconds - hours * 3600 - minutes * 60;
String time = "";
if (hours != 0)
time = "" + hours + "h ";
if (minutes != 0)
time = time + minutes + "m ";
if (seconds != 0)
time = time + seconds + "s";
return time;
}
static public String getFormattedEnergy(int energy) {
return format1(energy / 3600000.) + "kwh";
}
static private String format1(double n) {
String s = "" + (long) (n * 10 + 0.5);
int len = s.length();
return s.substring(0, len - 1) + "." + s.charAt(len - 1);
}
static final String dateformat = "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'";
static public String getFormattedTime3(float time) {
SimpleDateFormat TIMESTAMP_FORMAT = new SimpleDateFormat(dateformat, Locale.US);
TIMESTAMP_FORMAT.setTimeZone(TimeZone.getTimeZone("UTC"));
// yyyy-mm-ddThh:mm:ss.SSSZ
Date d = new Date((long) (time * 1000f));
return TIMESTAMP_FORMAT.format(d);
}
}

View file

@ -11,15 +11,12 @@ import btools.expressions.BExpressionContextNode;
import btools.expressions.BExpressionContextWay;
final class KinematicModel extends OsmPathModel
{
public OsmPrePath createPrePath()
{
final class KinematicModel extends OsmPathModel {
public OsmPrePath createPrePath() {
return new KinematicPrePath();
}
public OsmPath createPath()
{
public OsmPath createPath() {
return new KinematicPath();
}
@ -38,7 +35,7 @@ final class KinematicModel extends OsmPathModel
// derived values
public double pw; // balance power
public double cost0; // minimum possible cost per meter
private int wayIdxMaxspeed;
private int wayIdxMaxspeedExplicit;
private int wayIdxMinspeed;
@ -47,7 +44,7 @@ final class KinematicModel extends OsmPathModel
protected BExpressionContextWay ctxWay;
protected BExpressionContextNode ctxNode;
protected Map<String,String> params;
protected Map<String, String> params;
private boolean initDone = false;
@ -55,77 +52,67 @@ final class KinematicModel extends OsmPathModel
private double lastBreakingSpeed;
@Override
public void init( BExpressionContextWay expctxWay, BExpressionContextNode expctxNode, Map<String,String> extraParams )
{
if ( !initDone )
{
public void init(BExpressionContextWay expctxWay, BExpressionContextNode expctxNode, Map<String, String> extraParams) {
if (!initDone) {
ctxWay = expctxWay;
ctxNode = expctxNode;
wayIdxMaxspeed = ctxWay.getOutputVariableIndex( "maxspeed", false );
wayIdxMaxspeedExplicit = ctxWay.getOutputVariableIndex( "maxspeed_explicit", false );
wayIdxMinspeed = ctxWay.getOutputVariableIndex( "minspeed", false );
nodeIdxMaxspeed = ctxNode.getOutputVariableIndex( "maxspeed", false );
wayIdxMaxspeed = ctxWay.getOutputVariableIndex("maxspeed", false);
wayIdxMaxspeedExplicit = ctxWay.getOutputVariableIndex("maxspeed_explicit", false);
wayIdxMinspeed = ctxWay.getOutputVariableIndex("minspeed", false);
nodeIdxMaxspeed = ctxNode.getOutputVariableIndex("maxspeed", false);
initDone = true;
}
params = extraParams;
turnAngleDecayTime = getParam( "turnAngleDecayTime", 5.f );
f_roll = getParam( "f_roll", 232.f );
f_air = getParam( "f_air", 0.4f );
f_recup = getParam( "f_recup", 400.f );
p_standby = getParam( "p_standby", 250.f );
outside_temp = getParam( "outside_temp", 20.f );
recup_efficiency = getParam( "recup_efficiency", 0.7f );
totalweight = getParam( "totalweight", 1640.f );
vmax = getParam( "vmax", 80.f ) / 3.6;
leftWaySpeed = getParam( "leftWaySpeed", 12.f ) / 3.6;
rightWaySpeed = getParam( "rightWaySpeed", 12.f ) / 3.6;
turnAngleDecayTime = getParam("turnAngleDecayTime", 5.f);
f_roll = getParam("f_roll", 232.f);
f_air = getParam("f_air", 0.4f);
f_recup = getParam("f_recup", 400.f);
p_standby = getParam("p_standby", 250.f);
outside_temp = getParam("outside_temp", 20.f);
recup_efficiency = getParam("recup_efficiency", 0.7f);
totalweight = getParam("totalweight", 1640.f);
vmax = getParam("vmax", 80.f) / 3.6;
leftWaySpeed = getParam("leftWaySpeed", 12.f) / 3.6;
rightWaySpeed = getParam("rightWaySpeed", 12.f) / 3.6;
pw = 2. * f_air * vmax * vmax * vmax - p_standby;
cost0 = (pw+p_standby)/vmax + f_roll + f_air*vmax*vmax;
cost0 = (pw + p_standby) / vmax + f_roll + f_air * vmax * vmax;
}
protected float getParam( String name, float defaultValue )
{
String sval = params == null ? null : params.get( name );
if ( sval != null )
{
return Float.parseFloat( sval );
protected float getParam(String name, float defaultValue) {
String sval = params == null ? null : params.get(name);
if (sval != null) {
return Float.parseFloat(sval);
}
float v = ctxWay.getVariableValue( name, defaultValue );
if ( params != null )
{
params.put( name, "" + v );
float v = ctxWay.getVariableValue(name, defaultValue);
if (params != null) {
params.put(name, "" + v);
}
return v;
}
public float getWayMaxspeed()
{
return ctxWay.getBuildInVariable( wayIdxMaxspeed ) / 3.6f;
public float getWayMaxspeed() {
return ctxWay.getBuildInVariable(wayIdxMaxspeed) / 3.6f;
}
public float getWayMaxspeedExplicit()
{
return ctxWay.getBuildInVariable( wayIdxMaxspeedExplicit ) / 3.6f;
public float getWayMaxspeedExplicit() {
return ctxWay.getBuildInVariable(wayIdxMaxspeedExplicit) / 3.6f;
}
public float getWayMinspeed()
{
return ctxWay.getBuildInVariable( wayIdxMinspeed ) / 3.6f;
public float getWayMinspeed() {
return ctxWay.getBuildInVariable(wayIdxMinspeed) / 3.6f;
}
public float getNodeMaxspeed()
{
return ctxNode.getBuildInVariable( nodeIdxMaxspeed ) / 3.6f;
public float getNodeMaxspeed() {
return ctxNode.getBuildInVariable(nodeIdxMaxspeed) / 3.6f;
}
/**
* get the effective speed limit from the way-limit and vmax/vmin
*/
public double getEffectiveSpeedLimit( )
{
/**
* get the effective speed limit from the way-limit and vmax/vmin
*/
public double getEffectiveSpeedLimit() {
// performance related inline coding
double minspeed = getWayMinspeed();
double espeed = minspeed > vmax ? minspeed : vmax;
@ -133,30 +120,27 @@ final class KinematicModel extends OsmPathModel
return maxspeed < espeed ? maxspeed : espeed;
}
/**
* get the breaking speed for current balance-power (pw) and effective speed limit (vl)
*/
public double getBreakingSpeed( double vl )
{
if ( vl == lastEffectiveLimit )
{
/**
* get the breaking speed for current balance-power (pw) and effective speed limit (vl)
*/
public double getBreakingSpeed(double vl) {
if (vl == lastEffectiveLimit) {
return lastBreakingSpeed;
}
double v = vl*0.8;
double pw2 = pw+p_standby;
double v = vl * 0.8;
double pw2 = pw + p_standby;
double e = recup_efficiency;
double x0 = pw2/vl+f_air*e*vl*vl+(1.-e)*f_roll;
for(int i=0;i<5;i++)
{
double v2 = v*v;
double x = pw2/v+f_air*e*v2 - x0;
double dx = 2.*e*f_air*v - pw2/v2;
v -= x/dx;
double x0 = pw2 / vl + f_air * e * vl * vl + (1. - e) * f_roll;
for (int i = 0; i < 5; i++) {
double v2 = v * v;
double x = pw2 / v + f_air * e * v2 - x0;
double dx = 2. * e * f_air * v - pw2 / v2;
v -= x / dx;
}
lastEffectiveLimit = vl;
lastBreakingSpeed = v;
return v;
}

View file

@ -5,11 +5,7 @@
*/
package btools.router;
import btools.util.FastMath;
final class KinematicPath extends OsmPath
{
final class KinematicPath extends OsmPath {
private double ekin; // kinetic energy (Joule)
private double totalTime; // travel time (seconds)
private double totalEnergy; // total route energy (Joule)
@ -17,20 +13,17 @@ final class KinematicPath extends OsmPath
private float floatingAngleRight; // sliding average right bend (degree)
@Override
protected void init( OsmPath orig )
{
KinematicPath origin = (KinematicPath)orig;
protected void init(OsmPath orig) {
KinematicPath origin = (KinematicPath) orig;
ekin = origin.ekin;
totalTime = origin.totalTime;
totalEnergy = origin.totalEnergy;
floatingAngleLeft = origin.floatingAngleLeft;
floatingAngleRight = origin.floatingAngleRight;
priorityclassifier = origin.priorityclassifier;
}
@Override
protected void resetState()
{
protected void resetState() {
ekin = 0.;
totalTime = 0.;
totalEnergy = 0.;
@ -39,267 +32,233 @@ final class KinematicPath extends OsmPath
}
@Override
protected double processWaySection( RoutingContext rc, double dist, double delta_h, double elevation, double angle, double cosangle, boolean isStartpoint, int nsection, int lastpriorityclassifier )
{
KinematicModel km = (KinematicModel)rc.pm;
protected double processWaySection(RoutingContext rc, double dist, double delta_h, double elevation, double angle, double cosangle, boolean isStartpoint, int nsection, int lastpriorityclassifier) {
KinematicModel km = (KinematicModel) rc.pm;
double cost = 0.;
double extraTime = 0.;
if ( isStartpoint )
{
if (isStartpoint) {
// for forward direction, we start with target speed
if ( !rc.inverseDirection )
{
extraTime = 0.5 * (1. - cosangle ) * 40.; // 40 seconds turn penalty
if (!rc.inverseDirection) {
extraTime = 0.5 * (1. - cosangle) * 40.; // 40 seconds turn penalty
}
}
else
{
} else {
double turnspeed = 999.; // just high
if ( km.turnAngleDecayTime != 0. ) // process turn-angle slowdown
{
if ( angle < 0 ) floatingAngleLeft -= (float)angle;
else floatingAngleRight += (float)angle;
float aa = Math.max( floatingAngleLeft, floatingAngleRight );
if (km.turnAngleDecayTime != 0.) { // process turn-angle slowdown
if (angle < 0) floatingAngleLeft -= (float) angle;
else floatingAngleRight += (float) angle;
float aa = Math.max(floatingAngleLeft, floatingAngleRight);
double curveSpeed = aa > 10. ? 200. / aa : 20.;
double curveSpeed = aa > 10. ? 200. / aa : 20.;
double distanceTime = dist / curveSpeed;
double decayFactor = FastMath.exp( - distanceTime / km.turnAngleDecayTime );
floatingAngleLeft = (float)( floatingAngleLeft * decayFactor );
floatingAngleRight = (float)( floatingAngleRight * decayFactor );
double decayFactor = Math.exp(-distanceTime / km.turnAngleDecayTime);
floatingAngleLeft = (float) (floatingAngleLeft * decayFactor);
floatingAngleRight = (float) (floatingAngleRight * decayFactor);
if ( curveSpeed < 20. )
{
if (curveSpeed < 20.) {
turnspeed = curveSpeed;
}
}
if ( nsection == 0 ) // process slowdown by crossing geometry
{
if (nsection == 0) { // process slowdown by crossing geometry
double junctionspeed = 999.; // just high
int classifiermask = (int)rc.expctxWay.getClassifierMask();
int classifiermask = (int) rc.expctxWay.getClassifierMask();
// penalty for equal priority crossing
boolean hasLeftWay = false;
boolean hasRightWay = false;
boolean hasResidential = false;
for( OsmPrePath prePath = rc.firstPrePath; prePath != null; prePath = prePath.next )
{
KinematicPrePath pp = (KinematicPrePath)prePath;
for (OsmPrePath prePath = rc.firstPrePath; prePath != null; prePath = prePath.next) {
KinematicPrePath pp = (KinematicPrePath) prePath;
if ( ( (pp.classifiermask ^ classifiermask) & 8 ) != 0 ) // exactly one is linktype
{
if (((pp.classifiermask ^ classifiermask) & 8) != 0) { // exactly one is linktype
continue;
}
if ( ( pp.classifiermask & 32 ) != 0 ) // touching a residential?
{
if ((pp.classifiermask & 32) != 0) { // touching a residential?
hasResidential = true;
}
if ( pp.priorityclassifier > priorityclassifier || pp.priorityclassifier == priorityclassifier && priorityclassifier < 20 )
{
if (pp.priorityclassifier > priorityclassifier || pp.priorityclassifier == priorityclassifier && priorityclassifier < 20) {
double diff = pp.angle - angle;
if ( diff < -40. && diff > -140.) hasLeftWay = true;
if ( diff > 40. && diff < 140. ) hasRightWay = true;
if (diff < -40. && diff > -140.) hasLeftWay = true;
if (diff > 40. && diff < 140.) hasRightWay = true;
}
}
double residentialSpeed = 13.;
if ( hasLeftWay && junctionspeed > km.leftWaySpeed ) junctionspeed = km.leftWaySpeed;
if ( hasRightWay && junctionspeed > km.rightWaySpeed ) junctionspeed = km.rightWaySpeed;
if ( hasResidential && junctionspeed > residentialSpeed ) junctionspeed = residentialSpeed;
if (hasLeftWay && junctionspeed > km.leftWaySpeed) junctionspeed = km.leftWaySpeed;
if (hasRightWay && junctionspeed > km.rightWaySpeed) junctionspeed = km.rightWaySpeed;
if (hasResidential && junctionspeed > residentialSpeed) junctionspeed = residentialSpeed;
if ( (lastpriorityclassifier < 20) ^ (priorityclassifier < 20) )
{
if ((lastpriorityclassifier < 20) ^ (priorityclassifier < 20)) {
extraTime += 10.;
junctionspeed = 0; // full stop for entering or leaving road network
}
if ( lastpriorityclassifier != priorityclassifier && (classifiermask & 8) != 0 )
{
if (lastpriorityclassifier != priorityclassifier && (classifiermask & 8) != 0) {
extraTime += 2.; // two seconds for entering a link-type
}
turnspeed = turnspeed > junctionspeed ? junctionspeed : turnspeed;
if ( message != null )
{
message.vnode0 = (int) ( junctionspeed * 3.6 + 0.5 );
if (message != null) {
message.vnode0 = (int) (junctionspeed * 3.6 + 0.5);
}
}
cutEkin( km.totalweight, turnspeed ); // apply turnspeed
cutEkin(km.totalweight, turnspeed); // apply turnspeed
}
// linear temperature correction
double tcorr = (20.-km.outside_temp)*0.0035;
double tcorr = (20. - km.outside_temp) * 0.0035;
// air_pressure down 1mb/8m
double ecorr = 0.0001375 * (elevation - 100.);
double f_air = km.f_air * ( 1. + tcorr - ecorr );
double f_air = km.f_air * (1. + tcorr - ecorr);
double distanceCost = evolveDistance( km, dist, delta_h, f_air );
double distanceCost = evolveDistance(km, dist, delta_h, f_air);
if ( message != null )
{
message.costfactor = (float)(distanceCost/dist);
message.vmax = (int) ( km.getWayMaxspeed() * 3.6 + 0.5 );
message.vmaxExplicit = (int) ( km.getWayMaxspeedExplicit() * 3.6 + 0.5 );
message.vmin = (int) ( km.getWayMinspeed() * 3.6 + 0.5 );
message.extraTime = (int)(extraTime*1000);
if (message != null) {
message.costfactor = (float) (distanceCost / dist);
message.vmax = (int) (km.getWayMaxspeed() * 3.6 + 0.5);
message.vmaxExplicit = (int) (km.getWayMaxspeedExplicit() * 3.6 + 0.5);
message.vmin = (int) (km.getWayMinspeed() * 3.6 + 0.5);
message.extraTime = (int) (extraTime * 1000);
}
cost += extraTime * km.pw / km.cost0;
cost += extraTime * km.pw / km.cost0;
totalTime += extraTime;
return cost + distanceCost;
}
protected double evolveDistance( KinematicModel km, double dist, double delta_h, double f_air )
{
protected double evolveDistance(KinematicModel km, double dist, double delta_h, double f_air) {
// elevation force
double fh = delta_h * km.totalweight * 9.81 / dist;
double effectiveSpeedLimit = km.getEffectiveSpeedLimit();
double emax = 0.5*km.totalweight*effectiveSpeedLimit*effectiveSpeedLimit;
if ( emax <= 0. )
{
double emax = 0.5 * km.totalweight * effectiveSpeedLimit * effectiveSpeedLimit;
if (emax <= 0.) {
return -1.;
}
double vb = km.getBreakingSpeed( effectiveSpeedLimit );
double elow = 0.5*km.totalweight*vb*vb;
double vb = km.getBreakingSpeed(effectiveSpeedLimit);
double elow = 0.5 * km.totalweight * vb * vb;
double elapsedTime = 0.;
double dissipatedEnergy = 0.;
double v = Math.sqrt( 2. * ekin / km.totalweight );
double v = Math.sqrt(2. * ekin / km.totalweight);
double d = dist;
while( d > 0. )
{
while (d > 0.) {
boolean slow = ekin < elow;
boolean fast = ekin >= emax;
double etarget = slow ? elow : emax;
double f = km.f_roll + f_air*v*v + fh;
double f_recup = Math.max( 0., fast ? -f : (slow ? km.f_recup :0 ) -fh ); // additional recup for slow part
double f = km.f_roll + f_air * v * v + fh;
double f_recup = Math.max(0., fast ? -f : (slow ? km.f_recup : 0) - fh); // additional recup for slow part
f += f_recup;
double delta_ekin;
double timeStep;
double x;
if ( fast )
{
if (fast) {
x = d;
delta_ekin = x*f;
timeStep = x/v;
delta_ekin = x * f;
timeStep = x / v;
ekin = etarget;
}
else
{
delta_ekin = etarget-ekin;
double b = 2.*f_air / km.totalweight;
double x0 = delta_ekin/f;
double x0b = x0*b;
x = x0*(1. - x0b*(0.5 + x0b*(0.333333333-x0b*0.25 ) ) ); // = ln( delta_ekin*b/f + 1.) / b;
double maxstep = Math.min( 50., d );
if ( x >= maxstep )
{
} else {
delta_ekin = etarget - ekin;
double b = 2. * f_air / km.totalweight;
double x0 = delta_ekin / f;
double x0b = x0 * b;
x = x0 * (1. - x0b * (0.5 + x0b * (0.333333333 - x0b * 0.25))); // = ln( delta_ekin*b/f + 1.) / b;
double maxstep = Math.min(50., d);
if (x >= maxstep) {
x = maxstep;
double xb = x*b;
delta_ekin = x*f*(1.+xb*(0.5+xb*(0.166666667+xb*0.0416666667 ) ) ); // = f/b* exp(xb-1)
double xb = x * b;
delta_ekin = x * f * (1. + xb * (0.5 + xb * (0.166666667 + xb * 0.0416666667))); // = f/b* exp(xb-1)
ekin += delta_ekin;
}
else
{
} else {
ekin = etarget;
}
double v2 = Math.sqrt( 2. * ekin / km.totalweight );
double v2 = Math.sqrt(2. * ekin / km.totalweight);
double a = f / km.totalweight; // TODO: average force?
timeStep = (v2-v)/a;
timeStep = (v2 - v) / a;
v = v2;
}
d -= x;
elapsedTime += timeStep;
// dissipated energy does not contain elevation and efficient recup
dissipatedEnergy += delta_ekin - x*(fh + f_recup*km.recup_efficiency);
dissipatedEnergy += delta_ekin - x * (fh + f_recup * km.recup_efficiency);
// correction: inefficient recup going into heating is half efficient
double ieRecup = x*f_recup*(1.-km.recup_efficiency);
double eaux = timeStep*km.p_standby;
dissipatedEnergy -= Math.max( ieRecup, eaux ) * 0.5;
double ieRecup = x * f_recup * (1. - km.recup_efficiency);
double eaux = timeStep * km.p_standby;
dissipatedEnergy -= Math.max(ieRecup, eaux) * 0.5;
}
dissipatedEnergy += elapsedTime * km.p_standby;
totalTime += elapsedTime;
totalEnergy += dissipatedEnergy + dist*fh;
totalEnergy += dissipatedEnergy + dist * fh;
return (km.pw * elapsedTime + dissipatedEnergy)/km.cost0; // =cost
return (km.pw * elapsedTime + dissipatedEnergy) / km.cost0; // =cost
}
@Override
protected double processTargetNode( RoutingContext rc )
{
KinematicModel km = (KinematicModel)rc.pm;
protected double processTargetNode(RoutingContext rc) {
KinematicModel km = (KinematicModel) rc.pm;
// finally add node-costs for target node
if ( targetNode.nodeDescription != null )
{
rc.expctxNode.evaluate( false , targetNode.nodeDescription );
if (targetNode.nodeDescription != null) {
rc.expctxNode.evaluate(false, targetNode.nodeDescription);
float initialcost = rc.expctxNode.getInitialcost();
if ( initialcost >= 1000000. )
{
if (initialcost >= 1000000.) {
return -1.;
}
cutEkin( km.totalweight, km.getNodeMaxspeed() ); // apply node maxspeed
cutEkin(km.totalweight, km.getNodeMaxspeed()); // apply node maxspeed
if ( message != null )
{
message.linknodecost += (int)initialcost;
message.nodeKeyValues = rc.expctxNode.getKeyValueDescription( false, targetNode.nodeDescription );
if (message != null) {
message.linknodecost += (int) initialcost;
message.nodeKeyValues = rc.expctxNode.getKeyValueDescription(false, targetNode.nodeDescription);
message.vnode1 = (int) ( km.getNodeMaxspeed() * 3.6 + 0.5 );
message.vnode1 = (int) (km.getNodeMaxspeed() * 3.6 + 0.5);
}
return initialcost;
}
return 0.;
}
private void cutEkin( double weight, double speed )
{
double e = 0.5*weight*speed*speed;
if ( ekin > e ) ekin = e;
private void cutEkin(double weight, double speed) {
double e = 0.5 * weight * speed * speed;
if (ekin > e) ekin = e;
}
@Override
public int elevationCorrection( RoutingContext rc )
{
public int elevationCorrection() {
return 0;
}
@Override
public boolean definitlyWorseThan( OsmPath path, RoutingContext rc )
{
KinematicPath p = (KinematicPath)path;
public boolean definitlyWorseThan(OsmPath path) {
KinematicPath p = (KinematicPath) path;
int c = p.cost;
return cost > c + 100;
int c = p.cost;
return cost > c + 100;
}
@Override
public double getTotalTime()
{
public double getTotalTime() {
return totalTime;
}
@Override
public double getTotalEnergy()
{
public double getTotalEnergy() {
return totalEnergy;
}
}

View file

@ -8,16 +8,14 @@ package btools.router;
import btools.mapaccess.OsmNode;
import btools.mapaccess.OsmTransferNode;
final class KinematicPrePath extends OsmPrePath
{
final class KinematicPrePath extends OsmPrePath {
public double angle;
public int priorityclassifier;
public int classifiermask;
protected void initPrePath(OsmPath origin, RoutingContext rc )
{
protected void initPrePath(OsmPath origin, RoutingContext rc) {
byte[] description = link.descriptionBitmap;
if ( description == null ) throw new IllegalArgumentException( "null description for: " + link );
if (description == null) throw new IllegalArgumentException("null description for: " + link);
// extract the 3 positions of the first section
int lon0 = origin.originLon;
@ -27,32 +25,29 @@ final class KinematicPrePath extends OsmPrePath
int lon1 = p1.getILon();
int lat1 = p1.getILat();
boolean isReverse = link.isReverse( sourceNode );
boolean isReverse = link.isReverse(sourceNode);
// evaluate the way tags
rc.expctxWay.evaluate( rc.inverseDirection ^ isReverse, description );
rc.expctxWay.evaluate(rc.inverseDirection ^ isReverse, description);
OsmTransferNode transferNode = link.geometry == null ? null
: rc.geometryDecoder.decodeGeometry( link.geometry, p1, targetNode, isReverse );
: rc.geometryDecoder.decodeGeometry(link.geometry, p1, targetNode, isReverse);
int lon2;
int lat2;
if ( transferNode == null )
{
if (transferNode == null) {
lon2 = targetNode.ilon;
lat2 = targetNode.ilat;
}
else
{
} else {
lon2 = transferNode.ilon;
lat2 = transferNode.ilat;
}
int dist = rc.calcDistance( lon1, lat1, lon2, lat2 );
int dist = rc.calcDistance(lon1, lat1, lon2, lat2);
angle = rc.anglemeter.calcAngle( lon0, lat0, lon1, lat1, lon2, lat2 );
priorityclassifier = (int)rc.expctxWay.getPriorityClassifier();
classifiermask = (int)rc.expctxWay.getClassifierMask();
angle = rc.anglemeter.calcAngle(lon0, lat0, lon1, lat1, lon2, lat2);
priorityclassifier = (int) rc.expctxWay.getPriorityClassifier();
classifiermask = (int) rc.expctxWay.getClassifierMask();
}
}

View file

@ -6,15 +6,13 @@
package btools.router;
final class MessageData implements Cloneable
{
final class MessageData implements Cloneable {
int linkdist = 0;
int linkelevationcost = 0;
int linkturncost = 0;
int linknodecost = 0;
int linkinitcost = 0;
float costfactor;
int priorityclassifier;
int classifiermask;
@ -25,7 +23,7 @@ final class MessageData implements Cloneable
int lon;
int lat;
short ele;
float time;
float energy;
@ -37,84 +35,70 @@ final class MessageData implements Cloneable
int vnode1 = 999;
int extraTime = 0;
String toMessage()
{
if ( wayKeyValues == null )
{
String toMessage() {
if (wayKeyValues == null) {
return null;
}
int iCost = (int)(costfactor*1000 + 0.5f);
return (lon-180000000) + "\t"
+ (lat-90000000) + "\t"
+ ele/4 + "\t"
+ linkdist + "\t"
+ iCost + "\t"
+ linkelevationcost
+ "\t" + linkturncost
+ "\t" + linknodecost
+ "\t" + linkinitcost
+ "\t" + wayKeyValues
+ "\t" + ( nodeKeyValues == null ? "" : nodeKeyValues )
+ "\t" + ((int)time)
+ "\t" + ((int)energy);
int iCost = (int) (costfactor * 1000 + 0.5f);
return (lon - 180000000) + "\t"
+ (lat - 90000000) + "\t"
+ ele / 4 + "\t"
+ linkdist + "\t"
+ iCost + "\t"
+ linkelevationcost
+ "\t" + linkturncost
+ "\t" + linknodecost
+ "\t" + linkinitcost
+ "\t" + wayKeyValues
+ "\t" + (nodeKeyValues == null ? "" : nodeKeyValues)
+ "\t" + ((int) time)
+ "\t" + ((int) energy);
}
void add( MessageData d )
{
void add(MessageData d) {
linkdist += d.linkdist;
linkelevationcost += d.linkelevationcost;
linkturncost += d.linkturncost;
linknodecost += d.linknodecost;
linkinitcost+= d.linkinitcost;
linkinitcost += d.linkinitcost;
}
MessageData copy()
{
try
{
return (MessageData)clone();
}
catch( CloneNotSupportedException e )
{
throw new RuntimeException( e );
MessageData copy() {
try {
return (MessageData) clone();
} catch (CloneNotSupportedException e) {
throw new RuntimeException(e);
}
}
@Override
public String toString()
{
public String toString() {
return "dist=" + linkdist + " prio=" + priorityclassifier + " turn=" + turnangle;
}
public int getPrio()
{
public int getPrio() {
return priorityclassifier;
}
public boolean isBadOneway()
{
return ( classifiermask & 1 ) != 0;
public boolean isBadOneway() {
return (classifiermask & 1) != 0;
}
public boolean isGoodOneway()
{
return ( classifiermask & 2 ) != 0;
public boolean isGoodOneway() {
return (classifiermask & 2) != 0;
}
public boolean isRoundabout()
{
return ( classifiermask & 4 ) != 0;
public boolean isRoundabout() {
return (classifiermask & 4) != 0;
}
public boolean isLinktType()
{
return ( classifiermask & 8 ) != 0;
public boolean isLinktType() {
return (classifiermask & 8) != 0;
}
public boolean isGoodForCars()
{
return ( classifiermask & 16 ) != 0;
public boolean isGoodForCars() {
return (classifiermask & 16) != 0;
}
}

View file

@ -1,103 +1,99 @@
/**
* Container for an osm node
*
* @author ab
*/
package btools.router;
import btools.mapaccess.OsmNode;
import btools.util.CheapRuler;
public class OsmNodeNamed extends OsmNode
{
public String name;
public double radius; // radius of nogopoint (in meters)
public double nogoWeight; // weight for nogopoint
public boolean isNogo = false;
public OsmNodeNamed()
{
}
public OsmNodeNamed( OsmNode n)
{
super( n.ilon, n.ilat );
}
@Override
public String toString()
{
if ( Double.isNaN(nogoWeight ) ) {
return ilon + "," + ilat + "," + name;
} else {
return ilon + "," + ilat + "," + name + "," + nogoWeight;
}
}
public double distanceWithinRadius(int lon1, int lat1, int lon2, int lat2, double totalSegmentLength) {
double[] lonlat2m = CheapRuler.getLonLatToMeterScales( (lat1 + lat2) >> 1 );
boolean isFirstPointWithinCircle = CheapRuler.distance(lon1, lat1, ilon, ilat) < radius;
boolean isLastPointWithinCircle = CheapRuler.distance(lon2, lat2, ilon, ilat) < radius;
// First point is within the circle
if (isFirstPointWithinCircle) {
// Last point is within the circle
if (isLastPointWithinCircle) {
return totalSegmentLength;
}
// Last point is not within the circle
// Just swap points and go on with first first point not within the
// circle now.
// Swap longitudes
int tmp = lon2;
lon2 = lon1;
lon1 = tmp;
// Swap latitudes
tmp = lat2;
lat2 = lat1;
lat1 = tmp;
// Fix boolean values
isLastPointWithinCircle = isFirstPointWithinCircle;
isFirstPointWithinCircle = false;
}
// Distance between the initial point and projection of center of
// the circle on the current segment.
double initialToProject = (
(lon2 - lon1) * (ilon - lon1) * lonlat2m[0] * lonlat2m[0]
+ (lat2 - lat1) * (ilat - lat1) * lonlat2m[1] * lonlat2m[1]
) / totalSegmentLength;
// Distance between the initial point and the center of the circle.
double initialToCenter = CheapRuler.distance(ilon, ilat, lon1, lat1);
// Half length of the segment within the circle
double halfDistanceWithin = Math.sqrt(
radius*radius - (
initialToCenter*initialToCenter -
initialToProject*initialToProject
)
);
// Last point is within the circle
if (isLastPointWithinCircle) {
return halfDistanceWithin + (totalSegmentLength - initialToProject);
}
return 2 * halfDistanceWithin;
}
public static OsmNodeNamed decodeNogo( String s )
{
OsmNodeNamed n = new OsmNodeNamed();
int idx1 = s.indexOf( ',' );
n.ilon = Integer.parseInt( s.substring( 0, idx1 ) );
int idx2 = s.indexOf( ',', idx1+1 );
n.ilat = Integer.parseInt( s.substring( idx1+1, idx2 ) );
int idx3 = s.indexOf( ',', idx2+1 );
if ( idx3 == -1) {
n.name = s.substring( idx2 + 1 );
n.nogoWeight = Double.NaN;
} else {
n.name = s.substring( idx2+1, idx3 );
n.nogoWeight = Double.parseDouble( s.substring( idx3 + 1 ) );
}
n.isNogo = true;
return n;
}
}
/**
* Container for an osm node
*
* @author ab
*/
package btools.router;
import btools.mapaccess.OsmNode;
import btools.util.CheapRuler;
public class OsmNodeNamed extends OsmNode {
public String name;
public double radius; // radius of nogopoint (in meters)
public double nogoWeight; // weight for nogopoint
public boolean isNogo = false;
public boolean direct = false; // mark direct routing
public OsmNodeNamed() {
}
public OsmNodeNamed(OsmNode n) {
super(n.ilon, n.ilat);
}
@Override
public String toString() {
if (Double.isNaN(nogoWeight)) {
return ilon + "," + ilat + "," + name;
} else {
return ilon + "," + ilat + "," + name + "," + nogoWeight;
}
}
public double distanceWithinRadius(int lon1, int lat1, int lon2, int lat2, double totalSegmentLength) {
double[] lonlat2m = CheapRuler.getLonLatToMeterScales((lat1 + lat2) >> 1);
boolean isFirstPointWithinCircle = CheapRuler.distance(lon1, lat1, ilon, ilat) < radius;
boolean isLastPointWithinCircle = CheapRuler.distance(lon2, lat2, ilon, ilat) < radius;
// First point is within the circle
if (isFirstPointWithinCircle) {
// Last point is within the circle
if (isLastPointWithinCircle) {
return totalSegmentLength;
}
// Last point is not within the circle
// Just swap points and go on with first first point not within the
// circle now.
// Swap longitudes
int tmp = lon2;
lon2 = lon1;
lon1 = tmp;
// Swap latitudes
tmp = lat2;
lat2 = lat1;
lat1 = tmp;
// Fix boolean values
isLastPointWithinCircle = isFirstPointWithinCircle;
isFirstPointWithinCircle = false;
}
// Distance between the initial point and projection of center of
// the circle on the current segment.
double initialToProject = (
(lon2 - lon1) * (ilon - lon1) * lonlat2m[0] * lonlat2m[0]
+ (lat2 - lat1) * (ilat - lat1) * lonlat2m[1] * lonlat2m[1]
) / totalSegmentLength;
// Distance between the initial point and the center of the circle.
double initialToCenter = CheapRuler.distance(ilon, ilat, lon1, lat1);
// Half length of the segment within the circle
double halfDistanceWithin = Math.sqrt(
radius * radius - (
initialToCenter * initialToCenter -
initialToProject * initialToProject
)
);
// Last point is within the circle
if (isLastPointWithinCircle) {
return halfDistanceWithin + (totalSegmentLength - initialToProject);
}
return 2 * halfDistanceWithin;
}
public static OsmNodeNamed decodeNogo(String s) {
OsmNodeNamed n = new OsmNodeNamed();
int idx1 = s.indexOf(',');
n.ilon = Integer.parseInt(s.substring(0, idx1));
int idx2 = s.indexOf(',', idx1 + 1);
n.ilat = Integer.parseInt(s.substring(idx1 + 1, idx2));
int idx3 = s.indexOf(',', idx2 + 1);
if (idx3 == -1) {
n.name = s.substring(idx2 + 1);
n.nogoWeight = Double.NaN;
} else {
n.name = s.substring(idx2 + 1, idx3);
n.nogoWeight = Double.parseDouble(s.substring(idx3 + 1));
}
n.isNogo = true;
return n;
}
}

File diff suppressed because it is too large Load diff

View file

@ -1,524 +1,430 @@
/**
* Container for link between two Osm nodes
*
* @author ab
*/
package btools.router;
import java.io.IOException;
import btools.mapaccess.OsmLink;
import btools.mapaccess.OsmLinkHolder;
import btools.mapaccess.OsmNode;
import btools.mapaccess.OsmTransferNode;
import btools.mapaccess.TurnRestriction;
import btools.util.CheapRuler;
abstract class OsmPath implements OsmLinkHolder
{
/**
* The cost of that path (a modified distance)
*/
public int cost = 0;
// the elevation assumed for that path can have a value
// if the corresponding node has not
public short selev;
public int airdistance = 0; // distance to endpos
protected OsmNode sourceNode;
protected OsmNode targetNode;
protected OsmLink link;
public OsmPathElement originElement;
public OsmPathElement myElement;
protected float traffic;
private OsmLinkHolder nextForLink = null;
public int treedepth = 0;
// the position of the waypoint just before
// this path position (for angle calculation)
public int originLon;
public int originLat;
// the classifier of the segment just before this paths position
protected float lastClassifier;
protected float lastInitialCost;
protected int priorityclassifier;
private static final int PATH_START_BIT = 1;
private static final int CAN_LEAVE_DESTINATION_BIT = 2;
private static final int IS_ON_DESTINATION_BIT = 4;
private static final int HAD_DESTINATION_START_BIT = 8;
protected int bitfield = PATH_START_BIT;
private boolean getBit( int mask )
{
return (bitfield & mask ) != 0;
}
private void setBit( int mask, boolean bit )
{
if ( getBit( mask ) != bit )
{
bitfield ^= mask;
}
}
public boolean didEnterDestinationArea()
{
return !getBit( HAD_DESTINATION_START_BIT ) && getBit( IS_ON_DESTINATION_BIT );
}
public MessageData message;
public void unregisterUpTree( RoutingContext rc )
{
try
{
OsmPathElement pe = originElement;
while( pe instanceof OsmPathElementWithTraffic && ((OsmPathElementWithTraffic)pe).unregister(rc) )
{
pe = pe.origin;
}
}
catch( IOException ioe )
{
throw new RuntimeException( ioe );
}
}
public void registerUpTree()
{
if ( originElement instanceof OsmPathElementWithTraffic )
{
OsmPathElementWithTraffic ot = (OsmPathElementWithTraffic)originElement;
ot.register();
ot.addTraffic( traffic );
}
}
public void init( OsmLink link )
{
this.link = link;
targetNode = link.getTarget( null );
selev = targetNode.getSElev();
originLon = -1;
originLat = -1;
}
public void init( OsmPath origin, OsmLink link, OsmTrack refTrack, boolean detailMode, RoutingContext rc )
{
if ( origin.myElement == null )
{
origin.myElement = OsmPathElement.create( origin, rc.countTraffic );
}
this.originElement = origin.myElement;
this.link = link;
this.sourceNode = origin.targetNode;
this.targetNode = link.getTarget( sourceNode );
this.cost = origin.cost;
this.lastClassifier = origin.lastClassifier;
this.lastInitialCost = origin.lastInitialCost;
this.bitfield = origin.bitfield;
init( origin );
addAddionalPenalty(refTrack, detailMode, origin, link, rc );
}
protected abstract void init( OsmPath orig );
protected abstract void resetState();
protected void addAddionalPenalty(OsmTrack refTrack, boolean detailMode, OsmPath origin, OsmLink link, RoutingContext rc )
{
byte[] description = link.descriptionBitmap;
if ( description == null )
{
return; // could be a beeline path
}
boolean recordTransferNodes = detailMode || rc.countTraffic;
rc.nogoCost = 0.;
// extract the 3 positions of the first section
int lon0 = origin.originLon;
int lat0 = origin.originLat;
int lon1 = sourceNode.getILon();
int lat1 = sourceNode.getILat();
short ele1 = origin.selev;
int linkdisttotal = 0;
message = detailMode ? new MessageData() : null;
boolean isReverse = link.isReverse( sourceNode );
// evaluate the way tags
rc.expctxWay.evaluate( rc.inverseDirection ^ isReverse, description );
// calculate the costfactor inputs
float costfactor = rc.expctxWay.getCostfactor();
boolean isTrafficBackbone = cost == 0 && rc.expctxWay.getIsTrafficBackbone() > 0.f;
int lastpriorityclassifier = priorityclassifier;
priorityclassifier = (int)rc.expctxWay.getPriorityClassifier();
// *** add initial cost if the classifier changed
float newClassifier = rc.expctxWay.getInitialClassifier();
float newInitialCost = rc.expctxWay.getInitialcost();
float classifierDiff = newClassifier - lastClassifier;
if ( newClassifier != 0. && lastClassifier != 0. && ( classifierDiff > 0.0005 || classifierDiff < -0.0005 ) )
{
float initialcost = rc.inverseDirection ? lastInitialCost : newInitialCost;
if ( initialcost >= 1000000. )
{
cost = -1;
return;
}
int iicost = (int)initialcost;
if ( message != null )
{
message.linkinitcost += iicost;
}
cost += iicost;
}
lastClassifier = newClassifier;
lastInitialCost = newInitialCost;
// *** destination logic: no destination access in between
int classifiermask = (int)rc.expctxWay.getClassifierMask();
boolean newDestination = (classifiermask & 64) != 0;
boolean oldDestination = getBit( IS_ON_DESTINATION_BIT );
if ( getBit( PATH_START_BIT ) )
{
setBit( PATH_START_BIT, false );
setBit( CAN_LEAVE_DESTINATION_BIT, newDestination );
setBit( HAD_DESTINATION_START_BIT, newDestination );
}
else
{
if ( oldDestination && !newDestination )
{
if ( getBit( CAN_LEAVE_DESTINATION_BIT ) )
{
setBit( CAN_LEAVE_DESTINATION_BIT, false );
}
else
{
cost = -1;
return;
}
}
}
setBit( IS_ON_DESTINATION_BIT, newDestination );
OsmTransferNode transferNode = link.geometry == null ? null
: rc.geometryDecoder.decodeGeometry( link.geometry, sourceNode, targetNode, isReverse );
for(int nsection=0; ;nsection++)
{
originLon = lon1;
originLat = lat1;
int lon2;
int lat2;
short ele2;
if ( transferNode == null )
{
lon2 = targetNode.ilon;
lat2 = targetNode.ilat;
ele2 = targetNode.selev;
}
else
{
lon2 = transferNode.ilon;
lat2 = transferNode.ilat;
ele2 = transferNode.selev;
}
boolean isStartpoint = lon0 == -1 && lat0 == -1;
// check turn restrictions (n detail mode (=final pass) no TR to not mess up voice hints)
if ( nsection == 0 && rc.considerTurnRestrictions && !detailMode&& !isStartpoint )
{
if ( rc.inverseDirection
? TurnRestriction.isTurnForbidden( sourceNode.firstRestriction, lon2, lat2, lon0, lat0, rc.bikeMode, rc.carMode )
: TurnRestriction.isTurnForbidden( sourceNode.firstRestriction, lon0, lat0, lon2, lat2, rc.bikeMode, rc.carMode ) )
{
cost = -1;
return;
}
}
// if recording, new MessageData for each section (needed for turn-instructions)
if ( message != null && message.wayKeyValues != null )
{
originElement.message = message;
message = new MessageData();
}
int dist = rc.calcDistance( lon1, lat1, lon2, lat2 );
boolean stopAtEndpoint = false;
if ( rc.shortestmatch )
{
if ( rc.isEndpoint )
{
stopAtEndpoint = true;
ele2 = interpolateEle( ele1, ele2, rc.wayfraction );
}
else
{
// we just start here, reset everything
cost = 0;
resetState();
lon0 = -1; // reset turncost-pipe
lat0 = -1;
isStartpoint = true;
if ( recordTransferNodes )
{
if ( rc.wayfraction > 0. )
{
ele1 = interpolateEle( ele1, ele2, 1. - rc.wayfraction );
originElement = OsmPathElement.create( rc.ilonshortest, rc.ilatshortest, ele1, null, rc.countTraffic );
}
else
{
originElement = null; // prevent duplicate point
}
}
if ( rc.checkPendingEndpoint() )
{
dist = rc.calcDistance( rc.ilonshortest, rc.ilatshortest, lon2, lat2 );
if ( rc.shortestmatch )
{
stopAtEndpoint = true;
ele2 = interpolateEle( ele1, ele2, rc.wayfraction );
}
}
}
}
if ( message != null )
{
message.linkdist += dist;
}
linkdisttotal += dist;
// apply a start-direction if appropriate (by faking the origin position)
if ( isStartpoint )
{
if ( rc.startDirectionValid )
{
double dir = rc.startDirection.intValue() * CheapRuler.DEG_TO_RAD;
double[] lonlat2m = CheapRuler.getLonLatToMeterScales( (lon0 + lat1) >> 1 );
lon0 = lon1 - (int) ( 1000. * Math.sin( dir ) / lonlat2m[0] );
lat0 = lat1 - (int) ( 1000. * Math.cos( dir ) / lonlat2m[1] );
}
else
{
lon0 = lon1 - (lon2-lon1);
lat0 = lat1 - (lat2-lat1);
}
}
double angle = rc.anglemeter.calcAngle( lon0, lat0, lon1, lat1, lon2, lat2 );
double cosangle = rc.anglemeter.getCosAngle();
// *** elevation stuff
double delta_h = 0.;
if ( ele2 == Short.MIN_VALUE ) ele2 = ele1;
if ( ele1 != Short.MIN_VALUE )
{
delta_h = (ele2 - ele1)/4.;
if ( rc.inverseDirection )
{
delta_h = -delta_h;
}
}
double elevation = ele2 == Short.MIN_VALUE ? 100. : ele2/4.;
double sectionCost = processWaySection( rc, dist, delta_h, elevation, angle, cosangle, isStartpoint, nsection, lastpriorityclassifier );
if ( ( sectionCost < 0. || costfactor > 9998. && !detailMode ) || sectionCost + cost >= 2000000000. )
{
cost = -1;
return;
}
if ( isTrafficBackbone )
{
sectionCost = 0.;
}
cost += (int)sectionCost;
// calculate traffic
if ( rc.countTraffic )
{
int minDist = (int)rc.trafficSourceMinDist;
int cost2 = cost < minDist ? minDist : cost;
traffic += dist*rc.expctxWay.getTrafficSourceDensity()*Math.pow(cost2/10000.f,rc.trafficSourceExponent);
}
// compute kinematic
computeKinematic( rc, dist, delta_h, detailMode );
if ( message != null )
{
message.turnangle = (float)angle;
message.time = (float)getTotalTime();
message.energy = (float)getTotalEnergy();
message.priorityclassifier = priorityclassifier;
message.classifiermask = classifiermask;
message.lon = lon2;
message.lat = lat2;
message.ele = ele2;
message.wayKeyValues = rc.expctxWay.getKeyValueDescription( isReverse, description );
}
if ( stopAtEndpoint )
{
if ( recordTransferNodes )
{
originElement = OsmPathElement.create( rc.ilonshortest, rc.ilatshortest, ele2, originElement, rc.countTraffic );
originElement.cost = cost;
if ( message != null )
{
originElement.message = message;
}
}
if ( rc.nogoCost < 0)
{
cost = -1;
}
else
{
cost += rc.nogoCost;
}
return;
}
if ( transferNode == null )
{
// *** penalty for being part of the reference track
if ( refTrack != null && refTrack.containsNode( targetNode ) && refTrack.containsNode( sourceNode ) )
{
int reftrackcost = linkdisttotal;
cost += reftrackcost;
}
selev = ele2;
break;
}
transferNode = transferNode.next;
if ( recordTransferNodes )
{
originElement = OsmPathElement.create( lon2, lat2, ele2, originElement, rc.countTraffic );
originElement.cost = cost;
originElement.addTraffic( traffic );
traffic = 0;
}
lon0 = lon1;
lat0 = lat1;
lon1 = lon2;
lat1 = lat2;
ele1 = ele2;
}
// check for nogo-matches (after the *actual* start of segment)
if ( rc.nogoCost < 0)
{
cost = -1;
return;
}
else
{
cost += rc.nogoCost;
}
// add target-node costs
double targetCost = processTargetNode( rc );
if ( targetCost < 0. || targetCost + cost >= 2000000000. )
{
cost = -1;
return;
}
cost += (int)targetCost;
}
public short interpolateEle( short e1, short e2, double fraction )
{
if ( e1 == Short.MIN_VALUE || e2 == Short.MIN_VALUE )
{
return Short.MIN_VALUE;
}
return (short)( e1*(1.-fraction) + e2*fraction );
}
protected abstract double processWaySection( RoutingContext rc, double dist, double delta_h, double elevation, double angle, double cosangle, boolean isStartpoint, int nsection, int lastpriorityclassifier );
protected abstract double processTargetNode( RoutingContext rc );
protected void computeKinematic( RoutingContext rc, double dist, double delta_h, boolean detailMode )
{
}
public abstract int elevationCorrection( RoutingContext rc );
public abstract boolean definitlyWorseThan( OsmPath p, RoutingContext rc );
public OsmNode getSourceNode()
{
return sourceNode;
}
public OsmNode getTargetNode()
{
return targetNode;
}
public OsmLink getLink()
{
return link;
}
public void setNextForLink( OsmLinkHolder holder )
{
nextForLink = holder;
}
public OsmLinkHolder getNextForLink()
{
return nextForLink;
}
public double getTotalTime()
{
return 0.;
}
public double getTotalEnergy()
{
return 0.;
}
}
/**
* Container for link between two Osm nodes
*
* @author ab
*/
package btools.router;
import btools.mapaccess.OsmLink;
import btools.mapaccess.OsmLinkHolder;
import btools.mapaccess.OsmNode;
import btools.mapaccess.OsmTransferNode;
import btools.mapaccess.TurnRestriction;
import btools.util.CheapRuler;
abstract class OsmPath implements OsmLinkHolder {
/**
* The cost of that path (a modified distance)
*/
public int cost = 0;
// the elevation assumed for that path can have a value
// if the corresponding node has not
public short selev;
public int airdistance = 0; // distance to endpos
protected OsmNode sourceNode;
protected OsmNode targetNode;
protected OsmLink link;
public OsmPathElement originElement;
public OsmPathElement myElement;
private OsmLinkHolder nextForLink = null;
public int treedepth = 0;
// the position of the waypoint just before
// this path position (for angle calculation)
public int originLon;
public int originLat;
// the classifier of the segment just before this paths position
protected float lastClassifier;
protected float lastInitialCost;
protected int priorityclassifier;
private static final int PATH_START_BIT = 1;
private static final int CAN_LEAVE_DESTINATION_BIT = 2;
private static final int IS_ON_DESTINATION_BIT = 4;
private static final int HAD_DESTINATION_START_BIT = 8;
protected int bitfield = PATH_START_BIT;
private boolean getBit(int mask) {
return (bitfield & mask) != 0;
}
private void setBit(int mask, boolean bit) {
if (getBit(mask) != bit) {
bitfield ^= mask;
}
}
public boolean didEnterDestinationArea() {
return !getBit(HAD_DESTINATION_START_BIT) && getBit(IS_ON_DESTINATION_BIT);
}
public MessageData message;
public void init(OsmLink link) {
this.link = link;
targetNode = link.getTarget(null);
selev = targetNode.getSElev();
originLon = -1;
originLat = -1;
}
public void init(OsmPath origin, OsmLink link, OsmTrack refTrack, boolean detailMode, RoutingContext rc) {
if (origin.myElement == null) {
origin.myElement = OsmPathElement.create(origin);
}
this.originElement = origin.myElement;
this.link = link;
this.sourceNode = origin.targetNode;
this.targetNode = link.getTarget(sourceNode);
this.cost = origin.cost;
this.lastClassifier = origin.lastClassifier;
this.lastInitialCost = origin.lastInitialCost;
this.bitfield = origin.bitfield;
this.priorityclassifier = origin.priorityclassifier;
init(origin);
addAddionalPenalty(refTrack, detailMode, origin, link, rc);
}
protected abstract void init(OsmPath orig);
protected abstract void resetState();
static int seg = 1;
protected void addAddionalPenalty(OsmTrack refTrack, boolean detailMode, OsmPath origin, OsmLink link, RoutingContext rc) {
byte[] description = link.descriptionBitmap;
if (description == null) { // could be a beeline path
message = new MessageData();
if (message != null) {
message.turnangle = 0;
message.time = (float) 1;
message.energy = (float) 0;
message.priorityclassifier = 0;
message.classifiermask = 0;
message.lon = targetNode.getILon();
message.lat = targetNode.getILat();
message.ele = Short.MIN_VALUE;
message.linkdist = sourceNode.calcDistance(targetNode);
message.wayKeyValues = "direct_segment=" + seg;
seg++;
}
return;
}
boolean recordTransferNodes = detailMode;
rc.nogoCost = 0.;
// extract the 3 positions of the first section
int lon0 = origin.originLon;
int lat0 = origin.originLat;
int lon1 = sourceNode.getILon();
int lat1 = sourceNode.getILat();
short ele1 = origin.selev;
int linkdisttotal = 0;
message = detailMode ? new MessageData() : null;
boolean isReverse = link.isReverse(sourceNode);
// evaluate the way tags
rc.expctxWay.evaluate(rc.inverseDirection ^ isReverse, description);
// calculate the costfactor inputs
float costfactor = rc.expctxWay.getCostfactor();
boolean isTrafficBackbone = cost == 0 && rc.expctxWay.getIsTrafficBackbone() > 0.f;
int lastpriorityclassifier = priorityclassifier;
priorityclassifier = (int) rc.expctxWay.getPriorityClassifier();
// *** add initial cost if the classifier changed
float newClassifier = rc.expctxWay.getInitialClassifier();
float newInitialCost = rc.expctxWay.getInitialcost();
float classifierDiff = newClassifier - lastClassifier;
if (newClassifier != 0. && lastClassifier != 0. && (classifierDiff > 0.0005 || classifierDiff < -0.0005)) {
float initialcost = rc.inverseDirection ? lastInitialCost : newInitialCost;
if (initialcost >= 1000000.) {
cost = -1;
return;
}
int iicost = (int) initialcost;
if (message != null) {
message.linkinitcost += iicost;
}
cost += iicost;
}
lastClassifier = newClassifier;
lastInitialCost = newInitialCost;
// *** destination logic: no destination access in between
int classifiermask = (int) rc.expctxWay.getClassifierMask();
boolean newDestination = (classifiermask & 64) != 0;
boolean oldDestination = getBit(IS_ON_DESTINATION_BIT);
if (getBit(PATH_START_BIT)) {
setBit(PATH_START_BIT, false);
setBit(CAN_LEAVE_DESTINATION_BIT, newDestination);
setBit(HAD_DESTINATION_START_BIT, newDestination);
} else {
if (oldDestination && !newDestination) {
if (getBit(CAN_LEAVE_DESTINATION_BIT)) {
setBit(CAN_LEAVE_DESTINATION_BIT, false);
} else {
cost = -1;
return;
}
}
}
setBit(IS_ON_DESTINATION_BIT, newDestination);
OsmTransferNode transferNode = link.geometry == null ? null
: rc.geometryDecoder.decodeGeometry(link.geometry, sourceNode, targetNode, isReverse);
for (int nsection = 0; ; nsection++) {
originLon = lon1;
originLat = lat1;
int lon2;
int lat2;
short ele2;
short originEle2;
if (transferNode == null) {
lon2 = targetNode.ilon;
lat2 = targetNode.ilat;
originEle2 = targetNode.selev;
} else {
lon2 = transferNode.ilon;
lat2 = transferNode.ilat;
originEle2 = transferNode.selev;
}
ele2 = originEle2;
boolean isStartpoint = lon0 == -1 && lat0 == -1;
// check turn restrictions (n detail mode (=final pass) no TR to not mess up voice hints)
if (nsection == 0 && rc.considerTurnRestrictions && !detailMode && !isStartpoint) {
if (rc.inverseDirection
? TurnRestriction.isTurnForbidden(sourceNode.firstRestriction, lon2, lat2, lon0, lat0, rc.bikeMode || rc.footMode, rc.carMode)
: TurnRestriction.isTurnForbidden(sourceNode.firstRestriction, lon0, lat0, lon2, lat2, rc.bikeMode || rc.footMode, rc.carMode)) {
cost = -1;
return;
}
}
// if recording, new MessageData for each section (needed for turn-instructions)
if (message != null && message.wayKeyValues != null) {
originElement.message = message;
message = new MessageData();
}
int dist = rc.calcDistance(lon1, lat1, lon2, lat2);
boolean stopAtEndpoint = false;
if (rc.shortestmatch) {
if (rc.isEndpoint) {
stopAtEndpoint = true;
ele2 = interpolateEle(ele1, ele2, rc.wayfraction);
} else {
// we just start here, reset everything
cost = 0;
resetState();
lon0 = -1; // reset turncost-pipe
lat0 = -1;
isStartpoint = true;
if (recordTransferNodes) {
if (rc.wayfraction > 0.) {
ele1 = interpolateEle(ele1, ele2, 1. - rc.wayfraction);
originElement = OsmPathElement.create(rc.ilonshortest, rc.ilatshortest, ele1, null);
} else {
originElement = null; // prevent duplicate point
}
}
if (rc.checkPendingEndpoint()) {
dist = rc.calcDistance(rc.ilonshortest, rc.ilatshortest, lon2, lat2);
if (rc.shortestmatch) {
stopAtEndpoint = true;
ele2 = interpolateEle(ele1, ele2, rc.wayfraction);
}
}
}
}
if (message != null) {
message.linkdist += dist;
}
linkdisttotal += dist;
// apply a start-direction if appropriate (by faking the origin position)
if (isStartpoint) {
if (rc.startDirectionValid) {
double dir = rc.startDirection * CheapRuler.DEG_TO_RAD;
double[] lonlat2m = CheapRuler.getLonLatToMeterScales((lon0 + lat1) >> 1);
lon0 = lon1 - (int) (1000. * Math.sin(dir) / lonlat2m[0]);
lat0 = lat1 - (int) (1000. * Math.cos(dir) / lonlat2m[1]);
} else {
lon0 = lon1 - (lon2 - lon1);
lat0 = lat1 - (lat2 - lat1);
}
}
double angle = rc.anglemeter.calcAngle(lon0, lat0, lon1, lat1, lon2, lat2);
double cosangle = rc.anglemeter.getCosAngle();
// *** elevation stuff
double delta_h = 0.;
if (ele2 == Short.MIN_VALUE) ele2 = ele1;
if (ele1 != Short.MIN_VALUE) {
delta_h = (ele2 - ele1) / 4.;
if (rc.inverseDirection) {
delta_h = -delta_h;
}
}
double elevation = ele2 == Short.MIN_VALUE ? 100. : ele2 / 4.;
double sectionCost = processWaySection(rc, dist, delta_h, elevation, angle, cosangle, isStartpoint, nsection, lastpriorityclassifier);
if ((sectionCost < 0. || costfactor > 9998. && !detailMode) || sectionCost + cost >= 2000000000.) {
cost = -1;
return;
}
if (isTrafficBackbone) {
sectionCost = 0.;
}
cost += (int) sectionCost;
// compute kinematic
computeKinematic(rc, dist, delta_h, detailMode);
if (message != null) {
message.turnangle = (float) angle;
message.time = (float) getTotalTime();
message.energy = (float) getTotalEnergy();
message.priorityclassifier = priorityclassifier;
message.classifiermask = classifiermask;
message.lon = lon2;
message.lat = lat2;
message.ele = originEle2;
message.wayKeyValues = rc.expctxWay.getKeyValueDescription(isReverse, description);
}
if (stopAtEndpoint) {
if (recordTransferNodes) {
originElement = OsmPathElement.create(rc.ilonshortest, rc.ilatshortest, originEle2, originElement);
originElement.cost = cost;
if (message != null) {
originElement.message = message;
}
}
if (rc.nogoCost < 0) {
cost = -1;
} else {
cost += rc.nogoCost;
}
return;
}
if (transferNode == null) {
// *** penalty for being part of the reference track
if (refTrack != null && refTrack.containsNode(targetNode) && refTrack.containsNode(sourceNode)) {
int reftrackcost = linkdisttotal;
cost += reftrackcost;
}
selev = ele2;
break;
}
transferNode = transferNode.next;
if (recordTransferNodes) {
originElement = OsmPathElement.create(lon2, lat2, originEle2, originElement);
originElement.cost = cost;
}
lon0 = lon1;
lat0 = lat1;
lon1 = lon2;
lat1 = lat2;
ele1 = ele2;
}
// check for nogo-matches (after the *actual* start of segment)
if (rc.nogoCost < 0) {
cost = -1;
return;
} else {
cost += rc.nogoCost;
}
// add target-node costs
double targetCost = processTargetNode(rc);
if (targetCost < 0. || targetCost + cost >= 2000000000.) {
cost = -1;
return;
}
cost += (int) targetCost;
}
public short interpolateEle(short e1, short e2, double fraction) {
if (e1 == Short.MIN_VALUE || e2 == Short.MIN_VALUE) {
return Short.MIN_VALUE;
}
return (short) (e1 * (1. - fraction) + e2 * fraction);
}
protected abstract double processWaySection(RoutingContext rc, double dist, double delta_h, double elevation, double angle, double cosangle, boolean isStartpoint, int nsection, int lastpriorityclassifier);
protected abstract double processTargetNode(RoutingContext rc);
protected void computeKinematic(RoutingContext rc, double dist, double delta_h, boolean detailMode) {
}
public abstract int elevationCorrection();
public abstract boolean definitlyWorseThan(OsmPath p);
public OsmNode getSourceNode() {
return sourceNode;
}
public OsmNode getTargetNode() {
return targetNode;
}
public OsmLink getLink() {
return link;
}
public void setNextForLink(OsmLinkHolder holder) {
nextForLink = holder;
}
public OsmLinkHolder getNextForLink() {
return nextForLink;
}
public double getTotalTime() {
return 0.;
}
public double getTotalEnergy() {
return 0.;
}
}

View file

@ -1,136 +1,127 @@
package btools.router;
import java.io.DataInput;
import java.io.DataOutput;
import java.io.IOException;
import btools.mapaccess.OsmNode;
import btools.mapaccess.OsmPos;
import btools.util.CheapRuler;
/**
* Container for link between two Osm nodes
*
* @author ab
*/
public class OsmPathElement implements OsmPos
{
private int ilat; // latitude
private int ilon; // longitude
private short selev; // longitude
public MessageData message = null; // description
public int cost;
// interface OsmPos
public final int getILat()
{
return ilat;
}
public final int getILon()
{
return ilon;
}
public final short getSElev()
{
return selev;
}
public final double getElev()
{
return selev / 4.;
}
public final float getTime()
{
return message == null ? 0.f : message.time;
}
public final void setTime( float t )
{
if ( message != null )
{
message.time = t;
}
}
public final float getEnergy()
{
return message == null ? 0.f : message.energy;
}
public final void setEnergy( float e )
{
if ( message != null )
{
message.energy = e;
}
}
public final long getIdFromPos()
{
return ((long)ilon)<<32 | ilat;
}
public final int calcDistance( OsmPos p )
{
return (int)(CheapRuler.distance(ilon, ilat, p.getILon(), p.getILat()) + 1.0 );
}
public OsmPathElement origin;
// construct a path element from a path
public static final OsmPathElement create( OsmPath path, boolean countTraffic )
{
OsmNode n = path.getTargetNode();
OsmPathElement pe = create( n.getILon(), n.getILat(), path.selev, path.originElement, countTraffic );
pe.cost = path.cost;
pe.message = path.message;
return pe;
}
public static final OsmPathElement create( int ilon, int ilat, short selev, OsmPathElement origin, boolean countTraffic )
{
OsmPathElement pe = countTraffic ? new OsmPathElementWithTraffic() : new OsmPathElement();
pe.ilon = ilon;
pe.ilat = ilat;
pe.selev = selev;
pe.origin = origin;
return pe;
}
protected OsmPathElement()
{
}
public void addTraffic( float traffic )
{
}
public String toString()
{
return ilon + "_" + ilat;
}
public void writeToStream( DataOutput dos ) throws IOException
{
dos.writeInt( ilat );
dos.writeInt( ilon );
dos.writeShort( selev );
dos.writeInt( cost );
}
public static OsmPathElement readFromStream( DataInput dis ) throws IOException
{
OsmPathElement pe = new OsmPathElement();
pe.ilat = dis.readInt();
pe.ilon = dis.readInt();
pe.selev = dis.readShort();
pe.cost = dis.readInt();
return pe;
}
}
package btools.router;
import java.io.DataInput;
import java.io.DataOutput;
import java.io.IOException;
import btools.mapaccess.OsmNode;
import btools.mapaccess.OsmPos;
import btools.util.CheapRuler;
/**
* Container for link between two Osm nodes
*
* @author ab
*/
public class OsmPathElement implements OsmPos {
private int ilat; // latitude
private int ilon; // longitude
private short selev; // longitude
public MessageData message = null; // description
public int cost;
// interface OsmPos
public final int getILat() {
return ilat;
}
public final int getILon() {
return ilon;
}
public final short getSElev() {
return selev;
}
public final void setSElev(short s) {
selev = s;
}
public final double getElev() {
return selev / 4.;
}
public final float getTime() {
return message == null ? 0.f : message.time;
}
public final void setTime(float t) {
if (message != null) {
message.time = t;
}
}
public final float getEnergy() {
return message == null ? 0.f : message.energy;
}
public final void setEnergy(float e) {
if (message != null) {
message.energy = e;
}
}
public final void setAngle(float e) {
if (message != null) {
message.turnangle = e;
}
}
public final long getIdFromPos() {
return ((long) ilon) << 32 | ilat;
}
public final int calcDistance(OsmPos p) {
return (int) Math.max(1.0, Math.round(CheapRuler.distance(ilon, ilat, p.getILon(), p.getILat())));
}
public OsmPathElement origin;
// construct a path element from a path
public static final OsmPathElement create(OsmPath path) {
OsmNode n = path.getTargetNode();
OsmPathElement pe = create(n.getILon(), n.getILat(), n.getSElev(), path.originElement);
pe.cost = path.cost;
pe.message = path.message;
return pe;
}
public static final OsmPathElement create(int ilon, int ilat, short selev, OsmPathElement origin) {
OsmPathElement pe = new OsmPathElement();
pe.ilon = ilon;
pe.ilat = ilat;
pe.selev = selev;
pe.origin = origin;
return pe;
}
protected OsmPathElement() {
}
public String toString() {
return ilon + "_" + ilat;
}
public boolean positionEquals(OsmPathElement e) {
return this.ilat == e.ilat && this.ilon == e.ilon;
}
public void writeToStream(DataOutput dos) throws IOException {
dos.writeInt(ilat);
dos.writeInt(ilon);
dos.writeShort(selev);
dos.writeInt(cost);
}
public static OsmPathElement readFromStream(DataInput dis) throws IOException {
OsmPathElement pe = new OsmPathElement();
pe.ilat = dis.readInt();
pe.ilon = dis.readInt();
pe.selev = dis.readShort();
pe.cost = dis.readInt();
return pe;
}
}

View file

@ -1,78 +0,0 @@
package btools.router;
import java.io.IOException;
/**
* Extension to OsmPathElement to count traffic load
*
* @author ab
*/
public final class OsmPathElementWithTraffic extends OsmPathElement
{
private int registerCount;
private float farTraffic;
private float nearTraffic;
public void register()
{
if ( registerCount++ == 0 )
{
if ( origin instanceof OsmPathElementWithTraffic )
{
OsmPathElementWithTraffic ot = (OsmPathElementWithTraffic)origin;
ot.register();
ot.farTraffic += farTraffic;
ot.nearTraffic += nearTraffic;
farTraffic = 0;
nearTraffic = 0;
}
}
}
@Override
public void addTraffic( float traffic )
{
this.farTraffic += traffic;
this.nearTraffic += traffic;
}
// unregister from origin if our registercount is 0, else do nothing
public static double maxtraffic = 0.;
public boolean unregister( RoutingContext rc ) throws IOException
{
if ( --registerCount == 0 )
{
if ( origin instanceof OsmPathElementWithTraffic )
{
OsmPathElementWithTraffic ot = (OsmPathElementWithTraffic)origin;
int costdelta = cost-ot.cost;
ot.farTraffic += farTraffic*Math.exp(-costdelta/rc.farTrafficDecayLength);
ot.nearTraffic += nearTraffic*Math.exp(-costdelta/rc.nearTrafficDecayLength);
if ( costdelta > 0 && farTraffic > maxtraffic ) maxtraffic = farTraffic;
int t2 = cost == ot.cost ? -1 : (int)(rc.farTrafficWeight*farTraffic + rc.nearTrafficWeight*nearTraffic);
if ( t2 > 4000 || t2 == -1 )
{
// System.out.println( "unregistered: " + this + " origin=" + ot + " farTraffic =" + farTraffic + " nearTraffic =" + nearTraffic + " cost=" + cost );
if ( rc.trafficOutputStream != null )
{
rc.trafficOutputStream.writeLong( getIdFromPos());
rc.trafficOutputStream.writeLong( ot.getIdFromPos());
rc.trafficOutputStream.writeInt( t2 );
}
}
farTraffic = 0;
nearTraffic = 0;
}
return true;
}
return false;
}
}

View file

@ -11,11 +11,10 @@ import btools.expressions.BExpressionContextNode;
import btools.expressions.BExpressionContextWay;
abstract class OsmPathModel
{
abstract class OsmPathModel {
public abstract OsmPrePath createPrePath();
public abstract OsmPath createPath();
public abstract void init( BExpressionContextWay expctxWay, BExpressionContextNode expctxNode, Map<String,String> keyValues );
public abstract void init(BExpressionContextWay expctxWay, BExpressionContextNode expctxNode, Map<String, String> keyValues);
}

View file

@ -7,23 +7,20 @@ package btools.router;
import btools.mapaccess.OsmLink;
import btools.mapaccess.OsmNode;
import btools.mapaccess.OsmTransferNode;
public abstract class OsmPrePath
{
public abstract class OsmPrePath {
protected OsmNode sourceNode;
protected OsmNode targetNode;
protected OsmLink link;
public OsmPrePath next;
public void init( OsmPath origin, OsmLink link, RoutingContext rc )
{
public void init(OsmPath origin, OsmLink link, RoutingContext rc) {
this.link = link;
this.sourceNode = origin.getTargetNode();
this.targetNode = link.getTarget( sourceNode );
initPrePath(origin, rc );
this.targetNode = link.getTarget(sourceNode);
initPrePath(origin, rc);
}
protected abstract void initPrePath(OsmPath origin, RoutingContext rc );
protected abstract void initPrePath(OsmPath origin, RoutingContext rc);
}

File diff suppressed because it is too large Load diff

View file

@ -11,146 +11,125 @@ import btools.expressions.BExpressionContextNode;
import btools.expressions.BExpressionContextWay;
import btools.expressions.BExpressionMetaData;
public final class ProfileCache
{
public final class ProfileCache {
private static File lastLookupFile;
private static long lastLookupTimestamp;
private BExpressionContextWay expctxWay;
private BExpressionContextNode expctxNode;
private File lastProfileFile;
private long lastProfileTimestamp;
private long lastProfileTimestamp;
private boolean profilesBusy;
private long lastUseTime;
private static ProfileCache[] apc = new ProfileCache[1];
private static boolean debug = Boolean.getBoolean( "debugProfileCache" );
public static synchronized void setSize( int size )
{
private static ProfileCache[] apc = new ProfileCache[1];
private static boolean debug = Boolean.getBoolean("debugProfileCache");
public static synchronized void setSize(int size) {
apc = new ProfileCache[size];
}
public static synchronized boolean parseProfile( RoutingContext rc )
{
String profileBaseDir = System.getProperty( "profileBaseDir" );
File profileDir;
File profileFile;
if ( profileBaseDir == null )
{
profileDir = new File( rc.localFunction ).getParentFile();
profileFile = new File( rc.localFunction ) ;
}
else
{
profileDir = new File( profileBaseDir );
profileFile = new File( profileDir, rc.localFunction + ".brf" ) ;
}
public static synchronized boolean parseProfile(RoutingContext rc) {
String profileBaseDir = System.getProperty("profileBaseDir");
File profileDir;
File profileFile;
if (profileBaseDir == null) {
profileDir = new File(rc.localFunction).getParentFile();
profileFile = new File(rc.localFunction);
} else {
profileDir = new File(profileBaseDir);
profileFile = new File(profileDir, rc.localFunction + ".brf");
}
rc.profileTimestamp = profileFile.lastModified() + rc.getKeyValueChecksum()<<24;
File lookupFile = new File( profileDir, "lookups.dat" );
// invalidate cache at lookup-table update
if ( !(lookupFile.equals( lastLookupFile ) && lookupFile.lastModified() == lastLookupTimestamp ) )
{
if ( lastLookupFile != null )
{
System.out.println( "******** invalidating profile-cache after lookup-file update ******** " );
}
apc = new ProfileCache[apc.length];
lastLookupFile = lookupFile;
lastLookupTimestamp = lookupFile.lastModified();
}
ProfileCache lru = null;
int unusedSlot =-1;
rc.profileTimestamp = profileFile.lastModified() + rc.getKeyValueChecksum() << 24;
File lookupFile = new File(profileDir, "lookups.dat");
// check for re-use
for( int i=0; i<apc.length; i++)
{
ProfileCache pc = apc[i];
if ( pc != null )
{
if ( (!pc.profilesBusy) && profileFile.equals( pc.lastProfileFile ) )
{
if ( rc.profileTimestamp == pc.lastProfileTimestamp )
{
rc.expctxWay = pc.expctxWay;
rc.expctxNode = pc.expctxNode;
rc.readGlobalConfig();
pc.profilesBusy = true;
return true;
}
lru = pc; // name-match but timestamp-mismatch -> we overide this one
unusedSlot = -1;
break;
}
if ( lru == null || lru.lastUseTime > pc.lastUseTime )
{
lru = pc;
// invalidate cache at lookup-table update
if (!(lookupFile.equals(lastLookupFile) && lookupFile.lastModified() == lastLookupTimestamp)) {
if (lastLookupFile != null) {
System.out.println("******** invalidating profile-cache after lookup-file update ******** ");
}
apc = new ProfileCache[apc.length];
lastLookupFile = lookupFile;
lastLookupTimestamp = lookupFile.lastModified();
}
ProfileCache lru = null;
int unusedSlot = -1;
// check for re-use
for (int i = 0; i < apc.length; i++) {
ProfileCache pc = apc[i];
if (pc != null) {
if ((!pc.profilesBusy) && profileFile.equals(pc.lastProfileFile)) {
if (rc.profileTimestamp == pc.lastProfileTimestamp) {
rc.expctxWay = pc.expctxWay;
rc.expctxNode = pc.expctxNode;
rc.readGlobalConfig();
pc.profilesBusy = true;
return true;
}
lru = pc; // name-match but timestamp-mismatch -> we overide this one
unusedSlot = -1;
break;
}
else if ( unusedSlot < 0 )
{
unusedSlot = i;
if (lru == null || lru.lastUseTime > pc.lastUseTime) {
lru = pc;
}
} else if (unusedSlot < 0) {
unusedSlot = i;
}
BExpressionMetaData meta = new BExpressionMetaData();
rc.expctxWay = new BExpressionContextWay( rc.memoryclass * 512, meta );
rc.expctxNode = new BExpressionContextNode( 0, meta );
rc.expctxNode.setForeignContext( rc.expctxWay );
meta.readMetaData( new File( profileDir, "lookups.dat" ) );
}
rc.expctxWay.parseFile( profileFile, "global" );
rc.expctxNode.parseFile( profileFile, "global" );
BExpressionMetaData meta = new BExpressionMetaData();
rc.readGlobalConfig();
if ( rc.processUnusedTags )
{
rc.expctxWay.setAllTagsUsed();
rc.expctxWay = new BExpressionContextWay(rc.memoryclass * 512, meta);
rc.expctxNode = new BExpressionContextNode(0, meta);
rc.expctxNode.setForeignContext(rc.expctxWay);
meta.readMetaData(new File(profileDir, "lookups.dat"));
rc.expctxWay.parseFile(profileFile, "global", rc.keyValues);
rc.expctxNode.parseFile(profileFile, "global", rc.keyValues);
rc.readGlobalConfig();
if (rc.processUnusedTags) {
rc.expctxWay.setAllTagsUsed();
}
if (lru == null || unusedSlot >= 0) {
lru = new ProfileCache();
if (unusedSlot >= 0) {
apc[unusedSlot] = lru;
if (debug)
System.out.println("******* adding new profile at idx=" + unusedSlot + " for " + profileFile);
}
}
if ( lru == null || unusedSlot >= 0 )
{
lru = new ProfileCache();
if ( unusedSlot >= 0 )
{
apc[unusedSlot] = lru;
if ( debug ) System.out.println( "******* adding new profile at idx=" + unusedSlot + " for " + profileFile );
}
}
if (lru.lastProfileFile != null) {
if (debug)
System.out.println("******* replacing profile of age " + ((System.currentTimeMillis() - lru.lastUseTime) / 1000L) + " sec " + lru.lastProfileFile + "->" + profileFile);
}
if ( lru.lastProfileFile != null )
{
if ( debug ) System.out.println( "******* replacing profile of age " + ((System.currentTimeMillis()-lru.lastUseTime)/1000L) + " sec " + lru.lastProfileFile + "->" + profileFile );
}
lru.lastProfileTimestamp = rc.profileTimestamp;
lru.lastProfileFile = profileFile;
lru.expctxWay = rc.expctxWay;
lru.expctxNode = rc.expctxNode;
lru.profilesBusy = true;
lru.lastUseTime = System.currentTimeMillis();
return false;
lru.lastProfileTimestamp = rc.profileTimestamp;
lru.lastProfileFile = profileFile;
lru.expctxWay = rc.expctxWay;
lru.expctxNode = rc.expctxNode;
lru.profilesBusy = true;
lru.lastUseTime = System.currentTimeMillis();
return false;
}
public static synchronized void releaseProfile( RoutingContext rc )
{
for( int i=0; i<apc.length; i++)
{
public static synchronized void releaseProfile(RoutingContext rc) {
for (int i = 0; i < apc.length; i++) {
ProfileCache pc = apc[i];
if ( pc != null )
{
if (pc != null) {
// only the thread that holds the cached instance can release it
if ( rc.expctxWay == pc.expctxWay && rc.expctxNode == pc.expctxNode )
{
if (rc.expctxWay == pc.expctxWay && rc.expctxNode == pc.expctxNode) {
pc.profilesBusy = false;
break;
}

File diff suppressed because it is too large Load diff

File diff suppressed because it is too large Load diff

View file

@ -9,41 +9,33 @@ import java.io.File;
import btools.mapaccess.StorageConfigHelper;
public final class RoutingHelper
{
public static File getAdditionalMaptoolDir( File segmentDir )
{
return StorageConfigHelper.getAdditionalMaptoolDir(segmentDir);
}
public final class RoutingHelper {
public static File getAdditionalMaptoolDir(File segmentDir) {
return StorageConfigHelper.getAdditionalMaptoolDir(segmentDir);
}
public static File getSecondarySegmentDir( File segmentDir )
{
return StorageConfigHelper.getSecondarySegmentDir(segmentDir);
}
public static boolean hasDirectoryAnyDatafiles( File segmentDir )
{
if ( hasAnyDatafiles( segmentDir ) )
{
return true;
}
// check secondary, too
File secondary = StorageConfigHelper.getSecondarySegmentDir( segmentDir );
if ( secondary != null )
{
return hasAnyDatafiles( secondary );
}
return false;
}
public static File getSecondarySegmentDir(File segmentDir) {
return StorageConfigHelper.getSecondarySegmentDir(segmentDir);
}
private static boolean hasAnyDatafiles( File dir )
{
String[] fileNames = dir.list();
for( String fileName : fileNames )
{
if ( fileName.endsWith( ".rd5" ) ) return true;
}
return false;
public static boolean hasDirectoryAnyDatafiles(File segmentDir) {
if (hasAnyDatafiles(segmentDir)) {
return true;
}
// check secondary, too
File secondary = StorageConfigHelper.getSecondarySegmentDir(segmentDir);
if (secondary != null) {
return hasAnyDatafiles(secondary);
}
return false;
}
private static boolean hasAnyDatafiles(File dir) {
String[] fileNames = dir.list();
for (String fileName : fileNames) {
if (fileName.endsWith(".rd5")) return true;
}
return false;
}
}

View file

@ -1,5 +1,4 @@
package btools.router;
public class RoutingIslandException extends RuntimeException
{
public class RoutingIslandException extends RuntimeException {
}

View file

@ -0,0 +1,356 @@
package btools.router;
import java.io.UnsupportedEncodingException;
import java.net.URLDecoder;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.StringTokenizer;
public class RoutingParamCollector {
final static boolean DEBUG = false;
/**
* get a list of points and optional extra info for the points
*
* @param lonLats linked list separated by ';' or '|'
* @return a list
*/
public List<OsmNodeNamed> getWayPointList(String lonLats) {
if (lonLats == null) throw new IllegalArgumentException("lonlats parameter not set");
String[] coords = lonLats.split(";|\\|"); // use both variantes
if (coords.length < 1 || !coords[0].contains(","))
throw new IllegalArgumentException("we need one lat/lon point at least!");
List<OsmNodeNamed> wplist = new ArrayList<>();
for (int i = 0; i < coords.length; i++) {
String[] lonLat = coords[i].split(",");
if (lonLat.length < 1)
throw new IllegalArgumentException("we need one lat/lon point at least!");
wplist.add(readPosition(lonLat[0], lonLat[1], "via" + i));
if (lonLat.length > 2) {
if (lonLat[2].equals("d")) {
wplist.get(wplist.size() - 1).direct = true;
} else {
wplist.get(wplist.size() - 1).name = lonLat[2];
}
}
}
if (wplist.get(0).name.startsWith("via")) wplist.get(0).name = "from";
if (wplist.get(wplist.size() - 1).name.startsWith("via")) {
wplist.get(wplist.size() - 1).name = "to";
}
return wplist;
}
/**
* get a list of points (old style, positions only)
*
* @param lons array with longitudes
* @param lats array with latitudes
* @return a list
*/
public List<OsmNodeNamed> readPositions(double[] lons, double[] lats) {
List<OsmNodeNamed> wplist = new ArrayList<>();
if (lats == null || lats.length < 2 || lons == null || lons.length < 2) {
return wplist;
}
for (int i = 0; i < lats.length && i < lons.length; i++) {
OsmNodeNamed n = new OsmNodeNamed();
n.name = "via" + i;
n.ilon = (int) ((lons[i] + 180.) * 1000000. + 0.5);
n.ilat = (int) ((lats[i] + 90.) * 1000000. + 0.5);
wplist.add(n);
}
if (wplist.get(0).name.startsWith("via")) wplist.get(0).name = "from";
if (wplist.get(wplist.size() - 1).name.startsWith("via")) {
wplist.get(wplist.size() - 1).name = "to";
}
return wplist;
}
private OsmNodeNamed readPosition(String vlon, String vlat, String name) {
if (vlon == null) throw new IllegalArgumentException("lon " + name + " not found in input");
if (vlat == null) throw new IllegalArgumentException("lat " + name + " not found in input");
return readPosition(Double.parseDouble(vlon), Double.parseDouble(vlat), name);
}
private OsmNodeNamed readPosition(double lon, double lat, String name) {
OsmNodeNamed n = new OsmNodeNamed();
n.name = name;
n.ilon = (int) ((lon + 180.) * 1000000. + 0.5);
n.ilat = (int) ((lat + 90.) * 1000000. + 0.5);
return n;
}
/**
* read a url like parameter list linked with '&'
*
* @param url parameter list
* @return a hashmap of the parameter
* @throws UnsupportedEncodingException
*/
public Map<String, String> getUrlParams(String url) throws UnsupportedEncodingException {
Map<String, String> params = new HashMap<>();
String decoded = URLDecoder.decode(url, "UTF-8");
StringTokenizer tk = new StringTokenizer(decoded, "?&");
while (tk.hasMoreTokens()) {
String t = tk.nextToken();
StringTokenizer tk2 = new StringTokenizer(t, "=");
if (tk2.hasMoreTokens()) {
String key = tk2.nextToken();
if (tk2.hasMoreTokens()) {
String value = tk2.nextToken();
params.put(key, value);
}
}
}
return params;
}
/**
* fill a parameter map into the routing context
*
* @param rctx the context
* @param wplist the list of way points needed for 'straight' parameter
* @param params the list of parameters
*/
public void setParams(RoutingContext rctx, List<OsmNodeNamed> wplist, Map<String, String> params) {
if (params != null) {
if (params.size() == 0) return;
// prepare nogos extra
if (params.containsKey("profile")) {
rctx.localFunction = params.get("profile");
}
if (params.containsKey("nogoLats") && params.get("nogoLats").length() > 0) {
List<OsmNodeNamed> nogoList = readNogos(params.get("nogoLons"), params.get("nogoLats"), params.get("nogoRadi"));
if (nogoList != null) {
RoutingContext.prepareNogoPoints(nogoList);
if (rctx.nogopoints == null) {
rctx.nogopoints = nogoList;
} else {
rctx.nogopoints.addAll(nogoList);
}
}
params.remove("nogoLats");
params.remove("nogoLons");
params.remove("nogoRadi");
}
if (params.containsKey("nogos")) {
List<OsmNodeNamed> nogoList = readNogoList(params.get("nogos"));
if (nogoList != null) {
RoutingContext.prepareNogoPoints(nogoList);
if (rctx.nogopoints == null) {
rctx.nogopoints = nogoList;
} else {
rctx.nogopoints.addAll(nogoList);
}
}
params.remove("nogos");
}
if (params.containsKey("polylines")) {
List<OsmNodeNamed> result = new ArrayList<>();
parseNogoPolygons(params.get("polylines"), result, false);
if (rctx.nogopoints == null) {
rctx.nogopoints = result;
} else {
rctx.nogopoints.addAll(result);
}
params.remove("polylines");
}
if (params.containsKey("polygons")) {
List<OsmNodeNamed> result = new ArrayList<>();
parseNogoPolygons(params.get("polygons"), result, true);
if (rctx.nogopoints == null) {
rctx.nogopoints = result;
} else {
rctx.nogopoints.addAll(result);
}
params.remove("polygons");
}
for (Map.Entry<String, String> e : params.entrySet()) {
String key = e.getKey();
String value = e.getValue();
if (DEBUG) System.out.println("params " + key + " " + value);
if (key.equals("straight")) {
try {
String[] sa = value.split(",");
for (int i = 0; i < sa.length; i++) {
int v = Integer.parseInt(sa[i]);
if (wplist.size() > v) wplist.get(v).direct = true;
}
} catch (Exception ex) {
System.err.println("error " + ex.getStackTrace()[0].getLineNumber() + " " + ex.getStackTrace()[0] + "\n" + ex);
}
} else if (key.equals("pois")) {
rctx.poipoints = readPoisList(value);
} else if (key.equals("heading")) {
rctx.startDirection = Integer.valueOf(value);
rctx.forceUseStartDirection = true;
} else if (key.equals("direction")) {
rctx.startDirection = Integer.valueOf(value);
} else if (key.equals("alternativeidx")) {
rctx.setAlternativeIdx(Integer.parseInt(value));
} else if (key.equals("turnInstructionMode")) {
rctx.turnInstructionMode = Integer.parseInt(value);
} else if (key.equals("timode")) {
rctx.turnInstructionMode = Integer.parseInt(value);
} else if (key.equals("turnInstructionFormat")) {
if ("osmand".equalsIgnoreCase(value)) {
rctx.turnInstructionMode = 3;
} else if ("locus".equalsIgnoreCase(value)) {
rctx.turnInstructionMode = 2;
}
} else if (key.equals("exportWaypoints")) {
rctx.exportWaypoints = (Integer.parseInt(value) == 1);
} else if (key.equals("format")) {
rctx.outputFormat = ((String) value).toLowerCase();
} else if (key.equals("trackFormat")) {
rctx.outputFormat = ((String) value).toLowerCase();
} else if (key.startsWith("profile:")) {
if (rctx.keyValues == null) rctx.keyValues = new HashMap<>();
rctx.keyValues.put(key.substring(8), value);
}
// ignore other params
}
}
}
/**
* fill profile parameter list
*
* @param rctx the routing context
* @param params the list of parameters
*/
public void setProfileParams(RoutingContext rctx, Map<String, String> params) {
if (params != null) {
if (params.size() == 0) return;
if (rctx.keyValues == null) rctx.keyValues = new HashMap<>();
for (Map.Entry<String, String> e : params.entrySet()) {
String key = e.getKey();
String value = e.getValue();
if (DEBUG) System.out.println("params " + key + " " + value);
rctx.keyValues.put(key, value);
}
}
}
private void parseNogoPolygons(String polygons, List<OsmNodeNamed> result, boolean closed) {
if (polygons != null) {
String[] polygonList = polygons.split("\\|");
for (int i = 0; i < polygonList.length; i++) {
String[] lonLatList = polygonList[i].split(",");
if (lonLatList.length > 1) {
OsmNogoPolygon polygon = new OsmNogoPolygon(closed);
int j;
for (j = 0; j < 2 * (lonLatList.length / 2) - 1; ) {
String slon = lonLatList[j++];
String slat = lonLatList[j++];
int lon = (int) ((Double.parseDouble(slon) + 180.) * 1000000. + 0.5);
int lat = (int) ((Double.parseDouble(slat) + 90.) * 1000000. + 0.5);
polygon.addVertex(lon, lat);
}
String nogoWeight = "NaN";
if (j < lonLatList.length) {
nogoWeight = lonLatList[j];
}
polygon.nogoWeight = Double.parseDouble(nogoWeight);
if (polygon.points.size() > 0) {
polygon.calcBoundingCircle();
result.add(polygon);
}
}
}
}
}
public List<OsmNodeNamed> readPoisList(String pois) {
// lon,lat,name|...
if (pois == null) return null;
String[] lonLatNameList = pois.split("\\|");
List<OsmNodeNamed> poisList = new ArrayList<>();
for (int i = 0; i < lonLatNameList.length; i++) {
String[] lonLatName = lonLatNameList[i].split(",");
if (lonLatName.length != 3)
continue;
OsmNodeNamed n = new OsmNodeNamed();
n.ilon = (int) ((Double.parseDouble(lonLatName[0]) + 180.) * 1000000. + 0.5);
n.ilat = (int) ((Double.parseDouble(lonLatName[1]) + 90.) * 1000000. + 0.5);
n.name = lonLatName[2];
poisList.add(n);
}
return poisList;
}
public List<OsmNodeNamed> readNogoList(String nogos) {
// lon,lat,radius[,weight]|...
if (nogos == null) return null;
String[] lonLatRadList = nogos.split("\\|");
List<OsmNodeNamed> nogoList = new ArrayList<>();
for (int i = 0; i < lonLatRadList.length; i++) {
String[] lonLatRad = lonLatRadList[i].split(",");
String nogoWeight = "NaN";
if (lonLatRad.length > 3) {
nogoWeight = lonLatRad[3];
}
nogoList.add(readNogo(lonLatRad[0], lonLatRad[1], lonLatRad[2], nogoWeight));
}
return nogoList;
}
public List<OsmNodeNamed> readNogos(String nogoLons, String nogoLats, String nogoRadi) {
if (nogoLons == null || nogoLats == null || nogoRadi == null) return null;
List<OsmNodeNamed> nogoList = new ArrayList<>();
String[] lons = nogoLons.split(",");
String[] lats = nogoLats.split(",");
String[] radi = nogoRadi.split(",");
String nogoWeight = "undefined";
for (int i = 0; i < lons.length && i < lats.length && i < radi.length; i++) {
OsmNodeNamed n = readNogo(lons[i].trim(), lats[i].trim(), radi[i].trim(), nogoWeight);
nogoList.add(n);
}
return nogoList;
}
private OsmNodeNamed readNogo(String lon, String lat, String radius, String nogoWeight) {
double weight = "undefined".equals(nogoWeight) ? Double.NaN : Double.parseDouble(nogoWeight);
return readNogo(Double.parseDouble(lon), Double.parseDouble(lat), (int) Double.parseDouble(radius), weight);
}
private OsmNodeNamed readNogo(double lon, double lat, int radius, double nogoWeight) {
OsmNodeNamed n = new OsmNodeNamed();
n.name = "nogo" + radius;
n.ilon = (int) ((lon + 180.) * 1000000. + 0.5);
n.ilat = (int) ((lat + 90.) * 1000000. + 0.5);
n.isNogo = true;
n.nogoWeight = nogoWeight;
return n;
}
}

View file

@ -8,83 +8,80 @@ package btools.router;
import btools.mapaccess.OsmNode;
public final class SearchBoundary
{
public final class SearchBoundary {
private int minlon0;
private int minlat0;
private int maxlon0;
private int maxlat0;
private int minlon0;
private int minlat0;
private int maxlon0;
private int maxlat0;
private int minlon;
private int minlat;
private int maxlon;
private int maxlat;
private int radius;
private OsmNode p;
private int minlon;
private int minlat;
private int maxlon;
private int maxlat;
private int radius;
private OsmNode p;
int direction;
int direction;
/**
* @param radius Search radius in meters.
*/
public SearchBoundary( OsmNode n, int radius, int direction )
{
this.radius = radius;
this.direction = direction;
/**
* @param radius Search radius in meters.
*/
public SearchBoundary(OsmNode n, int radius, int direction) {
this.radius = radius;
this.direction = direction;
p = new OsmNode( n.ilon, n.ilat );
p = new OsmNode(n.ilon, n.ilat);
int lon = (n.ilon / 5000000 ) * 5000000;
int lat = (n.ilat / 5000000 ) * 5000000;
int lon = (n.ilon / 5000000) * 5000000;
int lat = (n.ilat / 5000000) * 5000000;
minlon0 = lon - 5000000;
minlat0 = lat - 5000000;
maxlon0 = lon + 10000000;
maxlat0 = lat + 10000000;
minlon0 = lon - 5000000;
minlat0 = lat - 5000000;
maxlon0 = lon + 10000000;
maxlat0 = lat + 10000000;
minlon = lon - 1000000;
minlat = lat - 1000000;
maxlon = lon + 6000000;
maxlat = lat + 6000000;
minlon = lon - 1000000;
minlat = lat - 1000000;
maxlon = lon + 6000000;
maxlat = lat + 6000000;
}
public static String getFileName(OsmNode n) {
int lon = (n.ilon / 5000000) * 5000000;
int lat = (n.ilat / 5000000) * 5000000;
int dlon = lon / 1000000 - 180;
int dlat = lat / 1000000 - 90;
String slon = dlon < 0 ? "W" + (-dlon) : "E" + dlon;
String slat = dlat < 0 ? "S" + (-dlat) : "N" + dlat;
return slon + "_" + slat + ".trf";
}
public boolean isInBoundary(OsmNode n, int cost) {
if (radius > 0) {
return n.calcDistance(p) < radius;
}
public static String getFileName( OsmNode n )
{
int lon = (n.ilon / 5000000 ) * 5000000;
int lat = (n.ilat / 5000000 ) * 5000000;
int dlon = lon / 1000000 -180;
int dlat = lat / 1000000 - 90;
String slon = dlon < 0 ? "W" + (-dlon) : "E" + dlon;
String slat = dlat < 0 ? "S" + (-dlat) : "N" + dlat;
return slon + "_" + slat + ".trf";
if (cost == 0) {
return n.ilon > minlon0 && n.ilon < maxlon0 && n.ilat > minlat0 && n.ilat < maxlat0;
}
return n.ilon > minlon && n.ilon < maxlon && n.ilat > minlat && n.ilat < maxlat;
}
public boolean isInBoundary( OsmNode n, int cost )
{
if ( radius > 0 )
{
return n.calcDistance( p ) < radius;
}
if ( cost == 0 )
{
return n.ilon > minlon0 && n.ilon < maxlon0 && n.ilat > minlat0 && n.ilat < maxlat0;
}
return n.ilon > minlon && n.ilon < maxlon && n.ilat > minlat && n.ilat < maxlat;
}
public int getBoundaryDistance( OsmNode n )
{
switch( direction )
{
case 0: return n.calcDistance( new OsmNode( n.ilon, minlat ) );
case 1: return n.calcDistance( new OsmNode( minlon, n.ilat ) );
case 2: return n.calcDistance( new OsmNode( n.ilon, maxlat ) );
case 3: return n.calcDistance( new OsmNode( maxlon, n.ilat ) );
default: throw new IllegalArgumentException( "undefined direction: "+ direction );
}
public int getBoundaryDistance(OsmNode n) {
switch (direction) {
case 0:
return n.calcDistance(new OsmNode(n.ilon, minlat));
case 1:
return n.calcDistance(new OsmNode(minlon, n.ilat));
case 2:
return n.calcDistance(new OsmNode(n.ilon, maxlat));
case 3:
return n.calcDistance(new OsmNode(maxlon, n.ilat));
default:
throw new IllegalArgumentException("undefined direction: " + direction);
}
}
}

View file

@ -12,15 +12,12 @@ import btools.expressions.BExpressionContextNode;
import btools.expressions.BExpressionContextWay;
final class StdModel extends OsmPathModel
{
public OsmPrePath createPrePath()
{
final class StdModel extends OsmPathModel {
public OsmPrePath createPrePath() {
return null;
}
public OsmPath createPath()
{
public OsmPath createPath() {
return new StdPath();
}
@ -29,11 +26,10 @@ final class StdModel extends OsmPathModel
@Override
public void init( BExpressionContextWay expctxWay, BExpressionContextNode expctxNode, Map<String,String> keyValues )
{
public void init(BExpressionContextWay expctxWay, BExpressionContextNode expctxNode, Map<String, String> keyValues) {
ctxWay = expctxWay;
ctxNode = expctxNode;
BExpressionContext expctxGlobal = expctxWay; // just one of them...
}

View file

@ -5,10 +5,7 @@
*/
package btools.router;
import btools.util.FastMath;
final class StdPath extends OsmPath
{
final class StdPath extends OsmPath {
/**
* The elevation-hysteresis-buffer (0-10 m)
*/
@ -19,13 +16,15 @@ final class StdPath extends OsmPath
private float totalEnergy; // total route energy (Joule)
private float elevation_buffer; // just another elevation buffer (for travel time)
private int uphillcostdiv;
private int downhillcostdiv;
// Gravitational constant, g
private static final double GRAVITY = 9.81; // in meters per second^(-2)
@Override
public void init( OsmPath orig )
{
StdPath origin = (StdPath)orig;
public void init(OsmPath orig) {
StdPath origin = (StdPath) orig;
this.ehbd = origin.ehbd;
this.ehbu = origin.ehbu;
this.totalTime = origin.totalTime;
@ -34,34 +33,63 @@ final class StdPath extends OsmPath
}
@Override
protected void resetState()
{
protected void resetState() {
ehbd = 0;
ehbu = 0;
totalTime = 0.f;
totalEnergy = 0.f;
uphillcostdiv = 0;
downhillcostdiv = 0;
elevation_buffer = 0.f;
}
@Override
protected double processWaySection( RoutingContext rc, double distance, double delta_h, double elevation, double angle, double cosangle, boolean isStartpoint, int nsection, int lastpriorityclassifier )
{
protected double processWaySection(RoutingContext rc, double distance, double delta_h, double elevation, double angle, double cosangle, boolean isStartpoint, int nsection, int lastpriorityclassifier) {
// calculate the costfactor inputs
float turncostbase = rc.expctxWay.getTurncost();
float uphillcutoff = rc.expctxWay.getUphillcutoff() * 10000;
float downhillcutoff = rc.expctxWay.getDownhillcutoff() * 10000;
float uphillmaxslope = rc.expctxWay.getUphillmaxslope() * 10000;
float downhillmaxslope = rc.expctxWay.getDownhillmaxslope() * 10000;
float cfup = rc.expctxWay.getUphillCostfactor();
float cfdown = rc.expctxWay.getDownhillCostfactor();
float cf = rc.expctxWay.getCostfactor();
cfup = cfup == 0.f ? cf : cfup;
cfdown = cfdown == 0.f ? cf : cfdown;
int dist = (int)distance; // legacy arithmetics needs int
downhillcostdiv = (int) rc.expctxWay.getDownhillcost();
if (downhillcostdiv > 0) {
downhillcostdiv = 1000000 / downhillcostdiv;
}
int downhillmaxslopecostdiv = (int) rc.expctxWay.getDownhillmaxslopecost();
if (downhillmaxslopecostdiv > 0) {
downhillmaxslopecostdiv = 1000000 / downhillmaxslopecostdiv;
} else {
// if not given, use legacy behavior
downhillmaxslopecostdiv = downhillcostdiv;
}
uphillcostdiv = (int) rc.expctxWay.getUphillcost();
if (uphillcostdiv > 0) {
uphillcostdiv = 1000000 / uphillcostdiv;
}
int uphillmaxslopecostdiv = (int) rc.expctxWay.getUphillmaxslopecost();
if (uphillmaxslopecostdiv > 0) {
uphillmaxslopecostdiv = 1000000 / uphillmaxslopecostdiv;
} else {
// if not given, use legacy behavior
uphillmaxslopecostdiv = uphillcostdiv;
}
int dist = (int) distance; // legacy arithmetics needs int
// penalty for turning angle
int turncost = (int)((1.-cosangle) * turncostbase + 0.2 ); // e.g. turncost=90 -> 90 degree = 90m penalty
if ( message != null )
{
int turncost = (int) ((1. - cosangle) * turncostbase + 0.2); // e.g. turncost=90 -> 90 degree = 90m penalty
if (message != null) {
message.linkturncost += turncost;
message.turnangle = (float)angle;
message.turnangle = (float) angle;
}
double sectionCost = turncost;
@ -70,81 +98,78 @@ final class StdPath extends OsmPath
// only the part of the descend that does not fit into the elevation-hysteresis-buffers
// leads to an immediate penalty
int delta_h_micros = (int)(1000000. * delta_h);
ehbd += -delta_h_micros - dist * rc.downhillcutoff;
ehbu += delta_h_micros - dist * rc.uphillcutoff;
int delta_h_micros = (int) (1000000. * delta_h);
ehbd += -delta_h_micros - dist * downhillcutoff;
ehbu += delta_h_micros - dist * uphillcutoff;
float downweight = 0.f;
if ( ehbd > rc.elevationpenaltybuffer )
{
if (ehbd > rc.elevationpenaltybuffer) {
downweight = 1.f;
int excess = ehbd - rc.elevationpenaltybuffer;
int reduce = dist * rc.elevationbufferreduce;
if ( reduce > excess )
{
downweight = ((float)excess)/reduce;
if (reduce > excess) {
downweight = ((float) excess) / reduce;
reduce = excess;
}
excess = ehbd - rc.elevationmaxbuffer;
if ( reduce < excess )
{
if (reduce < excess) {
reduce = excess;
}
ehbd -= reduce;
if ( rc.downhillcostdiv > 0 )
{
int elevationCost = reduce/rc.downhillcostdiv;
float elevationCost = 0.f;
if (downhillcostdiv > 0) {
elevationCost += Math.min(reduce, dist * downhillmaxslope) / downhillcostdiv;
}
if (downhillmaxslopecostdiv > 0) {
elevationCost += Math.max(0, reduce - dist * downhillmaxslope) / downhillmaxslopecostdiv;
}
if (elevationCost > 0) {
sectionCost += elevationCost;
if ( message != null )
{
if (message != null) {
message.linkelevationcost += elevationCost;
}
}
}
else if ( ehbd < 0 )
{
} else if (ehbd < 0) {
ehbd = 0;
}
float upweight = 0.f;
if ( ehbu > rc.elevationpenaltybuffer )
{
if (ehbu > rc.elevationpenaltybuffer) {
upweight = 1.f;
int excess = ehbu - rc.elevationpenaltybuffer;
int reduce = dist * rc.elevationbufferreduce;
if ( reduce > excess )
{
upweight = ((float)excess)/reduce;
if (reduce > excess) {
upweight = ((float) excess) / reduce;
reduce = excess;
}
excess = ehbu - rc.elevationmaxbuffer;
if ( reduce < excess )
{
if (reduce < excess) {
reduce = excess;
}
ehbu -= reduce;
if ( rc.uphillcostdiv > 0 )
{
int elevationCost = reduce/rc.uphillcostdiv;
float elevationCost = 0.f;
if (uphillcostdiv > 0) {
elevationCost += Math.min(reduce, dist * uphillmaxslope) / uphillcostdiv;
}
if (uphillmaxslopecostdiv > 0) {
elevationCost += Math.max(0, reduce - dist * uphillmaxslope) / uphillmaxslopecostdiv;
}
if (elevationCost > 0) {
sectionCost += elevationCost;
if ( message != null )
{
if (message != null) {
message.linkelevationcost += elevationCost;
}
}
}
else if ( ehbu < 0 )
{
} else if (ehbu < 0) {
ehbu = 0;
}
// get the effective costfactor (slope dependent)
float costfactor = cfup*upweight + cf*(1.f - upweight - downweight) + cfdown*downweight;
float costfactor = cfup * upweight + cf * (1.f - upweight - downweight) + cfdown * downweight;
if ( message != null )
{
if (message != null) {
message.costfactor = costfactor;
}
@ -154,22 +179,18 @@ final class StdPath extends OsmPath
}
@Override
protected double processTargetNode( RoutingContext rc )
{
protected double processTargetNode(RoutingContext rc) {
// finally add node-costs for target node
if ( targetNode.nodeDescription != null )
{
if (targetNode.nodeDescription != null) {
boolean nodeAccessGranted = rc.expctxWay.getNodeAccessGranted() != 0.;
rc.expctxNode.evaluate( nodeAccessGranted , targetNode.nodeDescription );
rc.expctxNode.evaluate(nodeAccessGranted, targetNode.nodeDescription);
float initialcost = rc.expctxNode.getInitialcost();
if ( initialcost >= 1000000. )
{
if (initialcost >= 1000000.) {
return -1.;
}
if ( message != null )
{
message.linknodecost += (int)initialcost;
message.nodeKeyValues = rc.expctxNode.getKeyValueDescription( nodeAccessGranted, targetNode.nodeDescription );
if (message != null) {
message.linknodecost += (int) initialcost;
message.nodeKeyValues = rc.expctxNode.getKeyValueDescription(nodeAccessGranted, targetNode.nodeDescription);
}
return initialcost;
}
@ -177,118 +198,88 @@ final class StdPath extends OsmPath
}
@Override
public int elevationCorrection( RoutingContext rc )
{
return ( rc.downhillcostdiv > 0 ? ehbd/rc.downhillcostdiv : 0 )
+ ( rc.uphillcostdiv > 0 ? ehbu/rc.uphillcostdiv : 0 );
public int elevationCorrection() {
return (downhillcostdiv > 0 ? ehbd / downhillcostdiv : 0)
+ (uphillcostdiv > 0 ? ehbu / uphillcostdiv : 0);
}
@Override
public boolean definitlyWorseThan( OsmPath path, RoutingContext rc )
{
StdPath p = (StdPath)path;
public boolean definitlyWorseThan(OsmPath path) {
StdPath p = (StdPath) path;
int c = p.cost;
if ( rc.downhillcostdiv > 0 )
{
int delta = p.ehbd - ehbd;
if ( delta > 0 ) c += delta/rc.downhillcostdiv;
}
if ( rc.uphillcostdiv > 0 )
{
int delta = p.ehbu - ehbu;
if ( delta > 0 ) c += delta/rc.uphillcostdiv;
}
int c = p.cost;
if (p.downhillcostdiv > 0) {
int delta = p.ehbd / p.downhillcostdiv - (downhillcostdiv > 0 ? ehbd / downhillcostdiv : 0);
if (delta > 0) c += delta;
}
if (p.uphillcostdiv > 0) {
int delta = p.ehbu / p.uphillcostdiv - (uphillcostdiv > 0 ? ehbu / uphillcostdiv : 0);
if (delta > 0) c += delta;
}
return cost > c;
return cost > c;
}
private double calcIncline( double dist )
{
private double calcIncline(double dist) {
double min_delta = 3.;
double shift;
if ( elevation_buffer > min_delta )
{
double shift = 0.;
if (elevation_buffer > min_delta) {
shift = -min_delta;
} else if (elevation_buffer < -min_delta) {
shift = min_delta;
}
else if ( elevation_buffer < min_delta )
{
shift = -min_delta;
}
else
{
return 0.;
}
double decayFactor = FastMath.exp( - dist / 100. );
float new_elevation_buffer = (float)( (elevation_buffer+shift) * decayFactor - shift);
double incline = ( elevation_buffer - new_elevation_buffer ) / dist;
double decayFactor = Math.exp(-dist / 100.);
float new_elevation_buffer = (float) ((elevation_buffer + shift) * decayFactor - shift);
double incline = (elevation_buffer - new_elevation_buffer) / dist;
elevation_buffer = new_elevation_buffer;
return incline;
}
@Override
protected void computeKinematic( RoutingContext rc, double dist, double delta_h, boolean detailMode )
{
if ( !detailMode )
{
protected void computeKinematic(RoutingContext rc, double dist, double delta_h, boolean detailMode) {
if (!detailMode) {
return;
}
// compute incline
elevation_buffer += delta_h;
double incline = calcIncline( dist );
double incline = calcIncline(dist);
double wayMaxspeed;
wayMaxspeed = rc.expctxWay.getMaxspeed() / 3.6f;
if (wayMaxspeed == 0)
{
wayMaxspeed = rc.maxSpeed;
double maxSpeed = rc.maxSpeed;
double speedLimit = rc.expctxWay.getMaxspeed() / 3.6f;
if (speedLimit > 0) {
maxSpeed = Math.min(maxSpeed, speedLimit);
}
wayMaxspeed = Math.min(wayMaxspeed,rc.maxSpeed);
double speed; // Travel speed
double f_roll = rc.totalMass * GRAVITY * ( rc.defaultC_r + incline );
if (rc.footMode || rc.expctxWay.getCostfactor() > 4.9 )
{
double speed = maxSpeed; // Travel speed
double f_roll = rc.totalMass * GRAVITY * (rc.defaultC_r + incline);
if (rc.footMode) {
// Use Tobler's hiking function for walking sections
speed = rc.maxSpeed * 3.6;
speed = (speed * FastMath.exp(-3.5 * Math.abs( incline + 0.05))) / 3.6;
speed = rc.maxSpeed * Math.exp(-3.5 * Math.abs(incline + 0.05));
} else if (rc.bikeMode) {
speed = solveCubic(rc.S_C_x, f_roll, rc.bikerPower);
speed = Math.min(speed, maxSpeed);
}
else if (rc.bikeMode)
{
speed = solveCubic( rc.S_C_x, f_roll, rc.bikerPower );
speed = Math.min(speed, wayMaxspeed);
}
else // all other
{
speed = wayMaxspeed;
}
float dt = (float) ( dist / speed );
float dt = (float) (dist / speed);
totalTime += dt;
// Calc energy assuming biking (no good model yet for hiking)
// (Count only positive, negative would mean breaking to enforce maxspeed)
double energy = dist*(rc.S_C_x*speed*speed + f_roll);
if ( energy > 0. )
{
double energy = dist * (rc.S_C_x * speed * speed + f_roll);
if (energy > 0.) {
totalEnergy += energy;
}
}
private static double solveCubic( double a, double c, double d )
{
private static double solveCubic(double a, double c, double d) {
// Solves a * v^3 + c * v = d with a Newton method
// to get the speed v for the section.
double v = 8.;
boolean findingStartvalue = true;
for ( int i = 0; i < 10; i++ )
{
double y = ( a * v * v + c ) * v - d;
if ( y < .1 )
{
if ( findingStartvalue )
{
for (int i = 0; i < 10; i++) {
double y = (a * v * v + c) * v - d;
if (y < .1) {
if (findingStartvalue) {
v *= 2.;
continue;
}
@ -302,14 +293,12 @@ final class StdPath extends OsmPath
}
@Override
public double getTotalTime()
{
public double getTotalTime() {
return totalTime;
}
@Override
public double getTotalEnergy()
{
public double getTotalEnergy() {
return totalEnergy;
}
}

View file

@ -0,0 +1,59 @@
package btools.router;
import java.util.Map;
public class SuspectInfo {
public static final int TRIGGER_DEAD_END = 1;
public static final int TRIGGER_DEAD_START = 2;
public static final int TRIGGER_NODE_BLOCK = 4;
public static final int TRIGGER_BAD_ACCESS = 8;
public static final int TRIGGER_UNK_ACCESS = 16;
public static final int TRIGGER_SHARP_EXIT = 32;
public static final int TRIGGER_SHARP_ENTRY = 64;
public static final int TRIGGER_SHARP_LINK = 128;
public static final int TRIGGER_BAD_TR = 256;
public int prio;
public int triggers;
public static void addSuspect(Map<Long, SuspectInfo> map, long id, int prio, int trigger) {
Long iD = id;
SuspectInfo info = map.get(iD);
if (info == null) {
info = new SuspectInfo();
map.put(iD, info);
}
info.prio = Math.max(info.prio, prio);
info.triggers |= trigger;
}
public static SuspectInfo addTrigger(SuspectInfo old, int prio, int trigger) {
if (old == null) {
old = new SuspectInfo();
}
old.prio = Math.max(old.prio, prio);
old.triggers |= trigger;
return old;
}
public static String getTriggerText(int triggers) {
StringBuilder sb = new StringBuilder();
addText(sb, "dead-end", triggers, TRIGGER_DEAD_END);
addText(sb, "dead-start", triggers, TRIGGER_DEAD_START);
addText(sb, "node-block", triggers, TRIGGER_NODE_BLOCK);
addText(sb, "bad-access", triggers, TRIGGER_BAD_ACCESS);
addText(sb, "unkown-access", triggers, TRIGGER_UNK_ACCESS);
addText(sb, "sharp-exit", triggers, TRIGGER_SHARP_EXIT);
addText(sb, "sharp-entry", triggers, TRIGGER_SHARP_ENTRY);
addText(sb, "sharp-link", triggers, TRIGGER_SHARP_LINK);
addText(sb, "bad-tr", triggers, TRIGGER_BAD_TR);
return sb.toString();
}
private static void addText(StringBuilder sb, String text, int mask, int bit) {
if ((bit & mask) == 0) return;
if (sb.length() > 0) sb.append(",");
sb.append(text);
}
}

View file

@ -9,8 +9,7 @@ package btools.router;
import java.util.ArrayList;
import java.util.List;
public class VoiceHint
{
public class VoiceHint {
static final int C = 1; // continue (go straight)
static final int TL = 2; // turn left
static final int TSLL = 3; // turn slightly left
@ -20,11 +19,13 @@ public class VoiceHint
static final int TSHR = 7; // turn sharply right
static final int KL = 8; // keep left
static final int KR = 9; // keep right
static final int TU = 10; // U-turn
static final int TLU = 10; // U-turn
static final int TRU = 11; // Right U-turn
static final int OFFR = 12; // Off route
static final int RNDB = 13; // Roundabout
static final int RNLB = 14; // Roundabout left
static final int TU = 15; // 180 degree u-turn
static final int BL = 16; // Beeline routing
int ilon;
int ilat;
@ -36,280 +37,566 @@ public class VoiceHint
double distanceToNext;
int indexInTrack;
public float getTime()
{
public float getTime() {
return oldWay == null ? 0.f : oldWay.time;
}
float angle;
float angle = Float.MAX_VALUE;
boolean turnAngleConsumed;
boolean needsRealTurn;
int maxBadPrio = -1;
int roundaboutExit;
boolean isRoundabout()
{
boolean isRoundabout() {
return roundaboutExit != 0;
}
public void addBadWay( MessageData badWay )
{
if ( badWay == null )
{
public void addBadWay(MessageData badWay) {
if (badWay == null) {
return;
}
if ( badWays == null )
{
badWays = new ArrayList<MessageData>();
if (badWays == null) {
badWays = new ArrayList<>();
}
badWays.add( badWay );
badWays.add(badWay);
}
public int getCommand()
{
return cmd;
public int getJsonCommandIndex() {
switch (cmd) {
case TLU:
return 10;
case TU:
return 15;
case TSHL:
return 4;
case TL:
return 2;
case TSLL:
return 3;
case KL:
return 8;
case C:
return 1;
case KR:
return 9;
case TSLR:
return 6;
case TR:
return 5;
case TSHR:
return 7;
case TRU:
return 11;
case RNDB:
return 13;
case RNLB:
return 14;
case BL:
return 16;
case OFFR:
return 12;
default:
throw new IllegalArgumentException("unknown command: " + cmd);
}
}
public int getExitNumber()
{
public int getExitNumber() {
return roundaboutExit;
}
public String getCommandString()
{
switch ( cmd )
{
case TU : return "TU";
case TSHL : return "TSHL";
case TL : return "TL";
case TSLL : return "TSLL";
case KL : return "KL";
case C : return "C";
case KR : return "KR";
case TSLR : return "TSLR";
case TR : return "TR";
case TSHR : return "TSHR";
case TRU : return "TRU";
case RNDB : return "RNDB" + roundaboutExit;
case RNLB : return "RNLB" + (-roundaboutExit);
default : throw new IllegalArgumentException( "unknown command: " + cmd );
}
}
public String getSymbolString()
{
switch ( cmd )
{
case TU : return "TU";
case TSHL : return "TSHL";
case TL : return "Left";
case TSLL : return "TSLL";
case KL : return "TSLL"; // ?
case C : return "Straight";
case KR : return "TSLR"; // ?
case TSLR : return "TSLR";
case TR : return "Right";
case TSHR : return "TSHR";
case TRU : return "TU";
case RNDB : return "RNDB" + roundaboutExit;
case RNLB : return "RNLB" + (-roundaboutExit);
default : throw new IllegalArgumentException( "unknown command: " + cmd );
/*
* used by comment style, osmand style
*/
public String getCommandString() {
switch (cmd) {
case TLU:
return "TU"; // should be changed to TLU when osmand uses new voice hint constants
case TU:
return "TU";
case TSHL:
return "TSHL";
case TL:
return "TL";
case TSLL:
return "TSLL";
case KL:
return "KL";
case C:
return "C";
case KR:
return "KR";
case TSLR:
return "TSLR";
case TR:
return "TR";
case TSHR:
return "TSHR";
case TRU:
return "TRU";
case RNDB:
return "RNDB" + roundaboutExit;
case RNLB:
return "RNLB" + (-roundaboutExit);
case BL:
return "BL";
case OFFR:
return "OFFR";
default:
throw new IllegalArgumentException("unknown command: " + cmd);
}
}
public String getMessageString()
{
switch ( cmd )
{
case TU : return "u-turn";
case TSHL : return "sharp left";
case TL : return "left";
case TSLL : return "slight left";
case KL : return "keep left";
case C : return "straight";
case KR : return "keep right";
case TSLR : return "slight right";
case TR : return "right";
case TSHR : return "sharp right";
case TRU : return "u-turn";
case RNDB : return "Take exit " + roundaboutExit;
case RNLB : return "Take exit " + (-roundaboutExit);
default : throw new IllegalArgumentException( "unknown command: " + cmd );
/*
* used by trkpt/sym style
*/
public String getCommandString(int c) {
switch (c) {
case TLU:
return "TLU";
case TU:
return "TU";
case TSHL:
return "TSHL";
case TL:
return "TL";
case TSLL:
return "TSLL";
case KL:
return "KL";
case C:
return "C";
case KR:
return "KR";
case TSLR:
return "TSLR";
case TR:
return "TR";
case TSHR:
return "TSHR";
case TRU:
return "TRU";
case RNDB:
return "RNDB" + roundaboutExit;
case RNLB:
return "RNLB" + (-roundaboutExit);
case BL:
return "BL";
case OFFR:
return "OFFR";
default:
return "unknown command: " + c;
}
}
public int getLocusAction()
{
switch ( cmd )
{
case TU : return 13;
case TSHL : return 5;
case TL : return 4;
case TSLL : return 3;
case KL : return 9; // ?
case C : return 1;
case KR : return 10; // ?
case TSLR : return 6;
case TR : return 7;
case TSHR : return 8;
case TRU : return 14;
case RNDB : return 26 + roundaboutExit;
case RNLB : return 26 - roundaboutExit;
default : throw new IllegalArgumentException( "unknown command: " + cmd );
/*
* used by gpsies style
*/
public String getSymbolString() {
switch (cmd) {
case TLU:
return "TU";
case TU:
return "TU";
case TSHL:
return "TSHL";
case TL:
return "Left";
case TSLL:
return "TSLL";
case KL:
return "TSLL"; // ?
case C:
return "Straight";
case KR:
return "TSLR"; // ?
case TSLR:
return "TSLR";
case TR:
return "Right";
case TSHR:
return "TSHR";
case TRU:
return "TU";
case RNDB:
return "RNDB" + roundaboutExit;
case RNLB:
return "RNLB" + (-roundaboutExit);
case BL:
return "BL";
case OFFR:
return "OFFR";
default:
throw new IllegalArgumentException("unknown command: " + cmd);
}
}
public int getOruxAction()
{
switch ( cmd )
{
case TU : return 1003;
case TSHL : return 1019;
case TL : return 1000;
case TSLL : return 1017;
case KL : return 1015; // ?
case C : return 1002;
case KR : return 1014; // ?
case TSLR : return 1016;
case TR : return 1001;
case TSHR : return 1018;
case TRU : return 1003;
case RNDB : return 1008 + roundaboutExit;
case RNLB : return 1008 + roundaboutExit;
default : throw new IllegalArgumentException( "unknown command: " + cmd );
}
/*
* used by new locus trkpt style
*/
public String getLocusSymbolString() {
switch (cmd) {
case TLU:
return "u-turn_left";
case TU:
return "u-turn";
case TSHL:
return "left_sharp";
case TL:
return "left";
case TSLL:
return "left_slight";
case KL:
return "stay_left"; // ?
case C:
return "straight";
case KR:
return "stay_right"; // ?
case TSLR:
return "right_slight";
case TR:
return "right";
case TSHR:
return "right_sharp";
case TRU:
return "u-turn_right";
case RNDB:
return "roundabout_e" + roundaboutExit;
case RNLB:
return "roundabout_e" + (-roundaboutExit);
case BL:
return "beeline";
default:
throw new IllegalArgumentException("unknown command: " + cmd);
}
}
public void calcCommand()
{
/*
* used by osmand style
*/
public String getMessageString() {
switch (cmd) {
case TLU:
return "u-turn"; // should be changed to u-turn-left when osmand uses new voice hint constants
case TU:
return "u-turn";
case TSHL:
return "sharp left";
case TL:
return "left";
case TSLL:
return "slight left";
case KL:
return "keep left";
case C:
return "straight";
case KR:
return "keep right";
case TSLR:
return "slight right";
case TR:
return "right";
case TSHR:
return "sharp right";
case TRU:
return "u-turn"; // should be changed to u-turn-right when osmand uses new voice hint constants
case RNDB:
return "Take exit " + roundaboutExit;
case RNLB:
return "Take exit " + (-roundaboutExit);
default:
throw new IllegalArgumentException("unknown command: " + cmd);
}
}
/*
* used by old locus style
*/
public int getLocusAction() {
switch (cmd) {
case TLU:
return 13;
case TU:
return 12;
case TSHL:
return 5;
case TL:
return 4;
case TSLL:
return 3;
case KL:
return 9; // ?
case C:
return 1;
case KR:
return 10; // ?
case TSLR:
return 6;
case TR:
return 7;
case TSHR:
return 8;
case TRU:
return 14;
case RNDB:
return 26 + roundaboutExit;
case RNLB:
return 26 - roundaboutExit;
default:
throw new IllegalArgumentException("unknown command: " + cmd);
}
}
/*
* used by orux style
*/
public int getOruxAction() {
switch (cmd) {
case TLU:
return 1003;
case TU:
return 1003;
case TSHL:
return 1019;
case TL:
return 1000;
case TSLL:
return 1017;
case KL:
return 1015; // ?
case C:
return 1002;
case KR:
return 1014; // ?
case TSLR:
return 1016;
case TR:
return 1001;
case TSHR:
return 1018;
case TRU:
return 1003;
case RNDB:
return 1008 + roundaboutExit;
case RNLB:
return 1008 + roundaboutExit;
default:
throw new IllegalArgumentException("unknown command: " + cmd);
}
}
/*
* used by cruiser, equivalent to getCommandString() - osmand style - when osmand changes the voice hint constants
*/
public String getCruiserCommandString() {
switch (cmd) {
case TLU:
return "TLU";
case TU:
return "TU";
case TSHL:
return "TSHL";
case TL:
return "TL";
case TSLL:
return "TSLL";
case KL:
return "KL";
case C:
return "C";
case KR:
return "KR";
case TSLR:
return "TSLR";
case TR:
return "TR";
case TSHR:
return "TSHR";
case TRU:
return "TRU";
case RNDB:
return "RNDB" + roundaboutExit;
case RNLB:
return "RNLB" + (-roundaboutExit);
case BL:
return "BL";
case OFFR:
return "OFFR";
default:
throw new IllegalArgumentException("unknown command: " + cmd);
}
}
/*
* used by cruiser, equivalent to getMessageString() - osmand style - when osmand changes the voice hint constants
*/
public String getCruiserMessageString() {
switch (cmd) {
case TLU:
return "u-turn left";
case TU:
return "u-turn";
case TSHL:
return "sharp left";
case TL:
return "left";
case TSLL:
return "slight left";
case KL:
return "keep left";
case C:
return "straight";
case KR:
return "keep right";
case TSLR:
return "slight right";
case TR:
return "right";
case TSHR:
return "sharp right";
case TRU:
return "u-turn right";
case RNDB:
return "take exit " + roundaboutExit;
case RNLB:
return "take exit " + (-roundaboutExit);
case BL:
return "beeline";
case OFFR:
return "offroad";
default:
throw new IllegalArgumentException("unknown command: " + cmd);
}
}
public void calcCommand() {
float lowerBadWayAngle = -181;
float higherBadWayAngle = 181;
if ( badWays != null )
{
for ( MessageData badWay : badWays )
{
if ( badWay.isBadOneway() )
{
if (badWays != null) {
for (MessageData badWay : badWays) {
if (badWay.isBadOneway()) {
continue;
}
if ( lowerBadWayAngle < badWay.turnangle && badWay.turnangle < goodWay.turnangle )
{
if (lowerBadWayAngle < badWay.turnangle && badWay.turnangle < goodWay.turnangle) {
lowerBadWayAngle = badWay.turnangle;
}
if ( higherBadWayAngle > badWay.turnangle && badWay.turnangle > goodWay.turnangle )
{
if (higherBadWayAngle > badWay.turnangle && badWay.turnangle > goodWay.turnangle) {
higherBadWayAngle = badWay.turnangle;
}
}
}
float cmdAngle= angle;
float cmdAngle = angle;
// fall back to local angle if otherwise inconsistent
if ( lowerBadWayAngle > angle || higherBadWayAngle < angle )
{
//if ( lowerBadWayAngle > angle || higherBadWayAngle < angle )
//{
//cmdAngle = goodWay.turnangle;
//}
if (angle == Float.MAX_VALUE) {
cmdAngle = goodWay.turnangle;
}
if (cmd == BL) return;
if (roundaboutExit > 0)
{
if (roundaboutExit > 0) {
cmd = RNDB;
}
else if (roundaboutExit < 0)
{
} else if (roundaboutExit < 0) {
cmd = RNLB;
}
else if ( cmdAngle < -159. )
{
} else if (is180DegAngle(cmdAngle) && cmdAngle <= -179.f && higherBadWayAngle == 181.f && lowerBadWayAngle == -181.f) {
cmd = TU;
}
else if ( cmdAngle < -135. )
{
} else if (cmdAngle < -159.f) {
cmd = TLU;
} else if (cmdAngle < -135.f) {
cmd = TSHL;
}
else if ( cmdAngle < -45. )
{
} else if (cmdAngle < -45.f) {
// a TL can be pushed in either direction by a close-by alternative
if ( higherBadWayAngle > -90. && higherBadWayAngle < -15. && lowerBadWayAngle < -180. )
{
if (cmdAngle < -95.f && higherBadWayAngle < -30.f && lowerBadWayAngle < -180.f) {
cmd = TSHL;
} else if (cmdAngle > -85.f && lowerBadWayAngle > -180.f && higherBadWayAngle > -10.f) {
cmd = TSLL;
} else {
if (cmdAngle < -110.f) {
cmd = TSHL;
} else if (cmdAngle > -60.f) {
cmd = TSLL;
} else {
cmd = TL;
}
}
else if ( lowerBadWayAngle > -180. && lowerBadWayAngle < -90. && higherBadWayAngle > 0. )
{
} else if (cmdAngle < -21.f) {
if (cmd != KR) { // don't overwrite KR with TSLL
cmd = TSLL;
}
else
{
cmd = TL;
}
}
else if ( cmdAngle < -21. )
{
if ( cmd != KR ) // don't overwrite KR with TSLL
{
} else if (cmdAngle < -5.f) {
if (lowerBadWayAngle < -100.f && higherBadWayAngle < 45.f) {
cmd = TSLL;
}
}
else if ( cmdAngle < 21. )
{
if ( cmd != KR && cmd != KL ) // don't overwrite KL/KR hints!
{
} else if (lowerBadWayAngle >= -100.f && higherBadWayAngle < 45.f) {
cmd = KL;
} else {
cmd = C;
}
}
else if ( cmdAngle < 45. )
{
if ( cmd != KL ) // don't overwrite KL with TSLR
{
cmd = TSLR;
} else if (cmdAngle < 5.f) {
if (lowerBadWayAngle > -30.f) {
cmd = KR;
} else if (higherBadWayAngle < 30.f) {
cmd = KL;
} else {
cmd = C;
}
}
else if ( cmdAngle < 135. )
{
} else if (cmdAngle < 21.f) {
// a TR can be pushed in either direction by a close-by alternative
if ( higherBadWayAngle > 90. && higherBadWayAngle < 180. && lowerBadWayAngle < 0. )
{
if (lowerBadWayAngle > -45.f && higherBadWayAngle > 100.f) {
cmd = TSLR;
} else if (lowerBadWayAngle > -45.f && higherBadWayAngle <= 100.f) {
cmd = KR;
} else {
cmd = C;
}
else if ( lowerBadWayAngle > 15. && lowerBadWayAngle < 90. && higherBadWayAngle > 180. )
{
} else if (cmdAngle < 45.f) {
cmd = TSLR;
} else if (cmdAngle < 135.f) {
if (cmdAngle < 85.f && higherBadWayAngle < 180.f && lowerBadWayAngle < 10.f) {
cmd = TSLR;
} else if (cmdAngle > 95.f && lowerBadWayAngle > 30.f && higherBadWayAngle > 180.f) {
cmd = TSHR;
} else {
if (cmdAngle > 110.) {
cmd = TSHR;
} else if (cmdAngle < 60.) {
cmd = TSLR;
} else {
cmd = TR;
}
}
else
{
cmd = TR;
}
}
else if ( cmdAngle < 159. )
{
} else if (cmdAngle < 159.f) {
cmd = TSHR;
}
else
{
} else if (is180DegAngle(cmdAngle) && cmdAngle >= 179.f && higherBadWayAngle == 181.f && lowerBadWayAngle == -181.f) {
cmd = TU;
} else {
cmd = TRU;
}
}
public String formatGeometry()
{
static boolean is180DegAngle(float angle) {
return (Math.abs(angle) <= 180.f && Math.abs(angle) >= 179.f);
}
public String formatGeometry() {
float oldPrio = oldWay == null ? 0.f : oldWay.priorityclassifier;
StringBuilder sb = new StringBuilder(30);
sb.append( ' ' ).append( (int)oldPrio );
appendTurnGeometry(sb,goodWay);
if ( badWays != null )
{
for ( MessageData badWay : badWays )
{
sb.append( " " );
appendTurnGeometry( sb, badWay );
sb.append(' ').append((int) oldPrio);
appendTurnGeometry(sb, goodWay);
if (badWays != null) {
for (MessageData badWay : badWays) {
sb.append(" ");
appendTurnGeometry(sb, badWay);
}
}
return sb.toString();
}
private void appendTurnGeometry( StringBuilder sb, MessageData msg )
{
sb.append( "(" ).append( (int)(msg.turnangle+0.5) ).append( ")" ).append( (int)(msg.priorityclassifier) );
private void appendTurnGeometry(StringBuilder sb, MessageData msg) {
sb.append("(").append((int) (msg.turnangle + 0.5)).append(")").append((int) (msg.priorityclassifier));
}
}

View file

@ -9,30 +9,51 @@ package btools.router;
import java.util.ArrayList;
import java.util.List;
public class VoiceHintList
{
private String transportMode;
int turnInstructionMode;
ArrayList<VoiceHint> list = new ArrayList<VoiceHint>();
public class VoiceHintList {
public void setTransportMode( boolean isCar, boolean isBike )
{
transportMode = isCar ? "car" : ( isBike ? "bike" : "foot" );
static final int TRANS_MODE_NONE = 0;
static final int TRANS_MODE_FOOT = 1;
static final int TRANS_MODE_BIKE = 2;
static final int TRANS_MODE_CAR = 3;
private int transportMode = TRANS_MODE_BIKE;
int turnInstructionMode;
List<VoiceHint> list = new ArrayList<>();
public void setTransportMode(boolean isCar, boolean isBike) {
transportMode = isCar ? TRANS_MODE_CAR : (isBike ? TRANS_MODE_BIKE : TRANS_MODE_FOOT);
}
public String getTransportMode()
{
public void setTransportMode(int mode) {
transportMode = mode;
}
public String getTransportMode() {
String ret;
switch (transportMode) {
case TRANS_MODE_FOOT:
ret = "foot";
break;
case TRANS_MODE_CAR:
ret = "car";
break;
case TRANS_MODE_BIKE:
default:
ret = "bike";
break;
}
return ret;
}
public int transportMode() {
return transportMode;
}
public int getLocusRouteType()
{
if ( "car".equals( transportMode ) )
{
public int getLocusRouteType() {
if (transportMode == TRANS_MODE_CAR) {
return 0;
}
if ( "bike".equals( transportMode ) )
{
if (transportMode == TRANS_MODE_BIKE) {
return 5;
}
return 3; // foot

View file

@ -8,26 +8,27 @@ package btools.router;
import java.util.ArrayList;
import java.util.List;
public final class VoiceHintProcessor
{
private double catchingRange; // range to catch angles and merge turns
private boolean explicitRoundabouts;
public final class VoiceHintProcessor {
public VoiceHintProcessor( double catchingRange, boolean explicitRoundabouts )
{
this.catchingRange = catchingRange;
double SIGNIFICANT_ANGLE = 22.5;
double INTERNAL_CATCHING_RANGE = 2.;
// private double catchingRange; // range to catch angles and merge turns
private boolean explicitRoundabouts;
private int transportMode;
public VoiceHintProcessor(double catchingRange, boolean explicitRoundabouts, int transportMode) {
// this.catchingRange = catchingRange;
this.explicitRoundabouts = explicitRoundabouts;
this.transportMode = transportMode;
}
private float sumNonConsumedWithinCatchingRange( List<VoiceHint> inputs, int offset )
{
private float sumNonConsumedWithinCatchingRange(List<VoiceHint> inputs, int offset) {
double distance = 0.;
float angle = 0.f;
while( offset >= 0 && distance < catchingRange )
{
VoiceHint input = inputs.get( offset-- );
if ( input.turnAngleConsumed )
{
while (offset >= 0 && distance < INTERNAL_CATCHING_RANGE) {
VoiceHint input = inputs.get(offset--);
if (input.turnAngleConsumed) {
break;
}
angle += input.goodWay.turnangle;
@ -44,10 +45,10 @@ public final class VoiceHintProcessor
* order (from target to start), but output is
* returned in travel-direction and only for
* those nodes that trigger a voice hint.
*
* <p>
* Input objects are expected for every segment
* of the track, also for those without a junction
*
* <p>
* VoiceHint objects in the output list are enriched
* by the voice-command, the total angle and the distance
* to the next hint
@ -55,56 +56,91 @@ public final class VoiceHintProcessor
* @param inputs tracknodes, un reverse order
* @return voice hints, in forward order
*/
public List<VoiceHint> process( List<VoiceHint> inputs )
{
List<VoiceHint> results = new ArrayList<VoiceHint>();
public List<VoiceHint> process(List<VoiceHint> inputs) {
List<VoiceHint> results = new ArrayList<>();
double distance = 0.;
float roundAboutTurnAngle = 0.f; // sums up angles in roundabout
int roundaboutExit = 0;
int roundaboudStartIdx = -1;
for ( int hintIdx = 0; hintIdx < inputs.size(); hintIdx++ )
{
VoiceHint input = inputs.get( hintIdx );
for (int hintIdx = 0; hintIdx < inputs.size(); hintIdx++) {
VoiceHint input = inputs.get(hintIdx);
if (input.cmd == VoiceHint.BL) {
results.add(input);
continue;
}
float turnAngle = input.goodWay.turnangle;
distance += input.goodWay.linkdist;
int currentPrio = input.goodWay.getPrio();
int oldPrio = input.oldWay.getPrio();
int minPrio = Math.min( oldPrio, currentPrio );
int minPrio = Math.min(oldPrio, currentPrio);
boolean isLink2Highway = input.oldWay.isLinktType() && !input.goodWay.isLinktType();
boolean isHighway2Link = !input.oldWay.isLinktType() && input.goodWay.isLinktType();
if ( input.oldWay.isRoundabout() )
{
roundAboutTurnAngle += sumNonConsumedWithinCatchingRange( inputs, hintIdx );
if (explicitRoundabouts && input.oldWay.isRoundabout()) {
if (roundaboudStartIdx == -1) roundaboudStartIdx = hintIdx;
roundAboutTurnAngle += sumNonConsumedWithinCatchingRange(inputs, hintIdx);
if (roundaboudStartIdx == hintIdx) {
if (input.badWays != null) {
// remove goodWay
roundAboutTurnAngle -= input.goodWay.turnangle;
// add a badWay
for (MessageData badWay : input.badWays) {
if (!badWay.isBadOneway()) roundAboutTurnAngle += badWay.turnangle;
}
}
}
boolean isExit = roundaboutExit == 0; // exit point is always exit
if ( input.badWays != null )
{
for ( MessageData badWay : input.badWays )
{
if ( !badWay.isBadOneway() && badWay.isGoodForCars() && Math.abs( badWay.turnangle ) < 120. )
{
if (input.badWays != null) {
for (MessageData badWay : input.badWays) {
if (!badWay.isBadOneway() &&
badWay.isGoodForCars()) {
isExit = true;
}
}
}
if ( isExit )
{
if (isExit) {
roundaboutExit++;
}
continue;
}
if ( roundaboutExit > 0 )
{
roundAboutTurnAngle += sumNonConsumedWithinCatchingRange( inputs, hintIdx );
if (roundaboutExit > 0) {
//roundAboutTurnAngle += sumNonConsumedWithinCatchingRange(inputs, hintIdx);
//double startTurn = (roundaboudStartIdx != -1 ? inputs.get(roundaboudStartIdx + 1).goodWay.turnangle : turnAngle);
input.angle = roundAboutTurnAngle;
input.goodWay.turnangle = roundAboutTurnAngle;
input.distanceToNext = distance;
input.roundaboutExit = turnAngle < 0 ? -roundaboutExit : roundaboutExit;
//input.roundaboutExit = startTurn < 0 ? roundaboutExit : -roundaboutExit;
input.roundaboutExit = roundAboutTurnAngle < 0 ? roundaboutExit : -roundaboutExit;
float tmpangle = 0;
VoiceHint tmpRndAbt = new VoiceHint();
tmpRndAbt.badWays = new ArrayList<>();
for (int i = hintIdx-1; i > roundaboudStartIdx; i--) {
VoiceHint vh = inputs.get(i);
tmpangle += inputs.get(i).goodWay.turnangle;
if (vh.badWays != null) {
for (MessageData badWay : vh.badWays) {
if (!badWay.isBadOneway()) {
MessageData md = new MessageData();
md.linkdist = vh.goodWay.linkdist;
md.priorityclassifier = vh.goodWay.priorityclassifier;
md.turnangle = tmpangle;
tmpRndAbt.badWays.add(md);
}
}
}
}
distance = 0.;
results.add( input );
input.badWays = tmpRndAbt.badWays;
results.add(input);
roundAboutTurnAngle = 0.f;
roundaboutExit = 0;
roundaboudStartIdx = -1;
continue;
}
int maxPrioAll = -1; // max prio of all detours
@ -114,134 +150,265 @@ public final class VoiceHintProcessor
float minAngle = 180.f;
float minAbsAngeRaw = 180.f;
if ( input.badWays != null )
{
for ( MessageData badWay : input.badWays )
{
boolean isBadwayLink = false;
if (input.badWays != null) {
for (MessageData badWay : input.badWays) {
int badPrio = badWay.getPrio();
float badTurn = badWay.turnangle;
if (badWay.isLinktType()) {
isBadwayLink = true;
}
boolean isBadHighway2Link = !input.oldWay.isLinktType() && badWay.isLinktType();
boolean isHighway2Link = !input.oldWay.isLinktType() && badWay.isLinktType();
if ( badPrio > maxPrioAll && !isHighway2Link )
{
if (badPrio > maxPrioAll && !isBadHighway2Link) {
maxPrioAll = badPrio;
input.maxBadPrio = Math.max(input.maxBadPrio, badPrio);
}
if ( badWay.costfactor < 20.f && Math.abs( badTurn ) < minAbsAngeRaw )
{
minAbsAngeRaw = Math.abs( badTurn );
}
if ( badPrio < minPrio )
{
if (badPrio < minPrio) {
continue; // ignore low prio ways
}
if ( badWay.isBadOneway() )
{
if (badWay.isBadOneway()) {
continue; // ignore wrong oneways
}
if ( Math.abs( badTurn ) - Math.abs( turnAngle ) > 80.f )
{
if (Math.abs(badTurn) - Math.abs(turnAngle) > 80.f) {
continue; // ways from the back should not trigger a slight turn
}
if ( badPrio > maxPrioCandidates )
{
maxPrioCandidates = badPrio;
if (badWay.costfactor < 20.f && Math.abs(badTurn) < minAbsAngeRaw) {
minAbsAngeRaw = Math.abs(badTurn);
}
if ( badTurn > maxAngle )
{
if (badPrio > maxPrioCandidates) {
maxPrioCandidates = badPrio;
input.maxBadPrio = Math.max(input.maxBadPrio, badPrio);
}
if (badTurn > maxAngle) {
maxAngle = badTurn;
}
if ( badTurn < minAngle )
{
if (badTurn < minAngle) {
minAngle = badTurn;
}
}
}
boolean hasSomethingMoreStraight = Math.abs( turnAngle ) - minAbsAngeRaw > 20.;
// boolean hasSomethingMoreStraight = (Math.abs(turnAngle) - minAbsAngeRaw) > 20.;
boolean hasSomethingMoreStraight = (Math.abs(turnAngle - minAbsAngeRaw)) > 20. && input.badWays != null; // && !ignoreBadway;
// unconditional triggers are all junctions with
// - higher detour prios than the minimum route prio (except link->highway junctions)
// - or candidate detours with higher prio then the route exit leg
boolean unconditionalTrigger = hasSomethingMoreStraight || ( maxPrioAll > minPrio && !isLink2Highway ) || ( maxPrioCandidates > currentPrio );
boolean unconditionalTrigger = hasSomethingMoreStraight ||
(maxPrioAll > minPrio && !isLink2Highway) ||
(maxPrioCandidates > currentPrio) ||
VoiceHint.is180DegAngle(turnAngle) ||
(!isHighway2Link && isBadwayLink && Math.abs(turnAngle) > 5.f) ||
(isHighway2Link && !isBadwayLink && Math.abs(turnAngle) < 5.f);
// conditional triggers (=real turning angle required) are junctions
// with candidate detours equal in priority than the route exit leg
boolean conditionalTrigger = maxPrioCandidates >= minPrio;
if ( unconditionalTrigger || conditionalTrigger )
{
if (unconditionalTrigger || conditionalTrigger) {
input.angle = turnAngle;
input.calcCommand();
boolean isStraight = input.cmd == VoiceHint.C;
input.needsRealTurn = (!unconditionalTrigger) && isStraight;
// check for KR/KL
if ( maxAngle < turnAngle && maxAngle > turnAngle - 45.f - (turnAngle > 0.f ? turnAngle : 0.f ) )
{
input.cmd = VoiceHint.KR;
}
if ( minAngle > turnAngle && minAngle < turnAngle + 45.f - (turnAngle < 0.f ? turnAngle : 0.f ) )
{
input.cmd = VoiceHint.KL;
if (Math.abs(turnAngle) > 5.) { // don't use too small angles
if (maxAngle < turnAngle && maxAngle > turnAngle - 45.f - (Math.max(turnAngle, 0.f))) {
input.cmd = VoiceHint.KR;
}
if (minAngle > turnAngle && minAngle < turnAngle + 45.f - (Math.min(turnAngle, 0.f))) {
input.cmd = VoiceHint.KL;
}
}
input.angle = sumNonConsumedWithinCatchingRange( inputs, hintIdx );
input.angle = sumNonConsumedWithinCatchingRange(inputs, hintIdx);
input.distanceToNext = distance;
distance = 0.;
results.add( input );
results.add(input);
}
if ( results.size() > 0 && distance < catchingRange )
{
results.get( results.size()-1 ).angle += sumNonConsumedWithinCatchingRange( inputs, hintIdx );
if (results.size() > 0 && distance < INTERNAL_CATCHING_RANGE) { //catchingRange
results.get(results.size() - 1).angle += sumNonConsumedWithinCatchingRange(inputs, hintIdx);
}
}
// go through the hint list again in reverse order (=travel direction)
// and filter out non-signficant hints and hints too close to it's predecessor
// and filter out non-significant hints and hints too close to its predecessor
List<VoiceHint> results2 = new ArrayList<VoiceHint>();
List<VoiceHint> results2 = new ArrayList<>();
int i = results.size();
while( i > 0 )
{
while (i > 0) {
VoiceHint hint = results.get(--i);
if ( hint.cmd == 0 )
{
if (hint.cmd == 0) {
hint.calcCommand();
}
if ( ! ( hint.needsRealTurn && hint.cmd == VoiceHint.C ) )
{
if (!(hint.needsRealTurn && (hint.cmd == VoiceHint.C || hint.cmd == VoiceHint.BL))) {
double dist = hint.distanceToNext;
// sum up other hints within the catching range (e.g. 40m)
while( dist < catchingRange && i > 0 )
{
VoiceHint h2 = results.get(i-1);
while (dist < INTERNAL_CATCHING_RANGE && i > 0) {
VoiceHint h2 = results.get(i - 1);
dist = h2.distanceToNext;
hint.distanceToNext+= dist;
hint.distanceToNext += dist;
hint.angle += h2.angle;
i--;
if ( h2.isRoundabout() ) // if we hit a roundabout, use that as the trigger
{
if (h2.isRoundabout()) { // if we hit a roundabout, use that as the trigger
h2.angle = hint.angle;
hint = h2;
break;
}
}
if ( !explicitRoundabouts )
{
if (!explicitRoundabouts) {
hint.roundaboutExit = 0; // use an angular hint instead
}
hint.calcCommand();
results2.add( hint );
results2.add(hint);
} else if (hint.cmd == VoiceHint.BL) {
results2.add(hint);
} else {
if (results2.size() > 0)
results2.get(results2.size() - 1).distanceToNext += hint.distanceToNext;
}
}
return results2;
}
public List<VoiceHint> postProcess(List<VoiceHint> inputs, double catchingRange, double minRange) {
List<VoiceHint> results = new ArrayList<>();
double distance = 0;
VoiceHint inputLast = null;
VoiceHint inputLastSaved = null;
for (int hintIdx = 0; hintIdx < inputs.size(); hintIdx++) {
VoiceHint input = inputs.get(hintIdx);
VoiceHint nextInput = null;
if (hintIdx + 1 < inputs.size()) {
nextInput = inputs.get(hintIdx + 1);
}
if (nextInput == null) {
if (input.cmd == VoiceHint.C && !input.goodWay.isLinktType()) {
if (input.goodWay.getPrio() < input.maxBadPrio && (inputLastSaved != null && inputLastSaved.distanceToNext > catchingRange)) {
results.add(input);
} else {
if (inputLast != null) { // when drop add distance to last
inputLast.distanceToNext += input.distanceToNext;
}
continue;
}
} else {
results.add(input);
}
} else {
if ((inputLastSaved != null && inputLastSaved.distanceToNext > catchingRange) || input.distanceToNext > catchingRange) {
if (input.cmd == VoiceHint.C && !input.goodWay.isLinktType()) {
if (input.goodWay.getPrio() < input.maxBadPrio
&& (inputLastSaved != null && inputLastSaved.distanceToNext > minRange)
&& (input.distanceToNext > minRange)) {
// add only on prio
results.add(input);
inputLastSaved = input;
} else {
if (inputLastSaved != null) { // when drop add distance to last
inputLastSaved.distanceToNext += input.distanceToNext;
}
}
} else {
// add all others
// ignore motorway / primary continue
if (((input.goodWay.getPrio() != 28) &&
(input.goodWay.getPrio() != 30) &&
(input.goodWay.getPrio() != 26))
|| input.isRoundabout()
|| Math.abs(input.angle) > 21.f) {
results.add(input);
inputLastSaved = input;
} else {
if (inputLastSaved != null) { // when drop add distance to last
inputLastSaved.distanceToNext += input.distanceToNext;
}
}
}
} else if (input.distanceToNext < catchingRange) {
double dist = input.distanceToNext;
float angles = input.angle;
int i = 1;
boolean save = false;
dist += nextInput.distanceToNext;
angles += nextInput.angle;
if (input.cmd == VoiceHint.C && !input.goodWay.isLinktType()) {
if (input.goodWay.getPrio() < input.maxBadPrio) {
if (inputLastSaved != null && inputLastSaved.cmd != VoiceHint.C
&& (inputLastSaved != null && inputLastSaved.distanceToNext > minRange)
&& transportMode != VoiceHintList.TRANS_MODE_CAR) {
// add when straight and not linktype
// and last vh not straight
save = true;
// remove when next straight and not linktype
if (nextInput != null &&
nextInput.cmd == VoiceHint.C &&
!nextInput.goodWay.isLinktType()) {
input.distanceToNext += nextInput.distanceToNext;
hintIdx++;
}
}
} else {
if (inputLastSaved != null) { // when drop add distance to last
inputLastSaved.distanceToNext += input.distanceToNext;
}
}
} else if (VoiceHint.is180DegAngle(input.angle)) {
// add u-turn, 180 degree
save = true;
} else if (transportMode == VoiceHintList.TRANS_MODE_CAR && Math.abs(angles) > 180 - SIGNIFICANT_ANGLE) {
// add when inc car mode and u-turn, collects e.g. two left turns in range
input.angle = angles;
input.calcCommand();
input.distanceToNext += nextInput.distanceToNext;
save = true;
hintIdx++;
} else if (Math.abs(angles) < SIGNIFICANT_ANGLE && input.distanceToNext < minRange) {
input.angle = angles;
input.calcCommand();
input.distanceToNext += nextInput.distanceToNext;
save = true;
hintIdx++;
} else if (Math.abs(input.angle) > SIGNIFICANT_ANGLE) {
// add when angle above 22.5 deg
save = true;
} else if (Math.abs(input.angle) < SIGNIFICANT_ANGLE) {
// add when angle below 22.5 deg ???
// save = true;
} else {
// otherwise ignore but add distance to next
if (nextInput != null) { // when drop add distance to last
nextInput.distanceToNext += input.distanceToNext;
}
save = false;
}
if (save) {
results.add(input); // add when last
inputLastSaved = input;
}
} else {
results.add(input);
inputLastSaved = input;
}
}
inputLast = input;
}
return results;
}
}

View file

@ -1,22 +1,18 @@
package btools.router;
import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertFalse;
import static org.junit.Assert.assertTrue;
import org.junit.AfterClass;
import org.junit.BeforeClass;
import org.junit.Test;
import btools.util.CheapRuler;
public class OsmNodeNamedTest {
static int toOsmLon(double lon) {
return (int)( ( lon + 180. ) / CheapRuler.ILATLNG_TO_LATLNG + 0.5);
return (int) ((lon + 180.) / CheapRuler.ILATLNG_TO_LATLNG + 0.5);
}
static int toOsmLat(double lat) {
return (int)( ( lat + 90. ) / CheapRuler.ILATLNG_TO_LATLNG + 0.5);
return (int) ((lat + 90.) / CheapRuler.ILATLNG_TO_LATLNG + 0.5);
}
@Test

View file

@ -1,6 +1,6 @@
/**********************************************************************************************
Copyright (C) 2018 Norbert Truchsess norbert.truchsess@t-online.de
**********************************************************************************************/
Copyright (C) 2018 Norbert Truchsess norbert.truchsess@t-online.de
**********************************************************************************************/
package btools.router;
import static org.junit.Assert.assertEquals;
@ -11,7 +11,6 @@ import org.junit.AfterClass;
import org.junit.BeforeClass;
import org.junit.Test;
import btools.router.OsmNogoPolygon.Point;
import btools.util.CheapRuler;
public class OsmNogoPolygonTest {
@ -22,26 +21,26 @@ public class OsmNogoPolygonTest {
static OsmNogoPolygon polygon;
static OsmNogoPolygon polyline;
static final double[] lons = { 1.0, 1.0, 0.5, 0.5, 1.0, 1.0, -1.1, -1.0 };
static final double[] lats = { -1.0, -0.1, -0.1, 0.1, 0.1, 1.0, 1.1, -1.0 };
static final double[] lons = {1.0, 1.0, 0.5, 0.5, 1.0, 1.0, -1.1, -1.0};
static final double[] lats = {-1.0, -0.1, -0.1, 0.1, 0.1, 1.0, 1.1, -1.0};
static int toOsmLon(double lon, int offset_x) {
return (int)( ( lon + 180. ) *1000000. + 0.5)+offset_x; // see ServerHandler.readPosition()
return (int) ((lon + 180.) * 1000000. + 0.5) + offset_x; // see ServerHandler.readPosition()
}
static int toOsmLat(double lat, int offset_y) {
return (int)( ( lat + 90. ) *1000000. + 0.5)+offset_y;
return (int) ((lat + 90.) * 1000000. + 0.5) + offset_y;
}
@BeforeClass
public static void setUp() throws Exception {
polygon = new OsmNogoPolygon(true);
for (int i = 0; i<lons.length; i++) {
polygon.addVertex(toOsmLon(lons[i], OFFSET_X),toOsmLat(lats[i], OFFSET_Y));
for (int i = 0; i < lons.length; i++) {
polygon.addVertex(toOsmLon(lons[i], OFFSET_X), toOsmLat(lats[i], OFFSET_Y));
}
polyline = new OsmNogoPolygon(false);
for (int i = 0; i<lons.length; i++) {
polyline.addVertex(toOsmLon(lons[i], OFFSET_X),toOsmLat(lats[i], OFFSET_Y));
for (int i = 0; i < lons.length; i++) {
polyline.addVertex(toOsmLon(lons[i], OFFSET_X), toOsmLat(lats[i], OFFSET_Y));
}
}
@ -51,162 +50,162 @@ public class OsmNogoPolygonTest {
@Test
public void testCalcBoundingCircle() {
double[] lonlat2m = CheapRuler.getLonLatToMeterScales( polygon.ilat );
double[] lonlat2m = CheapRuler.getLonLatToMeterScales(polygon.ilat);
double dlon2m = lonlat2m[0];
double dlat2m = lonlat2m[1];
polygon.calcBoundingCircle();
double r = polygon.radius;
for (int i=0; i<lons.length; i++) {
for (int i = 0; i < lons.length; i++) {
double dpx = (toOsmLon(lons[i], OFFSET_X) - polygon.ilon) * dlon2m;
double dpy = (toOsmLat(lats[i], OFFSET_Y) - polygon.ilat) * dlat2m;
double r1 = Math.sqrt(dpx * dpx + dpy * dpy);
double diff = r-r1;
assertTrue("i: "+i+" r("+r+") >= r1("+r1+")", diff >= 0);
double diff = r - r1;
assertTrue("i: " + i + " r(" + r + ") >= r1(" + r1 + ")", diff >= 0);
}
polyline.calcBoundingCircle();
r = polyline.radius;
for (int i=0; i<lons.length; i++) {
for (int i = 0; i < lons.length; i++) {
double dpx = (toOsmLon(lons[i], OFFSET_X) - polyline.ilon) * dlon2m;
double dpy = (toOsmLat(lats[i], OFFSET_Y) - polyline.ilat) * dlat2m;
double r1 = Math.sqrt(dpx * dpx + dpy * dpy);
double diff = r-r1;
assertTrue("i: "+i+" r("+r+") >= r1("+r1+")", diff >= 0);
double diff = r - r1;
assertTrue("i: " + i + " r(" + r + ") >= r1(" + r1 + ")", diff >= 0);
}
}
@Test
public void testIsWithin() {
double[] plons = { 0.0, 0.5, 1.0, -1.5, -0.5, 1.0, 1.0, 0.5, 0.5, 0.5, };
double[] plats = { 0.0, 1.5, 0.0, 0.5, -1.5, -1.0, -0.1, -0.1, 0.0, 0.1, };
boolean[] within = { true, false, false, false, false, true, true, true, true, true, };
double[] plons = {0.0, 0.5, 1.0, -1.5, -0.5, 1.0, 1.0, 0.5, 0.5, 0.5};
double[] plats = {0.0, 1.5, 0.0, 0.5, -1.5, -1.0, -0.1, -0.1, 0.0, 0.1};
boolean[] within = {true, false, false, false, false, true, true, true, true, true};
for (int i=0; i<plons.length; i++) {
assertEquals("("+plons[i]+","+plats[i]+")",within[i],polygon.isWithin(toOsmLon(plons[i], OFFSET_X), toOsmLat(plats[i], OFFSET_Y)));
for (int i = 0; i < plons.length; i++) {
assertEquals("(" + plons[i] + "," + plats[i] + ")", within[i], polygon.isWithin(toOsmLon(plons[i], OFFSET_X), toOsmLat(plats[i], OFFSET_Y)));
}
}
@Test
public void testIntersectsPolygon() {
double[] p0lons = { 0.0, 1.0, -0.5, 0.5, 0.7, 0.7, 0.7, -1.5, -1.5, 0.0 };
double[] p0lats = { 0.0, 0.0, 0.5, 0.5, 0.5, 0.05, 0.05, -1.5, 0.2, 0.0 };
double[] p1lons = { 0.0, 1.0, 0.5, 1.0, 0.7, 0.7, 0.7, -0.5, -0.2, 0.5 };
double[] p1lats = { 0.0, 0.0, 0.5, 0.5, -0.5, -0.5, -0.05, -0.5, 1.5, -1.5 };
boolean[] within = { false, false, false, true, true, true, false, true, true, true };
double[] p0lons = {0.0, 1.0, -0.5, 0.5, 0.7, 0.7, 0.7, -1.5, -1.5, 0.0};
double[] p0lats = {0.0, 0.0, 0.5, 0.5, 0.5, 0.05, 0.05, -1.5, 0.2, 0.0};
double[] p1lons = {0.0, 1.0, 0.5, 1.0, 0.7, 0.7, 0.7, -0.5, -0.2, 0.5};
double[] p1lats = {0.0, 0.0, 0.5, 0.5, -0.5, -0.5, -0.05, -0.5, 1.5, -1.5};
boolean[] within = {false, false, false, true, true, true, false, true, true, true};
for (int i=0; i<p0lons.length; i++) {
assertEquals("("+p0lons[i]+","+p0lats[i]+")-("+p1lons[i]+","+p1lats[i]+")",within[i],polygon.intersects(toOsmLon(p0lons[i], OFFSET_X), toOsmLat(p0lats[i], OFFSET_Y), toOsmLon(p1lons[i], OFFSET_X), toOsmLat(p1lats[i], OFFSET_Y)));
for (int i = 0; i < p0lons.length; i++) {
assertEquals("(" + p0lons[i] + "," + p0lats[i] + ")-(" + p1lons[i] + "," + p1lats[i] + ")", within[i], polygon.intersects(toOsmLon(p0lons[i], OFFSET_X), toOsmLat(p0lats[i], OFFSET_Y), toOsmLon(p1lons[i], OFFSET_X), toOsmLat(p1lats[i], OFFSET_Y)));
}
}
@Test
public void testIntersectsPolyline() {
double[] p0lons = { 0.0, 1.0, -0.5, 0.5, 0.7, 0.7, 0.7, -1.5, -1.5, 0.0 };
double[] p0lats = { 0.0, 0.0, 0.5, 0.5, 0.5, 0.05, 0.05, -1.5, 0.2, 0.0 };
double[] p1lons = { 0.0, 1.0, 0.5, 1.0, 0.7, 0.7, 0.7, -0.5, -0.2, 0.5 };
double[] p1lats = { 0.0, 0.0, 0.5, 0.5, -0.5, -0.5, -0.05, -0.5, 1.5, -1.5 };
boolean[] within = { false, false, false, true, true, true, false, true, true, false };
double[] p0lons = {0.0, 1.0, -0.5, 0.5, 0.7, 0.7, 0.7, -1.5, -1.5, 0.0};
double[] p0lats = {0.0, 0.0, 0.5, 0.5, 0.5, 0.05, 0.05, -1.5, 0.2, 0.0};
double[] p1lons = {0.0, 1.0, 0.5, 1.0, 0.7, 0.7, 0.7, -0.5, -0.2, 0.5};
double[] p1lats = {0.0, 0.0, 0.5, 0.5, -0.5, -0.5, -0.05, -0.5, 1.5, -1.5};
boolean[] within = {false, false, false, true, true, true, false, true, true, false};
for (int i=0; i<p0lons.length; i++) {
assertEquals("("+p0lons[i]+","+p0lats[i]+")-("+p1lons[i]+","+p1lats[i]+")",within[i],polyline.intersects(toOsmLon(p0lons[i], OFFSET_X), toOsmLat(p0lats[i], OFFSET_Y), toOsmLon(p1lons[i], OFFSET_X), toOsmLat(p1lats[i], OFFSET_Y)));
for (int i = 0; i < p0lons.length; i++) {
assertEquals("(" + p0lons[i] + "," + p0lats[i] + ")-(" + p1lons[i] + "," + p1lats[i] + ")", within[i], polyline.intersects(toOsmLon(p0lons[i], OFFSET_X), toOsmLat(p0lats[i], OFFSET_Y), toOsmLon(p1lons[i], OFFSET_X), toOsmLat(p1lats[i], OFFSET_Y)));
}
}
@Test
public void testBelongsToLine() {
assertTrue(OsmNogoPolygon.isOnLine(10,10, 10,10, 10,20));
assertTrue(OsmNogoPolygon.isOnLine(10,10, 10,10, 20,10));
assertTrue(OsmNogoPolygon.isOnLine(10,10, 20,10, 10,10));
assertTrue(OsmNogoPolygon.isOnLine(10,10, 10,20, 10,10));
assertTrue(OsmNogoPolygon.isOnLine(10,15, 10,10, 10,20));
assertTrue(OsmNogoPolygon.isOnLine(15,10, 10,10, 20,10));
assertTrue(OsmNogoPolygon.isOnLine(10,10, 10,10, 20,30));
assertTrue(OsmNogoPolygon.isOnLine(20,30, 10,10, 20,30));
assertTrue(OsmNogoPolygon.isOnLine(15,20, 10,10, 20,30));
assertFalse(OsmNogoPolygon.isOnLine(11,11, 10,10, 10,20));
assertFalse(OsmNogoPolygon.isOnLine(11,11, 10,10, 20,10));
assertFalse(OsmNogoPolygon.isOnLine(15,21, 10,10, 20,30));
assertFalse(OsmNogoPolygon.isOnLine(15,19, 10,10, 20,30));
assertFalse(OsmNogoPolygon.isOnLine(0,-10, 10,10, 20,30));
assertFalse(OsmNogoPolygon.isOnLine(30,50, 10,10, 20,30));
assertTrue(OsmNogoPolygon.isOnLine(10, 10, 10, 10, 10, 20));
assertTrue(OsmNogoPolygon.isOnLine(10, 10, 10, 10, 20, 10));
assertTrue(OsmNogoPolygon.isOnLine(10, 10, 20, 10, 10, 10));
assertTrue(OsmNogoPolygon.isOnLine(10, 10, 10, 20, 10, 10));
assertTrue(OsmNogoPolygon.isOnLine(10, 15, 10, 10, 10, 20));
assertTrue(OsmNogoPolygon.isOnLine(15, 10, 10, 10, 20, 10));
assertTrue(OsmNogoPolygon.isOnLine(10, 10, 10, 10, 20, 30));
assertTrue(OsmNogoPolygon.isOnLine(20, 30, 10, 10, 20, 30));
assertTrue(OsmNogoPolygon.isOnLine(15, 20, 10, 10, 20, 30));
assertFalse(OsmNogoPolygon.isOnLine(11, 11, 10, 10, 10, 20));
assertFalse(OsmNogoPolygon.isOnLine(11, 11, 10, 10, 20, 10));
assertFalse(OsmNogoPolygon.isOnLine(15, 21, 10, 10, 20, 30));
assertFalse(OsmNogoPolygon.isOnLine(15, 19, 10, 10, 20, 30));
assertFalse(OsmNogoPolygon.isOnLine(0, -10, 10, 10, 20, 30));
assertFalse(OsmNogoPolygon.isOnLine(30, 50, 10, 10, 20, 30));
}
@Test
public void testDistanceWithinPolygon() {
// Testing polygon
final double[] lons = { 2.333523, 2.333432, 2.333833, 2.333983, 2.334815, 2.334766 };
final double[] lats = { 48.823778, 48.824091, 48.82389, 48.824165, 48.824232, 48.82384 };
OsmNogoPolygon polygon = new OsmNogoPolygon(true);
for (int i = 0; i < lons.length; i++) {
polygon.addVertex(toOsmLon(lons[i], 0), toOsmLat(lats[i], 0));
}
OsmNogoPolygon polyline = new OsmNogoPolygon(false);
for (int i = 0; i < lons.length; i++) {
polyline.addVertex(toOsmLon(lons[i], 0), toOsmLat(lats[i], 0));
}
// Testing polygon
final double[] lons = {2.333523, 2.333432, 2.333833, 2.333983, 2.334815, 2.334766};
final double[] lats = {48.823778, 48.824091, 48.82389, 48.824165, 48.824232, 48.82384};
OsmNogoPolygon polygon = new OsmNogoPolygon(true);
for (int i = 0; i < lons.length; i++) {
polygon.addVertex(toOsmLon(lons[i], 0), toOsmLat(lats[i], 0));
}
OsmNogoPolygon polyline = new OsmNogoPolygon(false);
for (int i = 0; i < lons.length; i++) {
polyline.addVertex(toOsmLon(lons[i], 0), toOsmLat(lats[i], 0));
}
// Check with a segment with a single intersection with the polygon
int lon1 = toOsmLon(2.33308732509613, 0);
int lat1 = toOsmLat(48.8238790443901, 0);
int lon2 = toOsmLon(2.33378201723099, 0);
int lat2 = toOsmLat(48.8239585098974, 0);
assertEquals(
"Should give the correct length for a segment with a single intersection",
17.5,
polygon.distanceWithinPolygon(lon1, lat1, lon2, lat2),
0.05 * 17.5
);
// Check with a segment with a single intersection with the polygon
int lon1 = toOsmLon(2.33308732509613, 0);
int lat1 = toOsmLat(48.8238790443901, 0);
int lon2 = toOsmLon(2.33378201723099, 0);
int lat2 = toOsmLat(48.8239585098974, 0);
assertEquals(
"Should give the correct length for a segment with a single intersection",
17.5,
polygon.distanceWithinPolygon(lon1, lat1, lon2, lat2),
0.05 * 17.5
);
// Check with a segment crossing multiple times the polygon
lon2 = toOsmLon(2.33488172292709, 0);
lat2 = toOsmLat(48.8240891862353, 0);
assertEquals(
"Should give the correct length for a segment with multiple intersections",
85,
polygon.distanceWithinPolygon(lon1, lat1, lon2, lat2),
0.05 * 85
);
// Check with a segment crossing multiple times the polygon
lon2 = toOsmLon(2.33488172292709, 0);
lat2 = toOsmLat(48.8240891862353, 0);
assertEquals(
"Should give the correct length for a segment with multiple intersections",
85,
polygon.distanceWithinPolygon(lon1, lat1, lon2, lat2),
0.05 * 85
);
// Check that it works when a point is within the polygon
lon2 = toOsmLon(2.33433187007904, 0);
lat2 = toOsmLat(48.8240238480664, 0);
assertEquals(
"Should give the correct length when last point is within the polygon",
50,
polygon.distanceWithinPolygon(lon1, lat1, lon2, lat2),
0.05 * 50
);
lon1 = toOsmLon(2.33433187007904, 0);
lat1 = toOsmLat(48.8240238480664, 0);
lon2 = toOsmLon(2.33488172292709, 0);
lat2 = toOsmLat(48.8240891862353, 0);
assertEquals(
"Should give the correct length when first point is within the polygon",
35,
polygon.distanceWithinPolygon(lon1, lat1, lon2, lat2),
0.05 * 35
);
// Check that it works when a point is within the polygon
lon2 = toOsmLon(2.33433187007904, 0);
lat2 = toOsmLat(48.8240238480664, 0);
assertEquals(
"Should give the correct length when last point is within the polygon",
50,
polygon.distanceWithinPolygon(lon1, lat1, lon2, lat2),
0.05 * 50
);
lon1 = toOsmLon(2.33433187007904, 0);
lat1 = toOsmLat(48.8240238480664, 0);
lon2 = toOsmLon(2.33488172292709, 0);
lat2 = toOsmLat(48.8240891862353, 0);
assertEquals(
"Should give the correct length when first point is within the polygon",
35,
polygon.distanceWithinPolygon(lon1, lat1, lon2, lat2),
0.05 * 35
);
lon1 = toOsmLon(2.333523, 0);
lat1 = toOsmLat(48.823778, 0);
lon2 = toOsmLon(2.333432, 0);
lat2 = toOsmLat(48.824091, 0);
assertEquals(
"Should give the correct length if the segment overlaps with an edge of the polygon",
CheapRuler.distance(lon1, lat1, lon2, lat2),
polygon.distanceWithinPolygon(lon1, lat1, lon2, lat2),
0.05 * CheapRuler.distance(lon1, lat1, lon2, lat2)
);
lon1 = toOsmLon(2.333523, 0);
lat1 = toOsmLat(48.823778, 0);
lon2 = toOsmLon(2.333432, 0);
lat2 = toOsmLat(48.824091, 0);
assertEquals(
"Should give the correct length if the segment overlaps with an edge of the polygon",
CheapRuler.distance(lon1, lat1, lon2, lat2),
polygon.distanceWithinPolygon(lon1, lat1, lon2, lat2),
0.05 * CheapRuler.distance(lon1, lat1, lon2, lat2)
);
lon1 = toOsmLon(2.333523, 0);
lat1 = toOsmLat(48.823778, 0);
lon2 = toOsmLon(2.3334775, 0);
lat2 = toOsmLat(48.8239345, 0);
assertEquals(
"Should give the correct length if the segment overlaps with a polyline",
CheapRuler.distance(lon1, lat1, lon2, lat2),
polyline.distanceWithinPolygon(lon1, lat1, lon2, lat2),
0.05 * CheapRuler.distance(lon1, lat1, lon2, lat2)
);
lon1 = toOsmLon(2.333523, 0);
lat1 = toOsmLat(48.823778, 0);
lon2 = toOsmLon(2.3334775, 0);
lat2 = toOsmLat(48.8239345, 0);
assertEquals(
"Should give the correct length if the segment overlaps with a polyline",
CheapRuler.distance(lon1, lat1, lon2, lat2),
polyline.distanceWithinPolygon(lon1, lat1, lon2, lat2),
0.05 * CheapRuler.distance(lon1, lat1, lon2, lat2)
);
}
}

View file

@ -0,0 +1,68 @@
package btools.router;
import org.junit.Assert;
import org.junit.Test;
import java.io.UnsupportedEncodingException;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
public class RouteParamTest {
@Test(expected = IllegalArgumentException.class)
public void readWptsNull() {
RoutingParamCollector rpc = new RoutingParamCollector();
List<OsmNodeNamed> map = rpc.getWayPointList(null);
Assert.assertEquals("result content null", 0, map.size());
}
@Test
public void readWpts() {
String data = "1.0,1.2;2.0,2.2";
RoutingParamCollector rpc = new RoutingParamCollector();
List<OsmNodeNamed> map = rpc.getWayPointList(data);
Assert.assertEquals("result content 1 ", 2, map.size());
data = "1.0,1.1|2.0,2.2|3.0,3.3";
map = rpc.getWayPointList(data);
Assert.assertEquals("result content 2 ", 3, map.size());
data = "1.0,1.2,Name;2.0,2.2";
map = rpc.getWayPointList(data);
Assert.assertEquals("result content 3 ", "Name", map.get(0).name);
data = "1.0,1.2,d;2.0,2.2";
map = rpc.getWayPointList(data);
Assert.assertTrue("result content 4 ", map.get(0).direct);
}
@Test
public void readUrlParams() throws UnsupportedEncodingException {
String url = "lonlats=1,1;2,2&profile=test&more=1";
RoutingParamCollector rpc = new RoutingParamCollector();
Map<String, String> map = rpc.getUrlParams(url);
Assert.assertEquals("result content ", 3, map.size());
}
@Test
public void readParamsFromList() throws UnsupportedEncodingException {
Map<String, String> params = new HashMap<>();
params.put("timode", "3");
RoutingContext rc = new RoutingContext();
RoutingParamCollector rpc = new RoutingParamCollector();
rpc.setParams(rc, null, params);
Assert.assertEquals("result content timode ", 3, rc.turnInstructionMode);
}
}

View file

@ -0,0 +1,86 @@
package btools.router;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Test;
import java.io.File;
import java.net.URL;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
public class RoutingEngineTest {
private File workingDir;
@Before
public void before() {
URL resulturl = this.getClass().getResource("/testtrack0.gpx");
Assert.assertNotNull("reference result not found: ", resulturl);
File resultfile = new File(resulturl.getFile());
workingDir = resultfile.getParentFile();
}
@Test
public void routeCrossingSegmentBorder() {
String msg = calcRoute(8.720897, 50.002515, 8.723658, 49.997510, "testtrack", new RoutingContext());
// error message from router?
Assert.assertNull("routing failed: " + msg, msg);
// if the track didn't change, we expect the first alternative also
File a1 = new File(workingDir, "testtrack1.gpx");
a1.deleteOnExit();
Assert.assertTrue("result content mismatch", a1.exists());
}
@Test
public void routeDestinationPointFarOff() {
String msg = calcRoute(8.720897, 50.002515, 16.723658, 49.997510, "notrack", new RoutingContext());
Assert.assertTrue(msg, msg != null && msg.contains("not found"));
}
@Test
public void overrideParam() {
RoutingContext rctx = new RoutingContext();
rctx.keyValues = new HashMap<>();
rctx.keyValues.put("avoid_unsafe", "1.0");
String msg = calcRoute(8.723037, 50.000491, 8.712737, 50.002899, "paramTrack", rctx);
Assert.assertNull("routing failed: " + msg, msg);
File trackFile = new File(workingDir, "paramTrack1.gpx");
trackFile.deleteOnExit();
Assert.assertTrue("result content mismatch", trackFile.exists());
}
private String calcRoute(double flon, double flat, double tlon, double tlat, String trackname, RoutingContext rctx) {
String wd = workingDir.getAbsolutePath();
List<OsmNodeNamed> wplist = new ArrayList<>();
OsmNodeNamed n;
n = new OsmNodeNamed();
n.name = "from";
n.ilon = 180000000 + (int) (flon * 1000000 + 0.5);
n.ilat = 90000000 + (int) (flat * 1000000 + 0.5);
wplist.add(n);
n = new OsmNodeNamed();
n.name = "to";
n.ilon = 180000000 + (int) (tlon * 1000000 + 0.5);
n.ilat = 90000000 + (int) (tlat * 1000000 + 0.5);
wplist.add(n);
rctx.localFunction = wd + "/../../../../misc/profiles2/trekking.brf";
RoutingEngine re = new RoutingEngine(
wd + "/" + trackname,
wd + "/" + trackname,
new File(wd, "/../../../../brouter-map-creator/build/resources/test/tmp/segments"),
wplist,
rctx);
re.doRun(0);
return re.getErrorMessage();
}
}

View file

@ -0,0 +1,67 @@
<?xml version="1.0" encoding="UTF-8"?>
<!-- track-length = 1570 filtered ascend = 4 plain-ascend = -15 cost=2840 energy=.0kwh time=3m 45s -->
<gpx
xmlns="http://www.topografix.com/GPX/1/1"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.topografix.com/GPX/1/1 http://www.topografix.com/GPX/1/1/gpx.xsd"
creator="BRouter-1.7.0" version="1.1">
<trk>
<name>brouter_trekking_0</name>
<trkseg>
<trkpt lon="8.723027" lat="50.000499"><ele>175.25</ele></trkpt>
<trkpt lon="8.723285" lat="50.000610"><ele>176.75</ele></trkpt>
<trkpt lon="8.724003" lat="50.000939"><ele>179.25</ele></trkpt>
<trkpt lon="8.723553" lat="50.001028"><ele>177.5</ele></trkpt>
<trkpt lon="8.723041" lat="50.001194"><ele>174.5</ele></trkpt>
<trkpt lon="8.722781" lat="50.001312"><ele>173.25</ele></trkpt>
<trkpt lon="8.722027" lat="50.001834"><ele>169.5</ele></trkpt>
<trkpt lon="8.721982" lat="50.001865"><ele>169.5</ele></trkpt>
<trkpt lon="8.722320" lat="50.002050"><ele>171.0</ele></trkpt>
<trkpt lon="8.722449" lat="50.002197"><ele>171.25</ele></trkpt>
<trkpt lon="8.722497" lat="50.002337"><ele>171.25</ele></trkpt>
<trkpt lon="8.722506" lat="50.002463"><ele>171.0</ele></trkpt>
<trkpt lon="8.722486" lat="50.002697"><ele>170.25</ele></trkpt>
<trkpt lon="8.721892" lat="50.002631"><ele>167.75</ele></trkpt>
<trkpt lon="8.721836" lat="50.002624"><ele>167.5</ele></trkpt>
<trkpt lon="8.721209" lat="50.002553"><ele>165.25</ele></trkpt>
<trkpt lon="8.721118" lat="50.002538"><ele>164.75</ele></trkpt>
<trkpt lon="8.721021" lat="50.002493"><ele>164.5</ele></trkpt>
<trkpt lon="8.720994" lat="50.002509"><ele>164.25</ele></trkpt>
<trkpt lon="8.720960" lat="50.002518"><ele>164.25</ele></trkpt>
<trkpt lon="8.720888" lat="50.002517"><ele>163.75</ele></trkpt>
<trkpt lon="8.720853" lat="50.002586"><ele>163.75</ele></trkpt>
<trkpt lon="8.720782" lat="50.002704"><ele>163.25</ele></trkpt>
<trkpt lon="8.720554" lat="50.002937"><ele>162.25</ele></trkpt>
<trkpt lon="8.720469" lat="50.003004"><ele>162.0</ele></trkpt>
<trkpt lon="8.718899" lat="50.003724"><ele>160.0</ele></trkpt>
<trkpt lon="8.718254" lat="50.004051"><ele>159.25</ele></trkpt>
<trkpt lon="8.718123" lat="50.004087"><ele>159.0</ele></trkpt>
<trkpt lon="8.717543" lat="50.004244"><ele>159.0</ele></trkpt>
<trkpt lon="8.717181" lat="50.004357"><ele>159.0</ele></trkpt>
<trkpt lon="8.716729" lat="50.004515"><ele>158.0</ele></trkpt>
<trkpt lon="8.716463" lat="50.004600"><ele>157.5</ele></trkpt>
<trkpt lon="8.715713" lat="50.004799"><ele>156.25</ele></trkpt>
<trkpt lon="8.715490" lat="50.004843"><ele>156.0</ele></trkpt>
<trkpt lon="8.714977" lat="50.004918"><ele>155.25</ele></trkpt>
<trkpt lon="8.714539" lat="50.005012"><ele>154.25</ele></trkpt>
<trkpt lon="8.713784" lat="50.005136"><ele>152.5</ele></trkpt>
<trkpt lon="8.713582" lat="50.005177"><ele>152.5</ele></trkpt>
<trkpt lon="8.713316" lat="50.005086"><ele>153.0</ele></trkpt>
<trkpt lon="8.713067" lat="50.005001"><ele>153.25</ele></trkpt>
<trkpt lon="8.712848" lat="50.004896"><ele>153.75</ele></trkpt>
<trkpt lon="8.712781" lat="50.004859"><ele>154.0</ele></trkpt>
<trkpt lon="8.712667" lat="50.004765"><ele>154.25</ele></trkpt>
<trkpt lon="8.712563" lat="50.004683"><ele>154.5</ele></trkpt>
<trkpt lon="8.712154" lat="50.004321"><ele>156.25</ele></trkpt>
<trkpt lon="8.712066" lat="50.004245"><ele>156.5</ele></trkpt>
<trkpt lon="8.713422" lat="50.003599"><ele>158.5</ele></trkpt>
<trkpt lon="8.713452" lat="50.003572"><ele>158.5</ele></trkpt>
<trkpt lon="8.713592" lat="50.003347"><ele>158.5</ele></trkpt>
<trkpt lon="8.713620" lat="50.003326"><ele>158.5</ele></trkpt>
<trkpt lon="8.713956" lat="50.003142"><ele>158.5</ele></trkpt>
<trkpt lon="8.713468" lat="50.002781"><ele>159.5</ele></trkpt>
<trkpt lon="8.713293" lat="50.002684"><ele>159.75</ele></trkpt>
<trkpt lon="8.712770" lat="50.002929"><ele>159.5</ele></trkpt>
</trkseg>
</trk>
</gpx>

View file

@ -0,0 +1,64 @@
<?xml version="1.0" encoding="UTF-8"?>
<!-- track-length = 736 filtered ascend = 0 plain-ascend = 0 cost=736 energy=.0kwh time=1m 53s -->
<gpx
xmlns="http://www.topografix.com/GPX/1/1"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.topografix.com/GPX/1/1 http://www.topografix.com/GPX/1/1/gpx.xsd"
creator="BRouter-1.6.3" version="1.1">
<trk>
<name>brouter_trekking_0</name>
<trkseg>
<trkpt lon="8.720895" lat="50.002517"></trkpt>
<trkpt lon="8.720888" lat="50.002517"></trkpt>
<trkpt lon="8.720832" lat="50.002494"></trkpt>
<trkpt lon="8.720814" lat="50.002476"></trkpt>
<trkpt lon="8.720802" lat="50.002433"></trkpt>
<trkpt lon="8.720810" lat="50.002412"></trkpt>
<trkpt lon="8.720682" lat="50.002377"></trkpt>
<trkpt lon="8.720553" lat="50.002342"></trkpt>
<trkpt lon="8.720339" lat="50.002251"></trkpt>
<trkpt lon="8.720068" lat="50.002110"></trkpt>
<trkpt lon="8.719973" lat="50.002051"></trkpt>
<trkpt lon="8.719838" lat="50.001948"></trkpt>
<trkpt lon="8.719759" lat="50.001864"></trkpt>
<trkpt lon="8.719712" lat="50.001780"></trkpt>
<trkpt lon="8.719678" lat="50.001789"></trkpt>
<trkpt lon="8.719641" lat="50.001790"></trkpt>
<trkpt lon="8.719600" lat="50.001783"></trkpt>
<trkpt lon="8.719564" lat="50.001768"></trkpt>
<trkpt lon="8.719539" lat="50.001745"></trkpt>
<trkpt lon="8.719527" lat="50.001719"></trkpt>
<trkpt lon="8.719530" lat="50.001692"></trkpt>
<trkpt lon="8.719546" lat="50.001667"></trkpt>
<trkpt lon="8.719574" lat="50.001647"></trkpt>
<trkpt lon="8.719610" lat="50.001634"></trkpt>
<trkpt lon="8.719652" lat="50.001630"></trkpt>
<trkpt lon="8.719693" lat="50.001634"></trkpt>
<trkpt lon="8.719730" lat="50.001647"></trkpt>
<trkpt lon="8.719788" lat="50.001576"></trkpt>
<trkpt lon="8.719887" lat="50.001483"></trkpt>
<trkpt lon="8.719994" lat="50.001425"></trkpt>
<trkpt lon="8.720206" lat="50.001297"></trkpt>
<trkpt lon="8.720324" lat="50.001211"></trkpt>
<trkpt lon="8.720403" lat="50.001137"></trkpt>
<trkpt lon="8.720482" lat="50.001041"></trkpt>
<trkpt lon="8.720539" lat="50.000948"></trkpt>
<trkpt lon="8.720600" lat="50.000799"></trkpt>
<trkpt lon="8.720672" lat="50.000551"></trkpt>
<trkpt lon="8.720760" lat="50.000387"></trkpt>
<trkpt lon="8.720921" lat="50.000228"></trkpt>
<trkpt lon="8.721129" lat="50.000074"></trkpt>
<trkpt lon="8.721391" lat="49.999871"></trkpt>
<trkpt lon="8.721602" lat="49.999714"></trkpt>
<trkpt lon="8.722176" lat="49.999309"></trkpt>
<trkpt lon="8.722416" lat="49.999100"></trkpt>
<trkpt lon="8.722474" lat="49.999038"></trkpt>
<trkpt lon="8.722547" lat="49.998975"></trkpt>
<trkpt lon="8.722669" lat="49.998853"></trkpt>
<trkpt lon="8.723033" lat="49.998411"></trkpt>
<trkpt lon="8.723136" lat="49.998266"></trkpt>
<trkpt lon="8.723659" lat="49.997526"></trkpt>
<trkpt lon="8.723669" lat="49.997514"></trkpt>
</trkseg>
</trk>
</gpx>

View file

@ -1 +0,0 @@
/build/

View file

@ -1,9 +1,8 @@
plugins {
id 'java-library'
id 'brouter.library-conventions'
}
dependencies {
implementation project(':brouter-util')
implementation project(':brouter-codec')
testImplementation 'junit:junit:4.13.1'
}

View file

@ -1,3 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<manifest package="btools.expressions" />

View file

@ -1,300 +1,406 @@
package btools.expressions;
import java.util.StringTokenizer;
final class BExpression
{
private static final int OR_EXP = 10;
private static final int AND_EXP = 11;
private static final int NOT_EXP = 12;
private static final int ADD_EXP = 20;
private static final int MULTIPLY_EXP = 21;
private static final int MAX_EXP = 22;
private static final int EQUAL_EXP = 23;
private static final int GREATER_EXP = 24;
private static final int MIN_EXP = 25;
private static final int SUB_EXP = 26;
private static final int LESSER_EXP = 27;
private static final int XOR_EXP = 28;
private static final int SWITCH_EXP = 30;
private static final int ASSIGN_EXP = 31;
private static final int LOOKUP_EXP = 32;
private static final int NUMBER_EXP = 33;
private static final int VARIABLE_EXP = 34;
private static final int FOREIGN_VARIABLE_EXP = 35;
private static final int VARIABLE_GET_EXP = 36;
private int typ;
private BExpression op1;
private BExpression op2;
private BExpression op3;
private float numberValue;
private int variableIdx;
private int lookupNameIdx;
private int[] lookupValueIdxArray;
// Parse the expression and all subexpression
public static BExpression parse( BExpressionContext ctx, int level ) throws Exception
{
return parse( ctx, level, null );
}
private static BExpression parse( BExpressionContext ctx, int level, String optionalToken ) throws Exception
{
boolean brackets = false;
String operator = ctx.parseToken();
if ( optionalToken != null && optionalToken.equals( operator ) )
{
operator = ctx.parseToken();
}
if ( "(".equals( operator ) )
{
brackets = true;
operator = ctx.parseToken();
}
if ( operator == null )
{
if ( level == 0 ) return null;
else throw new IllegalArgumentException( "unexpected end of file" );
}
if ( level == 0 )
{
if ( !"assign".equals( operator ) )
{
throw new IllegalArgumentException( "operator " + operator + " is invalid on toplevel (only 'assign' allowed)" );
}
}
BExpression exp = new BExpression();
int nops = 3;
boolean ifThenElse = false;
if ( "switch".equals( operator ) )
{
exp.typ = SWITCH_EXP;
}
else if ( "if".equals( operator ) )
{
exp.typ = SWITCH_EXP;
ifThenElse = true;
}
else
{
nops = 2; // check binary expressions
if ( "or".equals( operator ) )
{
exp.typ = OR_EXP;
}
else if ( "and".equals( operator ) )
{
exp.typ = AND_EXP;
}
else if ( "multiply".equals( operator ) )
{
exp.typ = MULTIPLY_EXP;
}
else if ( "add".equals( operator ) )
{
exp.typ = ADD_EXP;
}
else if ( "max".equals( operator ) )
{
exp.typ = MAX_EXP;
}
else if ( "min".equals( operator ) )
{
exp.typ = MIN_EXP;
}
else if ( "equal".equals( operator ) )
{
exp.typ = EQUAL_EXP;
}
else if ( "greater".equals( operator ) )
{
exp.typ = GREATER_EXP;
}
else if ( "sub".equals( operator ) )
{
exp.typ = SUB_EXP;
}
else if ( "lesser".equals( operator ) )
{
exp.typ = LESSER_EXP;
}
else if ( "xor".equals( operator ) )
{
exp.typ = XOR_EXP;
}
else
{
nops = 1; // check unary expressions
if ( "assign".equals( operator ) )
{
if ( level > 0 ) throw new IllegalArgumentException( "assign operator within expression" );
exp.typ = ASSIGN_EXP;
String variable = ctx.parseToken();
if ( variable == null ) throw new IllegalArgumentException( "unexpected end of file" );
if ( variable.indexOf( '=' ) >= 0 ) throw new IllegalArgumentException( "variable name cannot contain '=': " + variable );
if ( variable.indexOf( ':' ) >= 0 ) throw new IllegalArgumentException( "cannot assign context-prefixed variable: " + variable );
exp.variableIdx = ctx.getVariableIdx( variable, true );
if ( exp.variableIdx < ctx.getMinWriteIdx() ) throw new IllegalArgumentException( "cannot assign to readonly variable " + variable );
}
else if ( "not".equals( operator ) )
{
exp.typ = NOT_EXP;
}
else
{
nops = 0; // check elemantary expressions
int idx = operator.indexOf( '=' );
if ( idx >= 0 )
{
exp.typ = LOOKUP_EXP;
String name = operator.substring( 0, idx );
String values = operator.substring( idx+1 );
exp.lookupNameIdx = ctx.getLookupNameIdx( name );
if ( exp.lookupNameIdx < 0 )
{
throw new IllegalArgumentException( "unknown lookup name: " + name );
}
ctx.markLookupIdxUsed( exp.lookupNameIdx );
StringTokenizer tk = new StringTokenizer( values, "|" );
int nt = tk.countTokens();
int nt2 = nt == 0 ? 1 : nt;
exp.lookupValueIdxArray = new int[nt2];
for( int ti=0; ti<nt2; ti++ )
{
String value = ti < nt ? tk.nextToken() : "";
exp.lookupValueIdxArray[ti] = ctx.getLookupValueIdx( exp.lookupNameIdx, value );
if ( exp.lookupValueIdxArray[ti] < 0 )
{
throw new IllegalArgumentException( "unknown lookup value: " + value );
}
}
}
else if ( ( idx = operator.indexOf( ':' ) ) >= 0 )
{
/*
use of variable values
assign no_height
switch and not maxheight=
lesser v:maxheight my_height true
false
*/
if (operator.startsWith("v:")) {
String name = operator.substring(2);
exp.typ = VARIABLE_GET_EXP;
exp.lookupNameIdx = ctx.getLookupNameIdx( name );
} else {
String context = operator.substring( 0, idx );
String varname = operator.substring( idx+1 );
exp.typ = FOREIGN_VARIABLE_EXP;
exp.variableIdx = ctx.getForeignVariableIdx( context, varname );
}
}
else if ( (idx = ctx.getVariableIdx( operator, false )) >= 0 )
{
exp.typ = VARIABLE_EXP;
exp.variableIdx = idx;
}
else if ( "true".equals( operator ) )
{
exp.numberValue = 1.f;
exp.typ = NUMBER_EXP;
}
else if ( "false".equals( operator ) )
{
exp.numberValue = 0.f;
exp.typ = NUMBER_EXP;
}
else
{
try
{
exp.numberValue = Float.parseFloat( operator );
exp.typ = NUMBER_EXP;
}
catch( NumberFormatException nfe )
{
throw new IllegalArgumentException( "unknown expression: " + operator );
}
}
}
}
}
// parse operands
if ( nops > 0 )
{
exp.op1 = BExpression.parse( ctx, level+1, exp.typ == ASSIGN_EXP ? "=" : null );
}
if ( nops > 1 )
{
if ( ifThenElse ) checkExpectedToken( ctx, "then" );
exp.op2 = BExpression.parse( ctx, level+1, null );
}
if ( nops > 2 )
{
if ( ifThenElse ) checkExpectedToken( ctx, "else" );
exp.op3 = BExpression.parse( ctx, level+1, null );
}
if ( brackets )
{
checkExpectedToken( ctx, ")" );
}
return exp;
}
private static void checkExpectedToken( BExpressionContext ctx, String expected ) throws Exception
{
String token = ctx.parseToken();
if ( ! expected.equals( token ) )
{
throw new IllegalArgumentException( "unexpected token: " + token + ", expected: " + expected );
}
}
// Evaluate the expression
public float evaluate( BExpressionContext ctx )
{
switch( typ )
{
case OR_EXP: return op1.evaluate(ctx) != 0.f ? 1.f : ( op2.evaluate(ctx) != 0.f ? 1.f : 0.f );
case XOR_EXP: return ( (op1.evaluate(ctx) != 0.f) ^ ( op2.evaluate(ctx) != 0.f ) ? 1.f : 0.f );
case AND_EXP: return op1.evaluate(ctx) != 0.f ? ( op2.evaluate(ctx) != 0.f ? 1.f : 0.f ) : 0.f;
case ADD_EXP: return op1.evaluate(ctx) + op2.evaluate(ctx);
case SUB_EXP: return op1.evaluate(ctx) - op2.evaluate(ctx);
case MULTIPLY_EXP: return op1.evaluate(ctx) * op2.evaluate(ctx);
case MAX_EXP: return max( op1.evaluate(ctx), op2.evaluate(ctx) );
case MIN_EXP: return min( op1.evaluate(ctx), op2.evaluate(ctx) );
case EQUAL_EXP: return op1.evaluate(ctx) == op2.evaluate(ctx) ? 1.f : 0.f;
case GREATER_EXP: return op1.evaluate(ctx) > op2.evaluate(ctx) ? 1.f : 0.f;
case LESSER_EXP: return op1.evaluate(ctx) < op2.evaluate(ctx) ? 1.f : 0.f;
case SWITCH_EXP: return op1.evaluate(ctx) != 0.f ? op2.evaluate(ctx) : op3.evaluate(ctx);
case ASSIGN_EXP: return ctx.assign( variableIdx, op1.evaluate(ctx) );
case LOOKUP_EXP: return ctx.getLookupMatch( lookupNameIdx, lookupValueIdxArray );
case NUMBER_EXP: return numberValue;
case VARIABLE_EXP: return ctx.getVariableValue( variableIdx );
case FOREIGN_VARIABLE_EXP: return ctx.getForeignVariableValue( variableIdx );
case VARIABLE_GET_EXP: return ctx.getLookupValue(lookupNameIdx);
case NOT_EXP: return op1.evaluate(ctx) == 0.f ? 1.f : 0.f;
default: throw new IllegalArgumentException( "unknown op-code: " + typ );
}
}
private float max( float v1, float v2 )
{
return v1 > v2 ? v1 : v2;
}
private float min( float v1, float v2 )
{
return v1 < v2 ? v1 : v2;
}
}
package btools.expressions;
import java.util.StringTokenizer;
final class BExpression {
private static final int OR_EXP = 10;
private static final int AND_EXP = 11;
private static final int NOT_EXP = 12;
private static final int ADD_EXP = 20;
private static final int MULTIPLY_EXP = 21;
private static final int DIVIDE_EXP = 22;
private static final int MAX_EXP = 23;
private static final int EQUAL_EXP = 24;
private static final int GREATER_EXP = 25;
private static final int MIN_EXP = 26;
private static final int SUB_EXP = 27;
private static final int LESSER_EXP = 28;
private static final int XOR_EXP = 29;
private static final int SWITCH_EXP = 30;
private static final int ASSIGN_EXP = 31;
private static final int LOOKUP_EXP = 32;
private static final int NUMBER_EXP = 33;
private static final int VARIABLE_EXP = 34;
private static final int FOREIGN_VARIABLE_EXP = 35;
private static final int VARIABLE_GET_EXP = 36;
private int typ;
private BExpression op1;
private BExpression op2;
private BExpression op3;
private float numberValue;
private int variableIdx;
private int lookupNameIdx = -1;
private int[] lookupValueIdxArray;
private boolean doNotChange;
// Parse the expression and all subexpression
public static BExpression parse(BExpressionContext ctx, int level) throws Exception {
return parse(ctx, level, null);
}
private static BExpression parse(BExpressionContext ctx, int level, String optionalToken) throws Exception {
BExpression e = parseRaw(ctx, level, optionalToken);
if (e == null) {
return null;
}
if (ASSIGN_EXP == e.typ) {
// manage assined an injected values
BExpression assignedBefore = ctx.lastAssignedExpression.get(e.variableIdx);
if (assignedBefore != null && assignedBefore.doNotChange) {
e.op1 = assignedBefore; // was injected as key-value
e.op1.doNotChange = false; // protect just once, can be changed in second assignement
}
ctx.lastAssignedExpression.set(e.variableIdx, e.op1);
}
else if (!ctx.skipConstantExpressionOptimizations) {
// try to simplify the expression
if (VARIABLE_EXP == e.typ) {
BExpression ae = ctx.lastAssignedExpression.get(e.variableIdx);
if (ae != null && ae.typ == NUMBER_EXP) {
e = ae;
}
} else {
BExpression eCollapsed = e.tryCollapse();
if (e != eCollapsed) {
e = eCollapsed; // allow breakpoint..
}
BExpression eEvaluated = e.tryEvaluateConstant();
if (e != eEvaluated) {
e = eEvaluated; // allow breakpoint..
}
}
}
if (level == 0) {
// mark the used lookups after the
// expression is collapsed to not mark
// lookups as used that appear in the profile
// but are de-activated by constant expressions
int nodeCount = e.markLookupIdxUsed(ctx);
ctx.expressionNodeCount += nodeCount;
}
return e;
}
private int markLookupIdxUsed(BExpressionContext ctx) {
int nodeCount = 1;
if (lookupNameIdx >= 0) {
ctx.markLookupIdxUsed(lookupNameIdx);
}
if (op1 != null) {
nodeCount += op1.markLookupIdxUsed(ctx);
}
if (op2 != null) {
nodeCount += op2.markLookupIdxUsed(ctx);
}
if (op3 != null) {
nodeCount += op3.markLookupIdxUsed(ctx);
}
return nodeCount;
}
private static BExpression parseRaw(BExpressionContext ctx, int level, String optionalToken) throws Exception {
boolean brackets = false;
String operator = ctx.parseToken();
if (optionalToken != null && optionalToken.equals(operator)) {
operator = ctx.parseToken();
}
if ("(".equals(operator)) {
brackets = true;
operator = ctx.parseToken();
}
if (operator == null) {
if (level == 0) return null;
else throw new IllegalArgumentException("unexpected end of file");
}
if (level == 0) {
if (!"assign".equals(operator)) {
throw new IllegalArgumentException("operator " + operator + " is invalid on toplevel (only 'assign' allowed)");
}
}
BExpression exp = new BExpression();
int nops = 3;
boolean ifThenElse = false;
if ("switch".equals(operator)) {
exp.typ = SWITCH_EXP;
} else if ("if".equals(operator)) {
exp.typ = SWITCH_EXP;
ifThenElse = true;
} else {
nops = 2; // check binary expressions
if ("or".equals(operator)) {
exp.typ = OR_EXP;
} else if ("and".equals(operator)) {
exp.typ = AND_EXP;
} else if ("multiply".equals(operator)) {
exp.typ = MULTIPLY_EXP;
} else if ("divide".equals(operator)) {
exp.typ = DIVIDE_EXP;
} else if ("add".equals(operator)) {
exp.typ = ADD_EXP;
} else if ("max".equals(operator)) {
exp.typ = MAX_EXP;
} else if ("min".equals(operator)) {
exp.typ = MIN_EXP;
} else if ("equal".equals(operator)) {
exp.typ = EQUAL_EXP;
} else if ("greater".equals(operator)) {
exp.typ = GREATER_EXP;
} else if ("sub".equals(operator)) {
exp.typ = SUB_EXP;
} else if ("lesser".equals(operator)) {
exp.typ = LESSER_EXP;
} else if ("xor".equals(operator)) {
exp.typ = XOR_EXP;
} else {
nops = 1; // check unary expressions
if ("assign".equals(operator)) {
if (level > 0) throw new IllegalArgumentException("assign operator within expression");
exp.typ = ASSIGN_EXP;
String variable = ctx.parseToken();
if (variable == null) throw new IllegalArgumentException("unexpected end of file");
if (variable.indexOf('=') >= 0)
throw new IllegalArgumentException("variable name cannot contain '=': " + variable);
if (variable.indexOf(':') >= 0)
throw new IllegalArgumentException("cannot assign context-prefixed variable: " + variable);
exp.variableIdx = ctx.getVariableIdx(variable, true);
if (exp.variableIdx < ctx.getMinWriteIdx())
throw new IllegalArgumentException("cannot assign to readonly variable " + variable);
} else if ("not".equals(operator)) {
exp.typ = NOT_EXP;
} else {
nops = 0; // check elemantary expressions
int idx = operator.indexOf('=');
if (idx >= 0) {
exp.typ = LOOKUP_EXP;
String name = operator.substring(0, idx);
String values = operator.substring(idx + 1);
exp.lookupNameIdx = ctx.getLookupNameIdx(name);
if (exp.lookupNameIdx < 0) {
throw new IllegalArgumentException("unknown lookup name: " + name);
}
StringTokenizer tk = new StringTokenizer(values, "|");
int nt = tk.countTokens();
int nt2 = nt == 0 ? 1 : nt;
exp.lookupValueIdxArray = new int[nt2];
for (int ti = 0; ti < nt2; ti++) {
String value = ti < nt ? tk.nextToken() : "";
exp.lookupValueIdxArray[ti] = ctx.getLookupValueIdx(exp.lookupNameIdx, value);
if (exp.lookupValueIdxArray[ti] < 0) {
throw new IllegalArgumentException("unknown lookup value: " + value);
}
}
} else if ((idx = operator.indexOf(':')) >= 0) {
/*
use of variable values
assign no_height
switch and not maxheight=
lesser v:maxheight my_height true
false
*/
if (operator.startsWith("v:")) {
String name = operator.substring(2);
exp.typ = VARIABLE_GET_EXP;
exp.lookupNameIdx = ctx.getLookupNameIdx(name);
} else {
String context = operator.substring(0, idx);
String varname = operator.substring(idx + 1);
exp.typ = FOREIGN_VARIABLE_EXP;
exp.variableIdx = ctx.getForeignVariableIdx(context, varname);
}
} else if ((idx = ctx.getVariableIdx(operator, false)) >= 0) {
exp.typ = VARIABLE_EXP;
exp.variableIdx = idx;
} else if ("true".equals(operator)) {
exp.numberValue = 1.f;
exp.typ = NUMBER_EXP;
} else if ("false".equals(operator)) {
exp.numberValue = 0.f;
exp.typ = NUMBER_EXP;
} else {
try {
exp.numberValue = Float.parseFloat(operator);
exp.typ = NUMBER_EXP;
} catch (NumberFormatException nfe) {
throw new IllegalArgumentException("unknown expression: " + operator);
}
}
}
}
}
// parse operands
if (nops > 0) {
exp.op1 = parse(ctx, level + 1, exp.typ == ASSIGN_EXP ? "=" : null);
}
if (nops > 1) {
if (ifThenElse) checkExpectedToken(ctx, "then");
exp.op2 = parse(ctx, level + 1, null);
}
if (nops > 2) {
if (ifThenElse) checkExpectedToken(ctx, "else");
exp.op3 = parse(ctx, level + 1, null);
}
if (brackets) {
checkExpectedToken(ctx, ")");
}
return exp;
}
private static void checkExpectedToken(BExpressionContext ctx, String expected) throws Exception {
String token = ctx.parseToken();
if (!expected.equals(token)) {
throw new IllegalArgumentException("unexpected token: " + token + ", expected: " + expected);
}
}
// Evaluate the expression
public float evaluate(BExpressionContext ctx) {
switch (typ) {
case OR_EXP:
return op1.evaluate(ctx) != 0.f ? 1.f : (op2.evaluate(ctx) != 0.f ? 1.f : 0.f);
case XOR_EXP:
return ((op1.evaluate(ctx) != 0.f) ^ (op2.evaluate(ctx) != 0.f) ? 1.f : 0.f);
case AND_EXP:
return op1.evaluate(ctx) != 0.f ? (op2.evaluate(ctx) != 0.f ? 1.f : 0.f) : 0.f;
case ADD_EXP:
return op1.evaluate(ctx) + op2.evaluate(ctx);
case SUB_EXP:
return op1.evaluate(ctx) - op2.evaluate(ctx);
case MULTIPLY_EXP:
return op1.evaluate(ctx) * op2.evaluate(ctx);
case DIVIDE_EXP:
return divide(op1.evaluate(ctx), op2.evaluate(ctx));
case MAX_EXP:
return max(op1.evaluate(ctx), op2.evaluate(ctx));
case MIN_EXP:
return min(op1.evaluate(ctx), op2.evaluate(ctx));
case EQUAL_EXP:
return op1.evaluate(ctx) == op2.evaluate(ctx) ? 1.f : 0.f;
case GREATER_EXP:
return op1.evaluate(ctx) > op2.evaluate(ctx) ? 1.f : 0.f;
case LESSER_EXP:
return op1.evaluate(ctx) < op2.evaluate(ctx) ? 1.f : 0.f;
case SWITCH_EXP:
return op1.evaluate(ctx) != 0.f ? op2.evaluate(ctx) : op3.evaluate(ctx);
case ASSIGN_EXP:
return ctx.assign(variableIdx, op1.evaluate(ctx));
case LOOKUP_EXP:
return ctx.getLookupMatch(lookupNameIdx, lookupValueIdxArray);
case NUMBER_EXP:
return numberValue;
case VARIABLE_EXP:
return ctx.getVariableValue(variableIdx);
case FOREIGN_VARIABLE_EXP:
return ctx.getForeignVariableValue(variableIdx);
case VARIABLE_GET_EXP:
return ctx.getLookupValue(lookupNameIdx);
case NOT_EXP:
return op1.evaluate(ctx) == 0.f ? 1.f : 0.f;
default:
throw new IllegalArgumentException("unknown op-code: " + typ);
}
}
// Try to collapse the expression
// if logically possible
private BExpression tryCollapse() {
switch (typ) {
case OR_EXP:
return NUMBER_EXP == op1.typ ?
(op1.numberValue != 0.f ? op1 : op2)
: (NUMBER_EXP == op2.typ ?
(op2.numberValue != 0.f ? op2 : op1)
: this);
case AND_EXP:
return NUMBER_EXP == op1.typ ?
(op1.numberValue == 0.f ? op1 : op2)
: (NUMBER_EXP == op2.typ ?
(op2.numberValue == 0.f ? op2 : op1)
: this);
case ADD_EXP:
return NUMBER_EXP == op1.typ ?
(op1.numberValue == 0.f ? op2 : this)
: (NUMBER_EXP == op2.typ ?
(op2.numberValue == 0.f ? op1 : this)
: this);
case SWITCH_EXP:
return NUMBER_EXP == op1.typ ?
(op1.numberValue == 0.f ? op3 : op2) : this;
default:
return this;
}
}
// Try to evaluate the expression
// if all operands are constant
private BExpression tryEvaluateConstant() {
if (op1 != null && NUMBER_EXP == op1.typ
&& (op2 == null || NUMBER_EXP == op2.typ)
&& (op3 == null || NUMBER_EXP == op3.typ)) {
BExpression exp = new BExpression();
exp.typ = NUMBER_EXP;
exp.numberValue = evaluate(null);
return exp;
}
return this;
}
private float max(float v1, float v2) {
return v1 > v2 ? v1 : v2;
}
private float min(float v1, float v2) {
return v1 < v2 ? v1 : v2;
}
private float divide(float v1, float v2) {
if (v2 == 0f) throw new IllegalArgumentException("div by zero");
return v1 / v2;
}
@Override
public String toString() {
if (typ == NUMBER_EXP) {
return "" + numberValue;
}
if (typ == VARIABLE_EXP) {
return "vidx=" + variableIdx;
}
StringBuilder sb = new StringBuilder("typ=" + typ + " ops=(");
addOp(sb, op1);
addOp(sb, op2);
addOp(sb, op3);
sb.append(')');
return sb.toString();
}
private void addOp(StringBuilder sb, BExpression e) {
if (e != null) {
sb.append('[').append(e.toString()).append(']');
}
}
static BExpression createAssignExpressionFromKeyValue(BExpressionContext ctx, String key, String value) {
BExpression e = new BExpression();
e.typ = ASSIGN_EXP;
e.variableIdx = ctx.getVariableIdx(key, true);
e.op1 = new BExpression();
e.op1.typ = NUMBER_EXP;
e.op1.numberValue = Float.parseFloat(value);
e.op1.doNotChange = true;
ctx.lastAssignedExpression.set(e.variableIdx, e.op1);
return e;
}
}

View file

@ -7,32 +7,29 @@
package btools.expressions;
public final class BExpressionContextNode extends BExpressionContext
{
public final class BExpressionContextNode extends BExpressionContext {
private static String[] buildInVariables =
{ "initialcost" };
protected String[] getBuildInVariableNames()
{
{"initialcost"};
protected String[] getBuildInVariableNames() {
return buildInVariables;
}
public float getInitialcost() { return getBuildInVariable(0); }
public float getInitialcost() {
return getBuildInVariable(0);
}
public BExpressionContextNode( BExpressionMetaData meta )
{
super( "node", meta );
public BExpressionContextNode(BExpressionMetaData meta) {
super("node", meta);
}
/**
* Create an Expression-Context for way context
*
* @param hashSize size of hashmap for result caching
* @param hashSize size of hashmap for result caching
*/
public BExpressionContextNode( int hashSize, BExpressionMetaData meta )
{
super( "node", hashSize, meta );
public BExpressionContextNode(int hashSize, BExpressionMetaData meta) {
super("node", hashSize, meta);
}
}

View file

@ -8,66 +8,125 @@ package btools.expressions;
import btools.codec.TagValueValidator;
public final class BExpressionContextWay extends BExpressionContext implements TagValueValidator
{
public final class BExpressionContextWay extends BExpressionContext implements TagValueValidator {
private boolean decodeForbidden = true;
private static String[] buildInVariables =
{ "costfactor", "turncost", "uphillcostfactor", "downhillcostfactor", "initialcost", "nodeaccessgranted", "initialclassifier", "trafficsourcedensity", "istrafficbackbone", "priorityclassifier", "classifiermask", "maxspeed" };
protected String[] getBuildInVariableNames()
{
{"costfactor", "turncost", "uphillcostfactor", "downhillcostfactor", "initialcost", "nodeaccessgranted", "initialclassifier", "trafficsourcedensity", "istrafficbackbone", "priorityclassifier", "classifiermask", "maxspeed", "uphillcost", "downhillcost", "uphillcutoff", "downhillcutoff", "uphillmaxslope", "downhillmaxslope", "uphillmaxslopecost", "downhillmaxslopecost"};
protected String[] getBuildInVariableNames() {
return buildInVariables;
}
public float getCostfactor() { return getBuildInVariable(0); }
public float getTurncost() { return getBuildInVariable(1); }
public float getUphillCostfactor() { return getBuildInVariable(2); }
public float getDownhillCostfactor() { return getBuildInVariable(3); }
public float getInitialcost() { return getBuildInVariable(4); }
public float getNodeAccessGranted() { return getBuildInVariable(5); }
public float getInitialClassifier() { return getBuildInVariable(6); }
public float getTrafficSourceDensity() { return getBuildInVariable(7); }
public float getIsTrafficBackbone() { return getBuildInVariable(8); }
public float getPriorityClassifier() { return getBuildInVariable(9); }
public float getClassifierMask() { return getBuildInVariable(10); }
public float getMaxspeed() { return getBuildInVariable(11); }
public float getCostfactor() {
return getBuildInVariable(0);
}
public BExpressionContextWay( BExpressionMetaData meta )
{
super( "way", meta );
public float getTurncost() {
return getBuildInVariable(1);
}
public float getUphillCostfactor() {
return getBuildInVariable(2);
}
public float getDownhillCostfactor() {
return getBuildInVariable(3);
}
public float getInitialcost() {
return getBuildInVariable(4);
}
public float getNodeAccessGranted() {
return getBuildInVariable(5);
}
public float getInitialClassifier() {
return getBuildInVariable(6);
}
public float getTrafficSourceDensity() {
return getBuildInVariable(7);
}
public float getIsTrafficBackbone() {
return getBuildInVariable(8);
}
public float getPriorityClassifier() {
return getBuildInVariable(9);
}
public float getClassifierMask() {
return getBuildInVariable(10);
}
public float getMaxspeed() {
return getBuildInVariable(11);
}
public float getUphillcost() {
return getBuildInVariable(12);
}
public float getDownhillcost() {
return getBuildInVariable(13);
}
public float getUphillcutoff() {
return getBuildInVariable(14);
}
public float getDownhillcutoff() {
return getBuildInVariable(15);
}
public float getUphillmaxslope() {
return getBuildInVariable(16);
}
public float getDownhillmaxslope() {
return getBuildInVariable(17);
}
public float getUphillmaxslopecost() {
return getBuildInVariable(18);
}
public float getDownhillmaxslopecost() {
return getBuildInVariable(19);
}
public BExpressionContextWay(BExpressionMetaData meta) {
super("way", meta);
}
/**
* Create an Expression-Context for way context
*
* @param hashSize size of hashmap for result caching
* @param hashSize size of hashmap for result caching
*/
public BExpressionContextWay( int hashSize, BExpressionMetaData meta )
{
super( "way", hashSize, meta );
public BExpressionContextWay(int hashSize, BExpressionMetaData meta) {
super("way", hashSize, meta);
}
@Override
public int accessType( byte[] description )
{
evaluate( false, description );
public int accessType(byte[] description) {
evaluate(false, description);
float minCostFactor = getCostfactor();
if ( minCostFactor >= 9999.f )
{
if (minCostFactor >= 9999.f) {
setInverseVars();
float reverseCostFactor = getCostfactor();
if ( reverseCostFactor < minCostFactor )
{
if (reverseCostFactor < minCostFactor) {
minCostFactor = reverseCostFactor;
}
}
return minCostFactor < 9999.f ? 2 : decodeForbidden ? (minCostFactor < 10000.f ? 1 : 0) : 0;
}
@Override
public void setDecodeForbidden( boolean decodeForbidden )
{
this.decodeForbidden= decodeForbidden;
public void setDecodeForbidden(boolean decodeForbidden) {
this.decodeForbidden = decodeForbidden;
}
}

View file

@ -1,66 +1,57 @@
/**
* A lookup value with optional aliases
*
* toString just gives the primary value,
* equals just compares against primary value
* matches() also compares aliases
*
* @author ab
*/
package btools.expressions;
import java.util.ArrayList;
final class BExpressionLookupValue
{
String value;
ArrayList<String> aliases;
@Override
public String toString()
{
return value;
}
public BExpressionLookupValue( String value )
{
this.value = value;
}
public void addAlias( String alias )
{
if ( aliases == null ) aliases = new ArrayList<String>();
aliases.add( alias );
}
@Override
public boolean equals( Object o )
{
if ( o instanceof String )
{
String v = (String)o;
return value.equals( v );
}
if ( o instanceof BExpressionLookupValue )
{
BExpressionLookupValue v = (BExpressionLookupValue)o;
return value.equals( v.value );
}
return false;
}
public boolean matches( String s )
{
if ( value.equals( s ) ) return true;
if ( aliases != null )
{
for( String alias : aliases )
{
if ( alias.equals( s ) ) return true;
}
}
return false;
}
}
/**
* A lookup value with optional aliases
* <p>
* toString just gives the primary value,
* equals just compares against primary value
* matches() also compares aliases
*
* @author ab
*/
package btools.expressions;
import java.util.ArrayList;
import java.util.List;
final class BExpressionLookupValue {
String value;
List<String> aliases;
@Override
public String toString() {
return value;
}
public BExpressionLookupValue(String value) {
this.value = value;
}
public void addAlias(String alias) {
if (aliases == null) aliases = new ArrayList<>();
aliases.add(alias);
}
@Override
public boolean equals(Object o) {
if (o instanceof String) {
String v = (String) o;
return value.equals(v);
}
if (o instanceof BExpressionLookupValue) {
BExpressionLookupValue v = (BExpressionLookupValue) o;
return value.equals(v.value);
}
return false;
}
public boolean matches(String s) {
if (value.equals(s)) return true;
if (aliases != null) {
for (String alias : aliases) {
if (alias.equals(s)) return true;
}
}
return false;
}
}

View file

@ -9,81 +9,67 @@ package btools.expressions;
import java.io.BufferedReader;
import java.io.File;
import java.io.FileReader;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.StringTokenizer;
import java.util.TreeMap;
import btools.util.BitCoderContext;
import btools.util.Crc32;
public final class BExpressionMetaData
{
private static final String CONTEXT_TAG = "---context:";
public final class BExpressionMetaData {
private static final String CONTEXT_TAG = "---context:";
private static final String VERSION_TAG = "---lookupversion:";
private static final String MINOR_VERSION_TAG = "---minorversion:";
private static final String VARLENGTH_TAG = "---readvarlength";
private static final String MIN_APP_VERSION_TAG = "---minappversion:";
public short lookupVersion = -1;
public short lookupMinorVersion = -1;
public short minAppVersion = -1;
private HashMap<String,BExpressionContext> listeners = new HashMap<String,BExpressionContext>();
public void registerListener( String context, BExpressionContext ctx )
{
listeners.put( context, ctx );
private Map<String, BExpressionContext> listeners = new HashMap<>();
public void registerListener(String context, BExpressionContext ctx) {
listeners.put(context, ctx);
}
public void readMetaData( File lookupsFile )
{
try
{
BufferedReader br = new BufferedReader( new FileReader( lookupsFile ) );
BExpressionContext ctx = null;
for(;;)
{
String line = br.readLine();
if ( line == null ) break;
line = line.trim();
if ( line.length() == 0 || line.startsWith( "#" ) ) continue;
if ( line.startsWith( CONTEXT_TAG ) )
{
ctx = listeners.get( line.substring( CONTEXT_TAG.length() ) );
continue;
public void readMetaData(File lookupsFile) {
try {
BufferedReader br = new BufferedReader(new FileReader(lookupsFile));
BExpressionContext ctx = null;
for (; ; ) {
String line = br.readLine();
if (line == null) break;
line = line.trim();
if (line.length() == 0 || line.startsWith("#")) continue;
if (line.startsWith(CONTEXT_TAG)) {
ctx = listeners.get(line.substring(CONTEXT_TAG.length()));
continue;
}
if (line.startsWith(VERSION_TAG)) {
lookupVersion = Short.parseShort(line.substring(VERSION_TAG.length()));
continue;
}
if (line.startsWith(MINOR_VERSION_TAG)) {
lookupMinorVersion = Short.parseShort(line.substring(MINOR_VERSION_TAG.length()));
continue;
}
if (line.startsWith(MIN_APP_VERSION_TAG)) {
minAppVersion = Short.parseShort(line.substring(MIN_APP_VERSION_TAG.length()));
continue;
}
if (line.startsWith(VARLENGTH_TAG)) { // tag removed...
continue;
}
if (ctx != null) ctx.parseMetaLine(line);
}
if ( line.startsWith( VERSION_TAG ) )
{
lookupVersion = Short.parseShort( line.substring( VERSION_TAG.length() ) );
continue;
br.close();
for (BExpressionContext c : listeners.values()) {
c.finishMetaParsing();
}
if ( line.startsWith( MINOR_VERSION_TAG ) )
{
lookupMinorVersion = Short.parseShort( line.substring( MINOR_VERSION_TAG.length() ) );
continue;
}
if ( line.startsWith( VARLENGTH_TAG ) ) // tag removed...
{
continue;
}
if ( ctx != null ) ctx.parseMetaLine( line );
} catch (Exception e) {
throw new RuntimeException(e);
}
br.close();
for( BExpressionContext c : listeners.values() )
{
c.finishMetaParsing();
}
}
catch( Exception e )
{
throw new RuntimeException( e );
}
}
}

View file

@ -4,29 +4,24 @@ import java.util.Arrays;
import btools.util.LruMapNode;
public final class CacheNode extends LruMapNode
{
public final class CacheNode extends LruMapNode {
byte[] ab;
float[] vars;
@Override
public int hashCode()
{
public int hashCode() {
return hash;
}
@Override
public boolean equals( Object o )
{
public boolean equals(Object o) {
CacheNode n = (CacheNode) o;
if ( hash != n.hash )
{
if (hash != n.hash) {
return false;
}
if ( ab == null )
{
if (ab == null) {
return true; // hack: null = crc match only
}
return Arrays.equals( ab, n.ab );
return Arrays.equals(ab, n.ab);
}
}

View file

@ -0,0 +1,49 @@
package btools.expressions;
import java.io.File;
public class IntegrityCheckProfile {
public static void main(final String[] args) {
if (args.length != 2) {
System.out.println("usage: java IntegrityCheckProfile <lookup-file> <profile-folder>");
return;
}
IntegrityCheckProfile test = new IntegrityCheckProfile();
try {
File lookupFile = new File(args[0]);
File profileDir = new File(args[1]);
test.integrityTestProfiles(lookupFile, profileDir);
} catch (Exception e) {
System.err.println(e.getMessage());
}
}
public void integrityTestProfiles(File lookupFile, File profileDir) {
File[] files = profileDir.listFiles();
if (files == null) {
System.err.println("no files " + profileDir);
return;
}
if (!lookupFile.exists()) {
System.err.println("no lookup file " + lookupFile);
return;
}
for (File f : files) {
if (f.getName().endsWith(".brf")) {
BExpressionMetaData meta = new BExpressionMetaData();
BExpressionContext expctxWay = new BExpressionContextWay(meta);
BExpressionContext expctxNode = new BExpressionContextNode(meta);
meta.readMetaData(lookupFile);
expctxNode.setForeignContext(expctxWay);
expctxWay.parseFile(f, "global");
expctxNode.parseFile(f, "global");
System.out.println("test " + meta.lookupVersion + "." + meta.lookupMinorVersion + " " + f);
}
}
}
}

View file

@ -3,45 +3,50 @@ package btools.expressions;
import java.io.File;
import java.util.Random;
public final class ProfileComparator
{
public static void main( String[] args )
{
if ( args.length != 4 )
{
System.out.println( "usage: java ProfileComparator <lookup-file> <profile1> <profile2> <nsamples>" );
public final class ProfileComparator {
public static void main(String[] args) {
if (args.length != 4) {
System.out.println("usage: java ProfileComparator <lookup-file> <profile1> <profile2> <nsamples>");
return;
}
File lookupFile = new File( args[0] );
File profile1File = new File( args[1] );
File profile2File = new File( args[2] );
int nsamples = Integer.parseInt( args[3] );
testContext( lookupFile, profile1File, profile2File, nsamples, false );
testContext( lookupFile, profile1File, profile2File, nsamples, true );
File lookupFile = new File(args[0]);
File profile1File = new File(args[1]);
File profile2File = new File(args[2]);
int nsamples = Integer.parseInt(args[3]);
testContext(lookupFile, profile1File, profile2File, nsamples, false);
testContext(lookupFile, profile1File, profile2File, nsamples, true);
}
private static void testContext( File lookupFile, File profile1File, File profile2File, int nsamples, boolean nodeContext )
{
private static void testContext(File lookupFile, File profile1File, File profile2File, int nsamples, boolean nodeContext) {
// read lookup.dat + profiles
BExpressionMetaData meta1 = new BExpressionMetaData();
BExpressionMetaData meta2 = new BExpressionMetaData();
BExpressionContext expctx1 = nodeContext ? new BExpressionContextNode( meta1 ) : new BExpressionContextWay( meta1 );
BExpressionContext expctx2 = nodeContext ? new BExpressionContextNode( meta2 ) : new BExpressionContextWay( meta2 );
meta1.readMetaData( lookupFile );
meta2.readMetaData( lookupFile );
expctx1.parseFile( profile1File, "global" );
expctx2.parseFile( profile2File, "global" );
BExpressionContext expctx1 = nodeContext ? new BExpressionContextNode(meta1) : new BExpressionContextWay(meta1);
BExpressionContext expctx2 = nodeContext ? new BExpressionContextNode(meta2) : new BExpressionContextWay(meta2);
// if same profiles, compare different optimization levels
if (profile1File.getName().equals(profile2File.getName())) {
expctx2.skipConstantExpressionOptimizations = true;
}
meta1.readMetaData(lookupFile);
meta2.readMetaData(lookupFile);
expctx1.parseFile(profile1File, "global");
System.out.println("usedTags1=" + expctx1.usedTagList());
expctx2.parseFile(profile2File, "global");
System.out.println("usedTags2=" + expctx2.usedTagList());
System.out.println("nodeContext=" + nodeContext + " nodeCount1=" + expctx1.expressionNodeCount + " nodeCount2=" + expctx2.expressionNodeCount);
Random rnd = new Random();
for( int i=0; i<nsamples; i++ )
{
int[] data = expctx1.generateRandomValues( rnd );
expctx1.evaluate( data );
expctx2.evaluate( data );
expctx1.assertAllVariablesEqual( expctx2 );
for (int i = 0; i < nsamples; i++) {
int[] data = expctx1.generateRandomValues(rnd);
expctx1.evaluate(data);
expctx2.evaluate(data);
expctx1.assertAllVariablesEqual(expctx2);
}
}
}
}

View file

@ -4,24 +4,20 @@ import java.util.Arrays;
import btools.util.LruMapNode;
public final class VarWrapper extends LruMapNode
{
public final class VarWrapper extends LruMapNode {
float[] vars;
@Override
public int hashCode()
{
public int hashCode() {
return hash;
}
@Override
public boolean equals( Object o )
{
public boolean equals(Object o) {
VarWrapper n = (VarWrapper) o;
if ( hash != n.hash )
{
if (hash != n.hash) {
return false;
}
return Arrays.equals( vars, n.vars );
return Arrays.equals(vars, n.vars);
}
}

View file

@ -0,0 +1,50 @@
package btools.expressions;
import org.junit.Assert;
import org.junit.Test;
import java.io.File;
import java.util.Random;
import java.util.Map;
import java.util.HashMap;
public class ConstantOptimizerTest {
@Test
public void compareOptimizerModesTest() {
File lookupFile = new File(getClass().getResource("/lookups_test.dat").getPath());
File profileFile = new File(getClass().getResource("/profile_test.brf").getPath());
BExpressionMetaData meta1 = new BExpressionMetaData();
BExpressionMetaData meta2 = new BExpressionMetaData();
BExpressionContext expctx1 = new BExpressionContextWay(meta1);
BExpressionContext expctx2 = new BExpressionContextWay(meta2);
expctx2.skipConstantExpressionOptimizations = true;
Map<String, String> keyValue = new HashMap<>();
keyValue.put("global_inject1", "5");
keyValue.put("global_inject2", "6");
keyValue.put("global_inject3", "7");
meta1.readMetaData(lookupFile);
meta2.readMetaData(lookupFile);
expctx1.parseFile(profileFile, "global", keyValue);
expctx2.parseFile(profileFile, "global", keyValue);
float d = 0.0001f;
Assert.assertEquals(5f, expctx1.getVariableValue("global_inject1", 0f), d);
Assert.assertEquals(9f, expctx1.getVariableValue("global_inject2", 0f), d); // should be modified in 2. assign!
Assert.assertEquals(7f, expctx1.getVariableValue("global_inject3", 0f), d);
Assert.assertEquals(3f, expctx1.getVariableValue("global_inject4", 3f), d); // un-assigned
Assert.assertTrue("expected far less exporessions nodes if optimized", expctx2.expressionNodeCount - expctx1.expressionNodeCount >= 311-144);
Random rnd = new Random(17464); // fixed seed for unit test...
for (int i = 0; i < 10000; i++) {
int[] data = expctx1.generateRandomValues(rnd);
expctx1.evaluate(data);
expctx2.evaluate(data);
expctx1.assertAllVariablesEqual(expctx2);
}
}
}

View file

@ -1,60 +1,57 @@
package btools.expressions;
import java.util.*;
import java.io.*;
import java.net.URL;
import org.junit.Assert;
import org.junit.Test;
public class EncodeDecodeTest
{
import java.io.File;
import java.net.URL;
public class EncodeDecodeTest {
@Test
public void encodeDecodeTest()
{
URL testpurl = this.getClass().getResource( "/dummy.txt" );
public void encodeDecodeTest() {
URL testpurl = this.getClass().getResource("/dummy.txt");
File workingDir = new File(testpurl.getFile()).getParentFile();
File profileDir = new File( workingDir, "/../../../../misc/profiles2" );
File profileDir = new File(workingDir, "/../../../../misc/profiles2");
//File lookupFile = new File( profileDir, "lookups.dat" );
// add a test lookup
URL testlookup = this.getClass().getResource( "/lookups_test.dat" );
File lookupFile = new File( testlookup.getPath() );
// add a test lookup
URL testlookup = this.getClass().getResource("/lookups_test.dat");
File lookupFile = new File(testlookup.getPath());
// read lookup.dat + trekking.brf
BExpressionMetaData meta = new BExpressionMetaData();
BExpressionContextWay expctxWay = new BExpressionContextWay( meta );
meta.readMetaData( lookupFile );
expctxWay.parseFile( new File( profileDir, "trekking.brf" ), "global" );
BExpressionContextWay expctxWay = new BExpressionContextWay(meta);
meta.readMetaData(lookupFile);
expctxWay.parseFile(new File(profileDir, "trekking.brf"), "global");
String[] tags = {
"highway=residential",
"oneway=yes",
"depth=1'6\"",
// "depth=6 feet",
"maxheight=5.1m",
"maxdraft=~3 mt",
"reversedirection=yes"
"highway=residential",
"oneway=yes",
"depth=1'6\"",
// "depth=6 feet",
"maxheight=5.1m",
"maxdraft=~3 m - 4 m",
"reversedirection=yes"
};
// encode the tags into 64 bit description word
int[] lookupData = expctxWay.createNewLookupData();
for( String arg: tags )
{
int idx = arg.indexOf( '=' );
if ( idx < 0 ) throw new IllegalArgumentException( "bad argument (should be <tag>=<value>): " + arg );
String key = arg.substring( 0, idx );
String value = arg.substring( idx+1 );
expctxWay.addLookupValue( key, value, lookupData );
for (String arg : tags) {
int idx = arg.indexOf('=');
if (idx < 0)
throw new IllegalArgumentException("bad argument (should be <tag>=<value>): " + arg);
String key = arg.substring(0, idx);
String value = arg.substring(idx + 1);
expctxWay.addLookupValue(key, value, lookupData);
}
byte[] description = expctxWay.encode(lookupData);
// calculate the cost factor from that description
expctxWay.evaluate( true, description ); // true = "reversedirection=yes" (not encoded in description anymore)
expctxWay.evaluate(true, description); // true = "reversedirection=yes" (not encoded in description anymore)
System.out.println( "description: " + expctxWay.getKeyValueDescription(true, description) );
System.out.println("description: " + expctxWay.getKeyValueDescription(true, description));
float costfactor = expctxWay.getCostfactor();
Assert.assertTrue( "costfactor mismatch", Math.abs( costfactor - 5.15 ) < 0.00001 );
Assert.assertTrue("costfactor mismatch", Math.abs(costfactor - 5.15) < 0.00001);
}
}

View file

@ -0,0 +1,33 @@
package btools.expressions;
import static org.junit.Assert.assertNotNull;
import org.junit.Test;
import java.io.File;
import java.io.IOException;
public class IntegrityCheckProfileTest {
@Test
public void integrityTestProfiles() throws IOException {
File workingDir = new File(".").getCanonicalFile();
File profileDir = new File(workingDir, "../misc/profiles2");
File[] files = profileDir.listFiles();
assertNotNull("Missing profiles", files);
for (File f : files) {
if (f.getName().endsWith(".brf")) {
BExpressionMetaData meta = new BExpressionMetaData();
BExpressionContext expctxWay = new BExpressionContextWay(meta);
BExpressionContext expctxNode = new BExpressionContextNode(meta);
meta.readMetaData(new File(profileDir, "lookups.dat"));
expctxNode.setForeignContext(expctxWay);
expctxWay.parseFile(f, "global");
expctxNode.parseFile(f, "global");
}
}
}
}

View file

@ -687,6 +687,35 @@ construction;0000000037 driveway
construction;0000000021 mini_roundabout
construction;0000000020 turning_loop
estimated_forest_class;0000000001 1
estimated_forest_class;0000000001 2
estimated_forest_class;0000000001 3
estimated_forest_class;0000000001 4
estimated_forest_class;0000000001 5
estimated_forest_class;0000000001 6
estimated_noise_class;0000000001 1
estimated_noise_class;0000000001 2
estimated_noise_class;0000000001 3
estimated_noise_class;0000000001 4
estimated_noise_class;0000000001 5
estimated_noise_class;0000000001 6
estimated_river_class;0000000001 1
estimated_river_class;0000000001 2
estimated_river_class;0000000001 3
estimated_river_class;0000000001 4
estimated_river_class;0000000001 5
estimated_river_class;0000000001 6
estimated_town_class;0000000001 1
estimated_town_class;0000000001 2
estimated_town_class;0000000001 3
estimated_town_class;0000000001 4
estimated_town_class;0000000001 5
estimated_town_class;0000000001 6
---context:node
highway;0001314954 bus_stop

View file

@ -0,0 +1,88 @@
---context:global # following code refers to global config
assign global_false = false
assign global_true = true
assign global_and = and false global_true
assign global_inject1 = 5
assign global_inject2 = 13
assign global_inject2 = add global_inject2 3
assign global_or = or ( or global_true global_false ) ( or global_false global_true )
assign global_and = and ( and global_true true ) false
---context:way # following code refers to way-tags
assign v = highway=primary
assign w = surface=asphalt
# test constant or/and
assign costfactor =
add multiply 1 or 1 1
add multiply 2 or 1 0
add multiply 4 or 0 1
add multiply 8 or 0 0
add multiply 16 and 1 1
add multiply 32 and 1 1
add multiply 64 and 1 1
multiply 128 and 1 1
# test variable or
assign turncost =
add multiply 1 or v 1
add multiply 2 or v 0
add multiply 4 or 0 v
add multiply 8 or 1 v
multiply 16 or v w
# test variable and
assign uphillcostfactor =
add multiply 1 and v 1
add multiply 2 and v 0
add multiply 4 and 0 v
add multiply 8 and 1 v
multiply 16 and v w
# test add
assign downhillcostfactor =
add multiply 1 add 1 1
add multiply 2 add 1 0
add multiply 4 add 0 1
add multiply 8 add 0 0
add multiply 16 add v 1
add multiply 32 add v 0
add multiply 64 add 1 v
multiply 128 add 0 v
# test max
assign initialcost =
add multiply 1 max 1 2
add multiply 2 max multiply 2 v 1
add multiply 4 max 1 multiply 2 v
multiply 8 max multiply 2 v v
# test switch
assign initialclassifier =
add multiply 1 switch 1 1 0
add multiply 2 switch 0 1 0
add multiply 4 switch 1 0 1
add multiply 8 switch 0 0 1
add multiply 16 switch v 1 1
add multiply 32 switch v 0 1
add multiply 64 switch v 1 0
add multiply 128 switch v 0 1
multiply 256 switch 1 v w
# test global calcs
assign priorityclassifier =
add multiply 1 global_false
add multiply 2 global_true
add multiply 4 global_and
add multiply 8 global_inject1
add multiply 16 global_inject2
add multiply 32 global_or
multiply 64 global_and
---context:node # following code refers to node tags
assign initialcost = 1

View file

@ -1 +0,0 @@
/build/

View file

@ -1,12 +1,11 @@
plugins {
id 'java-library'
id 'brouter.application-conventions'
}
dependencies {
implementation project(':brouter-codec')
implementation project(':brouter-util')
implementation project(':brouter-expressions')
testImplementation('junit:junit:4.13.1')
implementation group: 'org.openstreetmap.osmosis', name: 'osmosis-osm-binary', version: '0.48.3'
}

View file

@ -0,0 +1,258 @@
package btools.mapcreator;
import com.google.protobuf.InvalidProtocolBufferException;
import org.openstreetmap.osmosis.osmbinary.Fileformat;
import org.openstreetmap.osmosis.osmbinary.Osmformat;
import java.io.IOException;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.HashMap;
import java.util.Iterator;
import java.util.List;
import java.util.Map;
import java.util.zip.DataFormatException;
import java.util.zip.Inflater;
import btools.util.LongList;
/**
* Converts PBF block data into decoded entities ready to be passed into an Osmosis pipeline. This
* class is designed to be passed into a pool of worker threads to allow multi-threaded decoding.
* <p/>
*
* @author Brett Henderson
*/
public class BPbfBlobDecoder {
private String blobType;
private byte[] rawBlob;
private OsmParser parser;
/**
* Creates a new instance.
* <p/>
*
* @param blobType The type of blob.
* @param rawBlob The raw data of the blob.
* @param listener The listener for receiving decoding results.
*/
public BPbfBlobDecoder(String blobType, byte[] rawBlob, OsmParser parser) {
this.blobType = blobType;
this.rawBlob = rawBlob;
this.parser = parser;
}
public void process() throws Exception {
if ("OSMHeader".equals(blobType)) {
processOsmHeader(readBlobContent());
} else if ("OSMData".equals(blobType)) {
processOsmPrimitives(readBlobContent());
} else {
System.out.println("Skipping unrecognised blob type " + blobType);
}
}
private byte[] readBlobContent() throws IOException {
Fileformat.Blob blob = Fileformat.Blob.parseFrom(rawBlob);
byte[] blobData;
if (blob.hasRaw()) {
blobData = blob.getRaw().toByteArray();
} else if (blob.hasZlibData()) {
Inflater inflater = new Inflater();
inflater.setInput(blob.getZlibData().toByteArray());
blobData = new byte[blob.getRawSize()];
try {
inflater.inflate(blobData);
} catch (DataFormatException e) {
throw new RuntimeException("Unable to decompress PBF blob.", e);
}
if (!inflater.finished()) {
throw new RuntimeException("PBF blob contains incomplete compressed data.");
}
} else {
throw new RuntimeException("PBF blob uses unsupported compression, only raw or zlib may be used.");
}
return blobData;
}
private void processOsmHeader(byte[] data) throws InvalidProtocolBufferException {
Osmformat.HeaderBlock header = Osmformat.HeaderBlock.parseFrom(data);
// Build the list of active and unsupported features in the file.
List<String> supportedFeatures = Arrays.asList("OsmSchema-V0.6", "DenseNodes");
List<String> activeFeatures = new ArrayList<>();
List<String> unsupportedFeatures = new ArrayList<>();
for (String feature : header.getRequiredFeaturesList()) {
if (supportedFeatures.contains(feature)) {
activeFeatures.add(feature);
} else {
unsupportedFeatures.add(feature);
}
}
// We can't continue if there are any unsupported features. We wait
// until now so that we can display all unsupported features instead of
// just the first one we encounter.
if (unsupportedFeatures.size() > 0) {
throw new RuntimeException("PBF file contains unsupported features " + unsupportedFeatures);
}
}
private Map<String, String> buildTags(List<Integer> keys, List<Integer> values, BPbfFieldDecoder fieldDecoder) {
Iterator<Integer> keyIterator = keys.iterator();
Iterator<Integer> valueIterator = values.iterator();
if (keyIterator.hasNext()) {
Map<String, String> tags = new HashMap<>();
while (keyIterator.hasNext()) {
String key = fieldDecoder.decodeString(keyIterator.next());
String value = fieldDecoder.decodeString(valueIterator.next());
tags.put(key, value);
}
return tags;
}
return null;
}
private void processNodes(List<Osmformat.Node> nodes, BPbfFieldDecoder fieldDecoder) {
for (Osmformat.Node node : nodes) {
Map<String, String> tags = buildTags(node.getKeysList(), node.getValsList(), fieldDecoder);
parser.addNode(node.getId(), tags, fieldDecoder.decodeLatitude(node
.getLat()), fieldDecoder.decodeLatitude(node.getLon()));
}
}
private void processNodes(Osmformat.DenseNodes nodes, BPbfFieldDecoder fieldDecoder) {
List<Long> idList = nodes.getIdList();
List<Long> latList = nodes.getLatList();
List<Long> lonList = nodes.getLonList();
Iterator<Integer> keysValuesIterator = nodes.getKeysValsList().iterator();
long nodeId = 0;
long latitude = 0;
long longitude = 0;
for (int i = 0; i < idList.size(); i++) {
// Delta decode node fields.
nodeId += idList.get(i);
latitude += latList.get(i);
longitude += lonList.get(i);
// Build the tags. The key and value string indexes are sequential
// in the same PBF array. Each set of tags is delimited by an index
// with a value of 0.
Map<String, String> tags = null;
while (keysValuesIterator.hasNext()) {
int keyIndex = keysValuesIterator.next();
if (keyIndex == 0) {
break;
}
int valueIndex = keysValuesIterator.next();
if (tags == null) {
tags = new HashMap<>();
}
tags.put(fieldDecoder.decodeString(keyIndex), fieldDecoder.decodeString(valueIndex));
}
parser.addNode(nodeId, tags, ((double) latitude) / 10000000, ((double) longitude) / 10000000);
}
}
private void processWays(List<Osmformat.Way> ways, BPbfFieldDecoder fieldDecoder) {
for (Osmformat.Way way : ways) {
Map<String, String> tags = buildTags(way.getKeysList(), way.getValsList(), fieldDecoder);
// Build up the list of way nodes for the way. The node ids are
// delta encoded meaning that each id is stored as a delta against
// the previous one.
long nodeId = 0;
LongList wayNodes = new LongList(16);
for (long nodeIdOffset : way.getRefsList()) {
nodeId += nodeIdOffset;
wayNodes.add(nodeId);
}
parser.addWay(way.getId(), tags, wayNodes);
}
}
private LongList fromWid;
private LongList toWid;
private LongList viaNid;
private LongList addLong(LongList ll, long l) {
if (ll == null) {
ll = new LongList(1);
}
ll.add(l);
return ll;
}
private LongList buildRelationMembers(
List<Long> memberIds, List<Integer> memberRoles, List<Osmformat.Relation.MemberType> memberTypes,
BPbfFieldDecoder fieldDecoder) {
LongList wayIds = new LongList(16);
fromWid = toWid = viaNid = null;
Iterator<Long> memberIdIterator = memberIds.iterator();
Iterator<Integer> memberRoleIterator = memberRoles.iterator();
Iterator<Osmformat.Relation.MemberType> memberTypeIterator = memberTypes.iterator();
// Build up the list of relation members for the way. The member ids are
// delta encoded meaning that each id is stored as a delta against
// the previous one.
long refId = 0;
while (memberIdIterator.hasNext()) {
Osmformat.Relation.MemberType memberType = memberTypeIterator.next();
refId += memberIdIterator.next();
String role = fieldDecoder.decodeString(memberRoleIterator.next());
if (memberType == Osmformat.Relation.MemberType.WAY) { // currently just waymembers
wayIds.add(refId);
if ("from".equals(role)) fromWid = addLong(fromWid, refId);
if ("to".equals(role)) toWid = addLong(toWid, refId);
}
if (memberType == Osmformat.Relation.MemberType.NODE) { // currently just waymembers
if ("via".equals(role)) viaNid = addLong(viaNid, refId);
}
}
return wayIds;
}
private void processRelations(List<Osmformat.Relation> relations, BPbfFieldDecoder fieldDecoder) {
for (Osmformat.Relation relation : relations) {
Map<String, String> tags = buildTags(relation.getKeysList(), relation.getValsList(), fieldDecoder);
LongList wayIds = buildRelationMembers(relation.getMemidsList(), relation.getRolesSidList(),
relation.getTypesList(), fieldDecoder);
parser.addRelation(relation.getId(), tags, wayIds, fromWid, toWid, viaNid);
}
}
private void processOsmPrimitives(byte[] data) throws InvalidProtocolBufferException {
Osmformat.PrimitiveBlock block = Osmformat.PrimitiveBlock.parseFrom(data);
BPbfFieldDecoder fieldDecoder = new BPbfFieldDecoder(block);
for (Osmformat.PrimitiveGroup primitiveGroup : block.getPrimitivegroupList()) {
processNodes(primitiveGroup.getDense(), fieldDecoder);
processNodes(primitiveGroup.getNodesList(), fieldDecoder);
processWays(primitiveGroup.getWaysList(), fieldDecoder);
processRelations(primitiveGroup.getRelationsList(), fieldDecoder);
}
}
}

View file

@ -0,0 +1,84 @@
package btools.mapcreator;
import java.util.Date;
import org.openstreetmap.osmosis.osmbinary.Osmformat;
/**
* Manages decoding of the lower level PBF data structures.
* <p/>
*
* @author Brett Henderson
* <p/>
*/
public class BPbfFieldDecoder {
private static final double COORDINATE_SCALING_FACTOR = 0.000000001;
private String[] strings;
private int coordGranularity;
private long coordLatitudeOffset;
private long coordLongitudeOffset;
private int dateGranularity;
/**
* Creates a new instance.
* <p/>
*
* @param primitiveBlock The primitive block containing the fields to be decoded.
*/
public BPbfFieldDecoder(Osmformat.PrimitiveBlock primitiveBlock) {
this.coordGranularity = primitiveBlock.getGranularity();
this.coordLatitudeOffset = primitiveBlock.getLatOffset();
this.coordLongitudeOffset = primitiveBlock.getLonOffset();
this.dateGranularity = primitiveBlock.getDateGranularity();
Osmformat.StringTable stringTable = primitiveBlock.getStringtable();
strings = new String[stringTable.getSCount()];
for (int i = 0; i < strings.length; i++) {
strings[i] = stringTable.getS(i).toStringUtf8();
}
}
/**
* Decodes a raw latitude value into degrees.
* <p/>
*
* @param rawLatitude The PBF encoded value.
* @return The latitude in degrees.
*/
public double decodeLatitude(long rawLatitude) {
return COORDINATE_SCALING_FACTOR * (coordLatitudeOffset + (coordGranularity * rawLatitude));
}
/**
* Decodes a raw longitude value into degrees.
* <p/>
*
* @param rawLongitude The PBF encoded value.
* @return The longitude in degrees.
*/
public double decodeLongitude(long rawLongitude) {
return COORDINATE_SCALING_FACTOR * (coordLongitudeOffset + (coordGranularity * rawLongitude));
}
/**
* Decodes a raw timestamp value into a Date.
* <p/>
*
* @param rawTimestamp The PBF encoded timestamp.
* @return The timestamp as a Date.
*/
public Date decodeTimestamp(long rawTimestamp) {
return new Date(dateGranularity * rawTimestamp);
}
/**
* Decodes a raw string into a String.
* <p/>
*
* @param rawString The PBF encoding string.
* @return The string as a String.
*/
public String decodeString(int rawString) {
return strings[rawString];
}
}

View file

@ -1,220 +0,0 @@
package btools.mapcreator;
import java.io.BufferedInputStream;
import java.io.BufferedOutputStream;
import java.io.DataInputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.InputStream;
import java.io.OutputStream;
import java.util.zip.ZipEntry;
import java.util.zip.ZipInputStream;
public class ConvertLidarTile
{
public static int NROWS;
public static int NCOLS;
public static final short NODATA2 = -32767; // hgt-formats nodata
public static final short NODATA = Short.MIN_VALUE;
static short[] imagePixels;
private static void readHgtZip( String filename, int rowOffset, int colOffset ) throws Exception
{
ZipInputStream zis = new ZipInputStream( new BufferedInputStream( new FileInputStream( filename ) ) );
try
{
for ( ;; )
{
ZipEntry ze = zis.getNextEntry();
if ( ze.getName().endsWith( ".hgt" ) )
{
readHgtFromStream( zis, rowOffset, colOffset );
return;
}
}
}
finally
{
zis.close();
}
}
private static void readHgtFromStream( InputStream is, int rowOffset, int colOffset )
throws Exception
{
DataInputStream dis = new DataInputStream( new BufferedInputStream( is ) );
for ( int ir = 0; ir < 1201; ir++ )
{
int row = rowOffset + ir;
for ( int ic = 0; ic < 1201; ic++ )
{
int col = colOffset + ic;
int i1 = dis.read(); // msb first!
int i0 = dis.read();
if ( i0 == -1 || i1 == -1 )
throw new RuntimeException( "unexcepted end of file reading hgt entry!" );
short val = (short) ( ( i1 << 8 ) | i0 );
if ( val == NODATA2 )
{
val = NODATA;
}
setPixel( row, col, val );
}
}
}
private static void setPixel( int row, int col, short val )
{
if ( row >= 0 && row < NROWS && col >= 0 && col < NCOLS )
{
imagePixels[row * NCOLS + col] = val;
}
}
private static short getPixel( int row, int col )
{
if ( row >= 0 && row < NROWS && col >= 0 && col < NCOLS )
{
return imagePixels[row * NCOLS + col];
}
return NODATA;
}
public static void doConvert( String inputDir, int lonDegreeStart, int latDegreeStart, String outputFile ) throws Exception
{
int extraBorder = 0;
NROWS = 5 * 1200 + 1 + 2 * extraBorder;
NCOLS = 5 * 1200 + 1 + 2 * extraBorder;
imagePixels = new short[NROWS * NCOLS]; // 650 MB !
// prefill as NODATA
for ( int row = 0; row < NROWS; row++ )
{
for ( int col = 0; col < NCOLS; col++ )
{
imagePixels[row * NCOLS + col] = NODATA;
}
}
for ( int latIdx = -1; latIdx <= 5; latIdx++ )
{
int latDegree = latDegreeStart + latIdx;
int rowOffset = extraBorder + ( 4 - latIdx ) * 1200;
for ( int lonIdx = -1; lonIdx <= 5; lonIdx++ )
{
int lonDegree = lonDegreeStart + lonIdx;
int colOffset = extraBorder + lonIdx * 1200;
String filename = inputDir + "/" + formatLat( latDegree ) + formatLon( lonDegree ) + ".zip";
File f = new File( filename );
if ( f.exists() && f.length() > 0 )
{
System.out.println( "exist: " + filename );
readHgtZip( filename, rowOffset, colOffset );
}
else
{
System.out.println( "none : " + filename );
}
}
}
boolean halfCol5 = false; // no halfcol tiles in lidar data (?)
SrtmRaster raster = new SrtmRaster();
raster.nrows = NROWS;
raster.ncols = NCOLS;
raster.halfcol = halfCol5;
raster.noDataValue = NODATA;
raster.cellsize = 1 / 1200.;
raster.xllcorner = lonDegreeStart - ( 0.5 + extraBorder ) * raster.cellsize;
raster.yllcorner = latDegreeStart - ( 0.5 + extraBorder ) * raster.cellsize;
raster.eval_array = imagePixels;
// encode the raster
OutputStream os = new BufferedOutputStream( new FileOutputStream( outputFile ) );
new RasterCoder().encodeRaster( raster, os );
os.close();
// decode the raster
InputStream is = new BufferedInputStream( new FileInputStream( outputFile ) );
SrtmRaster raster2 = new RasterCoder().decodeRaster( is );
is.close();
short[] pix2 = raster2.eval_array;
if ( pix2.length != imagePixels.length )
throw new RuntimeException( "length mismatch!" );
// compare decoding result
for ( int row = 0; row < NROWS; row++ )
{
int colstep = halfCol5 ? 2 : 1;
for ( int col = 0; col < NCOLS; col += colstep )
{
int idx = row * NCOLS + col;
short p2 = pix2[idx];
if ( p2 != imagePixels[idx] )
{
throw new RuntimeException( "content mismatch: p2=" + p2 + " p1=" + imagePixels[idx] );
}
}
}
}
private static String formatLon( int lon )
{
if ( lon >= 180 )
lon -= 180; // TODO: w180 oder E180 ?
String s = "E";
if ( lon < 0 )
{
lon = -lon;
s = "E";
}
String n = "000" + lon;
return s + n.substring( n.length() - 3 );
}
private static String formatLat( int lat )
{
String s = "N";
if ( lat < 0 )
{
lat = -lat;
s = "S";
}
String n = "00" + lat;
return s + n.substring( n.length() - 2 );
}
public static void main( String[] args ) throws Exception
{
String filename90 = args[0];
String filename30 = filename90.substring( 0, filename90.length() - 3 ) + "bef";
int srtmLonIdx = Integer.parseInt( filename90.substring( 5, 7 ).toLowerCase() );
int srtmLatIdx = Integer.parseInt( filename90.substring( 8, 10 ).toLowerCase() );
int ilon_base = ( srtmLonIdx - 1 ) * 5 - 180;
int ilat_base = 150 - srtmLatIdx * 5 - 90;
doConvert( args[1], ilon_base, ilat_base, filename30 );
}
}

View file

@ -1,311 +0,0 @@
package btools.mapcreator;
import java.io.*;
import java.util.zip.*;
public class ConvertSrtmTile
{
public static int NROWS;
public static int NCOLS;
public static final short SKIPDATA = -32766; // >50 degree skipped pixel
public static final short NODATA2 = -32767; // bil-formats nodata
public static final short NODATA = Short.MIN_VALUE;
static short[] imagePixels;
public static int[] diffs = new int[100];
private static void readBilZip( String filename, int rowOffset, int colOffset, boolean halfCols ) throws Exception
{
ZipInputStream zis = new ZipInputStream( new BufferedInputStream( new FileInputStream( filename ) ) );
try
{
for ( ;; )
{
ZipEntry ze = zis.getNextEntry();
if ( ze.getName().endsWith( ".bil" ) )
{
readBilFromStream( zis, rowOffset, colOffset, halfCols );
return;
}
}
}
finally
{
zis.close();
}
}
private static void readBilFromStream( InputStream is, int rowOffset, int colOffset, boolean halfCols )
throws Exception
{
DataInputStream dis = new DataInputStream( new BufferedInputStream( is ) );
for ( int ir = 0; ir < 3601; ir++ )
{
int row = rowOffset + ir;
for ( int ic = 0; ic < 3601; ic++ )
{
int col = colOffset + ic;
if ( ( ic % 2 ) == 1 && halfCols )
{
if ( getPixel( row, col ) == NODATA )
{
setPixel( row, col, SKIPDATA );
}
continue;
}
int i0 = dis.read();
int i1 = dis.read();
if ( i0 == -1 || i1 == -1 )
throw new RuntimeException( "unexcepted end of file reading bil entry!" );
short val = (short) ( ( i1 << 8 ) | i0 );
if ( val == NODATA2 )
{
val = NODATA;
}
setPixel( row, col, val );
}
}
}
private static void setPixel( int row, int col, short val )
{
if ( row >= 0 && row < NROWS && col >= 0 && col < NCOLS )
{
imagePixels[row * NCOLS + col] = val;
}
}
private static short getPixel( int row, int col )
{
if ( row >= 0 && row < NROWS && col >= 0 && col < NCOLS )
{
return imagePixels[row * NCOLS + col];
}
return NODATA;
}
public static void doConvert( String inputDir, String v1Dir, int lonDegreeStart, int latDegreeStart, String outputFile, SrtmRaster raster90 ) throws Exception
{
int extraBorder = 10;
int datacells = 0;
int mismatches = 0;
NROWS = 5 * 3600 + 1 + 2 * extraBorder;
NCOLS = 5 * 3600 + 1 + 2 * extraBorder;
imagePixels = new short[NROWS * NCOLS]; // 650 MB !
// prefill as NODATA
for ( int row = 0; row < NROWS; row++ )
{
for ( int col = 0; col < NCOLS; col++ )
{
imagePixels[row * NCOLS + col] = NODATA;
}
}
for ( int latIdx = -1; latIdx <= 5; latIdx++ )
{
int latDegree = latDegreeStart + latIdx;
int rowOffset = extraBorder + ( 4 - latIdx ) * 3600;
for ( int lonIdx = -1; lonIdx <= 5; lonIdx++ )
{
int lonDegree = lonDegreeStart + lonIdx;
int colOffset = extraBorder + lonIdx * 3600;
String filename = inputDir + "/" + formatLat( latDegree ) + "_" + formatLon( lonDegree ) + "_1arc_v3_bil.zip";
File f = new File( filename );
if ( f.exists() && f.length() > 0 )
{
System.out.println( "exist: " + filename );
boolean halfCol = latDegree >= 50 || latDegree < -50;
readBilZip( filename, rowOffset, colOffset, halfCol );
}
else
{
System.out.println( "none : " + filename );
}
}
}
boolean halfCol5 = latDegreeStart >= 50 || latDegreeStart < -50;
for ( int row90 = 0; row90 < 6001; row90++ )
{
int crow = 3 * row90 + extraBorder; // center row of 3x3
for ( int col90 = 0; col90 < 6001; col90++ )
{
int ccol = 3 * col90 + extraBorder; // center col of 3x3
// evaluate 3x3 area
if ( raster90 != null && (!halfCol5 || (col90 % 2) == 0 ) )
{
short v90 = raster90.eval_array[row90 * 6001 + col90];
int sum = 0;
int nodatas = 0;
int datas = 0;
int colstep = halfCol5 ? 2 : 1;
for ( int row = crow - 1; row <= crow + 1; row++ )
{
for ( int col = ccol - colstep; col <= ccol + colstep; col += colstep )
{
short v30 = imagePixels[row * NCOLS + col];
if ( v30 == NODATA )
{
nodatas++;
}
else if ( v30 != SKIPDATA )
{
sum += v30;
datas++;
}
}
}
boolean doReplace = nodatas > 0 || v90 == NODATA || datas < 7;
if ( !doReplace )
{
datacells++;
int diff = sum - datas * v90;
if ( diff < -4 || diff > 4 )
{
doReplace = true;
mismatches++;
}
if ( diff > -50 && diff < 50 && ( row90 % 1200 ) != 0 && ( col90 % 1200 ) != 0 )
{
diffs[diff + 50]++;
}
}
if ( doReplace )
{
for ( int row = crow - 1; row <= crow + 1; row++ )
{
for ( int col = ccol - colstep; col <= ccol + colstep; col += colstep )
{
imagePixels[row * NCOLS + col] = v90;
}
}
}
}
}
}
SrtmRaster raster = new SrtmRaster();
raster.nrows = NROWS;
raster.ncols = NCOLS;
raster.halfcol = halfCol5;
raster.noDataValue = NODATA;
raster.cellsize = 1 / 3600.;
raster.xllcorner = lonDegreeStart - ( 0.5 + extraBorder ) * raster.cellsize;
raster.yllcorner = latDegreeStart - ( 0.5 + extraBorder ) * raster.cellsize;
raster.eval_array = imagePixels;
// encode the raster
OutputStream os = new BufferedOutputStream( new FileOutputStream( outputFile ) );
new RasterCoder().encodeRaster( raster, os );
os.close();
// decode the raster
InputStream is = new BufferedInputStream( new FileInputStream( outputFile ) );
SrtmRaster raster2 = new RasterCoder().decodeRaster( is );
is.close();
short[] pix2 = raster2.eval_array;
if ( pix2.length != imagePixels.length )
throw new RuntimeException( "length mismatch!" );
// compare decoding result
for ( int row = 0; row < NROWS; row++ )
{
int colstep = halfCol5 ? 2 : 1;
for ( int col = 0; col < NCOLS; col += colstep )
{
int idx = row * NCOLS + col;
if ( imagePixels[idx] == SKIPDATA )
{
continue;
}
short p2 = pix2[idx];
if ( p2 > SKIPDATA )
{
p2 /= 2;
}
if ( p2 != imagePixels[idx] )
{
throw new RuntimeException( "content mismatch!" );
}
}
}
for(int i=1; i<100;i++) System.out.println( "diff[" + (i-50) + "] = " + diffs[i] );
System.out.println( "datacells=" + datacells + " mismatch%=" + (100.*mismatches)/datacells );
btools.util.MixCoderDataOutputStream.stats();
// test( raster );
// raster.calcWeights( 50. );
// test( raster );
// 39828330 &lon=3115280&layer=OpenStreetMap
}
private static void test( SrtmRaster raster )
{
int lat0 = 39828330;
int lon0 = 3115280;
for ( int iy = -9; iy <= 9; iy++ )
{
StringBuilder sb = new StringBuilder();
for ( int ix = -9; ix <= 9; ix++ )
{
int lat = lat0 + 90000000 - 100 * iy;
int lon = lon0 + 180000000 + 100 * ix;
int ival = (int) ( raster.getElevation( lon, lat ) / 4. );
String sval = " " + ival;
sb.append( sval.substring( sval.length() - 4 ) );
}
System.out.println( sb );
System.out.println();
}
}
private static String formatLon( int lon )
{
if ( lon >= 180 )
lon -= 180; // TODO: w180 oder E180 ?
String s = "e";
if ( lon < 0 )
{
lon = -lon;
s = "w";
}
String n = "000" + lon;
return s + n.substring( n.length() - 3 );
}
private static String formatLat( int lat )
{
String s = "n";
if ( lat < 0 )
{
lat = -lat;
s = "s";
}
String n = "00" + lat;
return s + n.substring( n.length() - 2 );
}
}

View file

@ -1,60 +0,0 @@
package btools.mapcreator;
import java.io.BufferedReader;
import java.io.File;
import java.io.FileReader;
public class ConvertUrlList
{
public static final short NODATA = -32767;
public static void main( String[] args ) throws Exception
{
BufferedReader br = new BufferedReader( new FileReader( args[0] ) );
for ( ;; )
{
String line = br.readLine();
if ( line == null )
{
break;
}
int idx1 = line.indexOf( "srtm_" );
if ( idx1 < 0 )
{
continue;
}
String filename90 = line.substring( idx1 );
String filename30 = filename90.substring( 0, filename90.length() - 3 ) + "bef";
if ( new File( filename30 ).exists() )
{
continue;
}
// int srtmLonIdx = (ilon+5000000)/5000000; -> ilon = (srtmLonIdx-1)*5
// int srtmLatIdx = (154999999-ilat)/5000000; -> ilat = 155 - srtmLatIdx*5
int srtmLonIdx = Integer.parseInt( filename90.substring( 5, 7 ).toLowerCase() );
int srtmLatIdx = Integer.parseInt( filename90.substring( 8, 10 ).toLowerCase() );
int ilon_base = ( srtmLonIdx - 1 ) * 5 - 180;
int ilat_base = 150 - srtmLatIdx * 5 - 90;
SrtmRaster raster90 = null;
File file90 = new File( new File( args[1] ), filename90 );
if ( file90.exists() )
{
System.out.println( "reading " + file90 );
raster90 = new SrtmData( file90 ).getRaster();
}
ConvertSrtmTile.doConvert( args[2], args[3], ilon_base, ilat_base, filename30, raster90 );
}
br.close();
}
}

View file

@ -0,0 +1,335 @@
package btools.mapcreator;
import java.awt.Color;
import java.awt.image.BufferedImage;
import java.awt.image.DataBufferInt;
import java.io.BufferedInputStream;
import java.io.BufferedReader;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.FileReader;
import java.io.InputStream;
import java.util.HashMap;
import java.util.Map;
import java.util.Map.Entry;
import java.util.TreeMap;
import javax.imageio.ImageIO;
public class CreateElevationRasterImage {
final static boolean DEBUG = false;
int[] data;
ElevationRaster lastSrtmRaster;
Map<String, ElevationRaster> srtmmap;
int lastSrtmLonIdx;
int lastSrtmLatIdx;
short maxElev = Short.MIN_VALUE;
short minElev = Short.MAX_VALUE;
String srtmdir;
boolean missingData;
Map<Short, Color> colorMap;
private void createImage(double lon, double lat, String dir, String imageName, int maxX, int maxY, int downscale, String format, String colors) throws Exception {
srtmdir = dir;
if (colors != null) {
loadColors(colors);
}
if (format.equals("hgt")) {
createImageFromHgt(lon, lat, dir, imageName, maxX, maxY);
return;
}
if (!format.equals("bef")) {
System.out.println("wrong format (bef|hgt)");
return;
}
srtmmap = new HashMap<>();
lastSrtmLonIdx = -1;
lastSrtmLatIdx = -1;
lastSrtmRaster = null;
NodeData n = new NodeData(1, lon, lat);
ElevationRaster srtm = srtmForNode(n.ilon, n.ilat);
if (srtm == null) {
System.out.println("no data");
return;
}
System.out.println("srtm " + srtm.toString());
//System.out.println("srtm elev " + srtm.getElevation(n.ilon, n.ilat));
double[] pos = getElevationPos(srtm, n.ilon, n.ilat);
//System.out.println("srtm pos " + Math.round(pos[0]) + " " + Math.round(pos[1]));
short[] raster = srtm.eval_array;
int rasterX = srtm.ncols;
int rasterY = srtm.nrows;
int tileSize = 1000 / downscale;
int sizeX = (maxX);
int sizeY = (maxY);
int[] imgraster = new int[sizeX * sizeY];
for (int y = 0; y < sizeY; y++) {
for (int x = 0; x < sizeX; x++) {
//short e = getElevationXY(srtm, pos[0] + (sizeY - y) * downscale, pos[1] + (x * downscale));
short e = get(srtm, (int) Math.round(pos[0]) + (sizeY - y), x + (int) Math.round(pos[1]));
if (e != Short.MIN_VALUE && e < minElev) minElev = e;
if (e != Short.MIN_VALUE && e > maxElev) maxElev = e;
if (e == Short.MIN_VALUE) {
imgraster[sizeY * y + x] = 0xffff;
} else {
//imgraster[sizeY * y + x] = getColorForHeight((short)(e/4)); //(int)(e/4.);
imgraster[sizeY * y + x] = getColorForHeight(e);
}
}
}
System.out.println("srtm target " + sizeX + " " + sizeY + " (" + rasterX + " " + rasterY + ")" + " min " + minElev + " max " + maxElev);
if (DEBUG) {
String out = "short ";
for (int i = 0; i < 100; i++) {
out += " " + get(srtm, sizeY - 0, i);
}
System.out.println(out);
}
BufferedImage argbImage = new BufferedImage(sizeX, sizeY, BufferedImage.TYPE_INT_ARGB);
data = ((DataBufferInt) argbImage.getRaster().getDataBuffer()).getData();
for (int y = 0; y < sizeY; y++) {
for (int x = 0; x < sizeX; x++) {
int v0 = imgraster[sizeX * y + x];
int rgb;
if (v0 != 0xffff)
rgb = 0xff000000 | v0; //(v0 << 8);
else
rgb = 0xff000000;
data[y * sizeX + x] = rgb;
}
}
ImageIO.write(argbImage, "png", new FileOutputStream(imageName));
}
private void createImageFromHgt(double lon, double lat, String dir, String imageName, int maxX, int maxY) throws Exception {
HgtReader rdr = new HgtReader(dir);
short[] data = rdr.getElevationDataFromHgt(lat, lon);
if (data == null) {
System.out.println("no data");
return;
}
int size = (data != null ? data.length : 0);
int rowlen = (int) Math.sqrt(size);
int sizeX = (maxX);
int sizeY = (maxY);
int[] imgraster = new int[sizeX * sizeY];
for (int y = 0; y < sizeY; y++) {
for (int x = 0; x < sizeX; x++) {
short e = data[(rowlen * y) + x];
if (e != HgtReader.HGT_VOID && e < minElev) minElev = e;
if (e != HgtReader.HGT_VOID && e > maxElev) maxElev = e;
if (e == HgtReader.HGT_VOID) {
imgraster[sizeY * y + x] = 0xffff;
} else if (e == 0) {
imgraster[sizeY * y + x] = 0xffff;
} else {
imgraster[sizeY * y + x] = getColorForHeight((short) (e));
}
}
}
System.out.println("hgt size " + rowlen + " x " + rowlen + " min " + minElev + " max " + maxElev);
if (DEBUG) {
String out = "short ";
for (int i = 0; i < 100; i++) {
out += " " + data[i];
}
System.out.println(out);
}
BufferedImage argbImage = new BufferedImage(sizeX, sizeY, BufferedImage.TYPE_INT_ARGB);
int[] idata = ((DataBufferInt) argbImage.getRaster().getDataBuffer()).getData();
for (int y = 0; y < sizeY; y++) {
for (int x = 0; x < sizeX; x++) {
int v0 = imgraster[sizeX * y + x];
int rgb;
if (v0 != 0xffff)
rgb = 0xff000000 | v0; //(v0 << 8);
else
rgb = 0xff000000;
idata[y * sizeX + x] = rgb;
}
}
ImageIO.write(argbImage, "png", new FileOutputStream(imageName));
}
private void loadColors(String colors) {
if (DEBUG) System.out.println("colors=" + colors);
File colFile = new File(colors);
if (colFile.exists()) {
BufferedReader reader = null;
colorMap = new TreeMap<>();
try {
reader = new BufferedReader(new FileReader(colors));
String line = reader.readLine();
while (line != null) {
if (DEBUG) System.out.println(line);
String[] sa = line.split(",");
if (!line.startsWith("#") && sa.length == 4) {
short e = Short.parseShort(sa[0].trim());
short r = Short.parseShort(sa[1].trim());
short g = Short.parseShort(sa[2].trim());
short b = Short.parseShort(sa[3].trim());
colorMap.put(e, new Color(r, g, b));
}
// read next line
line = reader.readLine();
}
} catch (Exception e) {
e.printStackTrace();
colorMap = null;
} finally {
if (reader != null) {
try {
reader.close();
} catch (Exception e) {
e.printStackTrace();
}
}
}
} else {
System.out.println("color file " + colors + " not found");
}
}
public double[] getElevationPos(ElevationRaster srtm, int ilon, int ilat) {
double lon = ilon / 1000000. - 180.;
double lat = ilat / 1000000. - 90.;
double dcol = (lon - srtm.xllcorner) / srtm.cellsize - 0.5;
double drow = (lat - srtm.yllcorner) / srtm.cellsize - 0.5;
int row = (int) drow;
int col = (int) dcol;
if (col < 0) col = 0;
if (row < 0) row = 0;
return new double[]{drow, dcol};
}
private short get(ElevationRaster srtm, int r, int c) {
short e = srtm.eval_array[(srtm.nrows - 1 - r) * srtm.ncols + c];
if (e == Short.MIN_VALUE) missingData = true;
return e;
}
public short getElevationXY(ElevationRaster srtm, double drow, double dcol) {
int row = (int) drow;
int col = (int) dcol;
if (col < 0) col = 0;
if (col >= srtm.ncols - 1) col = srtm.ncols - 2;
if (row < 0) row = 0;
if (row >= srtm.nrows - 1) row = srtm.nrows - 2;
double wrow = drow - row;
double wcol = dcol - col;
missingData = false;
double eval = (1. - wrow) * (1. - wcol) * get(srtm, row, col)
+ (wrow) * (1. - wcol) * get(srtm, row + 1, col)
+ (1. - wrow) * (wcol) * get(srtm, row, col + 1)
+ (wrow) * (wcol) * get(srtm, row + 1, col + 1);
return missingData ? Short.MIN_VALUE : (short) (eval * 4);
}
int getColorForHeight(short h) {
if (colorMap == null) {
colorMap = new TreeMap<>();
colorMap.put((short) 0, new Color(102, 153, 153));
colorMap.put((short) 1, new Color(0, 102, 0));
colorMap.put((short) 500, new Color(251, 255, 128));
colorMap.put((short) 1200, new Color(224, 108, 31));
colorMap.put((short) 2500, new Color(200, 55, 55));
colorMap.put((short) 4000, new Color(215, 244, 244));
colorMap.put((short) 8000, new Color(255, 244, 244));
}
Color lastColor = null;
short lastKey = 0;
for (Entry<Short, Color> entry : colorMap.entrySet()) {
short key = entry.getKey();
Color value = entry.getValue();
if (key == h) return value.getRGB();
if (lastColor != null && lastKey < h && key > h) {
double between = (double) (h - lastKey) / (key - lastKey);
return mixColors(value, lastColor, between);
}
lastColor = value;
lastKey = key;
}
return 0;
}
public int mixColors(Color color1, Color color2, double percent) {
double inverse_percent = 1.0 - percent;
int redPart = (int) (color1.getRed() * percent + color2.getRed() * inverse_percent);
int greenPart = (int) (color1.getGreen() * percent + color2.getGreen() * inverse_percent);
int bluePart = (int) (color1.getBlue() * percent + color2.getBlue() * inverse_percent);
return new Color(redPart, greenPart, bluePart).getRGB();
}
private ElevationRaster srtmForNode(int ilon, int ilat) throws Exception {
int srtmLonIdx = (ilon + 5000000) / 5000000;
int srtmLatIdx = (654999999 - ilat) / 5000000 - 100; // ugly negative rounding...
if (srtmLonIdx == lastSrtmLonIdx && srtmLatIdx == lastSrtmLatIdx) {
return lastSrtmRaster;
}
lastSrtmLonIdx = srtmLonIdx;
lastSrtmLatIdx = srtmLatIdx;
String slonidx = "0" + srtmLonIdx;
String slatidx = "0" + srtmLatIdx;
String filename = "srtm_" + slonidx.substring(slonidx.length() - 2) + "_" + slatidx.substring(slatidx.length() - 2);
lastSrtmRaster = srtmmap.get(filename);
if (lastSrtmRaster == null && !srtmmap.containsKey(filename)) {
File f = new File(new File(srtmdir), filename + ".bef");
if (f.exists()) {
System.out.println("*** reading: " + f);
try {
InputStream isc = new BufferedInputStream(new FileInputStream(f));
lastSrtmRaster = new ElevationRasterCoder().decodeRaster(isc);
isc.close();
} catch (Exception e) {
System.out.println("**** ERROR reading " + f + " ****");
}
srtmmap.put(filename, lastSrtmRaster);
return lastSrtmRaster;
}
srtmmap.put(filename, lastSrtmRaster);
}
return lastSrtmRaster;
}
public static void main(String[] args) throws Exception {
if (args.length < 6) {
System.out.println("usage: java CreateLidarImage <lon> <lat> <srtm-folder> <imageFileName> <maxX> <maxY> <downscale> [type] [color_file]");
System.out.println("\nwhere: type = [bef|hgt] downscale = [1|2|4|..]");
return;
}
String format = args.length >= 8 ? args[7] : "bef";
String colors = args.length == 9 ? args[8] : null;
new CreateElevationRasterImage().createImage(Double.parseDouble(args[0]), Double.parseDouble(args[1]), args[2], args[3],
Integer.parseInt(args[4]), Integer.parseInt(args[5]), Integer.parseInt(args[6]), format, colors);
}
}

View file

@ -5,80 +5,67 @@
*/
package btools.mapcreator;
import java.util.ArrayList;
import java.util.List;
import btools.util.CheapRuler;
public class DPFilter
{
public class DPFilter {
private static double dp_sql_threshold = 0.4 * 0.4;
/*
* for each node (except first+last), eventually set the DP_SURVIVOR_BIT
*/
public static void doDPFilter( ArrayList<OsmNodeP> nodes )
{
public static void doDPFilter(List<OsmNodeP> nodes) {
int first = 0;
int last = nodes.size()-1;
while( first < last && (nodes.get(first+1).bits & OsmNodeP.DP_SURVIVOR_BIT) != 0 )
{
int last = nodes.size() - 1;
while (first < last && (nodes.get(first + 1).bits & OsmNodeP.DP_SURVIVOR_BIT) != 0) {
first++;
}
while( first < last && (nodes.get(last-1).bits & OsmNodeP.DP_SURVIVOR_BIT) != 0 )
{
while (first < last && (nodes.get(last - 1).bits & OsmNodeP.DP_SURVIVOR_BIT) != 0) {
last--;
}
if ( last - first > 1 )
{
doDPFilter( nodes, first, last );
if (last - first > 1) {
doDPFilter(nodes, first, last);
}
}
public static void doDPFilter( ArrayList<OsmNodeP> nodes, int first, int last )
{
public static void doDPFilter(List<OsmNodeP> nodes, int first, int last) {
double maxSqDist = -1.;
int index = -1;
OsmNodeP p1 = nodes.get( first );
OsmNodeP p2 = nodes.get( last );
OsmNodeP p1 = nodes.get(first);
OsmNodeP p2 = nodes.get(last);
double[] lonlat2m = CheapRuler.getLonLatToMeterScales( (p1.ilat+p2.ilat) >> 1 );
double[] lonlat2m = CheapRuler.getLonLatToMeterScales((p1.ilat + p2.ilat) >> 1);
double dlon2m = lonlat2m[0];
double dlat2m = lonlat2m[1];
double dx = (p2.ilon - p1.ilon) * dlon2m;
double dy = (p2.ilat - p1.ilat) * dlat2m;
double d2 = dx * dx + dy * dy;
for ( int i = first + 1; i < last; i++ )
{
OsmNodeP p = nodes.get( i );
for (int i = first + 1; i < last; i++) {
OsmNodeP p = nodes.get(i);
double t = 0.;
if ( d2 != 0f )
{
t = ( ( p.ilon - p1.ilon ) * dlon2m * dx + ( p.ilat - p1.ilat ) * dlat2m * dy ) / d2;
t = t > 1. ? 1. : ( t < 0. ? 0. : t );
if (d2 != 0f) {
t = ((p.ilon - p1.ilon) * dlon2m * dx + (p.ilat - p1.ilat) * dlat2m * dy) / d2;
t = t > 1. ? 1. : (t < 0. ? 0. : t);
}
double dx2 = (p.ilon - ( p1.ilon + t*( p2.ilon - p1.ilon ) ) ) * dlon2m;
double dy2 = (p.ilat - ( p1.ilat + t*( p2.ilat - p1.ilat ) ) ) * dlat2m;
double dx2 = (p.ilon - (p1.ilon + t * (p2.ilon - p1.ilon))) * dlon2m;
double dy2 = (p.ilat - (p1.ilat + t * (p2.ilat - p1.ilat))) * dlat2m;
double sqDist = dx2 * dx2 + dy2 * dy2;
if ( sqDist > maxSqDist )
{
if (sqDist > maxSqDist) {
index = i;
maxSqDist = sqDist;
}
}
if ( index >= 0 )
{
if ( index - first > 1 )
{
doDPFilter( nodes, first, index );
if (index >= 0) {
if (index - first > 1) {
doDPFilter(nodes, first, index);
}
if ( maxSqDist >= dp_sql_threshold )
{
nodes.get( index ).bits |= OsmNodeP.DP_SURVIVOR_BIT;
if (maxSqDist >= dp_sql_threshold) {
nodes.get(index).bits |= OsmNodeP.DP_SURVIVOR_BIT;
}
if ( last - index > 1 )
{
doDPFilter( nodes, index, last );
if (last - index > 1) {
doDPFilter(nodes, index, last);
}
}
}

View file

@ -0,0 +1,199 @@
/**
* DatabasePseudoTagProvider reads Pseudo Tags from a database and adds them
* to the osm-data
*/
package btools.mapcreator;
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.InputStreamReader;
import java.io.OutputStreamWriter;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.zip.GZIPInputStream;
import java.util.zip.GZIPOutputStream;
import btools.util.CompactLongMap;
import btools.util.FrozenLongMap;
public class DatabasePseudoTagProvider {
private long cntOsmWays = 0L;
private long cntWayModified = 0L;
private Map<String, Long> pseudoTagsFound = new HashMap<>();
FrozenLongMap<Map<String, String>> dbData;
public static void main(String[] args) {
String jdbcurl = args[0];
String filename = args[1];
try (Connection conn = DriverManager.getConnection(jdbcurl);
BufferedWriter bw = new BufferedWriter(new OutputStreamWriter(
filename.endsWith(".gz") ? new GZIPOutputStream(new FileOutputStream(filename)) : new FileOutputStream(filename)))) {
conn.setAutoCommit(false);
System.out.println("DatabasePseudoTagProvider dumping data from " + jdbcurl + " to file " + filename);
bw.write("losmid;noise_class;river_class;forest_class;town_class;traffic_class\n");
String sql_all_tags = "SELECT * from all_tags";
try(PreparedStatement psAllTags = conn.prepareStatement(sql_all_tags)) {
psAllTags.setFetchSize(100);
// process the results
ResultSet rs = psAllTags.executeQuery();
long dbRows = 0L;
while (rs.next()) {
StringBuilder line = new StringBuilder();
line.append(rs.getLong("losmid"));
appendDBTag(line, rs, "noise_class");
appendDBTag(line, rs, "river_class");
appendDBTag(line, rs, "forest_class");
appendDBTag(line, rs, "town_class");
appendDBTag(line, rs, "traffic_class");
line.append('\n');
bw.write(line.toString());
dbRows++;
if (dbRows % 1000000L == 0L) {
System.out.println(".. from database: rows =" + dbRows);
}
}
}
} catch (SQLException g) {
System.err.format("DatabasePseudoTagProvider execute sql .. SQL State: %s\n%s\n", g.getSQLState(), g.getMessage());
System.exit(1);
} catch (Exception f) {
f.printStackTrace();
System.exit(1);
}
}
private static void appendDBTag(StringBuilder sb, ResultSet rs, String name) throws SQLException {
sb.append(';');
String v = rs.getString(name);
if (v != null) {
sb.append(v);
}
}
public DatabasePseudoTagProvider(String filename) {
try (BufferedReader br = new BufferedReader(new InputStreamReader(
filename.endsWith(".gz") ? new GZIPInputStream(new FileInputStream(filename)) : new FileInputStream(filename)))) {
System.out.println("DatabasePseudoTagProvider reading from file: " + filename);
br.readLine(); // skip header line
Map<Map<String, String>, Map<String, String>> mapUnifier = new HashMap<>();
CompactLongMap<Map<String, String>> data = new CompactLongMap<>();
long dbRows = 0L;
for (;;) {
String line = br.readLine();
if (line == null) {
break;
}
List<String> tokens = tokenize(line);
long osm_id = Long.parseLong(tokens.get(0));
Map<String, String> row = new HashMap<>(5);
addTag(row, tokens.get(1), "estimated_noise_class");
addTag(row, tokens.get(2), "estimated_river_class");
addTag(row, tokens.get(3), "estimated_forest_class");
addTag(row, tokens.get(4), "estimated_town_class");
addTag(row, tokens.get(5), "estimated_traffic_class");
// apply the instance-unifier for the row-map
Map<String, String> knownRow = mapUnifier.get(row);
if (knownRow != null) {
row = knownRow;
} else {
mapUnifier.put(row, row);
}
data.put(osm_id, row);
dbRows++;
if (dbRows % 1000000L == 0L) {
System.out.println(".. from database: rows =" + data.size() + " unique rows=" + mapUnifier.size());
}
}
System.out.println("freezing result map..");
dbData = new FrozenLongMap<>(data);
System.out.println("read from file: rows =" + dbData.size() + " unique rows=" + mapUnifier.size());
} catch (Exception f) {
f.printStackTrace();
System.exit(1);
}
}
// use own tokenizer as String.split, StringTokenizer
// etc. have issues with empty elements
private List<String> tokenize(String s) {
List<String> l = new ArrayList<>();
StringBuilder sb = new StringBuilder();
for (int i=0; i<s.length(); i++) {
char c = s.charAt(i);
if (c == ';') {
l.add(sb.toString());
sb.setLength(0);
} else {
sb.append(c);
}
}
l.add(sb.toString());
return l;
}
private static void addTag(Map<String, String> row, String s, String name) {
if (!s.isEmpty()) {
row.put(name, s);
}
}
public void addTags(long osm_id, Map<String, String> map) {
if (map == null || !map.containsKey("highway")) {
return;
}
cntOsmWays++;
if ((cntOsmWays % 1000000L) == 0) {
String out = "Osm Ways processed=" + cntOsmWays + " way modifs=" + cntWayModified;
for (String key : pseudoTagsFound.keySet()) {
out += " " + key + "=" + pseudoTagsFound.get(key);
}
System.out.println(out);
}
Map<String, String> dbTags = dbData.get(osm_id);
if (dbTags == null) {
return;
}
cntWayModified++;
for (String key : dbTags.keySet()) {
map.put(key, dbTags.get(key));
Long cnt = pseudoTagsFound.get(key);
if (cnt == null) {
cnt = 0L;
}
pseudoTagsFound.put(key, cnt + 1L);
}
}
}

View file

@ -0,0 +1,264 @@
package btools.mapcreator;
import btools.util.ReducedMedianFilter;
/**
* Container for a elevation raster + it's meta-data
*
* @author ab
*/
public class ElevationRaster {
public int ncols;
public int nrows;
public boolean halfcol;
public double xllcorner;
public double yllcorner;
public double cellsize;
public short[] eval_array;
public short noDataValue;
public boolean usingWeights = false;
private boolean missingData = false;
public short getElevation(int ilon, int ilat) {
double lon = ilon / 1000000. - 180.;
double lat = ilat / 1000000. - 90.;
if (usingWeights) {
return getElevationFromShiftWeights(lon, lat);
}
// no weights calculated, use 2d linear interpolation
double dcol = (lon - xllcorner) / cellsize - 0.5;
double drow = (lat - yllcorner) / cellsize - 0.5;
int row = (int) drow;
int col = (int) dcol;
if (col < 0) col = 0;
if (col >= ncols - 1) col = ncols - 2;
if (row < 0) row = 0;
if (row >= nrows - 1) row = nrows - 2;
double wrow = drow - row;
double wcol = dcol - col;
missingData = false;
// System.out.println( "wrow=" + wrow + " wcol=" + wcol + " row=" + row + " col=" + col );
double eval = (1. - wrow) * (1. - wcol) * get(row, col)
+ (wrow) * (1. - wcol) * get(row + 1, col)
+ (1. - wrow) * (wcol) * get(row, col + 1)
+ (wrow) * (wcol) * get(row + 1, col + 1);
// System.out.println( "eval=" + eval );
return missingData ? Short.MIN_VALUE : (short) (eval * 4);
}
private short get(int r, int c) {
short e = eval_array[(nrows - 1 - r) * ncols + c];
if (e == Short.MIN_VALUE) missingData = true;
return e;
}
private short getElevationFromShiftWeights(double lon, double lat) {
// calc lat-idx and -weight
double alat = lat < 0. ? -lat : lat;
alat /= 5.;
int latIdx = (int) alat;
double wlat = alat - latIdx;
double dcol = (lon - xllcorner) / cellsize;
double drow = (lat - yllcorner) / cellsize;
int row = (int) drow;
int col = (int) dcol;
double dgx = (dcol - col) * gridSteps;
double dgy = (drow - row) * gridSteps;
// System.out.println( "wrow=" + wrow + " wcol=" + wcol + " row=" + row + " col=" + col );
int gx = (int) (dgx);
int gy = (int) (dgy);
double wx = dgx - gx;
double wy = dgy - gy;
double w00 = (1. - wx) * (1. - wy);
double w01 = (1. - wx) * (wy);
double w10 = (wx) * (1. - wy);
double w11 = (wx) * (wy);
Weights[][] w0 = getWeights(latIdx);
Weights[][] w1 = getWeights(latIdx + 1);
missingData = false;
double m0 = w00 * getElevation(w0[gx][gy], row, col)
+ w01 * getElevation(w0[gx][gy + 1], row, col)
+ w10 * getElevation(w0[gx + 1][gy], row, col)
+ w11 * getElevation(w0[gx + 1][gy + 1], row, col);
double m1 = w00 * getElevation(w1[gx][gy], row, col)
+ w01 * getElevation(w1[gx][gy + 1], row, col)
+ w10 * getElevation(w1[gx + 1][gy], row, col)
+ w11 * getElevation(w1[gx + 1][gy + 1], row, col);
if (missingData) return Short.MIN_VALUE;
double m = (1. - wlat) * m0 + wlat * m1;
return (short) (m * 2);
}
private ReducedMedianFilter rmf = new ReducedMedianFilter(256);
private double getElevation(Weights w, int row, int col) {
if (missingData) {
return 0.;
}
int nx = w.nx;
int ny = w.ny;
int mx = nx / 2; // mean pixels
int my = ny / 2;
// System.out.println( "nx="+ nx + " ny=" + ny );
rmf.reset();
for (int ix = 0; ix < nx; ix++) {
for (int iy = 0; iy < ny; iy++) {
short val = get(row + iy - my, col + ix - mx);
rmf.addSample(w.getWeight(ix, iy), val);
}
}
return missingData ? 0. : rmf.calcEdgeReducedMedian(filterCenterFraction);
}
private static class Weights {
int nx;
int ny;
double[] weights;
long total = 0;
Weights(int nx, int ny) {
this.nx = nx;
this.ny = ny;
weights = new double[nx * ny];
}
void inc(int ix, int iy) {
weights[iy * nx + ix] += 1.;
total++;
}
void normalize(boolean verbose) {
for (int iy = 0; iy < ny; iy++) {
StringBuilder sb = verbose ? new StringBuilder() : null;
for (int ix = 0; ix < nx; ix++) {
weights[iy * nx + ix] /= total;
if (sb != null) {
int iweight = (int) (1000 * weights[iy * nx + ix] + 0.5);
String sval = " " + iweight;
sb.append(sval.substring(sval.length() - 4));
}
}
if (sb != null) {
System.out.println(sb);
System.out.println();
}
}
}
double getWeight(int ix, int iy) {
return weights[iy * nx + ix];
}
}
private static int gridSteps = 10;
private static Weights[][][] allShiftWeights = new Weights[17][][];
private static double filterCenterFraction = 0.2;
private static double filterDiscRadius = 4.999; // in pixels
static {
String sRadius = System.getProperty("filterDiscRadius");
if (sRadius != null && sRadius.length() > 0) {
filterDiscRadius = Integer.parseInt(sRadius);
System.out.println("using filterDiscRadius = " + filterDiscRadius);
}
String sFraction = System.getProperty("filterCenterFraction");
if (sFraction != null && sFraction.length() > 0) {
filterCenterFraction = Integer.parseInt(sFraction) / 100.;
System.out.println("using filterCenterFraction = " + filterCenterFraction);
}
}
// calculate interpolation weights from the overlap of a probe disc of given radius at given latitude
// ( latIndex = 0 -> 0 deg, latIndex = 16 -> 80 degree)
private static Weights[][] getWeights(int latIndex) {
int idx = latIndex < 16 ? latIndex : 16;
Weights[][] res = allShiftWeights[idx];
if (res == null) {
res = calcWeights(idx);
allShiftWeights[idx] = res;
}
return res;
}
private static Weights[][] calcWeights(int latIndex) {
double coslat = Math.cos(latIndex * 5. / 57.3);
// radius in pixel units
double ry = filterDiscRadius;
double rx = ry / coslat;
// gridsize is 2*radius + 1 cell
int nx = ((int) rx) * 2 + 3;
int ny = ((int) ry) * 2 + 3;
System.out.println("nx=" + nx + " ny=" + ny);
int mx = nx / 2; // mean pixels
int my = ny / 2;
// create a matrix for the relative intergrid-position
Weights[][] shiftWeights = new Weights[gridSteps + 1][];
// loop the intergrid-position
for (int gx = 0; gx <= gridSteps; gx++) {
shiftWeights[gx] = new Weights[gridSteps + 1];
double x0 = mx + ((double) gx) / gridSteps;
for (int gy = 0; gy <= gridSteps; gy++) {
double y0 = my + ((double) gy) / gridSteps;
// create the weight-matrix
Weights weights = new Weights(nx, ny);
shiftWeights[gx][gy] = weights;
double sampleStep = 0.001;
for (double x = -1. + sampleStep / 2.; x < 1.; x += sampleStep) {
double mx2 = 1. - x * x;
int x_idx = (int) (x0 + x * rx);
for (double y = -1. + sampleStep / 2.; y < 1.; y += sampleStep) {
if (y * y > mx2) {
continue;
}
// we are in the ellipse, see what pixel we are on
int y_idx = (int) (y0 + y * ry);
weights.inc(x_idx, y_idx);
}
}
weights.normalize(true);
}
}
return shiftWeights;
}
@Override
public String toString() {
return ncols + "," + nrows + "," + halfcol + "," + xllcorner + "," + yllcorner + "," + cellsize + "," + noDataValue + "," + usingWeights;
}
}

View file

@ -1,16 +1,20 @@
package btools.mapcreator;
import java.io.*;
import btools.util.*;
import java.io.DataInputStream;
import java.io.DataOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import btools.util.MixCoderDataInputStream;
import btools.util.MixCoderDataOutputStream;
//
// Encode/decode a raster
//
public class RasterCoder
{
public void encodeRaster(SrtmRaster raster, OutputStream os) throws IOException
{
public class ElevationRasterCoder {
public void encodeRaster(ElevationRaster raster, OutputStream os) throws IOException {
DataOutputStream dos = new DataOutputStream(os);
long t0 = System.currentTimeMillis();
@ -29,13 +33,12 @@ public class RasterCoder
System.out.println("finished encoding in " + (t1 - t0) + " ms");
}
public SrtmRaster decodeRaster(InputStream is) throws IOException
{
public ElevationRaster decodeRaster(InputStream is) throws IOException {
DataInputStream dis = new DataInputStream(is);
long t0 = System.currentTimeMillis();
SrtmRaster raster = new SrtmRaster();
ElevationRaster raster = new ElevationRaster();
raster.ncols = dis.readInt();
raster.nrows = dis.readInt();
raster.halfcol = dis.readBoolean();
@ -46,78 +49,65 @@ public class RasterCoder
raster.eval_array = new short[raster.ncols * raster.nrows];
_decodeRaster(raster, is);
raster.usingWeights = raster.ncols > 6001;
raster.usingWeights = false; // raster.ncols > 6001;
long t1 = System.currentTimeMillis();
System.out.println("finished decoding in " + (t1 - t0) + " ms ncols=" + raster.ncols + " nrows=" + raster.nrows );
System.out.println("finished decoding in " + (t1 - t0) + " ms ncols=" + raster.ncols + " nrows=" + raster.nrows);
return raster;
}
private void _encodeRaster(SrtmRaster raster, OutputStream os) throws IOException
{
private void _encodeRaster(ElevationRaster raster, OutputStream os) throws IOException {
MixCoderDataOutputStream mco = new MixCoderDataOutputStream(os);
int nrows = raster.nrows;
int ncols = raster.ncols;
short[] pixels = raster.eval_array;
int colstep = raster.halfcol ? 2 : 1;
for (int row = 0; row < nrows; row++)
{
for (int row = 0; row < nrows; row++) {
short lastval = Short.MIN_VALUE; // nodata
for (int col = 0; col < ncols; col += colstep )
{
for (int col = 0; col < ncols; col += colstep) {
short val = pixels[row * ncols + col];
if ( val == -32766 )
{
if (val == -32766) {
val = lastval; // replace remaining (border) skips
}
else
{
} else {
lastval = val;
}
// remap nodata
int code = val == Short.MIN_VALUE ? -1 : ( val < 0 ? val-1 : val );
mco.writeMixed( code );
int code = val == Short.MIN_VALUE ? -1 : (val < 0 ? val - 1 : val);
mco.writeMixed(code);
}
}
mco.flush();
}
private void _decodeRaster(SrtmRaster raster, InputStream is) throws IOException
{
private void _decodeRaster(ElevationRaster raster, InputStream is) throws IOException {
MixCoderDataInputStream mci = new MixCoderDataInputStream(is);
int nrows = raster.nrows;
int ncols = raster.ncols;
short[] pixels = raster.eval_array;
int colstep = raster.halfcol ? 2 : 1;
for (int row = 0; row < nrows; row++)
{
for (int col = 0; col < ncols; col += colstep )
{
for (int row = 0; row < nrows; row++) {
for (int col = 0; col < ncols; col += colstep) {
int code = mci.readMixed();
// remap nodata
int v30 = code == -1 ? Short.MIN_VALUE : ( code < 0 ? code + 1 : code );
if ( raster.usingWeights && v30 > -32766 )
{
int v30 = code == -1 ? Short.MIN_VALUE : (code < 0 ? code + 1 : code);
if (raster.usingWeights && v30 > -32766) {
v30 *= 2;
}
pixels[row * ncols + col] = (short) ( v30 );
}
pixels[row * ncols + col] = (short) (v30);
}
if ( raster.halfcol )
{
for (int col = 1; col < ncols-1; col += colstep )
{
int l = (int)pixels[row * ncols + col - 1];
int r = (int)pixels[row * ncols + col + 1];
if (raster.halfcol) {
for (int col = 1; col < ncols - 1; col += colstep) {
int l = (int) pixels[row * ncols + col - 1];
int r = (int) pixels[row * ncols + col + 1];
short v30 = Short.MIN_VALUE; // nodata
if ( l > -32766 && r > -32766 )
{
v30 = (short)((l+r)/2);
if (l > -32766 && r > -32766) {
v30 = (short) ((l + r) / 2);
}
pixels[row * ncols + col] = v30;
}

View file

@ -0,0 +1,545 @@
package btools.mapcreator;
import java.io.BufferedInputStream;
import java.io.BufferedOutputStream;
import java.io.BufferedReader;
import java.io.DataInputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.OutputStream;
import java.util.Arrays;
import java.util.Locale;
import java.util.StringTokenizer;
import java.util.zip.ZipEntry;
import java.util.zip.ZipInputStream;
public class ElevationRasterTileConverter {
public static final boolean DEBUG = false;
public static final short NODATA2 = -32767; // hgt-formats nodata
public static final short NODATA = Short.MIN_VALUE;
private static final String HGT_FILE_EXT = ".hgt";
private static final int HGT_BORDER_OVERLAP = 1;
private static final int HGT_3ASEC_ROWS = 1201; // 3 arc second resolution (90m)
private static final int HGT_3ASEC_FILE_SIZE = HGT_3ASEC_ROWS * HGT_3ASEC_ROWS * Short.BYTES;
private static final int HGT_1ASEC_ROWS = 3601; // 1 arc second resolution (30m)
private static final int SRTM3_ROW_LENGTH = 1200; // number of elevation values per line
private static final int SRTM1_ROW_LENGTH = 3600;
private static final boolean SRTM_NO_ZERO = true;
private int NROWS;
private int NCOLS;
private int ROW_LENGTH;
private short[] imagePixels;
/**
* This generates elevation raster files with a 5x5 degree scope
* The output can be for 1sec (18000x18000 points)
* or for 3sec (6000x6000 points)
* When using 1sec input files a not found area can be called from 3sec pool
* The input can be 1x1 degree 1sec/3sec hgt files (also packed as zip)
* or 5x5 degree 3sec asc files (delivered as zip)
* Arguments for single file generation:
* ElevationRasterTileConverter <srtm-filename | all> <hgt-data-dir> <srtm-output-dir> [arc seconds (1 or 3,default=3)] [hgt-fallback-data-dir]
* Samples
* $ ... ElevationRasterTileConverter srtm_34_-1 ./srtm/hgt3sec ./srtm/srtm3_bef
* $ ... ElevationRasterTileConverter srtm_34_-1 ./srtm/hgt1sec ./srtm/srtm1_bef 1
* $ ... ElevationRasterTileConverter srtm_34_-1 ./srtm/hgt1sec ./srtm/srtm1_bef 1 ./srtm/hgt3sec
* <p>
* Arguments for multi file generation (world wide):
* $ ... ElevationRasterTileConverter all ./srtm/hgt3sec ./srtm/srtm3_bef
* $ ... ElevationRasterTileConverter all ./srtm/hgt1sec ./srtm/srtm1_bef 1
* $ ... ElevationRasterTileConverter all ./srtm/hgt1sec ./srtm/srtm1_bef 1 ./srtm/hgt3sec
*
* @param args
* @throws Exception
*/
public static void main(String[] args) throws Exception {
if (args.length == 3 || args.length == 4 || args.length == 5) {
String filename90 = args[0];
if ("all".equals(filename90)) {
//if (DEBUG)
System.out.println("raster convert all ");
new ElevationRasterTileConverter().doConvertAll(args[1], args[2], (args.length > 3 ? args[3] : null), (args.length == 5 ? args[4] : null));
return;
}
// old filenames only
String filename30 = filename90 + ".bef"; //filename90.substring(0, filename90.length() - 3) + "bef";
int srtmLonIdx = Integer.parseInt(filename90.substring(5, 7).toLowerCase());
int srtmLatIdx = Integer.parseInt(filename90.substring(8, 10).toLowerCase());
int ilon_base = (srtmLonIdx - 1) * 5 - 180;
int ilat_base = 150 - srtmLatIdx * 5 - 90;
int row_length = SRTM3_ROW_LENGTH;
String fallbackdir = null;
if (args.length > 3) {
row_length = (Integer.parseInt(args[3]) == 1 ? SRTM1_ROW_LENGTH : SRTM3_ROW_LENGTH);
fallbackdir = (args.length == 5 ? args[4] : null);
}
//if (DEBUG)
System.out.println("raster convert " + ilon_base + " " + ilat_base + " from " + srtmLonIdx + " " + srtmLatIdx + " f: " + filename90 + " rowl " + row_length);
new ElevationRasterTileConverter().doConvert(args[1], ilon_base, ilat_base, args[2] + "/" + filename30, row_length, fallbackdir);
} else {
System.out.println("usage: java <srtm-filename> <hgt-data-dir> <srtm-output-dir> [arc seconds (1 or 3,default=3)] [hgt-fallback-data-dir]");
System.out.println("or java all <hgt-data-dir> <srtm-output-dir> [arc seconds (1 or 3, default=3)] [hgt-fallback-data-dir]");
}
}
private void doConvertAll(String hgtdata, String outdir, String rlen, String hgtfallbackdata) throws Exception {
int row_length = SRTM3_ROW_LENGTH;
if (rlen != null) {
row_length = (Integer.parseInt(rlen) == 1 ? SRTM1_ROW_LENGTH : SRTM3_ROW_LENGTH);
}
String filename30;
for (int ilon_base = -180; ilon_base < 180; ilon_base += 5) {
for (int ilat_base = 85; ilat_base > -90; ilat_base -= 5) {
if (PosUnifier.UseRasterRd5FileName) {
filename30 = genFilenameRd5(ilon_base, ilat_base);
} else {
filename30 = genFilenameOld(ilon_base, ilat_base);
}
if (DEBUG)
System.out.println("lidar convert all: " + filename30);
doConvert(hgtdata, ilon_base, ilat_base, outdir + "/" + filename30, row_length, hgtfallbackdata);
}
}
}
static String genFilenameOld(int ilon_base, int ilat_base) {
int srtmLonIdx = ((ilon_base + 180) / 5) + 1;
int srtmLatIdx = (60 - ilat_base) / 5;
return String.format(Locale.US, "srtm_%02d_%02d.bef", srtmLonIdx, srtmLatIdx);
}
static String genFilenameRd5(int ilon_base, int ilat_base) {
return String.format("srtm_%s_%s.bef", ilon_base < 0 ? "W" + (-ilon_base) : "E" + ilon_base,
ilat_base < 0 ? "S" + (-ilat_base) : "N" + ilat_base);
}
private void readHgtZip(String filename, int rowOffset, int colOffset, int row_length, int scale) throws Exception {
ZipInputStream zis = new ZipInputStream(new BufferedInputStream(new FileInputStream(filename)));
try {
for (; ; ) {
ZipEntry ze = zis.getNextEntry();
if (ze == null) break;
if (ze.getName().toLowerCase().endsWith(HGT_FILE_EXT)) {
readHgtFromStream(zis, rowOffset, colOffset, row_length, scale);
return;
}
}
} finally {
zis.close();
}
}
private void readHgtFromStream(InputStream is, int rowOffset, int colOffset, int rowLength, int scale)
throws Exception {
DataInputStream dis = new DataInputStream(new BufferedInputStream(is));
for (int ir = 0; ir < rowLength; ir++) {
int row = rowOffset + ir * scale;
for (int ic = 0; ic < rowLength; ic++) {
int col = colOffset + ic * scale;
int i1 = dis.read(); // msb first!
int i0 = dis.read();
if (i0 == -1 || i1 == -1)
throw new RuntimeException("unexpected end of file reading hgt entry!");
short val = (short) ((i1 << 8) | i0);
if (val == NODATA2) {
val = NODATA;
}
if (scale == 3) {
setPixel(row, col, val);
setPixel(row + 1, col, val);
setPixel(row + 2, col, val);
setPixel(row, col + 1, val);
setPixel(row + 1, col + 1, val);
setPixel(row + 2, col + 1, val);
setPixel(row, col + 2, val);
setPixel(row + 1, col + 2, val);
setPixel(row + 2, col + 2, val);
} else {
setPixel(row, col, val);
}
}
}
}
private void readHgtFile(File file, int rowOffset, int colOffset, int row_length, int scale)
throws Exception {
if (DEBUG)
System.out.println("read: " + file + " " + row_length);
FileInputStream fis = new FileInputStream(file);
try {
readHgtFromStream(fis, rowOffset, colOffset, row_length, scale);
} finally {
fis.close();
}
}
/*
private void readFallbackFile(File file, int rowOffset, int colOffset, int row_length)
throws Exception {
int rowLength;
int scale;
if (file.length() > HGT_3ASEC_FILE_SIZE) {
rowLength = HGT_1ASEC_ROWS;
scale = 1;
} else {
rowLength = HGT_3ASEC_ROWS;
scale = 3;
}
if (DEBUG)
System.out.println("read fallback: " + file + " " + rowLength);
FileInputStream fis = new FileInputStream(file);
try {
readHgtFromStream(fis, rowOffset, colOffset, rowLength, scale);
} finally {
fis.close();
}
}
*/
private void readAscZip(File file, ElevationRaster raster) throws Exception {
ZipInputStream zis = new ZipInputStream(new BufferedInputStream(new FileInputStream(file)));
try {
for (; ; ) {
ZipEntry ze = zis.getNextEntry();
if (ze.getName().endsWith(".asc")) {
readAscFromStream(zis, raster);
return;
}
}
} finally {
zis.close();
}
}
private String secondToken(String s) {
StringTokenizer tk = new StringTokenizer(s, " ");
tk.nextToken();
return tk.nextToken();
}
private void readAscFromStream(InputStream is, ElevationRaster raster) throws Exception {
BufferedReader br = new BufferedReader(new InputStreamReader(is));
int linenr = 0;
for (; ; ) {
linenr++;
if (linenr <= 6) {
String line = br.readLine();
if (linenr == 1)
raster.ncols = Integer.parseInt(secondToken(line));
else if (linenr == 2)
raster.nrows = Integer.parseInt(secondToken(line));
else if (linenr == 3)
raster.xllcorner = Double.parseDouble(secondToken(line));
else if (linenr == 4)
raster.yllcorner = Double.parseDouble(secondToken(line));
else if (linenr == 5)
raster.cellsize = Double.parseDouble(secondToken(line));
else if (linenr == 6) {
// nodata ignored here ( < -250 assumed nodata... )
// raster.noDataValue = Short.parseShort( secondToken( line ) );
raster.eval_array = new short[raster.ncols * raster.nrows];
}
} else {
int row = 0;
int col = 0;
int n = 0;
boolean negative = false;
for (; ; ) {
int c = br.read();
if (c < 0)
break;
if (c == ' ') {
if (negative)
n = -n;
short val = n < -250 ? Short.MIN_VALUE : (short) (n);
raster.eval_array[row * raster.ncols + col] = val;
if (++col == raster.ncols) {
col = 0;
++row;
}
n = 0;
negative = false;
} else if (c >= '0' && c <= '9') {
n = 10 * n + (c - '0');
} else if (c == '-') {
negative = true;
}
}
break;
}
}
br.close();
}
private void setPixel(int row, int col, short val) {
if (row >= 0 && row < NROWS && col >= 0 && col < NCOLS) {
imagePixels[row * NCOLS + col] = val;
}
}
private short getPixel(int row, int col) {
if (row >= 0 && row < NROWS && col >= 0 && col < NCOLS) {
return imagePixels[row * NCOLS + col];
}
return NODATA;
}
public void doConvert(String inputDir, int lonDegreeStart, int latDegreeStart, String outputFile, int row_length, String hgtfallbackdata) throws Exception {
int extraBorder = 0;
//List<String> foundList = new ArrayList<>();
//List<String> notfoundList = new ArrayList<>();
boolean hgtfound = false;
boolean ascfound = false;
String filename = null;
//if (row_length == SRTM1_ROW_LENGTH)
{
// check for sources w/o border
for (int latIdx = 0; latIdx < 5; latIdx++) {
int latDegree = latDegreeStart + latIdx;
for (int lonIdx = 0; lonIdx < 5; lonIdx++) {
int lonDegree = lonDegreeStart + lonIdx;
filename = inputDir + "/" + formatLat(latDegree) + formatLon(lonDegree) + ".zip";
File f = new File(filename);
if (f.exists() && f.length() > 0) {
hgtfound = true;
break;
}
filename = filename.substring(0, filename.length() - 4) + ".hgt";
f = new File(filename);
if (f.exists() && f.length() > 0) {
hgtfound = true;
break;
}
}
}
if (!hgtfound) {
filename = inputDir + "/" + genFilenameOld(lonDegreeStart, latDegreeStart).substring(0, 10) + ".zip";
File f = new File(filename);
if (f.exists() && f.length() > 0) {
ascfound = true;
}
}
}
if (hgtfound) { // init when found
NROWS = 5 * row_length + 1 + 2 * extraBorder;
NCOLS = 5 * row_length + 1 + 2 * extraBorder;
imagePixels = new short[NROWS * NCOLS]; // 650 MB !
// prefill as NODATA
Arrays.fill(imagePixels, NODATA);
} else if (!ascfound) {
if (DEBUG)
System.out.println("none data: " + lonDegreeStart + " " + latDegreeStart);
return;
}
if (hgtfound) {
for (int latIdx = -1; latIdx <= 5; latIdx++) {
int latDegree = latDegreeStart + latIdx;
int rowOffset = extraBorder + (4 - latIdx) * row_length;
for (int lonIdx = -1; lonIdx <= 5; lonIdx++) {
int lonDegree = lonDegreeStart + lonIdx;
int colOffset = extraBorder + lonIdx * row_length;
filename = inputDir + "/" + formatLat(latDegree) + formatLon(lonDegree) + ".zip";
File f = new File(filename);
if (f.exists() && f.length() > 0) {
if (DEBUG)
System.out.println("exist: " + filename);
readHgtZip(filename, rowOffset, colOffset, row_length + 1, 1);
continue;
}
filename = filename.substring(0, filename.length() - 4) + ".hgt";
f = new File(filename);
if (f.exists() && f.length() > 0) {
if (DEBUG)
System.out.println("exist: " + filename);
readHgtFile(f, rowOffset, colOffset, row_length + 1, 1);
continue;
} else {
if (hgtfallbackdata != null) {
filename = hgtfallbackdata + "/" + formatLat(latDegree) + formatLon(lonDegree) + ".hgt";
f = new File(filename);
if (f.exists() && f.length() > 0) {
readHgtFile(f, rowOffset, colOffset, SRTM3_ROW_LENGTH + 1, 3);
continue;
}
filename = filename.substring(0, filename.length() - 4) + ".zip";
f = new File(filename);
if (f.exists() && f.length() > 0) {
readHgtZip(filename, rowOffset, colOffset, SRTM3_ROW_LENGTH + 1, 3);
} else {
if (DEBUG)
System.out.println("none : " + filename);
}
}
}
}
}
// post fill zero
if (SRTM_NO_ZERO) {
for (int row = 0; row < NROWS; row++) {
for (int col = 0; col < NCOLS; col++) {
if (imagePixels[row * NCOLS + col] == 0) imagePixels[row * NCOLS + col] = NODATA;
}
}
}
}
boolean halfCol5 = false; // no halfcol tiles in lidar data (?)
ElevationRaster raster = new ElevationRaster();
if (hgtfound) {
raster.nrows = NROWS;
raster.ncols = NCOLS;
raster.halfcol = halfCol5;
raster.noDataValue = NODATA;
raster.cellsize = 1. / row_length;
raster.xllcorner = lonDegreeStart - (0.5 + extraBorder) * raster.cellsize;
raster.yllcorner = latDegreeStart - (0.5 + extraBorder) * raster.cellsize;
raster.eval_array = imagePixels;
}
if (ascfound) {
File f = new File(filename);
readAscZip(f, raster);
}
// encode the raster
OutputStream os = new BufferedOutputStream(new FileOutputStream(outputFile));
new ElevationRasterCoder().encodeRaster(raster, os);
os.close();
// decode the raster
InputStream is = new BufferedInputStream(new FileInputStream(outputFile));
ElevationRaster raster2 = new ElevationRasterCoder().decodeRaster(is);
is.close();
short[] pix2 = raster2.eval_array;
if (pix2.length != raster.eval_array.length)
throw new RuntimeException("length mismatch!");
// compare decoding result
for (int row = 0; row < raster.nrows; row++) {
int colstep = halfCol5 ? 2 : 1;
for (int col = 0; col < raster.ncols; col += colstep) {
int idx = row * raster.ncols + col;
short p2 = pix2[idx];
if (p2 != raster.eval_array[idx]) {
throw new RuntimeException("content mismatch: p2=" + p2 + " p1=" + raster.eval_array[idx]);
}
}
}
imagePixels = null;
}
private static String formatLon(int lon) {
if (lon >= 180)
lon -= 180; // TODO: w180 oder E180 ?
String s = "E";
if (lon < 0) {
lon = -lon;
s = "W";
}
String n = "000" + lon;
return s + n.substring(n.length() - 3);
}
private static String formatLat(int lat) {
String s = "N";
if (lat < 0) {
lat = -lat;
s = "S";
}
String n = "00" + lat;
return s + n.substring(n.length() - 2);
}
public ElevationRaster getRaster(File f, double lon, double lat) throws Exception {
long fileSize;
InputStream inputStream;
if (f.getName().toLowerCase().endsWith(".zip")) {
ZipInputStream zis = new ZipInputStream(new BufferedInputStream(new FileInputStream(f)));
for (; ; ) {
ZipEntry ze = zis.getNextEntry();
if (ze == null) {
throw new FileNotFoundException(f.getName() + " doesn't contain a " + HGT_FILE_EXT + " file.");
}
if (ze.getName().toLowerCase().endsWith(HGT_FILE_EXT)) {
fileSize = ze.getSize();
inputStream = zis;
break;
}
}
} else {
fileSize = f.length();
inputStream = new FileInputStream(f);
}
int rowLength;
if (fileSize > HGT_3ASEC_FILE_SIZE) {
rowLength = HGT_1ASEC_ROWS;
} else {
rowLength = HGT_3ASEC_ROWS;
}
// stay at 1 deg * 1 deg raster
NROWS = rowLength;
NCOLS = rowLength;
imagePixels = new short[NROWS * NCOLS];
// prefill as NODATA
Arrays.fill(imagePixels, NODATA);
readHgtFromStream(inputStream, 0, 0, rowLength, 1);
inputStream.close();
ElevationRaster raster = new ElevationRaster();
raster.nrows = NROWS;
raster.ncols = NCOLS;
raster.halfcol = false; // assume full resolution
raster.noDataValue = NODATA;
raster.cellsize = 1. / (double) (rowLength - HGT_BORDER_OVERLAP);
raster.xllcorner = (int) (lon < 0 ? lon - 1 : lon); //onDegreeStart - raster.cellsize;
raster.yllcorner = (int) (lat < 0 ? lat - 1 : lat); //latDegreeStart - raster.cellsize;
raster.eval_array = imagePixels;
return raster;
}
}

View file

@ -0,0 +1,342 @@
// License: GPL. For details, see LICENSE file.
package btools.mapcreator;
import java.io.BufferedInputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.InputStream;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.ShortBuffer;
import java.nio.channels.FileChannel;
import java.util.HashMap;
import java.util.Map;
import java.util.zip.ZipEntry;
import java.util.zip.ZipInputStream;
/**
* adapted from https://github.com/JOSM/josm-plugins/blob/master/ElevationProfile/src/org/openstreetmap/josm/plugins/elevation/HgtReader.java
* <p>
* Class HgtReader reads data from SRTM HGT files. Currently this class is restricted to a resolution of 3 arc seconds.
* <p>
* SRTM data files are available at the <a href="http://dds.cr.usgs.gov/srtm/version2_1/SRTM3">NASA SRTM site</a>
*
* @author Oliver Wieland &lt;oliver.wieland@online.de&gt;
*/
public class HgtReader {
final static boolean DEBUG = false;
private static final int SECONDS_PER_MINUTE = 60;
public static final String HGT_EXT = ".hgt";
public static final String ZIP_EXT = ".zip";
// alter these values for different SRTM resolutions
public static final int HGT3_RES = 3; // resolution in arc seconds
public static final int HGT3_ROW_LENGTH = 1201; // number of elevation values per line
public static final int HGT_VOID = -32768; // magic number which indicates 'void data' in HGT file
public static final int HGT1_RES = 1; // <<- The new SRTM is 1-ARCSEC
public static final int HGT1_ROW_LENGTH = 3601; //-- New file resolution is 3601x3601
/**
* The 'no elevation' data magic.
*/
public static double NO_ELEVATION = Double.NaN;
private static String srtmFolder = "";
private static final Map<String, ShortBuffer> cache = new HashMap<>();
public HgtReader(String folder) {
srtmFolder = folder;
}
public static double getElevationFromHgt(double lat, double lon) {
try {
String file = getHgtFileName(lat, lon);
if (DEBUG) System.out.println("HGT buffer " + file + " for " + lat + " " + lon);
// given area in cache?
if (!cache.containsKey(file)) {
// fill initial cache value. If no file is found, then
// we use it as a marker to indicate 'file has been searched
// but is not there'
cache.put(file, null);
// Try all resource directories
//for (String location : Main.pref.getAllPossiblePreferenceDirs())
{
String fullPath = new File(srtmFolder, file + HGT_EXT).getPath();
File f = new File(fullPath);
if (f.exists()) {
// found something: read HGT file...
ShortBuffer data = readHgtFile(fullPath);
// ... and store result in cache
cache.put(file, data);
//break;
} else {
fullPath = new File(srtmFolder, file + ZIP_EXT).getPath();
f = new File(fullPath);
if (f.exists()) {
ZipInputStream zis = new ZipInputStream(new BufferedInputStream(new FileInputStream(f)));
try {
for (; ; ) {
ZipEntry ze = zis.getNextEntry();
if (ze == null) break;
if (ze.getName().toLowerCase().endsWith(HGT_EXT)) {
// System.out.println("read zip " + ze.getName());
ShortBuffer data = readHgtStream(zis);
// ... and store result in cache
cache.put(file, data);
break;
}
zis.closeEntry();
}
} finally {
zis.close();
}
}
}
System.out.println("*** reading: " + f.getName() + " " + cache.get(file));
}
}
// read elevation value
return readElevation(lat, lon);
} catch (FileNotFoundException e) {
System.err.println("HGT Get elevation " + lat + ", " + lon + " failed: => " + e.getMessage());
// no problem... file not there
return NO_ELEVATION;
} catch (Exception ioe) {
// oops...
ioe.printStackTrace(System.err);
// fallback
return NO_ELEVATION;
}
}
public static short[] getElevationDataFromHgt(double lat, double lon) {
try {
if (lon < 0) lon += 1;
if (lat < 0) lat += 1;
String file = getHgtFileName(lat, lon);
if (DEBUG) System.out.println("HGT buffer " + file + " for " + lat + " " + lon);
ShortBuffer data = null;
// Try all resource directories
//for (String location : Main.pref.getAllPossiblePreferenceDirs())
String fullPath = new File(srtmFolder, file + HGT_EXT).getPath();
File f = new File(fullPath);
if (f.exists()) {
// found something: read HGT file...
data = readHgtFile(fullPath);
} else {
fullPath = new File(srtmFolder, file + ZIP_EXT).getPath();
f = new File(fullPath);
if (f.exists()) {
ZipInputStream zis = new ZipInputStream(new BufferedInputStream(new FileInputStream(f)));
try {
for (; ; ) {
ZipEntry ze = zis.getNextEntry();
if (ze == null) break;
if (ze.getName().toLowerCase().endsWith(HGT_EXT)) {
// System.out.println("read zip " + ze.getName());
data = readHgtStream(zis);
break;
}
zis.closeEntry();
}
} finally {
zis.close();
}
}
}
System.out.println("*** reading: " + f.getName() + " " + (data != null ? data.limit() : -1));
if (data != null) {
short[] array = new short[data.limit()];
data.get(array);
return array;
}
return null;
} catch (FileNotFoundException e) {
System.err.println("HGT Get elevation " + lat + ", " + lon + " failed: => " + e.getMessage());
// no problem... file not there
return null;
} catch (Exception ioe) {
// oops...
ioe.printStackTrace(System.err);
// fallback
return null;
}
}
@SuppressWarnings("resource")
private static ShortBuffer readHgtFile(String file) throws Exception {
if (file == null) throw new Exception("no hgt file " + file);
FileChannel fc = null;
ShortBuffer sb = null;
try {
// Eclipse complains here about resource leak on 'fc' - even with 'finally' clause???
fc = new FileInputStream(file).getChannel();
// choose the right endianness
ByteBuffer bb = ByteBuffer.allocateDirect((int) fc.size());
while (bb.remaining() > 0) fc.read(bb);
bb.flip();
//sb = bb.order(ByteOrder.LITTLE_ENDIAN).asShortBuffer();
sb = bb.order(ByteOrder.BIG_ENDIAN).asShortBuffer();
} finally {
if (fc != null) fc.close();
}
return sb;
}
// @SuppressWarnings("resource")
private static ShortBuffer readHgtStream(InputStream zis) throws Exception {
if (zis == null) throw new Exception("no hgt stream ");
ShortBuffer sb = null;
try {
// choose the right endianness
byte[] bytes = zis.readAllBytes();
ByteBuffer bb = ByteBuffer.allocate(bytes.length);
bb.put(bytes, 0, bytes.length);
//while (bb.remaining() > 0) zis.read(bb, 0, size);
//ByteBuffer bb = ByteBuffer.allocate(zis.available());
//Channels.newChannel(zis).read(bb);
bb.flip();
//sb = bb.order(ByteOrder.LITTLE_ENDIAN).asShortBuffer();
sb = bb.order(ByteOrder.BIG_ENDIAN).asShortBuffer();
} finally {
}
return sb;
}
/**
* Reads the elevation value for the given coordinate.
* <p>
* See also <a href="http://gis.stackexchange.com/questions/43743/how-to-extract-elevation-from-hgt-file">stackexchange.com</a>
*
* @param lat, lon the coordinate to get the elevation data for
* @return the elevation value or <code>Double.NaN</code>, if no value is present
*/
public static double readElevation(double lat, double lon) {
String tag = getHgtFileName(lat, lon);
ShortBuffer sb = cache.get(tag);
if (sb == null) {
return NO_ELEVATION;
}
if (DEBUG) System.out.println("HGT buffer size " + sb.capacity() + " limit " + sb.limit());
try {
int rowLength = HGT3_ROW_LENGTH;
int resolution = HGT3_RES;
if (sb.capacity() > (HGT3_ROW_LENGTH * HGT3_ROW_LENGTH)) {
rowLength = HGT1_ROW_LENGTH;
resolution = HGT1_RES;
}
// see http://gis.stackexchange.com/questions/43743/how-to-extract-elevation-from-hgt-file
double fLat = frac(lat) * SECONDS_PER_MINUTE;
double fLon = frac(lon) * SECONDS_PER_MINUTE;
// compute offset within HGT file
int row = (int) Math.round((fLat) * SECONDS_PER_MINUTE / resolution);
int col = (int) Math.round((fLon) * SECONDS_PER_MINUTE / resolution);
if (lon < 0) col = rowLength - col - 1;
if (lat > 0) row = rowLength - row - 1;
//row = rowLength - row;
int cell = (rowLength * (row)) + col;
//int cell = ((rowLength * (latitude)) + longitude);
if (DEBUG)
System.out.println("Read HGT elevation data from row/col/cell " + row + "," + col + ", " + cell + ", " + sb.limit());
// valid position in buffer?
if (cell < sb.limit()) {
short ele = sb.get(cell);
// check for data voids
if (ele == HGT_VOID) {
return NO_ELEVATION;
} else {
return ele;
}
} else {
return NO_ELEVATION;
}
} catch (Exception e) {
System.err.println("error at " + lon + " " + lat + " ");
e.printStackTrace();
}
return NO_ELEVATION;
}
/**
* Gets the associated HGT file name for the given way point. Usually the
* format is <tt>[N|S]nn[W|E]mmm.hgt</tt> where <i>nn</i> is the integral latitude
* without decimals and <i>mmm</i> is the longitude.
*
* @param llat,llon the coordinate to get the filename for
* @return the file name of the HGT file
*/
public static String getHgtFileName(double llat, double llon) {
int lat = (int) llat;
int lon = (int) llon;
String latPref = "N";
if (lat < 0) {
latPref = "S";
lat = -lat + 1;
}
String lonPref = "E";
if (lon < 0) {
lonPref = "W";
lon = -lon + 1;
}
return String.format("%s%02d%s%03d", latPref, lat, lonPref, lon);
}
public static double frac(double d) {
long iPart;
double fPart;
// Get user input
iPart = (long) d;
fPart = d - iPart;
return Math.abs(fPart);
}
public static void clear() {
if (cache != null) {
cache.clear();
}
}
public static void main(String[] args) throws Exception {
System.out.println("*** HGT position values and enhance elevation");
if (args.length == 3) {
HgtReader elevReader = new HgtReader(args[0]);
double lon = Double.parseDouble(args[1]);
double lat = Double.parseDouble(args[2]);
// check hgt direct
double elev = elevReader.getElevationFromHgt(lat, lon);
System.out.println("-----> elv for hgt " + lat + ", " + lon + " = " + elev);
}
}
}

View file

@ -1,171 +1,159 @@
/**
* common base class for the map-filters
*
* @author ab
*/
package btools.mapcreator;
import java.io.BufferedInputStream;
import java.io.BufferedOutputStream;
import java.io.DataInputStream;
import java.io.DataOutputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.util.HashMap;
import btools.util.DiffCoderDataOutputStream;
public abstract class MapCreatorBase implements WayListener, NodeListener, RelationListener
{
private DiffCoderDataOutputStream[] tileOutStreams;
protected File outTileDir;
protected HashMap<String,String> tags;
public void putTag( String key, String value )
{
if ( tags == null ) tags = new HashMap<String,String>();
tags.put( key, value );
}
public String getTag( String key )
{
return tags == null ? null : tags.get( key );
}
public HashMap<String,String> getTagsOrNull()
{
return tags;
}
public void setTags( HashMap<String,String> tags )
{
this.tags = tags;
}
protected static long readId( DataInputStream is) throws IOException
{
int offset = is.readByte();
if ( offset == 32 ) return -1;
long i = is.readInt();
i = i << 5;
return i | offset;
}
protected static void writeId( DataOutputStream o, long id ) throws IOException
{
if ( id == -1 )
{
o.writeByte( 32 );
return;
}
int offset = (int)( id & 0x1f );
int i = (int)( id >> 5 );
o.writeByte( offset );
o.writeInt( i );
}
protected static File[] sortBySizeAsc( File[] files )
{
int n = files.length;
long[] sizes = new long[n];
File[] sorted = new File[n];
for( int i=0; i<n; i++ ) sizes[i] = files[i].length();
for(int nf=0; nf<n; nf++)
{
int idx = -1;
long min = -1;
for( int i=0; i<n; i++ )
{
if ( sizes[i] != -1 && ( idx == -1 || sizes[i] < min ) )
{
min = sizes[i];
idx = i;
}
}
sizes[idx] = -1;
sorted[nf] = files[idx];
}
return sorted;
}
protected File fileFromTemplate( File template, File dir, String suffix )
{
String filename = template.getName();
filename = filename.substring( 0, filename.length() - 3 ) + suffix;
return new File( dir, filename );
}
protected DataInputStream createInStream( File inFile ) throws IOException
{
return new DataInputStream( new BufferedInputStream ( new FileInputStream( inFile ) ) );
}
protected DiffCoderDataOutputStream createOutStream( File outFile ) throws IOException
{
return new DiffCoderDataOutputStream( new BufferedOutputStream( new FileOutputStream( outFile ) ) );
}
protected DiffCoderDataOutputStream getOutStreamForTile( int tileIndex ) throws Exception
{
if ( tileOutStreams == null )
{
tileOutStreams = new DiffCoderDataOutputStream[64];
}
if ( tileOutStreams[tileIndex] == null )
{
tileOutStreams[tileIndex] = createOutStream( new File( outTileDir, getNameForTile( tileIndex ) ) );
}
return tileOutStreams[tileIndex];
}
protected String getNameForTile( int tileIndex )
{
throw new IllegalArgumentException( "getNameForTile not implemented" );
}
protected void closeTileOutStreams() throws Exception
{
if ( tileOutStreams == null )
{
return;
}
for( int tileIndex=0; tileIndex<tileOutStreams.length; tileIndex++ )
{
if ( tileOutStreams[tileIndex] != null ) tileOutStreams[tileIndex].close();
tileOutStreams[tileIndex] = null;
}
}
// interface dummys
@Override
public void nodeFileStart( File nodefile ) throws Exception {}
@Override
public void nextNode( NodeData n ) throws Exception {}
@Override
public void nodeFileEnd( File nodefile ) throws Exception {}
@Override
public boolean wayFileStart( File wayfile ) throws Exception { return true; }
@Override
public void nextWay( WayData data ) throws Exception {}
@Override
public void wayFileEnd( File wayfile ) throws Exception {}
@Override
public void nextRelation( RelationData data ) throws Exception {}
@Override
public void nextRestriction( RelationData data, long fromWid, long toWid, long viaNid ) throws Exception {}
}
/**
* common base class for the map-filters
*
* @author ab
*/
package btools.mapcreator;
import java.io.BufferedInputStream;
import java.io.BufferedOutputStream;
import java.io.DataInputStream;
import java.io.DataOutputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.util.HashMap;
import java.util.Map;
import btools.util.DiffCoderDataOutputStream;
public abstract class MapCreatorBase implements WayListener, NodeListener, RelationListener {
private DiffCoderDataOutputStream[] tileOutStreams;
protected File outTileDir;
protected Map<String, String> tags;
public void putTag(String key, String value) {
if (tags == null) tags = new HashMap<>();
tags.put(key, value);
}
public String getTag(String key) {
return tags == null ? null : tags.get(key);
}
public Map<String, String> getTagsOrNull() {
return tags;
}
public void setTags(Map<String, String> tags) {
this.tags = tags;
}
protected static long readId(DataInputStream is) throws IOException {
int offset = is.readByte();
if (offset == 32) return -1;
long i = is.readInt();
i = i << 5;
return i | offset;
}
protected static void writeId(DataOutputStream o, long id) throws IOException {
if (id == -1) {
o.writeByte(32);
return;
}
int offset = (int) (id & 0x1f);
int i = (int) (id >> 5);
o.writeByte(offset);
o.writeInt(i);
}
protected static File[] sortBySizeAsc(File[] files) {
int n = files.length;
long[] sizes = new long[n];
File[] sorted = new File[n];
for (int i = 0; i < n; i++) sizes[i] = files[i].length();
for (int nf = 0; nf < n; nf++) {
int idx = -1;
long min = -1;
for (int i = 0; i < n; i++) {
if (sizes[i] != -1 && (idx == -1 || sizes[i] < min)) {
min = sizes[i];
idx = i;
}
}
sizes[idx] = -1;
sorted[nf] = files[idx];
}
return sorted;
}
protected File fileFromTemplate(File template, File dir, String suffix) {
String filename = template.getName();
filename = filename.substring(0, filename.length() - 3) + suffix;
return new File(dir, filename);
}
protected DataInputStream createInStream(File inFile) throws IOException {
return new DataInputStream(new BufferedInputStream(new FileInputStream(inFile)));
}
protected DiffCoderDataOutputStream createOutStream(File outFile) throws IOException {
return new DiffCoderDataOutputStream(new BufferedOutputStream(new FileOutputStream(outFile)));
}
protected DiffCoderDataOutputStream getOutStreamForTile(int tileIndex) throws Exception {
if (tileOutStreams == null) {
tileOutStreams = new DiffCoderDataOutputStream[64];
}
if (tileOutStreams[tileIndex] == null) {
tileOutStreams[tileIndex] = createOutStream(new File(outTileDir, getNameForTile(tileIndex)));
}
return tileOutStreams[tileIndex];
}
protected String getNameForTile(int tileIndex) {
throw new IllegalArgumentException("getNameForTile not implemented");
}
protected void closeTileOutStreams() throws Exception {
if (tileOutStreams == null) {
return;
}
for (int tileIndex = 0; tileIndex < tileOutStreams.length; tileIndex++) {
if (tileOutStreams[tileIndex] != null) tileOutStreams[tileIndex].close();
tileOutStreams[tileIndex] = null;
}
}
// interface dummys
@Override
public void nodeFileStart(File nodefile) throws Exception {
}
@Override
public void nextNode(NodeData n) throws Exception {
}
@Override
public void nodeFileEnd(File nodefile) throws Exception {
}
@Override
public boolean wayFileStart(File wayfile) throws Exception {
return true;
}
@Override
public void nextWay(WayData data) throws Exception {
}
@Override
public void wayFileEnd(File wayfile) throws Exception {
}
@Override
public void nextRelation(RelationData data) throws Exception {
}
@Override
public void nextRestriction(RelationData data, long fromWid, long toWid, long viaNid) throws Exception {
}
}

View file

@ -1,84 +1,75 @@
package btools.mapcreator;
import java.io.File;
/**
* NodeCutter does 1 step in map-processing:
*
* - cuts the 45*30 node tiles into 5*5 pieces
*
* @author ab
*/
public class NodeCutter extends MapCreatorBase
{
private int lonoffset;
private int latoffset;
public static void main(String[] args) throws Exception
{
System.out.println("*** NodeCutter: Cut big node-tiles into 5x5 tiles");
if (args.length != 2)
{
System.out.println("usage: java NodeCutter <node-tiles-in> <node-tiles-out>" );
return;
}
new NodeCutter().process( new File( args[0] ), new File( args[1] ) );
}
public void init( File nodeTilesOut )
{
this.outTileDir = nodeTilesOut;
}
public void process( File nodeTilesIn, File nodeTilesOut ) throws Exception
{
init( nodeTilesOut );
new NodeIterator( this, true ).processDir( nodeTilesIn, ".tlf" );
}
@Override
public void nodeFileStart( File nodefile ) throws Exception
{
lonoffset = -1;
latoffset = -1;
}
@Override
public void nextNode( NodeData n ) throws Exception
{
n.writeTo( getOutStreamForTile( getTileIndex( n.ilon, n.ilat ) ) );
}
@Override
public void nodeFileEnd( File nodeFile ) throws Exception
{
closeTileOutStreams();
}
private int getTileIndex( int ilon, int ilat )
{
int lonoff = (ilon / 45000000 ) * 45;
int latoff = (ilat / 30000000 ) * 30;
if ( lonoffset == -1 ) lonoffset = lonoff;
if ( latoffset == -1 ) latoffset = latoff;
if ( lonoff != lonoffset || latoff != latoffset )
throw new IllegalArgumentException( "inconsistent node: " + ilon + " " + ilat );
int lon = (ilon / 5000000) % 9;
int lat = (ilat / 5000000) % 6;
if ( lon < 0 || lon > 8 || lat < 0 || lat > 5 ) throw new IllegalArgumentException( "illegal pos: " + ilon + "," + ilat );
return lon*6 + lat;
}
protected String getNameForTile( int tileIndex )
{
int lon = (tileIndex / 6 ) * 5 + lonoffset - 180;
int lat = (tileIndex % 6 ) * 5 + latoffset - 90;
String slon = lon < 0 ? "W" + (-lon) : "E" + lon;
String slat = lat < 0 ? "S" + (-lat) : "N" + lat;
return slon + "_" + slat + ".n5d";
}
}
package btools.mapcreator;
import java.io.File;
/**
* NodeCutter does 1 step in map-processing:
* <p>
* - cuts the 45*30 node tiles into 5*5 pieces
*
* @author ab
*/
public class NodeCutter extends MapCreatorBase {
private int lonoffset;
private int latoffset;
public static void main(String[] args) throws Exception {
System.out.println("*** NodeCutter: Cut big node-tiles into 5x5 tiles");
if (args.length != 2) {
System.out.println("usage: java NodeCutter <node-tiles-in> <node-tiles-out>");
return;
}
new NodeCutter().process(new File(args[0]), new File(args[1]));
}
public void init(File nodeTilesOut) {
this.outTileDir = nodeTilesOut;
}
public void process(File nodeTilesIn, File nodeTilesOut) throws Exception {
init(nodeTilesOut);
new NodeIterator(this, true).processDir(nodeTilesIn, ".tlf");
}
@Override
public void nodeFileStart(File nodefile) throws Exception {
lonoffset = -1;
latoffset = -1;
}
@Override
public void nextNode(NodeData n) throws Exception {
n.writeTo(getOutStreamForTile(getTileIndex(n.ilon, n.ilat)));
}
@Override
public void nodeFileEnd(File nodeFile) throws Exception {
closeTileOutStreams();
}
private int getTileIndex(int ilon, int ilat) {
int lonoff = (ilon / 45000000) * 45;
int latoff = (ilat / 30000000) * 30;
if (lonoffset == -1) lonoffset = lonoff;
if (latoffset == -1) latoffset = latoff;
if (lonoff != lonoffset || latoff != latoffset)
throw new IllegalArgumentException("inconsistent node: " + ilon + " " + ilat);
int lon = (ilon / 5000000) % 9;
int lat = (ilat / 5000000) % 6;
if (lon < 0 || lon > 8 || lat < 0 || lat > 5)
throw new IllegalArgumentException("illegal pos: " + ilon + "," + ilat);
return lon * 6 + lat;
}
protected String getNameForTile(int tileIndex) {
int lon = (tileIndex / 6) * 5 + lonoffset - 180;
int lat = (tileIndex % 6) * 5 + latoffset - 90;
String slon = lon < 0 ? "W" + (-lon) : "E" + lon;
String slat = lat < 0 ? "S" + (-lat) : "N" + lat;
return slon + "_" + slat + ".n5d";
}
}

View file

@ -1,46 +1,49 @@
package btools.mapcreator;
import btools.util.DiffCoderDataInputStream;
import btools.util.DiffCoderDataOutputStream;
/**
* Container for node data on the preprocessor level
*
* @author ab
*/
public class NodeData extends MapCreatorBase
{
public long nid;
public int ilon;
public int ilat;
public byte[] description;
public short selev = Short.MIN_VALUE;
public NodeData( long id, double lon, double lat )
{
nid = id;
ilat = (int)( ( lat + 90. )*1000000. + 0.5);
ilon = (int)( ( lon + 180. )*1000000. + 0.5);
}
public NodeData( DiffCoderDataInputStream dis ) throws Exception
{
nid = dis.readDiffed( 0 );
ilon = (int)dis.readDiffed( 1 );
ilat = (int)dis.readDiffed( 2 );
int mode = dis.readByte();
if ( ( mode & 1 ) != 0 ) { int dlen = dis.readShort(); description = new byte[dlen]; dis.readFully( description ); }
if ( ( mode & 2 ) != 0 ) selev = dis.readShort();
}
public void writeTo( DiffCoderDataOutputStream dos ) throws Exception
{
dos.writeDiffed( nid, 0 );
dos.writeDiffed( ilon, 1 );
dos.writeDiffed( ilat, 2 );
int mode = (description == null ? 0 : 1 ) | ( selev == Short.MIN_VALUE ? 0 : 2 );
dos.writeByte( (byte) mode);
if ( ( mode & 1 ) != 0 ) { dos.writeShort( description.length ); dos.write( description ); }
if ( ( mode & 2 ) != 0 ) dos.writeShort( selev );
}
}
package btools.mapcreator;
import btools.util.DiffCoderDataInputStream;
import btools.util.DiffCoderDataOutputStream;
/**
* Container for node data on the preprocessor level
*
* @author ab
*/
public class NodeData extends MapCreatorBase {
public long nid;
public int ilon;
public int ilat;
public byte[] description;
public short selev = Short.MIN_VALUE;
public NodeData(long id, double lon, double lat) {
nid = id;
ilat = (int) ((lat + 90.) * 1000000. + 0.5);
ilon = (int) ((lon + 180.) * 1000000. + 0.5);
}
public NodeData(DiffCoderDataInputStream dis) throws Exception {
nid = dis.readDiffed(0);
ilon = (int) dis.readDiffed(1);
ilat = (int) dis.readDiffed(2);
int mode = dis.readByte();
if ((mode & 1) != 0) {
int dlen = dis.readShort();
description = new byte[dlen];
dis.readFully(description);
}
if ((mode & 2) != 0) selev = dis.readShort();
}
public void writeTo(DiffCoderDataOutputStream dos) throws Exception {
dos.writeDiffed(nid, 0);
dos.writeDiffed(ilon, 1);
dos.writeDiffed(ilat, 2);
int mode = (description == null ? 0 : 1) | (selev == Short.MIN_VALUE ? 0 : 2);
dos.writeByte((byte) mode);
if ((mode & 1) != 0) {
dos.writeShort(description.length);
dos.write(description);
}
if ((mode & 2) != 0) dos.writeShort(selev);
}
}

View file

@ -1,92 +1,80 @@
package btools.mapcreator;
import java.io.BufferedOutputStream;
import java.io.File;
import java.io.FileOutputStream;
import btools.util.DenseLongMap;
import btools.util.DiffCoderDataOutputStream;
import btools.util.TinyDenseLongMap;
/**
* NodeFilter does 1 step in map-processing:
*
* - filters out unused nodes according to the way file
*
* @author ab
*/
public class NodeFilter extends MapCreatorBase
{
private DiffCoderDataOutputStream nodesOutStream;
private File nodeTilesOut;
protected DenseLongMap nodebitmap;
public static void main(String[] args) throws Exception
{
System.out.println("*** NodeFilter: Filter way related nodes");
if (args.length != 3)
{
System.out.println("usage: java NodeFilter <node-tiles-in> <way-file-in> <node-tiles-out>" );
return;
}
new NodeFilter().process( new File( args[0] ), new File( args[1] ), new File( args[2] ) );
}
public void init() throws Exception
{
nodebitmap = Boolean.getBoolean( "useDenseMaps" ) ? new DenseLongMap( 512 ) : new TinyDenseLongMap();
}
public void process( File nodeTilesIn, File wayFileIn, File nodeTilesOut ) throws Exception
{
init();
this.nodeTilesOut = nodeTilesOut;
// read the wayfile into a bitmap of used nodes
new WayIterator( this, false ).processFile( wayFileIn );
// finally filter all node files
new NodeIterator( this, true ).processDir( nodeTilesIn, ".tls" );
}
@Override
public void nextWay( WayData data ) throws Exception
{
int nnodes = data.nodes.size();
for (int i=0; i<nnodes; i++ )
{
nodebitmap.put( data.nodes.get(i), 0 );
}
}
@Override
public void nodeFileStart( File nodefile ) throws Exception
{
String filename = nodefile.getName();
File outfile = new File( nodeTilesOut, filename );
nodesOutStream = new DiffCoderDataOutputStream( new BufferedOutputStream ( new FileOutputStream( outfile ) ) );
}
@Override
public void nextNode( NodeData n ) throws Exception
{
if ( isRelevant( n ) )
{
n.writeTo( nodesOutStream );
}
}
public boolean isRelevant( NodeData n )
{
// check if node passes bitmap
return nodebitmap.getInt( n.nid ) == 0; // 0 -> bit set, -1 -> unset
}
@Override
public void nodeFileEnd( File nodeFile ) throws Exception
{
nodesOutStream.close();
}
}
package btools.mapcreator;
import java.io.BufferedOutputStream;
import java.io.File;
import java.io.FileOutputStream;
import btools.util.DenseLongMap;
import btools.util.DiffCoderDataOutputStream;
import btools.util.TinyDenseLongMap;
/**
* NodeFilter does 1 step in map-processing:
* <p>
* - filters out unused nodes according to the way file
*
* @author ab
*/
public class NodeFilter extends MapCreatorBase {
private DiffCoderDataOutputStream nodesOutStream;
private File nodeTilesOut;
protected DenseLongMap nodebitmap;
public static void main(String[] args) throws Exception {
System.out.println("*** NodeFilter: Filter way related nodes");
if (args.length != 3) {
System.out.println("usage: java NodeFilter <node-tiles-in> <way-file-in> <node-tiles-out>");
return;
}
new NodeFilter().process(new File(args[0]), new File(args[1]), new File(args[2]));
}
public void init() throws Exception {
nodebitmap = Boolean.getBoolean("useDenseMaps") ? new DenseLongMap(512) : new TinyDenseLongMap();
}
public void process(File nodeTilesIn, File wayFileIn, File nodeTilesOut) throws Exception {
init();
this.nodeTilesOut = nodeTilesOut;
// read the wayfile into a bitmap of used nodes
new WayIterator(this, false).processFile(wayFileIn);
// finally filter all node files
new NodeIterator(this, true).processDir(nodeTilesIn, ".tls");
}
@Override
public void nextWay(WayData data) throws Exception {
int nnodes = data.nodes.size();
for (int i = 0; i < nnodes; i++) {
nodebitmap.put(data.nodes.get(i), 0);
}
}
@Override
public void nodeFileStart(File nodefile) throws Exception {
String filename = nodefile.getName();
File outfile = new File(nodeTilesOut, filename);
nodesOutStream = new DiffCoderDataOutputStream(new BufferedOutputStream(new FileOutputStream(outfile)));
}
@Override
public void nextNode(NodeData n) throws Exception {
if (isRelevant(n)) {
n.writeTo(nodesOutStream);
}
}
public boolean isRelevant(NodeData n) {
// check if node passes bitmap
return nodebitmap.getInt(n.nid) == 0; // 0 -> bit set, -1 -> unset
}
@Override
public void nodeFileEnd(File nodeFile) throws Exception {
nodesOutStream.close();
}
}

View file

@ -1,71 +1,59 @@
package btools.mapcreator;
import java.io.BufferedInputStream;
import java.io.EOFException;
import java.io.File;
import java.io.FileInputStream;
import btools.util.DiffCoderDataInputStream;
/**
* Iterate over a singe nodefile or a directory
* of nodetiles and feed the nodes to the callback listener
*
* @author ab
*/
public class NodeIterator extends MapCreatorBase
{
private NodeListener listener;
private boolean delete;
public NodeIterator( NodeListener nodeListener, boolean deleteAfterReading )
{
listener = nodeListener;
delete = deleteAfterReading;
}
public void processDir( File indir, String inSuffix ) throws Exception
{
if ( !indir.isDirectory() )
{
throw new IllegalArgumentException( "not a directory: " + indir );
}
File[] af = sortBySizeAsc( indir.listFiles() );
for( int i=0; i<af.length; i++ )
{
File nodefile = af[i];
if ( nodefile.getName().endsWith( inSuffix ) )
{
processFile( nodefile );
}
}
}
public void processFile(File nodefile) throws Exception
{
System.out.println( "*** NodeIterator reading: " + nodefile );
listener.nodeFileStart( nodefile );
DiffCoderDataInputStream di = new DiffCoderDataInputStream( new BufferedInputStream ( new FileInputStream( nodefile ) ) );
try
{
for(;;)
{
NodeData n = new NodeData( di );
listener.nextNode( n );
}
}
catch( EOFException eof )
{
di.close();
}
listener.nodeFileEnd( nodefile );
if ( delete && "true".equals( System.getProperty( "deletetmpfiles" ) ))
{
nodefile.delete();
}
}
}
package btools.mapcreator;
import java.io.BufferedInputStream;
import java.io.EOFException;
import java.io.File;
import java.io.FileInputStream;
import btools.util.DiffCoderDataInputStream;
/**
* Iterate over a singe nodefile or a directory
* of nodetiles and feed the nodes to the callback listener
*
* @author ab
*/
public class NodeIterator extends MapCreatorBase {
private NodeListener listener;
private boolean delete;
public NodeIterator(NodeListener nodeListener, boolean deleteAfterReading) {
listener = nodeListener;
delete = deleteAfterReading;
}
public void processDir(File indir, String inSuffix) throws Exception {
if (!indir.isDirectory()) {
throw new IllegalArgumentException("not a directory: " + indir);
}
File[] af = sortBySizeAsc(indir.listFiles());
for (int i = 0; i < af.length; i++) {
File nodefile = af[i];
if (nodefile.getName().endsWith(inSuffix)) {
processFile(nodefile);
}
}
}
public void processFile(File nodefile) throws Exception {
System.out.println("*** NodeIterator reading: " + nodefile);
listener.nodeFileStart(nodefile);
DiffCoderDataInputStream di = new DiffCoderDataInputStream(new BufferedInputStream(new FileInputStream(nodefile)));
try {
for (; ; ) {
NodeData n = new NodeData(di);
listener.nextNode(n);
}
} catch (EOFException eof) {
di.close();
}
listener.nodeFileEnd(nodefile);
if (delete && "true".equals(System.getProperty("deletetmpfiles"))) {
nodefile.delete();
}
}
}

Some files were not shown because too many files have changed in this diff Show more