The deadend trimming is too enthusiastic, getting rid of some unsnapped
cycleways and things connecting to the map border. Will iterate on it
this week; net benefit for now.
dramatically improve time to import and edit maps.
The fix helps all maps that use extremely high edge weights to prevent
people from cutting through private roads. There may be a more robust
fast_paths fix later, but I want to reap the benefits for tomorrow's
release.
The dramatic numbers:
- importing huge_seattle: 893s down to 108s
- editing huge_seattle: 102s down to 19s
Query speeds didn't appear to substantially change.
lane that they're stuck behind them. Only record a risk exposure event
the first time, but let passing happen anywhere. #382
Also add scenario name to PrebakeSummary, to disambiguate the Poundbury
results.
fixes a very dramatic problem in the Green Lake map.
Regenerating everything...
Also added total trip time to the prebaked summary, to get a quick sense
if a change net helps or hurts and have a record in version control.
- handle when the equiv_pos of a driveway gets too close to the edge of
another lane
- make the updater workflow handle files from S3 that're a bit older
- remove pathfinding_avoiding_roads
- strip out old vehicle capping from map edit JSON, then fix up
proposals
- delete old capping API example
- temporarily give up on phinney; it starts gridlocking
- add broadmoor proposal link in-game
- arrays are now iterable directly
- switch to using BTree{Set,Map}::retain!
- a round of clippy
- regenerate scenarios and prebaked data; not sure why, but there's a
diff
for people that leave one border, then re-enter a different one. #664
Alternative considered: insert a dummy remote trip between the two
borders. We used to do something like this at non-trivial code
complexity expense and having to explain the trip in the UI.
Regenerated all scenarios and prebaked data.
- Modest file size increase, as expected. Montlake scenario goes from
1.3MB to 1.5, downtown from 37MB to 43MB, all Seattle scenarios from
280MB to 327MB
- Eyeballed a few of the previously broken trips; they work now!
- Montlake goes from 3127 cancelled trips to just 302!
- No new gridlock, except in Rainier Valley -- disabling that for now
- I discovered missing validation in Poundbury for no-op trips between
the same building. I'll filter those out and restore prebaked results
there in a followup.
* More conventional spelling of acronym identifiers
* `new` -> `new_state`
* Remove redundant field name
* Remove needless `collect`
* `to_controls` -> `make_controls`
* Simplify long if statement
* Fix module inception
* Simplify chained if let
* Return directly instead of creating a binding
* Disable clippy warning about `borrow` method
Implementing the `Borrow` trait instead would result in excessive type
annotations.
* Fix most remaining clippy warnings
* Allow clippy::type_complexity
* Fix bad merge from web editor
* Run cargo fmt
* Suppress large_enum_variant warnings
* Rename FYI state to ShowMessage
* Fix upper_case_acronyms warnings
Co-authored-by: Dustin Carlino <dabreegster@gmail.com>
"rise / run" calculation used the trimmed road center-lines, which don't
match up with the elevation at each original intersection point.
Also handle infinity in the output and reduce the resolution of the
query from every 1m to every 5m.
Regenerate all maps due to the map format change. Try bringing in
elevation data for all of Seattle using the LIDAR source, since
the data quality assessed in eldang/elevation_lookups#12 seems to be
similar, and LIDAR is way faster than contours.
City names are now disambiguated by a two-letter country code. This
commit handles almost everything needed to make this transition. Main
next steps are fixing up map edits automatically and making the city
picker UI understand the extra level of hierarchy.
A little bit of fallout: lakeslice gridlocks again; this regression is
actually from the recent traffic signal changes, but I'm just now
regenerating everything. Will fix soon.
source to augment the ones in OSM. For
https://github.com/cyipt/actdev/issues/53 -- sometimes the buildings
just haven't been mapped in OSM yet, other times the buildings are part
of a future development site. In either case, we can procedurally
generate some houses, so this is a way to include them in the map.
Start doing this for Chapelford. But first, adjust the generated house
sizes -- they were WAY too tiny.
Also prep for [rebuild] [release]
Originally, the intention of the deleted calls was to not interrupt
Timer progress bars with warnings. But the output of things like the
importer is impossible to read anyway. Strongly considering explicitly
sending logs and timing info to separate places and using something like
multitail for live progress.
Unplumb timer from LOADS of places that just needed it for logging.
Small adjustments to unzoomed rendering and stop sign placement.
Regenerate all maps because of the format change, but only Cambridge
changes. Since we're doing this anyway, also pull in leisure=garden.
widgetry, geom, and abstutil may wind up on crates.io in some form to
let other projects use widgetry. abstio has A/B Street-specific tricks
for reading data on native/web. Note widgetry still depends on abstio,
will figure out how to clean that up next.
degrees to 30 degrees. It works around the issue in #428, but it doesn't
solve the root cause there, so the unit test is also adjusted to provide
a way to solve the harder problem.
Regenerated all maps accordingly. Many traffic signals tended to
improve, with a straight turn marked protected, instead of permitted as
a "right turn."
The schedule validation changes slightly. No-op trips between the same
origin/destination are now an error and get filtered out.
huge_seattle scenario goes from 129MB to 110MB with the redundant
endpoints removed.
remote trip goes between two locations off-map, specified just by a GPS
coordinate. The trips aren't simulated at all. They were originally
added to support Orestis's pandemic model, to handle transmission
off-map in shared buildings. This work has died off, there are no other
anticipated use cases for remote trips, and they complicate bigger
refactorings. #258
This also has the nice side effect of substantially reducing scenario
size -- huge_seattle from 177MB to 147MB. That unused metadata was
expensive!