2020-06-20 04:48:07 +03:00
|
|
|
# Importing the Database
|
2018-08-24 00:12:10 +03:00
|
|
|
|
2016-06-07 01:17:15 +03:00
|
|
|
The following instructions explain how to create a Nominatim database
|
|
|
|
from an OSM planet file and how to keep the database up to date. It
|
2017-03-08 18:06:50 +03:00
|
|
|
is assumed that you have already successfully installed the Nominatim
|
2016-06-07 23:47:57 +03:00
|
|
|
software itself, if not return to the [installation page](Installation.md).
|
2016-06-07 01:17:15 +03:00
|
|
|
|
2018-08-24 00:12:10 +03:00
|
|
|
## Configuration setup in settings/local.php
|
2018-01-15 01:43:15 +03:00
|
|
|
|
|
|
|
The Nominatim server can be customized via the file `settings/local.php`
|
|
|
|
in the build directory. Note that this is a PHP file, so it must always
|
|
|
|
start like this:
|
|
|
|
|
|
|
|
<?php
|
|
|
|
|
|
|
|
without any leading spaces.
|
2016-06-07 01:17:15 +03:00
|
|
|
|
|
|
|
There are lots of configuration settings you can tweak. Have a look
|
2018-01-15 01:43:15 +03:00
|
|
|
at `settings/default.php` for a full list. Most should have a sensible default.
|
|
|
|
|
2018-08-24 00:12:10 +03:00
|
|
|
#### Flatnode files
|
2018-01-15 01:43:15 +03:00
|
|
|
|
2016-06-07 01:17:15 +03:00
|
|
|
If you plan to import a large dataset (e.g. Europe, North America, planet),
|
|
|
|
you should also enable flatnode storage of node locations. With this
|
|
|
|
setting enabled, node coordinates are stored in a simple file instead
|
|
|
|
of the database. This will save you import time and disk storage.
|
2017-02-07 16:57:54 +03:00
|
|
|
Add to your `settings/local.php`:
|
2016-06-07 01:17:15 +03:00
|
|
|
|
|
|
|
@define('CONST_Osm2pgsql_Flatnode_File', '/path/to/flatnode.file');
|
|
|
|
|
2016-08-10 23:02:10 +03:00
|
|
|
Replace the second part with a suitable path on your system and make sure
|
2019-11-24 12:31:34 +03:00
|
|
|
the directory exists. There should be at least 64GB of free space.
|
2016-06-07 01:17:15 +03:00
|
|
|
|
2018-08-24 00:12:10 +03:00
|
|
|
## Downloading additional data
|
2016-06-07 01:17:15 +03:00
|
|
|
|
2019-11-01 12:07:04 +03:00
|
|
|
### Wikipedia/Wikidata rankings
|
2016-06-07 01:17:15 +03:00
|
|
|
|
|
|
|
Wikipedia can be used as an optional auxiliary data source to help indicate
|
2019-05-21 14:55:16 +03:00
|
|
|
the importance of OSM features. Nominatim will work without this information
|
2016-06-07 01:17:15 +03:00
|
|
|
but it will improve the quality of the results if this is installed.
|
|
|
|
This data is available as a binary download:
|
|
|
|
|
|
|
|
cd $NOMINATIM_SOURCE_DIR/data
|
2019-11-17 12:09:31 +03:00
|
|
|
wget https://www.nominatim.org/data/wikimedia-importance.sql.gz
|
2016-06-07 01:17:15 +03:00
|
|
|
|
2019-11-17 12:09:31 +03:00
|
|
|
The file is about 400MB and adds around 4GB to Nominatim database.
|
2016-06-07 01:17:15 +03:00
|
|
|
|
2020-01-22 13:44:05 +03:00
|
|
|
!!! tip
|
|
|
|
If you forgot to download the wikipedia rankings, you can also add
|
|
|
|
importances after the import. Download the files, then run
|
|
|
|
`./utils/setup.php --import-wikipedia-articles`
|
|
|
|
and `./utils/update.php --recompute-importance`.
|
2017-07-15 04:52:54 +03:00
|
|
|
|
2019-07-22 02:11:06 +03:00
|
|
|
### Great Britain, USA postcodes
|
2016-06-07 01:17:15 +03:00
|
|
|
|
2019-07-22 02:11:06 +03:00
|
|
|
Nominatim can use postcodes from an external source to improve searches that
|
|
|
|
involve a GB or US postcode. This data can be optionally downloaded:
|
2016-06-07 01:17:15 +03:00
|
|
|
|
|
|
|
cd $NOMINATIM_SOURCE_DIR/data
|
2018-01-11 01:21:21 +03:00
|
|
|
wget https://www.nominatim.org/data/gb_postcode_data.sql.gz
|
2019-07-22 02:11:06 +03:00
|
|
|
wget https://www.nominatim.org/data/us_postcode_data.sql.gz
|
2016-06-07 01:17:15 +03:00
|
|
|
|
2018-12-05 00:22:19 +03:00
|
|
|
## Choosing the Data to Import
|
|
|
|
|
2019-01-03 23:08:38 +03:00
|
|
|
In its default setup Nominatim is configured to import the full OSM data
|
2018-12-05 00:22:19 +03:00
|
|
|
set for the entire planet. Such a setup requires a powerful machine with
|
2019-11-24 12:31:34 +03:00
|
|
|
at least 64GB of RAM and around 800GB of SSD hard disks. Depending on your
|
2018-12-05 00:22:19 +03:00
|
|
|
use case there are various ways to reduce the amount of data imported. This
|
|
|
|
section discusses these methods. They can also be combined.
|
|
|
|
|
|
|
|
### Using an extract
|
|
|
|
|
|
|
|
If you only need geocoding for a smaller region, then precomputed extracts
|
2019-01-03 23:08:38 +03:00
|
|
|
are a good way to reduce the database size and import time.
|
|
|
|
[Geofabrik](https://download.geofabrik.de) offers extracts for most countries.
|
|
|
|
They even have daily updates which can be used with the update process described
|
2018-12-05 00:22:19 +03:00
|
|
|
below. There are also
|
2019-01-03 23:08:38 +03:00
|
|
|
[other providers for extracts](https://wiki.openstreetmap.org/wiki/Planet.osm#Downloading).
|
2018-12-05 00:22:19 +03:00
|
|
|
|
2019-01-03 23:08:38 +03:00
|
|
|
Please be aware that some extracts are not cut exactly along the country
|
|
|
|
boundaries. As a result some parts of the boundary may be missing which means
|
2019-05-21 14:55:16 +03:00
|
|
|
that Nominatim cannot compute the areas for some administrative areas.
|
2019-01-03 23:08:38 +03:00
|
|
|
|
|
|
|
### Dropping Data Required for Dynamic Updates
|
|
|
|
|
|
|
|
About half of the data in Nominatim's database is not really used for serving
|
|
|
|
the API. It is only there to allow the data to be updated from the latest
|
|
|
|
changes from OSM. For many uses these dynamic updates are not really required.
|
|
|
|
If you don't plan to apply updates, the dynamic part of the database can be
|
|
|
|
safely dropped using the following command:
|
|
|
|
|
|
|
|
```
|
|
|
|
./utils/setup.php --drop
|
|
|
|
```
|
|
|
|
|
|
|
|
Note that you still need to provide for sufficient disk space for the initial
|
|
|
|
import. So this option is particularly interesting if you plan to transfer the
|
|
|
|
database or reuse the space later.
|
2018-12-05 00:22:19 +03:00
|
|
|
|
|
|
|
### Reverse-only Imports
|
|
|
|
|
|
|
|
If you only want to use the Nominatim database for reverse lookups or
|
|
|
|
if you plan to use the installation only for exports to a
|
2019-05-21 14:55:16 +03:00
|
|
|
[photon](https://photon.komoot.de/) database, then you can set up a database
|
2018-12-05 00:22:19 +03:00
|
|
|
without search indexes. Add `--reverse-only` to your setup command above.
|
|
|
|
|
|
|
|
This saves about 5% of disk space.
|
|
|
|
|
|
|
|
### Filtering Imported Data
|
|
|
|
|
|
|
|
Nominatim normally sets up a full search database containing administrative
|
|
|
|
boundaries, places, streets, addresses and POI data. There are also other
|
|
|
|
import styles available which only read selected data:
|
|
|
|
|
2019-01-03 23:08:38 +03:00
|
|
|
* **settings/import-admin.style**
|
|
|
|
Only import administrative boundaries and places.
|
|
|
|
* **settings/import-street.style**
|
|
|
|
Like the admin style but also adds streets.
|
|
|
|
* **settings/import-address.style**
|
|
|
|
Import all data necessary to compute addresses down to house number level.
|
|
|
|
* **settings/import-full.style**
|
|
|
|
Default style that also includes points of interest.
|
2019-12-29 13:47:10 +03:00
|
|
|
* **settings/import-extratags.style**
|
|
|
|
Like the full style but also adds most of the OSM tags into the extratags
|
|
|
|
column.
|
2019-01-03 23:08:38 +03:00
|
|
|
|
|
|
|
The style can be changed with the configuration `CONST_Import_Style`.
|
|
|
|
|
2019-05-21 14:55:16 +03:00
|
|
|
To give you an idea of the impact of using the different styles, the table
|
2019-01-03 23:08:38 +03:00
|
|
|
below gives rough estimates of the final database size after import of a
|
|
|
|
2018 planet and after using the `--drop` option. It also shows the time
|
2019-11-24 12:31:34 +03:00
|
|
|
needed for the import on a machine with 64GB RAM, 4 CPUS and SSDs. Note that
|
2019-01-09 00:20:42 +03:00
|
|
|
the given sizes are just an estimate meant for comparison of style requirements.
|
|
|
|
Your planet import is likely to be larger as the OSM data grows with time.
|
2019-01-03 23:08:38 +03:00
|
|
|
|
|
|
|
style | Import time | DB size | after drop
|
|
|
|
----------|--------------|------------|------------
|
2019-01-09 00:20:42 +03:00
|
|
|
admin | 5h | 190 GB | 20 GB
|
|
|
|
street | 42h | 400 GB | 180 GB
|
|
|
|
address | 59h | 500 GB | 260 GB
|
2019-01-23 00:42:18 +03:00
|
|
|
full | 80h | 575 GB | 300 GB
|
2019-12-29 13:47:10 +03:00
|
|
|
extratags | 80h | 585 GB | 310 GB
|
2018-12-05 00:22:19 +03:00
|
|
|
|
2019-11-24 12:31:34 +03:00
|
|
|
You can also customize the styles further. For a description of the
|
2019-05-21 14:55:16 +03:00
|
|
|
style format see [the development section](../develop/Import.md).
|
2016-06-07 01:17:15 +03:00
|
|
|
|
2018-08-24 00:12:10 +03:00
|
|
|
## Initial import of the data
|
2016-06-07 01:17:15 +03:00
|
|
|
|
2020-01-22 13:44:05 +03:00
|
|
|
!!! danger "Important"
|
|
|
|
First try the import with a small extract, for example from
|
|
|
|
[Geofabrik](https://download.geofabrik.de).
|
2016-06-07 01:17:15 +03:00
|
|
|
|
2019-02-13 23:58:59 +03:00
|
|
|
Download the data to import and load the data with the following command
|
|
|
|
from the build directory:
|
2016-06-07 01:17:15 +03:00
|
|
|
|
2018-01-15 01:43:15 +03:00
|
|
|
```sh
|
2019-11-24 12:31:34 +03:00
|
|
|
./utils/setup.php --osm-file <data file> --all 2>&1 | tee setup.log
|
2018-01-15 01:43:15 +03:00
|
|
|
```
|
2016-06-07 01:17:15 +03:00
|
|
|
|
2019-11-24 12:31:34 +03:00
|
|
|
***Note for full planet imports:*** Even on a perfectly configured machine
|
|
|
|
the import of a full planet takes at least 2 days. Once you see messages
|
|
|
|
with `Rank .. ETA` appear, the indexing process has started. This part takes
|
|
|
|
the most time. There are 30 ranks to process. Rank 26 and 30 are the most complex.
|
|
|
|
They take each about a third of the total import time. If you have not reached
|
|
|
|
rank 26 after two days of import, it is worth revisiting your system
|
|
|
|
configuration as it may not be optimal for the import.
|
|
|
|
|
|
|
|
### Notes on memory usage
|
|
|
|
|
|
|
|
In the first step of the import Nominatim uses osm2pgsql to load the OSM data
|
|
|
|
into the PostgreSQL database. This step is very demanding in terms of RAM usage.
|
|
|
|
osm2pgsql and PostgreSQL are running in parallel at this point. PostgreSQL
|
2019-11-26 23:52:37 +03:00
|
|
|
blocks at least the part of RAM that has been configured with the
|
2020-04-09 20:51:38 +03:00
|
|
|
`shared_buffers` parameter during [PostgreSQL tuning](Installation#postgresql-tuning)
|
2019-11-24 12:31:34 +03:00
|
|
|
and needs some memory on top of that. osm2pgsql needs at least 2GB of RAM for
|
|
|
|
its internal data structures, potentially more when it has to process very large
|
|
|
|
relations. In addition it needs to maintain a cache for node locations. The size
|
|
|
|
of this cache can be configured with the parameter `--osm2pgsql-cache`.
|
|
|
|
|
|
|
|
When importing with a flatnode file, it is best to disable the node cache
|
|
|
|
completely and leave the memory for the flatnode file. Nominatim will do this
|
|
|
|
by default, so you do not need to configure anything in this case.
|
|
|
|
|
|
|
|
For imports without a flatnode file, set `--osm2pgsql-cache` approximately to
|
|
|
|
the size of the OSM pbf file (in MB) you are importing. Make sure you leave
|
|
|
|
enough RAM for PostgreSQL and osm2pgsql as mentioned above. If the system starts
|
|
|
|
swapping or you are getting out-of-memory errors, reduce the cache size or
|
|
|
|
even consider using a flatnode file.
|
|
|
|
|
2020-09-16 00:51:25 +03:00
|
|
|
### Verify the import
|
2019-12-23 23:25:06 +03:00
|
|
|
|
|
|
|
Run this script to verify all required tables and indices got created successfully.
|
|
|
|
|
|
|
|
```sh
|
|
|
|
./utils/check_import_finished.php
|
|
|
|
```
|
|
|
|
|
2020-06-14 16:10:42 +03:00
|
|
|
### Setting up the website
|
|
|
|
|
2020-09-16 00:51:25 +03:00
|
|
|
Run the following command to set up the configuration file for the website
|
|
|
|
`settings/settings-frontend.php`. These settings are used in website/*.php files.
|
2020-06-14 16:10:42 +03:00
|
|
|
|
|
|
|
```sh
|
|
|
|
./utils/setup.php --setup-website
|
|
|
|
```
|
|
|
|
!!! Note
|
|
|
|
This step is not necessary if you use `--all` option while setting up the DB.
|
2019-12-23 23:25:06 +03:00
|
|
|
|
2020-09-16 00:51:25 +03:00
|
|
|
Now you can try out your installation by running:
|
|
|
|
|
|
|
|
```sh
|
|
|
|
make serve
|
|
|
|
```
|
|
|
|
|
|
|
|
This runs a small test server normally used for development. You can use it
|
|
|
|
to verify that your installation is working. Go to
|
|
|
|
`http://localhost:8088/status.php` and you should see the message `OK`.
|
|
|
|
You can also run a search query, e.g. `http://localhost:8088/search.php?q=Berlin`.
|
|
|
|
|
|
|
|
To run Nominatim via webservers like Apache or nginx, please read the
|
|
|
|
[Deployment chapter](Deployment.md).
|
|
|
|
|
2019-11-24 12:31:34 +03:00
|
|
|
## Tuning the database
|
|
|
|
|
|
|
|
Accurate word frequency information for search terms helps PostgreSQL's query
|
|
|
|
planner to make the right decisions. Recomputing them can improve the performance
|
|
|
|
of forward geocoding in particular under high load. To recompute word counts run:
|
2017-12-16 18:13:39 +03:00
|
|
|
|
2018-01-15 01:43:15 +03:00
|
|
|
```sh
|
|
|
|
./utils/update.php --recompute-word-counts
|
|
|
|
```
|
2017-12-16 18:13:39 +03:00
|
|
|
|
|
|
|
This will take a couple of hours for a full planet installation. You can
|
|
|
|
also defer that step to a later point in time when you realise that
|
|
|
|
performance becomes an issue. Just make sure that updates are stopped before
|
|
|
|
running this function.
|
|
|
|
|
2018-01-15 01:43:15 +03:00
|
|
|
If you want to be able to search for places by their type through
|
|
|
|
[special key phrases](https://wiki.openstreetmap.org/wiki/Nominatim/Special_Phrases)
|
|
|
|
you also need to enable these key phrases like this:
|
2016-06-07 01:17:15 +03:00
|
|
|
|
2016-06-09 00:17:48 +03:00
|
|
|
./utils/specialphrases.php --wiki-import > specialphrases.sql
|
|
|
|
psql -d nominatim -f specialphrases.sql
|
2016-06-07 01:17:15 +03:00
|
|
|
|
2019-11-24 12:31:34 +03:00
|
|
|
Note that this command downloads the phrases from the wiki link above. You
|
|
|
|
need internet access for the step.
|
2018-01-15 01:43:15 +03:00
|
|
|
|
2016-06-07 23:47:57 +03:00
|
|
|
|
2018-08-24 00:12:10 +03:00
|
|
|
## Installing Tiger housenumber data for the US
|
2016-06-07 23:47:57 +03:00
|
|
|
|
2020-03-27 17:50:05 +03:00
|
|
|
Nominatim is able to use the official [TIGER](https://www.census.gov/geographies/mapping-files/time-series/geo/tiger-line-file.html)
|
2018-09-28 21:17:02 +03:00
|
|
|
address set to complement the OSM house number data in the US. You can add
|
|
|
|
TIGER data to your own Nominatim instance by following these steps. The
|
|
|
|
entire US adds about 10GB to your database.
|
2016-06-07 23:47:57 +03:00
|
|
|
|
2019-08-23 15:59:03 +03:00
|
|
|
1. Get preprocessed TIGER 2019 data and unpack it into the
|
2018-01-15 01:43:15 +03:00
|
|
|
data directory in your Nominatim sources:
|
2016-06-07 23:47:57 +03:00
|
|
|
|
2019-05-22 10:25:05 +03:00
|
|
|
cd Nominatim/data
|
2019-08-23 15:59:03 +03:00
|
|
|
wget https://nominatim.org/data/tiger2019-nominatim-preprocessed.tar.gz
|
|
|
|
tar xf tiger2019-nominatim-preprocessed.tar.gz
|
2018-09-28 21:17:02 +03:00
|
|
|
|
2019-05-21 14:55:16 +03:00
|
|
|
2. Import the data into your Nominatim database:
|
2016-06-07 23:47:57 +03:00
|
|
|
|
2019-05-22 10:25:05 +03:00
|
|
|
./utils/setup.php --import-tiger-data
|
2016-06-07 23:47:57 +03:00
|
|
|
|
2018-09-28 21:17:02 +03:00
|
|
|
3. Enable use of the Tiger data in your `settings/local.php` by adding:
|
2018-01-15 01:43:15 +03:00
|
|
|
|
|
|
|
@define('CONST_Use_US_Tiger_Data', true);
|
2016-06-07 23:47:57 +03:00
|
|
|
|
2018-09-28 21:17:02 +03:00
|
|
|
4. Apply the new settings:
|
2016-06-07 23:47:57 +03:00
|
|
|
|
2018-01-15 01:43:15 +03:00
|
|
|
```sh
|
|
|
|
./utils/setup.php --create-functions --enable-diff-updates --create-partition-functions
|
|
|
|
```
|
2016-06-07 23:47:57 +03:00
|
|
|
|
|
|
|
|
2020-09-16 00:51:25 +03:00
|
|
|
See the [developer's guide](../develop/data-sources.md#us-census-tiger) for more
|
|
|
|
information on how the data got preprocessed.
|
|
|
|
|