mirror of
https://github.com/simonmichael/hledger.git
synced 2024-11-07 21:15:19 +03:00
;doc: import: edits
This commit is contained in:
parent
210f28a7b5
commit
2a67aa327b
@ -49,9 +49,11 @@ Tips:
|
||||
you can reduce the chance of this happening in new transactions by importing more often.
|
||||
(If it happens in old transactions, that's harmless.)
|
||||
|
||||
Note this is just one kind of "deduplication": avoiding reprocessing the same dates across successive runs.
|
||||
`import` doesn't detect other kinds of duplication, such as the same transaction appearing multiple times within a single run.
|
||||
(Because that sometimes happens legitimately in real-world data.)
|
||||
Note this is just one kind of "deduplication": not reprocessing the same dates across successive runs.
|
||||
`import` doesn't detect other kinds of duplication, such as
|
||||
the same transaction appearing multiple times within a single run,
|
||||
or a new transaction that looks identical to a transaction already in the journal.
|
||||
(Because these can happen legitimately in real-world data.)
|
||||
|
||||
Here's a situation where you need to run `import` with care:
|
||||
say you download but forget to import `bank.1.csv`, and a week later you download `bank.2.csv` with some overlapping data.
|
||||
|
Loading…
Reference in New Issue
Block a user