Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Record already exists for end dated transaction. #2237

Closed
falcon397 opened this issue Apr 20, 2019 · 5 comments
Closed

Record already exists for end dated transaction. #2237

falcon397 opened this issue Apr 20, 2019 · 5 comments
Labels
stale No replies or comments. Will be auto-closed in 14 days.

Comments

@falcon397
Copy link

Bug description
I am running Firefly III version 4.7.17, and my problem is:

I imported a CSV with about half a month of transactions. I realized I wasn't using my rules/budget correctly, so I removed the records, pointed the rule to the correct budget, and re-imported the CSV hoping the rules would get reapplied and fix the budget issue.

When I import the CSV I'm told the items could not be imported. It already exists. None of the records in the CSV could get added. I checked MySQL and the records are correctly end dated in the transaction table.

Steps to reproduce
I have not been able to move past this, or undo this.

Expected behavior
I would expect the app to see the record is end dated, and make a new record in the transactions DB

Extra info
Docker container, Linux host, Chrome Browser, MySQL 5.7 DB.

Bonus points
[2019-04-20 23:20:16] local.WARNING: Row #24 seems to be a duplicate entry and will be ignored.

[2019-04-20 23:20:16] local.INFO: Found a transaction journal with an existing hash: 29b00fac3455699d5f0096fc6662d36253abdfa2b83077417723d7550943e2fb

[2019-04-20 23:20:16] local.INFO: Transaction is a duplicate, and will not be imported (the hash exists). {"existing":3372,"description":"Withdrawal-ACH-A-1146275 WEBAcorns Investing (Transfer)","amount":"-10.5200","date":"2019-04-16 00:00:00"}

[2019-04-20 23:20:16] local.WARNING: Row #25 seems to be a duplicate entry and will be ignored.

[2019-04-20 23:20:16] local.INFO: Found a transaction journal with an existing hash: 6e9642ae846bd174a6f58d568f0a4f8c102bcd78b8e527f3bc9d986edf2accd6

[2019-04-20 23:20:16] local.INFO: Transaction is a duplicate, and will not be imported (the hash exists). {"existing":3373,"description":"Withdrawal at ARCO#83095FAYZANULLAY F 0 WILLITS CA US","amount":"-11.2600","date":"2019-04-16 00:00:00"}

[2019-04-20 23:20:16] local.WARNING: Row #26 seems to be a duplicate entry and will be ignored.

[2019-04-20 23:20:16] local.INFO: Found a transaction journal with an existing hash: 2d2d2212dcb1eb5a217f7aa43af1cc47f991e4386480bf0cf3f0c1e67dcfacbc

[2019-04-20 23:20:16] local.INFO: Transaction is a duplicate, and will not be imported (the hash exists). {"existing":3374,"description":"Withdrawal at WINCO FOODS #53 200 Blu 0 Folsom CA US","amount":"-244.5300","date":"2019-04-17 00:00:00"}

[2019-04-20 23:20:16] local.WARNING: Row #27 seems to be a duplicate entry and will be ignored.

[2019-04-20 23:20:16] local.INFO: Found a transaction journal with an existing hash: 068d90acff735953af9d9565bd4e18101f0d677098ecd69505c3da3ba04ce4b6

[2019-04-20 23:20:16] local.INFO: Transaction is a duplicate, and will not be imported (the hash exists). {"existing":3375,"description":"Withdrawal-ACH-A-930321 WEBCITI CARD ONLINE (PAYMENT)","amount":"-4672.0800","date":"2019-04-18 00:00:00"}

[2019-04-20 23:20:16] local.WARNING: Row #28 seems to be a duplicate entry and will be ignored.

[2019-04-20 23:20:16] local.INFO: Found a transaction journal with an existing hash: 14ea02bbf768058d892e98625232c21ea157ab6f20c07377957dc8900e47c402

[2019-04-20 23:20:16] local.INFO: Transaction is a duplicate, and will not be imported (the hash exists). {"existing":3376,"description":"Withdrawal-ACH-A-1146275 WEBAcorns Investing (Transfer)","amount":"-16.8200","date":"2019-04-18 00:00:00"}

image

@JC5
Copy link
Member

JC5 commented Apr 21, 2019

Firefly III will never re-import rows, even when they have been deleted. You'll have to remove the offending rows from your database manually, from the "transaction_journals" table.

This is by design. It is annoying when you're testing your import, but it gives you distinct advantages later on:

  1. You can safely import overlapping files because existing transactions will be ignored.
  2. You can delete transactions (many banks deliver "filler lines" with account info or notices) that can be deleted without being reimported again next time.

@JC5 JC5 added the question label Apr 21, 2019
@falcon397
Copy link
Author

falcon397 commented Apr 22, 2019

Makes sense. In my case I am updating my data because I went back into my online banking UI and applied a category to records without categories. It doesn't seem unreasonable that from time-to-time mistakes could be made on the categorization of the data, and a user would need to re-import entire data sets. What is preventing Firefly from seeing a duplicate record, and asking the user if they would like to update the record with the new data, or if they want to skip? I read the bit about your lack of personal support for a fully featured import function, but at the very least giving users the choice to update a record, or skip it, would make your import process a lot more flexible. Especially since records appear to be irreplaceable.

It is a bit confusing, I've re-imported a lot of data before, this is the first time I've ever had this problem with re-importing the same CSV. Is this a new feature? I have gone into the asset accounts, selected all the records I want to remove, deleted them, and re-uploaded the CSV's so many times it haunts my dreams. I didn't do anything different this time, but now it doesn't work. I did update my Docker container to latest over the weekend, so I suspected it is a newer feature. But now I'm curious, you said it works this way by design, that has not been my experience in past versions of this application.

Also, on imports of larger files, the page can timeout after a minute or so. Is there a way to make the page timeout a Docker variable? Typically, after a timeout I'll follow along with the Docker logs to verify Firefly III processes everything correctly. I can refresh the timed out page and the service will go back to the Dashboard screen. But the only way to know if the process successfully imported everything is to watch the logs, because the page times out.

@JC5
Copy link
Member

JC5 commented Apr 24, 2019

It's always been there. I'm not entirely sure why it hasn't triggered before. I'll do some digging to be sure.

The time-out can't be helped I'm afraid. It's already at 5 minutes, and the Javascript code in the import page should prevent it from timing out. The script should always be capable of picking up the current status, even if the page timed out.

@falcon397
Copy link
Author

5 minutes for a timeout is not my experience. My files are like 50 records. I have had a few really big ones, where I did an initial import of my banking going back 18 months, but I expected a timeout for that. Most of them are for the month, and less than 50 transactions is about average. It takes maybe 30 seconds to process the file. In that time, more often than not, the page times out.

@stale
Copy link

stale bot commented May 1, 2019

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale No replies or comments. Will be auto-closed in 14 days. label May 1, 2019
@stale stale bot closed this as completed May 8, 2019
@lock lock bot locked as resolved and limited conversation to collaborators Jan 19, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
stale No replies or comments. Will be auto-closed in 14 days.
Projects
None yet
Development

No branches or pull requests

2 participants