Can not send several transactions with same details even though they are different transactions


I have had the case where I had two transactions with the exact same details (two bus journeys the same day). When I try to import this in QuickFile using the API with the “ignore duplicate” switched on, I fail.

How can I still do this without turning off the “ignore duplicate” switch?


If you mean the Bank_CreateTransaction operation then it looks like you can specify the duplicate filter option as part of the API call rather than relying on what’s set at the account level. I haven’t tried this myself but passing an explicit false for DuplicateFilterOn might help.

If the two transactions are the same details then they are duplicates as far as the API is concerned and the only way to import them both would be to disable the duplicate filter.

Thank you for your reply.

Yes I was referring to this operation. The transaction time is obviously different.

I personally understand the duplicate filter option as being able to reimport the whole bank data without creating duplicates and it should not apply to new transactions with different transaction times.

I don’t work for QuickFile so I don’t know the database schema, but I don’t believe they store the time for bank transactions even if it is available in the source data. Certainly in the UI the transaction date only goes down to a granularity of one day.

For bank transactions the duplication checker works by checking for existing transactions where three things match:

  • Date (time isn’t taken into account)
  • Description
  • Value

If these three match then it’s considered to be a duplicate. However, if in the same upload you have 2 matching transactions (e.g. in the same statement upload) then it’s not considered as it’s from the same source.

With the API the same would apply. However, I will need to double check, but given each API request is executed as it’s received they would all be different sources as they’re received and executed as they arrive.

Turning the duplicate checker off should resolves this, but please let us know if you’re seeing different.

The schema suggests you can supply up to 100 transactions in one APIcall.

Hi @ian_roberts

Yes, that’s correct. I’m just checking how the duplicate filter is managed with these if it’s set to on. I would imagine it’s the same as I posted above (e.g. from the same API call it’s classed as “one source”), but I’m confirming with our development team to be sure.

Just to confirm - at the moment these transactions seem to be checked against each other for duplicates, even in the same API call. I have however asked for this to be changed to match the CSV imports.

OK thanks! Let me know when it is fixed.

@cedev - This should now be resolved as part of our latest release. Could you try it again please and let me know if that works?

To clarify, if you have 2 transactions that do match (same date, description and value), they shouldn’t be flagged as duplicates as they would be part of the same API call (and therefore it’s likely they would have been checked beforehand).

I’ll give it a try.

Yes, that as exactly the case.