published by whitemice on Tue, 01/24/2017 - 14:45
You have a text file with four million records and you want to load this data into a table in an SQLite database. But some of these records are duplicates (based on certain fields) and the file is not ordered. Due to the size of the data loading the entire file into memory doesn't work very well. And due to the number of records doing a check-at-insert when loading the data is also prohibitively slow. But what does work pretty well is just to load all the data and then deduplicate it.