The merged saveFile is 5GB, and in merge process takes up about 14GB of RAM.
I think the more obvious solution is to sort files from bigger to smallest when are merged; or use only small saveFiles.
On the other hand, the -ws flag I think is problematic when using -wsplit, generating larger files than necessary. Do you think it is interesting to separate the DP and the kangaroos into different save files?
As next improvements, I will work on improving the export of the DPs and the possibility of modifying the DP bits in a save file to reduce its size if we have chosen a too low DP value. It can also be interesting to remove from a save file the distances to share it without gifting the prize.
I tested dir merge on PC with 24GB RAM and 10 dir files that were probably 500MB a piece but I didn't check the RAM usage.
Alek76's version is similar to what you are talking about as far as separating files. He has (in current version) 8 text files that are generated, 4 tames and 4 wild. I modified it a little bit and used 2 tames and 4 wilds. Then, he has a python comparator that compares all the files to check for a solved key. I tried/trying to figure out how to merge that with JP's (this) version, but can't figure out how to read the files well enough to understand how to build the python comparator.