@WanderingPhilospher Please explain if I divide a public key with a range of 2 ^ 255 by 0, 1, 2, 3, 4, 5, ... up to 2 ^ 155 the probability is much more likely to get 2 ^ 100 than, for example, to split public key with a range of 2 ^ 255 by 2 ^ 235 to end up with the smallest 5 times range of 2 ^ 20 ?
I don't really understand number theory. Does all this have to do with modular multiplication? That is, getting 2 ^ 100 is more efficient and expected in theory the probability due to the fact that the module is 0xFFFFFFFFFFFFFFFFFFFFFFFFFFFEBAAEDCE6AF48A03BBFD25E8CD0364141?
I'm not sure I understand your question 100% but,
dividing doesn't necessarily equal probability. It just shrinks the search range BUT adds more pubkeys you have to search for in that smaller range.
With Pollard's Kangaroo, starting from scratch (meaning no previous tames and tamed wilds) I think it will take you longer to solve; unless you get lucky and pubkey is towards the beginning.
With BSGS, you save a little time searching say 100 pubkeys versus 1 pubkey, from not having to generate baby steps 100 times (most time consuming in BSGS)
With brute force like bitcrack, it would save time but since the more pubkeys you have in your input file, impacts the overall speed, then there is very little speed up, if any. But also, bitcrack is exponentially slower than Kangaroo or BSGS, just searching for 1 pubkey/address, so this would not be an option for me, at all.
But the biggest thing to remember is, can you write to file, the amount of pubkeys you wish to shrink the range by. If you want to shrink the range by more than 2^26, then it starts to take awhile/unfeasible to write all of those pubkeys to an input file. Just my 2 cents.