Hashing algorithms become irreversable due to their 'loss' of data. Consider some simple circuit logic:
1101001101010110 XOR 0110101000011000 = 1011100101001110. However, given 1011100101001110, you couldn't give me, with certainty, the two inputs that were XOR'd together. To simplify this further, let's take the example of multiplication. If I give you 6 and 9, you can quickly tell me '54'. However, if I gave you 54, you wouldn't be able to tell me which two numbers I multiplied together. You could guess, and you could give me a list of possibilities, and one of them would likely be correct. However, consider doing this 'stacked' thousands of times. The idea is that, as you continue to guess all the possible inputs, and all the possible inputs to create those inputs, your search area grows exponentially. This is, of course, a very vague simplification of the internals of a hashing function, but it gives an idea as to how they work. If someone does figure out some sort of shortcut for the algorithm, they could very easily mine faster, but this isn't a likely possibility. If someone was able to find some method of actually reversing SHA256, then the coin would certainly be more secure having multiple hashes.
For example, say we were just considering a normal password, rather than something for bitcoin. Say our password is 'passw0rd12345' (pretty great, right?). Hashing functions aim to be one-to-one mappings, however this isn't quite the case. This means, of course, that if a hash output was 16 characters of hexadecimal long, it has a total of 16^16th possibilities for outputs. However, you need to understand the idea of collisions, and the idea of entropy. Entropy is the 'randomness' for lack of a better term of a system. It's a measure of chaos. If I have a ACTUALLY random string of 100 characters, each of which can be either an a or a b, I have 2^100 entropy, or a 2^100 sample space. Now, given SHA256 has 16^64 bits of entropy if it existed as a perfect hashing function. A hashing function can only reduce entropy. If my input has more than 16^64 bits of entropy, then it is automatically reduced to the entropy of the hashing function. Since SHA256 isn't perfect and doesn't fully live up to its potential entropy (aka it has collisions, where two inputs give one output/the same output, although one has yet to be found), each SHA256(string) has some loss of entropy. If we continually feed data into a SHA256 loop (SHA256(SHA256(SHA256(SHA256......(string)....)))) we slowly lose entropy.
While having six or seven hashing functions would ensure that, if some of the hashing functions were compromised, the network would continue to function properly. For example, if you has a(b(c(d(e(string))))), where a, b, c, d, and e are all hashing functions, and b was compromised, that doesn't compromise the entire system, quite clearly. But, if you get too complex with the layering of hashing functions, you can carelessly loose entropy. Someone, to make such a coin, would have to make sure that all the hashing algorithms provide the required entropy, and that one of them critically reduce entropy due to a bug. It's certainly an idea worth implementing for someone who has the skills and knowledge, but it would just have to be done carefully.
Additionally, I think the idea of multiple algorithms has another interesting use--ASIC-resistant networks. Imagine a coin which was able to reorganize the hashing algorithms in order, complexity, length, entropy, etc. based on blockchain data. For example, say we have six hashing functions, a-f. For 100 blocks, the network could use a(c(d(e(a(c(d(b(a(f(f(e(f(c(info)))))))))))))). Then, the network could switch to a(d(f(e(c(a(f(d(b(b(b(b(a(c(info))))))))))))). It would be hard to design an ASIC to do this. However, it's still perfectly dooable. Now, imagine if the network could change the actual algorithm behind a. Maybe, a would have an injected starting value of the last 16 bits of the retarget block. Or perhaps the order of XOR, AND, and OR in the algorithm would be remixed depending on orders dictated by some pre-determined number of bits in the retarget block. Every client would be able to calculate the required hashing algorithm data and could mine with it. GPUs, CPUs would fairly easily adapt to changes, although mining software might be a bit of a PITA to write. However, creating an ASIC that is capable of changing the very circuit logic that backs the algorithms would be nearly impossible, given the static nature of ASIC chips.
People keep complaining about not having a way to be ASIC-resistant. If a developer was willing to put in the time and money, they could certainly create a coin which could never have ASICs created for it (unless the idea of ASICs were reinvented).
What a great post that is!! Hopefully people will read it and become more educated! Kudos!