I boldfaced to emphasize the importance of circuitry rather than quantum effects. I don't think they are contradictory statements, I just do not understand:
1) Why serial processing with large enough memory can be a proxy for parallel processing (I may not understand the role of the large enough memory here, so please explain)
How much do you know about single taped Turing machines? Nothing could be more serial. And they are capable of replicating the behavior of any computer, parallel or serial.
2) How such a system would be able to replicate the dynamics.
Neural circuitry is not at all static. See this
Video of spines/filopodia changing over the course of hours. So, any simulation would also have to account for these dynamics, and alter the algorithms with respect to previous activity as time passes. That is in addition to all the receptor desensitization and trafficking, etc. If you take the simulation out long enough it should also account for the susceptibility of certain cell types to different types of damage that occurs, e.g. mutations (due to differing epigenetic patterns, oxidative stress), etc as the brain ages. Astrocytes, microglia, and the vasculature also play crucial roles in maintaining and altering neural circuitry.
Any Turing machine (serial) could replicate all of the above, to any degree desired. Memory is the only limitation.
So, I have my doubts that any static hardware (even using accurate dynamic algorithms) could be used to properly mimic a system like this. I am not saying these models are not useful, just that a "silicon brain", as you put it, would be a fundamentally different phenomenon. Therefore, any experience this silicon brain had of qualia would necessarily be fundamentally different.
It's difficult to answer your statement above when it is in conflict with what I have been saying, but then, in the last sentence, it concludes what I concluded. Your above statement, in the first sentence, is not well grounded. See my above statements about Turing machines. Your conclusions, however ill founded they are, basically say what I said. That is, one doubts the sense of such a machine being conscious. Unfortunately, your statements below confound the issue due to your incorrect assumptions as to what consciousness is and what qualia are.
Essentially, the term qualia refers to an emergent property of a certain type of complex system (the brain). It is the way the brain responds to sensory input.
You are completely wrong here. That is not what qualia refers to. The way the brain responds to sensory input is not qualia. What you have described is a physical process. Qualia refers to the accompanied experience of the way things seem which correlates to different processes occurring within the brain - typically a distributed activation of a subset of the brain's neurons. You might want to get familiar with the concept of Philosophical Zombies and their conceivability. Whether you buy into the conceivability of Philosophical Zombies or not is irrelevant - you still must understand what they are to fully discuss qualia. Note that we're not even talking about the possibility of Philosophical Zombies actually existing, merely the conceivability of them.
If we expand this to include silicon brains and countries, we are saying that qualia refers to an emergent property of a complex system that arises due to the system "sensing" a change in it's environment.
Again, not quite right, but closer. You must detach qualia from the term "sensing". Sensing refers to a change in state due to incoming stimulation. That is not qualia. Qualia is the the experience which accompanies sensing either internal or external changes in state.
According to that definition, any suitably complex system that changes in response to it's environment will experience qualia, and can be deemed "conscious".
Nobody is really making any such claim. If it were so simple, we wouldn't have such discussions, and people such as Chalmers, Dennett, Penrose and Hofstadter would be writing basic texts of how things work, rather than how they hypothesize what might be happening.
Suitably complex can reasonably be defined as "equal to or more complex" than the human brain. So, then consciousness is a word we use to describe the complexity and responsiveness of a system to it's environment;
Wrong. Consciousness is not a word we use to describe the complexity and responsiveness of a system to its environment.