Everyone writes something different here. This may also be due to the question. Let me rephrase the question. Related to the blockchain, which CPU speeds, internet speeds, and hard drive speeds match? How can I investigate the relationships?
Summary: there is no generic rule - one would need to understand the way, how the initial load of the blockchain works :-)
There are too many variables. One would have to setup a weighted matrix with all dependencies... hence you find only general info, like this:
With my bitcoin core node (a fully validating node), it fetches all transactions from it peers, and verifies the transactions for the correctness (see here:
https://en.bitcoin.it/wiki/Protocol_rules#.22tx.22_messages).
This involves many, many lookups of previous transactions in the local blockchain, hence the dependency to the hard disk. I have had situations, where an SSD provided a times five speedup.
The calculations with a Quad-Core CPU is usually not a limiting factor, I had it running on a 4 years old Macintosh, and it never got limited. CPU calculations was always faster than data delivered from other peers.
On the bandwidth: I had a 10mbit/s connection, and did never run into congestion. Au contraire: I didn't have "enough" peers to send data continuously.
Then I did the opposite: I took a RasPI2+, attached it to a 2mbit line, and let it run 3 weeks, before he was nearly sync'd with the blockchain. And then it came into the time of full blocks (near the end 2017), and suddenly the RasPi would never completly sync. The CPU was probably too slow to verify the tx data, and the attached external disk (no SSD!) via USB could not cope with the amount of requests. I set up a monitor, and measured CPU, processes (load averages), disk I/O, network I/O... I could see that the disk I/O was not going down to zero a single time, and the load average never went under 1.0, it merely stayed between 3 and 4. So there were many processes in the working queue, that could not be worked on by the CPU, cause other tasks occupied it.
Summary: you have to have a setup, and measure all details.
I believe that core team does a similiar thing, and I recall they had a process setup, which (re-) loads the blockchain on a daily basis (to ensure correctness of the system). But I haven't seen any statistics.