I really do like the idea of Vertcoin. Still, two things come to my mind:
1.) Claiming something is "ASIC proof". No algorithm is that. You can always develop an ASIC. GPUs are ASICs, after all. Take the design of a GPU, strip everything not needed for Vertcoin mining and that's your VTC-ASIC. Still, of course, that ASIC would be way more complex than any SHA/BTC ASIC and would require loads of fast memory, thus making it very expensive (compared to SHA and even scrypt/LTC ASICS). But this immediately leads to my second point.
2.) How bad are ASICs after all? With SHA, there is a low entry level for developing an ASIC. With VTC, this "entry barrier" would be much higher. Imagine VTC would be as popular as BTC is now. I'm going to guarantee you that at least one person/group would be into developing a VTC ASIC. But that's the problem: Only few entities would have the funds to do so, much fewer than with Bitcoin. After all, this will lead to more centralization, not less.
I don't like that "hobby mining" is gone in the Bitcoin world, I don't like how much current ASICs are overpriced.
But I still think the Bitcoin network is better off with all the ASICs than the VTC network would be.
No worries. The coin is software. ASICs are hardware. A simple algorithm tweak proposed by the VTC developers and agreed upon by the community will shake off an ASIC. Even a change to N-factor schedule would wreak havoc on ASIC makers' business plans. As long as Vertans say ASICs are not welcome here, we'll be able to preserve the 'hobby mining' angle for a good while.
Edit: BTW I don't like the asics at all, mostly because of the incredible disruption they caused. There are not many companies that can make asics this complex, and when they do have working units they extensivly 'test' them before they ship them out. That's what I think BFL and such did.
HinnomTX,
ShadesOfMarble does have a point though, IMO. Asics (and FPGAs) are sort of in between hardware and software. The GPU (the core itself) is basically an ASIC. With lots of functionality that's not required for mining.
So if you could strip the unneeded parts from a graphicscard's GPU you could create an mining-specific PCB with a higher efficiency and probably lower power requirements. And you could change the N-factor just as easily.
I think that would would require an incredible amount of engineering though, you'd be doing AMD's and Nvidia's work basically. To me that doesn't sound feasible.
Unless AMD and Nvidia themselves would start producing those miners. Then again, the downside of anything application specific is that they're completely useless if this whole *coin-world collapses.
I don't think that will happen, but for a company this is probably a big risk.