750Ti still seem like a really good choice imo.
I guess the prices will go down a bit also now when new cards show up ?
Me myself are buying used 750Ti:s for like 80 bucks.
All my 750Ti:s produce around 7.1-7.3Mhs @Quark superstable.
True. But when you consider the fact that you can only fit so many cards in one box, looking at performance per card starts to be a bit more important. One box with 6x 960 should be about the same (probably better) hash/watt as two boxes with 6x 750ti each, but with significantly lower upfront cost.
You have to think outside of the box!
-- SNIP --
Hahaha! Nice setup! How is cooling in open air like that?
The "case" wasn't really the point of my comment though. CPUs, motherboards and all of the rest of the stuff is mostly irrelevant when it comes to hashrate. So, IMO, spending money and power on them is wasteful. I just snagged a
http://www.asrock.com/mb/Intel/Q1900M. Performance is the same on that puny 10W quad-core with a 960 and 750Ti installed as it is in my workstation with 84W i7-4770, 32G RAM, etc. Now if I can get a hold of some PCIe port multipliers and scale that out to 12 GPUs without performance degradation, I think I'll be onto something...
Cooling is no problem at all! In the summer I have an oscillating fan blowing on the rig. In the winter balls to the wall mining! It is cold in the garage! If you get a port multiplier working please let me know how. I will definitely add more GPU's. However I read Windows can only handle 8 GPU's. Do you know differently?
it seems that with a mod in windows - you can add more ... linux handles most easily ...
i cant remember exactly where the post is - but i know it was in the sgminer-dev thread that the guy proved that it can be done ... have a read through as im not in a position to get the exact post - apologies ...
port multipliers are great IF the system is capable of handling ... i have tested a few - and run into dead ends with the drivers ( stock and without modification ) for amd - but have no idea about nvidia ...
after the amount of work involved trying to set it all up - and i dont mean a few hours - it was more beneficial for me to spend the time ( and money ) on growing the farm in other ways ... even at the loss of a little bit more power for more systems - and i mean a 'little bit' more power ...
from what ive seen also - if the miner has the cpu active on thefarm - it has such a little effect on the hashrate - but a HUGE effect on the power consumption ... cpus chew too much power for the little gain that you get from them ...
when you consider that two of our gigabyte 750ti oc lp cards chew through about 100Watts of power and output about 5700KH - while the cpu alone chews through 95Watts of power on its own and increases the hash rate by approx 10KH ... id rather put another two gpus in ...
for the moment though - density is the limiting factor BUT is not an issue for thefarm ... the rebuild continues ...
btw ... the further you delve into a 'custom' system that requires mods and changes - the further issues you will have when it comes time for maintenance and breakfixing ... its why we stay with stock and standard for a 7 minute replacement ( including OS using images ) than a hairpulling hour or more fixing a break ... all this at the small cost of some extra electricity that is used for more systems ...
so agreed that this method WILL use more power per machine - but there is VERY little downtime using a stock system ... which means the machine gets up mining much more quickly ...
thats a compromise we are willing to accept ...
#crysx
This fork of ccminer support 32 gpu's in one rig. I have only tested it successufully with 7 cards. Motherboards that have 7 pci-e slots are more expensive than the btc- boards with 6 slots. I can get two 6 slot boards for one 7 slot board.
You should pick up a cheap bitcoin board like this:
http://www.asrock.com/mb/Intel/H61%20Pro%20BTC/http://www.asrock.com/mb/Intel/H81%20Pro%20BTC/And then buy a cheap celeron cpu. 4gig of ram is enough.
With the H61, and H81 you dont need powered riser cables, since you plug extra power directly into the boards.
For a 6x ti rig, cheap risers like this one is enough:
http://www.dx.com/p/pci-e-1x-to-16x-riser-card-extension-cable-15-5cm-length-100061#.Ve02T_ntlBcsp ...
those motherboards (
http://www.asrock.com/mb/Intel/H81%20Pro%20BTC/ ) are the boards we are rebuilding the farm with ... we have 40 of them ...
these are intel based systems anddo indeed have 6 pcie ports of various speeds ( x1 x8 x16 ) ...
the other motherboard is an amd one and as far as im aware - only has 5 pcie slots ... even though it states 6 ... i dunno ...
they have been currently discontinued - so we purchased all the stock they had here of the h81pro intel board ... there are still many floating around though - so we will pick a few more if we can ...
they are reasonably reliable - but dont have the same quality as higher ranked boards - which is why i must disagree with you on the power side of it ...
having the cards plugged INTO the board is the only way to access the power from the motherboard ... they are way too close together and you cant fit the 6th gpu this way - so you only end up with 5 ( or 4 if you use large cards ) ...
when you are using risers - those cheap ones are VERY fickle ... unless using pcie x16 risers - you will NOT get the full power of the 'on board power' which is supplied by the motherboard ...
we opted the other way - and use pcie x1 usb 3.0 powered risers - which means less load on the two 4pin molex connectors on the motherboard ( which have been known to melt on high load - like thefarm ) - AND you can fit all 6 gpus on the board ...
with your ccminer fork ( ccminer-spmod ) - they run very well ... we dont oc though - as stock rates are what keep these things going without touching them ... a new aluminium frame has been designed and is almost finished ( this has taken MONTHS to get right ) and this will house the motherboard and gpus and risers and psu and sdd and all cables in an orderly fashion ... if there is any downtime - they are easily accessible and unplugged and replaced / repaired ...
those cable risers are deadly for the system if you are unlucky enough to have one that is faulty - especially if it has no power boost soldered to it ( and even they are quite flakey at best ) ... some will run for months as ours did when we started with the amd gigabyte 7970 and 280x oc card - but heat and consistent use batters these cables to bits ... hence why it is highly advisable to spend that extra cash on the pcie x1 usb 3.0 risers ... apart from being longer and much more sturdy - they can be cable managed much more reeadily ... a huge thing if you need order and stability ...
we chose the gigabyte 750ti oc lp cards to run within thefarm - as you are well aware ... but if we wanted to upgrade all the gpus to - say - gigabyte 980ti oc cards? ... they draw a lot more power than the 750ti and generate a lot more heat ... so little wired risers will get overloaded and wont cope - as well as become quite brittle and hard to use over time ... there is no issue with the usb riser - they have been running non stop for a long time now ...
this is just MY opinion - as many have theirs - and as you have yours ... this is the methodology i use in our farm - and thefarm couldnt cope in its present state had it been built any other way - especially as we are growing now ...
btw - awesome work on the latest commit - and devs that have contributed ... even compiles properly under cuda 7.0.28 and fedora 22 x64 ... hashrate is lower - but compiles and x11 has no cpu validation errors ...
will be showing my appreciation when things settle here a little and im not bleeding btc to get things done ...
...
#crysx