Pages:
Author

Topic: Data Center Mining Garage and Man Mining Cave - page 7. (Read 49463 times)

legendary
Activity: 2800
Merit: 1012
Get Paid Crypto To Walk or Drive


@ Debitme - That's definitely the issue so no worries.  





HAZZAH!!!  It is working!  Thank you for the help!
rjg
newbie
Activity: 29
Merit: 0
Wow...  Thanks for sharing all of your experiences and great ideas.  I am learning quite a bit by reading and following along.

I do have a question that seems appropriate based on your garage experiences.  I am in Little Rock, AR, so a bit further north than you but seemingly just as hot, just not nearly as humid (speaking from my experience visiting Houston in July  Shocked ).

I currently have 48 GPUs in the garage (double car) in an open air config and will be adding another 24 GPUs before my power capacity is tapped out.  I am using metal bladed fans blowing on the GPUs which seems to be doing the job at this point with the garage door cracked open about 8" or so and the attic access opened up.

I have ordered and received 4 of the Windeevent (https://windeevent.com/) units and I am trying to determine the sizing for a fan in the ceiling of the garage to exhaust the hot air into the attic.  I have been looking specifically at the QuietCool (https://quietcoolsystems.com/) whole house fans due to their easy installation, warranty, and very efficient motors.

The information that I have found suggests that I will need about 1.5 sq ft of intake for each 1500CFM, at this point, with the 4 vents, I have just over 4 sq ft so I have enough intake to support a bit over 3000CFM.

Can you offer suggestions on how many CFM I should be shooting for, I am thinking at least 1500 and more likely something around 3000 for the estimated 3500 cu ft of garage volume.

Thanks in advance for your help!

Robert
legendary
Activity: 1834
Merit: 1080
---- winter*juvia -----

I have definitely considered doing this, but was advised against it by people MUCH smarter than me Wink  To be honest, getting a 7-GPU rig stable can be challenging enough at times, and limits how much OC'ing you can do on individual cards as it is, so I am happy with them as they stand without another avenue for frustrations, instability or failures. If you do implement it on a larger scale let me know how it works for you!

I can't agree more with your finksky, 7+ gpus rigs are very hard if not impossible to make stable, and usually require a much higher end motherboard (to have more pci-lane) so you lose quite a bit of the cost you save, not counting the hundreds headaches. 6x is the way to go IMHO  Wink

I think the 7-GPU rig really depends on the GPU type.
More forgiving cards like R7-370s - no so bad and this is the GPU config that I got working on H81 pro BTC mobo with 3-to-1 PCI splitter. This was one of my first rigs after 390s,290s rigs - 370s were on sale cheap back then. The rig is still in production and although it's not really that much of speedster vs 480s, Nanos, 390s, 290s... it's still a decent rig and it has amazing uptime life - very stable and doesn't even go beyond 65c temps.

Attempting 7 or 8 GPUs under Windows or Linux for 480s or Nanos -- I didn't get past first attempt. Too many variables and hacks to consider to make it to work. Then came Pandaminer w 8x480Ms-- now thats a beast. Suddenly there is no challenge to build 7/8-riggers anymore...
sr. member
Activity: 414
Merit: 251
@ bhugatti -  I never even tried further as I couldn't get Windows to see it or power the GPU even with the M.2 powered from Molex and my riser is powered by Molex.  My buddy told I need to disconnect another riser, do this do that, change Bios, Linux, blah blah.  Then I thought about instability and just say better to not even waste my time.  It does work with Linux, there are people who have it running with it on Gaming 5.  

I am easily side tracked.  Might be fun to re investigate at a later date for 1 of my Play rigs but right now I just don't have any cycles.  The one you show on ebay doesn't appear to have the Molex power cable, you will need that.  

@ m1n1ngP4d4w4n - Yeah for the most part, 6 GPU rigs are the way to go for Open air.  Very easy to troubleshoot and lower your chance of having 1 GPU with Crap ASIC that can't handle the ROMs.  However for Server Rack and for large deployments, we have to squeeze the max out of each Case due to higher cost per rig and space, network, rack, power cables, etc required.  For most optimal temperature in server cases, 6 GPU is also better.

However, for my Setup the difference for 60 rigs when using 7 GPU config vs 6 GPU is over $5K in savings plus 1 RACK less.  Very surprising right?, Considering that the MSI is $65 more expensive then the ASROCK.  However you're able to squeeze in 1 more GPU and the underlying MB set to support each rig (MB, PSU, CPU, SSD, RAM, Server case, power cable, Rack etc) makes it more efficient to use 7 vs 6 GPU configs.   So it's definitely a great value and I get a more premium MB with the super valuable BIOS Board Explorer feature.  I can also reboot any of the rigs without needing Emulators on MSI MB vs ASROCK.  If I do need high rez, I just connect the monitor cable and it will auto switched to High rez.  The benefit of 6 GPU config is better stability and cooling due to less heat and more spacing. 

Unfortunately I do love Furies so for those rigs, I still use ASROCK H81 with Emulators.......  The reason for that is spacing required for the Furies. They are longer cards so the thinner H81 helps it fit better.  I can squeeze a Gaming 5 in but again I can only fit 6 GPU so it's a waste.   My Open Air config is mostly H81 as the saving there is less substantial and I started off with a ton of H81.  As usual, there's never a 1 size fit all solution.  =)  I can't wait for the Biostar MB to come as we do need a good alternative to the H81 that is always sold out and the Gaming 5 which have went up from $90 used and $119 new to over $110-119 used and over 149 new.   

P.S I love ASROCK RMA, they are very good and fast unlike what you read in the Reviews.  MSI RMA..........................hmmmmmmmmm..................Caveats Emptor.


full member
Activity: 154
Merit: 100
@Yun, I have no plans for an 8 gpu rig but can you tell me what m.2 to pcie card you got to work on the msi z97, I bought one but it did not work.  Would like the ability to give it another try in case I have a bad pci slot on a board

This one?
http://www.ebay.com/itm/Q13025-WBTUO-LM-141X-V1-0-Drive-M-2-NGFF-to-PCI-E-X4-Adapter-Card-for-Desktop-PC-/351333328902
full member
Activity: 224
Merit: 100
CryptoLearner
@ m1n1ngP4d4w4n - Great idea but that method does add to the cost (labels) and more time to implement but the larger FONT would definitely help many people with poorer vision which will be me soon.  However, I prefer the K.I.S.S  method.  Keeping things Simple also help reduce cost.   Labels are not cheap.  I end up buying multiple label makers as they were selling them for $9.99 to $15 when on sale with the sample roll vs new rolls costing $27.95 to 34.95!!!  Its ridiculous what these companies are doing.  Much worst then normal printers.  

Oh i wasn't thinking doing proper labels we print them on very cheap paper, like 5 labels per sheet of paper, and print in eco/gray, the cost is barely anything Smiley, can't say anything about time, it does consume a bit of it, but we save 10x that time when we look for stuff. We bend the paper and insert it in the box opening, this way also no need to fix it or doesn't damage the box.
sr. member
Activity: 414
Merit: 251
@Vapourminer - All 3 are good. Leviton, Eaton Ultra or Square D so whatever is available at good prices is a winner.  Any brand is still better than zero protection.  My family members just started adding them too.  For $100 for the hardware, if you can DIY, it's a no brainer.

@ Debitme - That's definitely the issue so no worries.  

@ m1n1ngP4d4w4n - Great idea but that method does add to the cost (labels) and more time to implement but the larger FONT would definitely help many people with poorer vision which will be me soon.  However, I prefer the K.I.S.S  method.  Keeping things Simple also help reduce cost.   Labels are not cheap.  I end up buying multiple label makers as they were selling them for $9.99 to $15 when on sale with the sample roll vs new rolls costing $27.95 to 34.95!!!  Its ridiculous what these companies are doing.  Much worst then normal printers.  



@ Finksy - Yeah, it's never a good idea to do 8 GPU unless it's direct attached and no risers needed like Pandas.  More components especially crappy risers = higher chance of 1 GPU taking the entire rig down.  I have the M.2 and you don't need very expensive MB to support 8 GPU.  The MSI Gaming 5 can do that.  You will need Linux for that I believe.  

My M.2.  I was going to mess with it for fun but decided against it.  It's hard enough keeping a rig stable w 7 GPU.  








full member
Activity: 224
Merit: 100
CryptoLearner

I have definitely considered doing this, but was advised against it by people MUCH smarter than me Wink  To be honest, getting a 7-GPU rig stable can be challenging enough at times, and limits how much OC'ing you can do on individual cards as it is, so I am happy with them as they stand without another avenue for frustrations, instability or failures. If you do implement it on a larger scale let me know how it works for you!

I can't agree more with your finksky, 7+ gpus rigs are very hard if not impossible to make stable, and usually require a much higher end motherboard (to have more pci-lane) so you lose quite a bit of the cost you save, not counting the hundreds headaches. 6x is the way to go IMHO  Wink
legendary
Activity: 1022
Merit: 1003
Looking at that motherboard, I see an unused M.2 slot next to the far left PCIe slot.   Why not get a M.2 to PCIe adapter and another riser and that motherboard could then run 8 GPUs?  I have seen these adapters for $10.  Works for PCIe M.2 slots, not SATA ones, of course, but the PCIe X4 m.2 slot is more than enough to drive a X1 riser.

I'm considering trying this but have not had a chance to do so.  Wanted to know if others have done so, and if there's some reason it wouldn't work.  (Might require a Z270 MB to have enough PCIe lanes in the chipset.)

I have definitely considered doing this, but was advised against it by people MUCH smarter than me Wink  To be honest, getting a 7-GPU rig stable can be challenging enough at times, and limits how much OC'ing you can do on individual cards as it is, so I am happy with them as they stand without another avenue for frustrations, instability or failures. If you do implement it on a larger scale let me know how it works for you!
full member
Activity: 224
Merit: 100
CryptoLearner
legendary
Activity: 2800
Merit: 1012
Get Paid Crypto To Walk or Drive
It absolutely would.  ReLive drivers don't play well with mod BIOS.  It put back the whole signature mess.  DONT USE IT.

If I recall .... its because of this signature signing mess that prompted me to use smOS ..... I sleep better now with Linux.

If you ask me.... I dont even know what the most recent version of the Crimson driver...  Grin

gotta love the relive drivers.. NOT. i have a strap modded 470 and tried the relive 17.1.x driver. installed to the end with NO error message but didnt actually install the driver. like wtf?? least it could of done is toss a message saying "driver not installed".

back to 16.10.x for me.

Sounds good, I will give this one a try this evening.  Thanks for the help guys!
legendary
Activity: 4354
Merit: 3614
what is this "brake pedal" you speak of?
It absolutely would.  ReLive drivers don't play well with mod BIOS.  It put back the whole signature mess.  DONT USE IT.

If I recall .... its because of this signature signing mess that prompted me to use smOS ..... I sleep better now with Linux.

If you ask me.... I dont even know what the most recent version of the Crimson driver...  Grin

gotta love the relive drivers.. NOT. i have a strap modded 470 and tried the relive 17.1.x driver. installed to the end with NO error message but didnt actually install the driver. like wtf?? least it could of done is toss a message saying "driver not installed".

back to 16.10.x for me.
legendary
Activity: 4354
Merit: 3614
what is this "brake pedal" you speak of?
@ vapourminer  - Square D makes a good one too.  I can have this side mounted as well but it would require drilling thru the metal panel which I don't want.  The wires are very short and won't reach top.  My Electrician assures me it's high quality and I don't need to look at it daily.  He said he will be happy to come open the panel and check for me every quarter if it makes me happy.  =) 

@yun9999

chose the square d as my electrician recommended it. it has 2 green "Im OK" lights on it visible from the outside of the case, if either goes off its time to replace the MOVs (which are user replaceable).

my 200 amp panel has no real room in it for an internal unit, so..

i love the whole house suppressor bit as it also protects my 220 v hybrid hot water heater (air sourced heat pump based) and my 220 v geothermal (ground sourced heat pump) system.

[bit off topic follows, unless you will get backup power in your setup)

also put in a 7000 watt generac transfer switch to go with the 8500 watt key start briggs and stratton generator while we were at it. peace of mind is worth it. winter gets COLD here in the northeast with no power. not to mention having A/C in the summer when the power drop out. we have several power failures per year here, some several hours long. thats what we get for living in the sticks.

best part about the generator and transfer switch: my wife can switch the whole thing over safely start to finish with no help needed.. no risk to her or linesman working out on the poles.
sr. member
Activity: 414
Merit: 251
Best to save up for Vega 10.  Less rigs to manage if budget is not an issue.  Less cases to buy / build, MB, CPU, SSD, Mem to buy and build and they should keep their value easily for at least 12-18 months.  Worst case scenario, one would always be able to mine for at least 3 month with the high end cards and then dump them for almost free mining.  This is granted you're able to get them at launch.  This have been true for most high end GPUs.  My entire 6th Rack is reserved for Vega 10.   HBM IS THAT GOOD!  Let's hope it lives up to the hype but so far we're already hearing it's going to kick 1080's butt.  HBM memory unlike normal memories can be tweak to use very very little power without effecting much mining hashpower.

P.S if one is to look carefully to the left of the MSI GPU box rack, you will see an entire array of FURY boxes.  Yes, they also uses HBM memories.  =)
legendary
Activity: 1834
Merit: 1080
---- winter*juvia -----
It absolutely would.  ReLive drivers don't play well with mod BIOS.  It put back the whole signature mess.  DONT USE IT.

If I recall .... its because of this signature signing mess that prompted me to use smOS ..... I sleep better now with Linux.

If you ask me.... I dont even know what the most recent version of the Crimson driver...  Grin
legendary
Activity: 1834
Merit: 1080
---- winter*juvia -----
@ groovy1962 - It really depends.  If you have to use box fans, those will be louder than the fans used on the Server rack rigs.  So it wouldn't matter.  If you're not using box fans, open air is of course much more quiet.  The fans are what makes all the noises so you can easily control that variable.  IF heat is not the issue, I would do Open Air All the way.  If I were to build a long shed, it would be open air 100%.  Cheaper, faster to install, faster to troubleshoot, etc.  Server rack is not Cheap.  Its' only necessary to fixed an issue...................HEAT.  The extra side benefit is SEXINESS.  But it's very expensive and time consuming.

P.S I can't wait for VEGA 10................Going to be such a huge Game changer with those HBM memories!!!  That's the same ridiculous fast memory that allow the slow Nano card to be able to kill everything out there for ZEC at low wattage.  June can't come soon enough, I have 1 entire Rack just saving up for VEGA 10.  It better live up to the hype!  No more GPU shopping for me.  Time to save up for VEGA 10!

My problem in the warehouse is heat. With full on XMR mining now, not so bad. But for "mobility" reasons - my warehouse is on lease so I will need a good strategy to move the gear when I need to; I have been itching to rackmount all of the current rigs (all open cage btw)... after reading yun's project, it finally inspired me to get a few Rosewill's chassis, server racks and start building.

I almost sold off my Nanos to GPUshack a few months ago to buy 470s -- I am so glad I did not do that.

These Nanos with HBM are absolute speedster at stock 150w. For such a small card, its easily the fastest cards I have in the farm for any memory intensive algos like ZEC and ETH.
sr. member
Activity: 414
Merit: 251
It absolutely would.  ReLive drivers don't play well with mod BIOS.  It put back the whole signature mess.  DONT USE IT.
legendary
Activity: 2800
Merit: 1012
Get Paid Crypto To Walk or Drive
@Debitme - First use DDU to uninstall all the Junk from your system.  

http://www.guru3d.com/files-details/display-driver-uninstaller-download.html

Then use drivers that are Pre ReLive version.  Best to use Crimson Edition 16.11.5 Hotfix for RX series.  

More info but best to use DDU
https://www.drivereasy.com/knowledge/fix-code-43-error-device-manager/

I had tried the display drive uninstaller then reinstall drivers. I used the most recent ones and can do the procrss again tomorrow with older srivers, but i wouldnt think that driver version would be the cause of this issue?
sr. member
Activity: 414
Merit: 251
@Debitme - First use DDU to uninstall all the Junk from your system.  

http://www.guru3d.com/files-details/display-driver-uninstaller-download.html

Then use drivers that are Pre ReLive version.  Best to use Crimson Edition 16.11.5 Hotfix for RX series.  

More info but best to use DDU
https://www.drivereasy.com/knowledge/fix-code-43-error-device-manager/
legendary
Activity: 2800
Merit: 1012
Get Paid Crypto To Walk or Drive
Have you had any issues with cards saying, "windows has stopped this device because it has reported problems Code 43"?  I haven't had this before and my new 5 card rig has 3 cards working perfectly, and 2 that keep saying this.  I keep uninstalling and reinstalling drivers and trying them in different order, but nothing seems to fix it.  They are 5 470's Red Devils on a 1000 watt power supply, so the power consumption shouldn't be an issue.

Any ideas or other things to try?
Pages:
Jump to: