Author

Topic: Hardware suggestions required (Read 1448 times)

legendary
Activity: 1498
Merit: 1030
May 26, 2016, 01:53:24 AM
#18
Ah, but that was a rare case of the parts BEING a deal - I'm pretty sure Newegg just wanted to blow the things out 'cause they were wasting warehouse space.

 I've had too many issues with eBay to trust them - starting with their INSISTANCE on forcing PayRipoffPal down your throat.

 Did some digging around on Amazon, and found the prices on the components were just as high or higher - it appears that used R9 series are going back up again with the Ethereum surge.

 *shrug*

 I did the due dilligance well enough to be comfortable that I saved a fair bit of money over trying to assemble from parts - the used parts market near me is SMALL on anything reasonably current, and too many of the FEW sellers tend to want way too much (as an example, I've got local sellers trying to sell USED GTX 970s for less than $20 less than Newegg wants for NEW ones - those folks in particular are in for a rude shock in a couple of days).


 One of the things I actually miss about spending most of my life in-or-near a major city is the used market and such was a LOT bigger.


 Once I move, I get to start looking at doing open-case builds - probably. No roomate with a cat any more (I like the cat, but it DOES like to jump on and chew on anything resembling a cord that's bouncing around in the air, and it loves to attack fans that aren't enclosed somehow).

legendary
Activity: 1456
Merit: 1000
May 25, 2016, 04:13:29 PM
#17
Right now is a good time to WAIT on any NVidia GTC 9xx purchace - the GTX 1080 is due out FRIDAY, and that should put some serious price pressure on the high-end and mid-range current 9xx series - probably won't affect the 950 and little if any effect on the 960 though.

 Then June 10 comes the 1070 - and THAT should have some trickle-down effect on the 960 as well, through again the 950 probably won't notice much if at all.


 Forget about the http://www.newegg.com/Product/Product.aspx?Item=59-258-006 triple-280X system though, I appear to have gotten Newegg's last one. 8-)


 $800 was too cheap to pass up given the components - 3x$200ish vid cards, $200+ power supply ALONE - then add in the motherboard with 3xPCI-E 16 slots (among others), HD, RAM, and a BIG high-end case. Worst situation, I can probably part out the thing for $1000+ anytime in the next couple months - but I'd be more inclined to swap cards around, drop my trio of 7xxx series at some point, and put 3+ GTX 1080s or 1070s in this thing eventually.

 Up side for me - my existing X240s should work in that machine when I have motherboards start dying, so I can kick that Semperon 145 to the door - or swap CPUs around before that if I end up Folding on that machine eventually. Not sure about the 5050e or 4850e cpus I have, depends on which MB it is - but I've got enough X240s that shouldn't matter.



Another reason I would forget about a setup like that is I really like open air setups for mining.   If done right you can have air pushing through your cards much easier then if they inside of a PC case.

I am lucky that I had a decent amount of the parts still from LTC mining.  I had kept pretty much everything except GPU's.   I had sold them on ebay quite a while ago.   But I think you can find "deals" on individual parts in most cases and come in cheaper then that pre-made one.
legendary
Activity: 1498
Merit: 1030
May 25, 2016, 02:05:34 AM
#16
Right now is a good time to WAIT on any NVidia GTC 9xx purchace - the GTX 1080 is due out FRIDAY, and that should put some serious price pressure on the high-end and mid-range current 9xx series - probably won't affect the 950 and little if any effect on the 960 though.

 Then June 10 comes the 1070 - and THAT should have some trickle-down effect on the 960 as well, through again the 950 probably won't notice much if at all.


 Forget about the http://www.newegg.com/Product/Product.aspx?Item=59-258-006 triple-280X system though, I appear to have gotten Newegg's last one. 8-)


 $800 was too cheap to pass up given the components - 3x$200ish vid cards, $200+ power supply ALONE - then add in the motherboard with 3xPCI-E 16 slots (among others), HD, RAM, and a BIG high-end case. Worst situation, I can probably part out the thing for $1000+ anytime in the next couple months - but I'd be more inclined to swap cards around, drop my trio of 7xxx series at some point, and put 3+ GTX 1080s or 1070s in this thing eventually.

 Up side for me - my existing X240s should work in that machine when I have motherboards start dying, so I can kick that Semperon 145 to the door - or swap CPUs around before that if I end up Folding on that machine eventually. Not sure about the 5050e or 4850e cpus I have, depends on which MB it is - but I've got enough X240s that shouldn't matter.

legendary
Activity: 3248
Merit: 1070
May 25, 2016, 12:53:16 AM
#15
you can buy second hand card with nvidia(i can find g1 gaming 970 for $270, already shipped,m and for other model you can probably have a lower price, much lower) and they would be equal to amd, in that case, so not much of a difference there

but the only good gpu it 970 for nvidia, instead for amd you have more choice
legendary
Activity: 1498
Merit: 1030
May 24, 2016, 01:14:25 AM
#14

Since AMD and Nvidia also have their own specialties, would it be possible to run a rig that has both brands of cards?


 It's possible, but it's a bit of a pain to get set up and working with both type GPUs in the same system at the same time.

 I've got a couple systems running an AMD APU with NVidia cards, but the APU is set up to do something else (generally RC5-72) while the NVidia cards are doing the heavy lifting on the main thing the machines are doing (ETH mining for now, Folding@Home eventually).

 ASIC only blow GPUs out of the water on SHA256 (Bitcoin and alts), Scrypt (Litecoin Doge and alts), and starting to on X11 (Dash and alts) but not many in the wild on X11 yet.

Other algorythms are either GPU-based or CPU-based, and unlikely to attract ASIC attention due to relatively low market cap or other factors.

Ethereum won't be mineable long enough to make the development time AND COST worthwhile for someone to create an ASIC for it (the plan is for it to go Proof of Stake sometime in 2017) - though in theory I could see someone setting up a FPGA to mine Ethereum, I don't think any commonly-available FPGAs have enough RAM to work.

 NVidia can match AMD very closely on a Hash/W basis for ETH mining - but the cards to do so will be more expensive for the same hash rate, AMD wins hands down on hash/$. Only realistic reason to go NVidia for Ethereum mining is if you have some other usage after ETH becomes unprofitable where NVidia is significantly better than AMD, *OR* you already HAVE the NVidia cards to use.

legendary
Activity: 1456
Merit: 1000
May 23, 2016, 08:22:00 PM
#13
Right, after doing some more research, it seems that GPU mining is rather pointless, given that ASICs completely blow GPUs out of the water. I originally thought ASICs were only just more power efficient, helping people cover their electricity costs a little better.

I think after spending some cash on a good gaming rig, I'll use the remainder to grab some ASICs instead.

Assuming Bitcoin is still the go-to cryptocurrency, it's time for me to browse some threads!

You research must have missed the part about different algo's.  Asics do blow GPU's out of water on algo's they can do, such as in bitcoin sha256.  But there are coins like Etherium that do not have asics.  So GPU's remain very profitable, currently they are a good investment in a lot of cases.

But right now Eth is pretty much the only GPU coin worth mining.  But I suggest doing a little more reading on different algo's. 
newbie
Activity: 5
Merit: 0
May 23, 2016, 07:35:12 PM
#12
Right, after doing some more research, it seems that GPU mining is rather pointless, given that ASICs completely blow GPUs out of the water. I originally thought ASICs were only just more power efficient, helping people cover their electricity costs a little better.

I think after spending some cash on a good gaming rig, I'll use the remainder to grab some ASICs instead.

Assuming Bitcoin is still the go-to cryptocurrency, it's time for me to browse some threads!
legendary
Activity: 1302
Merit: 1068
May 23, 2016, 06:26:46 PM
#11
Nvidia is good at some things (They rock Folding), they are poor at others (they do NOT rock Ethereum mining on a hash/$ basis though they CAN compete on hash/watt), and they absolutely SUCK compared to AMD at RC5-72.

 You have to look at SPECIFICS before you can decide which way to go.

 TFlops is a totally artificial rating that has very little Real World value as a comparative tool.


Just to be clear. For Eth, 380 is equivalent to 970, but cheaper. With AMD you can get more hash per cards for lower efficiency, still at better price... or better efficiency than Nvidia at less hash per card. So all in all, its easy to say Nvidia does not have it place in Eth mining currently.

So instead of trying to get your magic car that does everything. Get a plane, get a car, get a bicycle, use the one you need for the right task. The right tool for the right job.

Okay, makes perfect sense. Personal use stuff, separate. Computing applications, separate. Could then do W10 for personal and Linux for computing. Since AMD and Nvidia also have their own specialties, would it be possible to run a rig that has both brands of cards? I know that for obvious reasons, SLI and crossfire don't mix, but neither have any use in these fields anyway. How about simply running as standalone cards? Should theoretically work, no? Just need drivers for both brands. That would then allow me to dedicate AMD cards to mining, and Nvidia cards to Folding. Or would I need to dedicate one system to AMD, one to Nvidia, and one for personal?

Glad you understand. W10 can work for a mining rig, but its not good for serial deployments. With Linux you can just setup an image then slap it on the next HDD/SSD when you build a new rig.

Mixing AMD and Nvidia is possible, but again, you're creating "engineering problems" with your "car". Definitively possible, though. However i don't know if you would reach best stability.

Best do A separate Nvidia rig. Thats what i do, though in 2-3 years i have no been in a situation where building a Nvidia rig was better. However, for a personal PC, on windows, i want to get something like 1080 TI for gaming, but mining wise, nvidia is generally lacking, sadly.
newbie
Activity: 5
Merit: 0
May 23, 2016, 07:45:48 AM
#10
So instead of trying to get your magic car that does everything. Get a plane, get a car, get a bicycle, use the one you need for the right task. The right tool for the right job.

Okay, makes perfect sense. Personal use stuff, separate. Computing applications, separate. Could then do W10 for personal and Linux for computing. Since AMD and Nvidia also have their own specialties, would it be possible to run a rig that has both brands of cards? I know that for obvious reasons, SLI and crossfire don't mix, but neither have any use in these fields anyway. How about simply running as standalone cards? Should theoretically work, no? Just need drivers for both brands. That would then allow me to dedicate AMD cards to mining, and Nvidia cards to Folding. Or would I need to dedicate one system to AMD, one to Nvidia, and one for personal?
legendary
Activity: 1498
Merit: 1030
May 23, 2016, 04:12:04 AM
#9
Nvidia is good at some things (They rock Folding), they are poor at others (they do NOT rock Ethereum mining on a hash/$ basis though they CAN compete on hash/watt), and they absolutely SUCK compared to AMD at RC5-72.

 You have to look at SPECIFICS before you can decide which way to go.

 TFlops is a totally artificial rating that has very little Real World value as a comparative tool.
legendary
Activity: 1302
Merit: 1068
May 22, 2016, 11:51:48 PM
#8
You're confusing things. Mining speed isint = TFLOP. There's software/hardware interaction thing, Nvidia suck at OpenCL.

Now the thing is, you seem to not even have picked a road to go.

I'm going to have to use an analogy, since it doesnt seem what i'm saying is getting through to you.

You're thinking of getting a car with everything, You want a standard sized car, but now you want to add seats. You want to add an electric engine. You want to add solar power. You want it to run on waters, you also want it to go on roads, in air.

Right now, you want something that is going to do everything. You want a dinosaur on a laser on a rocket shark. http://azeroth.metblogs.com/files/2010/08/undeadonraptoronsharkwithfrickinlaserbeams.jpg

There's tons of issues that come with wanting to do what you want to do. For instance, you think Nvidia is better, but you say you want to mine. Do you want most TFLOP, or do you want a machine that will give you the highest Hash per watt?

You need to decide what you want to mine in the first place. If you want to mine right now, you need AMD.

You dont use splitters because it bring more costs and hardware issues. It bring engineering issues, when you're adding things to your "car". For starters, You seem to want Windows, but Windows support limited amount of GPU. Also each rig require a license, so mining rigs are mass deployed on Linux. Although W10 does support 6 GPU.

So do you want Windows or Linux?

So i'll tell you again. Get yourself a workstation. Get yourself something else for mining. You can mine on your PC/Workstation, but whats going to make a good workstation isint going to make a great mining rig.

So instead of trying to get your magic car that does everything. Get a plane, get a car, get a bicycle, use the one you need for the right task. The right tool for the right job.

Right now you're trying to imagine a rig and spend something like 8000$USD on some monster machine that's going to be improper and an engineering nightmare, instead of getting 2-3 PC for half the price that will each do their designated task better than your Omni PC.
newbie
Activity: 5
Merit: 0
May 22, 2016, 06:54:44 PM
#7
Quote
-You can get much better efficiency... 50% more hash/w or so with which GPU you pick. If you want to buy a monster PSU to target at 50% get an extra 1-2% power saved... thats a bad idea, but you should go Sidehack's way. You'd probably need ... a 2000 or those 2880watt PSU if you want to run 380 or 390 at 50%, for marginal gain.

The efficiency at 80%-90% load are just fine IMO.

Right, I'll keep that in mind. So that then brings me back to the mainstream vs server PSUs. Since a server PSU was suggested, I have to ask where one would find such high wattage units for such low prices. $150 for a 1200w server PSU at platinum efficiency seems too good to be true. Newegg definitely doesn't sell server PSUs at such good prices.

Quote
-I can't tell you anything about the Pascal Cards but currently mining with Nvidia is not efficient nor cost effective. The Flavor of the day is Ethereum. The workstation cards are not mining cards. I don't know about the multi thousands USD cards currently but the old gen one don't work out versus their standard GPU counterpart.

Huh...I thought Nvidia was currently leading in the FLOPS per watt field. The R9 390s certainly weren't power efficient.

Quote
-Yes they have enough PCI-e slot, look them up yourself, that is what pretty much everyone would use for 6 slots.
https://pcpartpicker.com/part/asrock-motherboard-h81probtc

Any 1151 boards? Had originally planned for a Skylake. Since I plan to run DC projects as well, some GPU applications require a combination of CPU and GPU to run.

Quote
-I would not setup a mining rig and use it for everyday things for several reason. Noise, Temps, load, stability and OS. You're talking about getting more than 1 rigs worth of GPU, so i strongly recommend you decide what you want to do and stick to it, rather than do everything at the same time together and spend 10k on Hardware for not very good throughput.

I had planned on water cooling, and my building has a powerful HVAC system that can keep temperatures at 14 degrees. That said, I suppose stability might be the limiting factor for me, so I should probably take that into consideration.

EDIT: Also, no thoughts on that PCI-e splitter I linked in the first post? If that AsRock could expand GPU capacity from 6 to 9 or 12, it would make management easier by only requiring one dedicated system (forget my personal use, I'm more convinced now that it should be separate). Plus, without the need for additional mobos, CPUs, drives and RAM, it should come out to about the same price as the purchase cost of the splitter.
legendary
Activity: 4256
Merit: 8551
'The right to privacy matters'
May 22, 2016, 10:10:33 AM
#6
Righty, I've been looking into dropping $10k on a custom project, and I wanted some advice. Now I'm going to come clean and say that the original purpose of this system is for GPU distributed computing applications, but I may dabble in mining as well at some point, since I have free electricity. Since I assume mining and DC are more or less identical when it comes to the execution of hardware setups, you guys would be the top experts on these matters.

First off, my limitations are regarding how much power my home's circuits can take, so despite free electricity, I still require high efficiency. It would be the difference between, say, 10 GPUs, or 11. Now, after browsing a few random threads here, I see that there are plenty of components that won't be found in a mainstream hardware store, such as the flexible risers using USB 3.0 cables, so I assume I'll come across hardware I may not have seen before.

At present, I'm looking at using two gaming motherboards with 6 PCI-e slots each (a combined total from x1 and x16 slots). Is there a better way? Can I use a single board to handle every GPU in some way, essentially having more GPUs than there are PCI-e slots? I've heard of PCI-e backplanes, but to my understanding, they require special host boards, and the costs involved with that would basically leave me with no cash to actually put a single GPU in. I've seen this crop up in discussions before. Does anyone have experience with them, and if they actually work for purposes such as this? Last I heard, they were around the $200 mark, so I figured one board with two of these would be cheaper than two boards, and the required CPUs and RAM. Plus, it would be more power efficient.

Now speaking of power, at this moment, the plan I have is to grab four of those 1600 watt titanium efficiency PSUs from EVGA. They are a little over $400 each, but are the most efficient mainstream PSUs to date. Keyword here is 'mainstream'. If there's specialised hardware lurking on this side of the hardware world that can output similar amounts of efficient power, and do it for the same or lower price, please do point me in the right direction.

Also, since I mentioned the risers earlier, I do have a question regarding their usage. What's the maximum length of the cable used? As I mentioned, this is a very custom project, and the entire system will be wall-mounted, so some of the GPUs may be located a little further from the mobo than they normally are in GPU farms. What are the performance limitations here, and are there any additional precautions that have to be taken?

do not spend a dime  without just going to this guy's thread.


https://bitcointalksearch.org/topic/post-pics-of-your-custom-gpu-rig-frames-1470700


you could get 4 rigs of 6 cards from him for your 10k.

you would have over 520mh in eth coin  I believe he would discount you


many other good ideas in this thread.  BTW your thread should be in Altcoin  since gpu's are for Alt coins.
legendary
Activity: 1302
Merit: 1068
May 21, 2016, 08:48:22 PM
#5
What about dedicated 12v power supplies? Not necessarily designed for computers, but any supply that can just output nothing but 12v. Since PCI-e power cables are just a bunch of 12v and ground cables, it should theoretically work, right? Then just have a standard ATX supply for the mobo, drives, etc.

If not, do you suggest any particular sites to purchase those sorts of server supplies from? Newegg doesn't show efficiency in its power search for server supplies.

The cable that the PSU come with will support the PSU and the risers connectors just fine.

I was referring to the USB cable. Power transmission should be fine over long distances, as long as the cables aren't overly thin and cause voltage drop.

Quote
Get an EVGA P2 1200 or an 1300 G2 max, just use more power efficient GPUs. 1600's cost are exorbitant. The only way you'd need a 1600 was if you went for 6x 290 or such. You could look into 380, that would fit well. 370 are more power efficient still, however.

My intention was to try to run each PSU at roughly half load for maximum efficiency, hence why it might seem a little excessive. I'm also looking at the upcoming Pascal cards. I don't mind snatching up 8 to 10 1080s. Though one issue I have is that these are gaming cards. Aren't there cards more optimised for pure processing? Workstation cards, right? Though those are more expensive than they're worth, right?

Quote
Mobo... did you say 200$? You're looking at the wrong boards. You dont need anything fancy here. 60$ Asrock H81 or H97 etc is fine here. Cheapest CPU, cheapest RAM.

Do they have enough PCI-e slots?

Also, if possible, I'd prefer this to be all in one, used both for computing, as well as personal use. Should be possible, right?

-You can get much better efficiency... 50% more hash/w or so with which GPU you pick. If you want to buy a monster PSU to target at 50% get an extra 1-2% power saved... thats a bad idea, but you should go Sidehack's way. You'd probably need ... a 2000 or those 2880watt PSU if you want to run 380 or 390 at 50%, for marginal gain.

The efficiency at 80%-90% load are just fine IMO.

-I'd deploy more rigs that are more efficient instead of doing that. I'd get G2 1000 and go with low TDP cards long before going your route. You can get 1200 P2 if you want the little better efficiency and you will need 240V for the best efficiency. But i don't think you will recoup the 1200 P2's extra cost on at typical prices.

-USB cables don't transmit the power, they would melt if you tried.

-I can't tell you anything about the Pascal Cards but currently mining with Nvidia is not efficient nor cost effective. The Flavor of the day is Ethereum. The workstation cards are not mining cards. I don't know about the multi thousands USD cards currently but the old gen one don't work out versus their standard GPU counterpart.

-Yes they have enough PCI-e slot, look them up yourself, that is what pretty much everyone would use for 6 slots.
https://pcpartpicker.com/part/asrock-motherboard-h81probtc

-I would not setup a mining rig and use it for everyday things for several reason. Noise, Temps, load, stability and OS. You're talking about getting more than 1 rigs worth of GPU, so i strongly recommend you decide what you want to do and stick to it, rather than do everything at the same time together and spend 10k on Hardware for not very good throughput.

I'd decide what i want to invest now, for a per one mining rig and see what the best solution for the moment is. Then decide what i want for a PC and not mix.
newbie
Activity: 5
Merit: 0
May 21, 2016, 06:10:48 PM
#4
What about dedicated 12v power supplies? Not necessarily designed for computers, but any supply that can just output nothing but 12v. Since PCI-e power cables are just a bunch of 12v and ground cables, it should theoretically work, right? Then just have a standard ATX supply for the mobo, drives, etc.

If not, do you suggest any particular sites to purchase those sorts of server supplies from? Newegg doesn't show efficiency in its power search for server supplies.

The cable that the PSU come with will support the PSU and the risers connectors just fine.

I was referring to the USB cable. Power transmission should be fine over long distances, as long as the cables aren't overly thin and cause voltage drop.

Quote
Get an EVGA P2 1200 or an 1300 G2 max, just use more power efficient GPUs. 1600's cost are exorbitant. The only way you'd need a 1600 was if you went for 6x 290 or such. You could look into 380, that would fit well. 370 are more power efficient still, however.

My intention was to try to run each PSU at roughly half load for maximum efficiency, hence why it might seem a little excessive. I'm also looking at the upcoming Pascal cards. I don't mind snatching up 8 to 10 1080s. Though one issue I have is that these are gaming cards. Aren't there cards more optimised for pure processing? Workstation cards, right? Though those are more expensive than they're worth, right?

Quote
Mobo... did you say 200$? You're looking at the wrong boards. You dont need anything fancy here. 60$ Asrock H81 or H97 etc is fine here. Cheapest CPU, cheapest RAM.

Do they have enough PCI-e slots?

Also, if possible, I'd prefer this to be all in one, used both for computing, as well as personal use. Should be possible, right?
legendary
Activity: 3374
Merit: 1859
Curmudgeonly hardware guy
May 21, 2016, 05:42:15 PM
#3
You could probably get a 1200W platinum-rated server PSU with breakout board and custom cabling for $150 or less. A good board would allow you to turn on the PSU from an external signal, so you could use a small cheap ATX for the motherboard and big PSUs for additional 12V rails. You could also use a picoPSU for your motherboard driven directly off the server supply.
legendary
Activity: 1302
Merit: 1068
May 21, 2016, 04:52:32 PM
#2
The cable that the PSU come with will support the PSU and the risers connectors just fine.

Risers are on sale everywhere. Get the powered usb ones.

Get an EVGA P2 1200 or an 1300 G2 max, just use more power efficient GPUs. 1600's cost are exorbitant. The only way you'd need a 1600 was if you went for 6x 290 or such. You could look into 380, that would fit well. 370 are more power efficient still, however.

Mobo... did you say 200$? You're looking at the wrong boards. You dont need anything fancy here. 60$ Asrock H81 or H97 etc is fine here. Cheapest CPU, cheapest RAM.
newbie
Activity: 5
Merit: 0
May 21, 2016, 04:39:26 PM
#1
Righty, I've been looking into dropping $10k on a custom project, and I wanted some advice. Now I'm going to come clean and say that the original purpose of this system is for GPU distributed computing applications, but I may dabble in mining as well at some point, since I have free electricity. Since I assume mining and DC are more or less identical when it comes to the execution of hardware setups, you guys would be the top experts on these matters.

First off, my limitations are regarding how much power my home's circuits can take, so despite free electricity, I still require high efficiency. It would be the difference between, say, 10 GPUs, or 11. Now, after browsing a few random threads here, I see that there are plenty of components that won't be found in a mainstream hardware store, such as the flexible risers using USB 3.0 cables, so I assume I'll come across hardware I may not have seen before.

At present, I'm looking at using two gaming motherboards with 6 PCI-e slots each (a combined total from x1 and x16 slots). Is there a better way? Can I use a single board to handle every GPU in some way, essentially having more GPUs than there are PCI-e slots? I've heard of PCI-e backplanes, but to my understanding, they require special host boards, and the costs involved with that would basically leave me with no cash to actually put a single GPU in. I've seen this crop up in discussions before. Does anyone have experience with them, and if they actually work for purposes such as this? Last I heard, they were around the $200 mark, so I figured one board with two of these would be cheaper than two boards, and the required CPUs and RAM. Plus, it would be more power efficient.

Now speaking of power, at this moment, the plan I have is to grab four of those 1600 watt titanium efficiency PSUs from EVGA. They are a little over $400 each, but are the most efficient mainstream PSUs to date. Keyword here is 'mainstream'. If there's specialised hardware lurking on this side of the hardware world that can output similar amounts of efficient power, and do it for the same or lower price, please do point me in the right direction.

Also, since I mentioned the risers earlier, I do have a question regarding their usage. What's the maximum length of the cable used? As I mentioned, this is a very custom project, and the entire system will be wall-mounted, so some of the GPUs may be located a little further from the mobo than they normally are in GPU farms. What are the performance limitations here, and are there any additional precautions that have to be taken?
Jump to: