Pages:
Author

Topic: Fourth alt coin thread last three got oversized. - page 50. (Read 108938 times)

full member
Activity: 238
Merit: 100
FedEX is out for delivery---wooohoooo.....looking forward to posting results this evening.

Is always Christmas when deliveries on the way  Cheesy

newbie
Activity: 27
Merit: 0
FedEX is out for delivery---wooohoooo.....looking forward to posting results this evening.
full member
Activity: 238
Merit: 100
Phil, Which of these cards is the better in your opinion?
ASUS ROG GeForce GTX 1080 Ti STRIX

MSI NVidia GeForce GTX 1080 Ti Gaming
full member
Activity: 322
Merit: 233

I have found 

70% tdp
+200 core
-200 memory
75% fan

is good but  not every cards  likes the +200 core
 I have also found intensity of 24 in bat file is more stable

How does that 70% TDP translate to actual Watts?
Please consult nvidia-smi or Nvidia Inspector.
I am also interested in optimizing ccminer suite for best Sol/W.
Thanks.

Answer pretty please.

so it all depends on which card you have, each GPU has a different watt value for TDP, so my MSI cards are 285watts rated for tdp, but my latest gigabyte cards are rated 255watts tdp... so look up the specs on your card.. so if your tdp is lets say 250 watts, then you take 250x0.70= actual wattage that it should be running at..
sr. member
Activity: 308
Merit: 250

I have found 

70% tdp
+200 core
-200 memory
75% fan

is good but  not every cards  likes the +200 core
 I have also found intensity of 24 in bat file is more stable

How does that 70% TDP translate to actual Watts?
Please consult nvidia-smi or Nvidia Inspector.
I am also interested in optimizing ccminer suite for best Sol/W.
Thanks.

Answer pretty please.
legendary
Activity: 1498
Merit: 1030
Quick question guys

Is it okay to mix up cards from different companies in one rig? got access to several 570 580s from local stores. ranging from XFX, saphire, gigabyte.. Wouldnt I encounter driver issue if I put all of them in one rig?

 If they all factory clock the same, should be zero issue.

 If the factory clocks or "stock" TDP differ, they'll still work but you have to be extra careful to adjust them seperately to get optimal results.
 Stability should be the same as non-mixed if you run them at stock clocks or close.



 
full member
Activity: 322
Merit: 233
So today i got some good info toward the use of the PCIe 1 to 3 hubs available online...

So over in the miner rig porn pages, there is a fellow who builds a lot of 1 mobo to 8-12 GPU rigs and i just didnt understand how he was managing all of this so i messaged him and this is what he mentioned to me..

It really all comes down to the CPUs that you are using.... for example the Intel's LGA-1156, LGA-1155, and LGA-1150 sockets have a single 16x PCIe port on the CPU. This port can be broken into three subports in the following configurations: 16/0/0, 8/8/0, 8/4/4. Each subport can be down negotiated independently, so it's possible to use configurations such as 4/4/4, 4/1/4, and 8/0/1. It is not however possible to connect more than three PCIe devices to this port without using an external switching chip.

Intel's Platform controller hub has an additional 8x PCIe 2.0 port which can be broken down into 8 individual 1x ports which enables the use of numerous low-bandwidth peripherals and add-in chips. Most motherboards contain a 4x port and up to three 1x ports. These ports often share bandwidth (on motherboards that contain an M.2 connector, this shares bandwidth as well) which means that the 4x port can operate in either 4x mode or 1x mode. If none of the 1x ports are populated, it will operate in 4x mode, if any of the 1x ports are populated it will operate in 1x mode for a total of four 1x connections to the PCH.

So to run 8 GPU's you have to purchase a CPU that supports atleast 8 independent ports, not all CPU's that state having 16 PCIe channels work well with multiple independent configuration, mainly server and more expensive CPU's are the ones that are coded to properly work well with 8 independent channels communicating at one time..

So example... Intel i3 6100, its rated at 16 PCIe channels MAX, 1x16, 2x8, 1x8+2x4
So in theory the x16 slot can be broken into 3 independent channels, then all other ports used in 1x, but many motherboards share the bandwidth between the x8 slot and the x16 slot to save money, so you have to figure out what configuration the motherboard is wired firstly to correctly adjust the settings in the bios for it to run. The motherboards that dont share bandwidth allow you to run 3 x GPU's off the x16 slot in (4/4/4), then 2 additional off the x8 slot in (1/1), then utilize the rest x1 slots as singles.

Most fail to use them correctly, because they try to use CPU's with basic PCH control, which doesnt support anything more than 1 independent channel per port to communicate.

The other reason they fail to use them is the fact they have to view the video on the computer to see what they are doing, so they hook a monitor up to the computer, which depending on your GPU auto changes the port priority. Most AMD cards require minimal x4 to work correctly, while most modern Nvidia cards require x8 to work correctly. So they hook 8 GPU's to the computer, go to boot it up and the PCH auto mandates lets say x8 to the viewing GPU on bootup, what this does on the main x16 slot is forces it into (8/0/1) mode, so the video will show up but not all GPU's will show up on the system or the mobo will fail to boot because your demanding to many channels on a mode that has a null (0) channel. you have to setup the bios from remote access to properly setup 8+ GPU's on modern mobo's

this sounds very likely to be true.

Might get a coouple of those to play with and give it a shot.

Ya i have one on the way, because if i can get 8+ gpu's to run off 1 single mobo... that would cut my cost a lot.. i have ryzen in my main pc, which apparently skimped on pcie lanes.. so it seems none of the configurations would allow more than 6 on them, but some of the i5/i7 chips seem to have the ability of running 8+ if im understand what he is telling me correctly....
full member
Activity: 238
Merit: 100
fyi  zpool has been holding steady reported hashrate for about 4 hours now
full member
Activity: 238
Merit: 100
So today i got some good info toward the use of the PCIe 1 to 3 hubs available online...

So over in the miner rig porn pages, there is a fellow who builds a lot of 1 mobo to 8-12 GPU rigs and i just didnt understand how he was managing all of this so i messaged him and this is what he mentioned to me..

It really all comes down to the CPUs that you are using.... for example the Intel's LGA-1156, LGA-1155, and LGA-1150 sockets have a single 16x PCIe port on the CPU. This port can be broken into three subports in the following configurations: 16/0/0, 8/8/0, 8/4/4. Each subport can be down negotiated independently, so it's possible to use configurations such as 4/4/4, 4/1/4, and 8/0/1. It is not however possible to connect more than three PCIe devices to this port without using an external switching chip.

Intel's Platform controller hub has an additional 8x PCIe 2.0 port which can be broken down into 8 individual 1x ports which enables the use of numerous low-bandwidth peripherals and add-in chips. Most motherboards contain a 4x port and up to three 1x ports. These ports often share bandwidth (on motherboards that contain an M.2 connector, this shares bandwidth as well) which means that the 4x port can operate in either 4x mode or 1x mode. If none of the 1x ports are populated, it will operate in 4x mode, if any of the 1x ports are populated it will operate in 1x mode for a total of four 1x connections to the PCH.

So to run 8 GPU's you have to purchase a CPU that supports atleast 8 independent ports, not all CPU's that state having 16 PCIe channels work well with multiple independent configuration, mainly server and more expensive CPU's are the ones that are coded to properly work well with 8 independent channels communicating at one time..

So example... Intel i3 6100, its rated at 16 PCIe channels MAX, 1x16, 2x8, 1x8+2x4
So in theory the x16 slot can be broken into 3 independent channels, then all other ports used in 1x, but many motherboards share the bandwidth between the x8 slot and the x16 slot to save money, so you have to figure out what configuration the motherboard is wired firstly to correctly adjust the settings in the bios for it to run. The motherboards that dont share bandwidth allow you to run 3 x GPU's off the x16 slot in (4/4/4), then 2 additional off the x8 slot in (1/1), then utilize the rest x1 slots as singles.

Most fail to use them correctly, because they try to use CPU's with basic PCH control, which doesnt support anything more than 1 independent channel per port to communicate.

The other reason they fail to use them is the fact they have to view the video on the computer to see what they are doing, so they hook a monitor up to the computer, which depending on your GPU auto changes the port priority. Most AMD cards require minimal x4 to work correctly, while most modern Nvidia cards require x8 to work correctly. So they hook 8 GPU's to the computer, go to boot it up and the PCH auto mandates lets say x8 to the viewing GPU on bootup, what this does on the main x16 slot is forces it into (8/0/1) mode, so the video will show up but not all GPU's will show up on the system or the mobo will fail to boot because your demanding to many channels on a mode that has a null (0) channel. you have to setup the bios from remote access to properly setup 8+ GPU's on modern mobo's

this sounds very likely to be true.

Might get a coouple of those to play with and give it a shot.
legendary
Activity: 4326
Merit: 8899
'The right to privacy matters'
So today i got some good info toward the use of the PCIe 1 to 3 hubs available online...

So over in the miner rig porn pages, there is a fellow who builds a lot of 1 mobo to 8-12 GPU rigs and i just didnt understand how he was managing all of this so i messaged him and this is what he mentioned to me..

It really all comes down to the CPUs that you are using.... for example the Intel's LGA-1156, LGA-1155, and LGA-1150 sockets have a single 16x PCIe port on the CPU. This port can be broken into three subports in the following configurations: 16/0/0, 8/8/0, 8/4/4. Each subport can be down negotiated independently, so it's possible to use configurations such as 4/4/4, 4/1/4, and 8/0/1. It is not however possible to connect more than three PCIe devices to this port without using an external switching chip.

Intel's Platform controller hub has an additional 8x PCIe 2.0 port which can be broken down into 8 individual 1x ports which enables the use of numerous low-bandwidth peripherals and add-in chips. Most motherboards contain a 4x port and up to three 1x ports. These ports often share bandwidth (on motherboards that contain an M.2 connector, this shares bandwidth as well) which means that the 4x port can operate in either 4x mode or 1x mode. If none of the 1x ports are populated, it will operate in 4x mode, if any of the 1x ports are populated it will operate in 1x mode for a total of four 1x connections to the PCH.

So to run 8 GPU's you have to purchase a CPU that supports atleast 8 independent ports, not all CPU's that state having 16 PCIe channels work well with multiple independent configuration, mainly server and more expensive CPU's are the ones that are coded to properly work well with 8 independent channels communicating at one time..

So example... Intel i3 6100, its rated at 16 PCIe channels MAX, 1x16, 2x8, 1x8+2x4
So in theory the x16 slot can be broken into 3 independent channels, then all other ports used in 1x, but many motherboards share the bandwidth between the x8 slot and the x16 slot to save money, so you have to figure out what configuration the motherboard is wired firstly to correctly adjust the settings in the bios for it to run. The motherboards that dont share bandwidth allow you to run 3 x GPU's off the x16 slot in (4/4/4), then 2 additional off the x8 slot in (1/1), then utilize the rest x1 slots as singles.

Most fail to use them correctly, because they try to use CPU's with basic PCH control, which doesnt support anything more than 1 independent channel per port to communicate.

The other reason they fail to use them is the fact they have to view the video on the computer to see what they are doing, so they hook a monitor up to the computer, which depending on your GPU auto changes the port priority. Most AMD cards require minimal x4 to work correctly, while most modern Nvidia cards require x8 to work correctly. So they hook 8 GPU's to the computer, go to boot it up and the PCH auto mandates lets say x8 to the viewing GPU on bootup, what this does on the main x16 slot is forces it into (8/0/1) mode, so the video will show up but not all GPU's will show up on the system or the mobo will fail to boot because your demanding to many channels on a mode that has a null (0) channel. you have to setup the bios from remote access to properly setup 8+ GPU's on modern mobo's

this sounds very likely to be true.
newbie
Activity: 38
Merit: 0
Quick question guys

Is it okay to mix up cards from different companies in one rig? got access to several 570 580s from local stores. ranging from XFX, saphire, gigabyte.. Wouldnt I encounter driver issue if I put all of them in one rig?

Go for it Smiley

Mixing Nvidia and AMD is even possible, but not usually recommended

so even among 570/580s it would work but not really recommended to mix up?
full member
Activity: 322
Merit: 233
So today i got some good info toward the use of the PCIe 1 to 3 hubs available online...

So over in the miner rig porn pages, there is a fellow who builds a lot of 1 mobo to 8-12 GPU rigs and i just didnt understand how he was managing all of this so i messaged him and this is what he mentioned to me..

It really all comes down to the CPUs that you are using.... for example the Intel's LGA-1156, LGA-1155, and LGA-1150 sockets have a single 16x PCIe port on the CPU. This port can be broken into three subports in the following configurations: 16/0/0, 8/8/0, 8/4/4. Each subport can be down negotiated independently, so it's possible to use configurations such as 4/4/4, 4/1/4, and 8/0/1. It is not however possible to connect more than three PCIe devices to this port without using an external switching chip.

Intel's Platform controller hub has an additional 8x PCIe 2.0 port which can be broken down into 8 individual 1x ports which enables the use of numerous low-bandwidth peripherals and add-in chips. Most motherboards contain a 4x port and up to three 1x ports. These ports often share bandwidth (on motherboards that contain an M.2 connector, this shares bandwidth as well) which means that the 4x port can operate in either 4x mode or 1x mode. If none of the 1x ports are populated, it will operate in 4x mode, if any of the 1x ports are populated it will operate in 1x mode for a total of four 1x connections to the PCH.

So to run 8 GPU's you have to purchase a CPU that supports atleast 8 independent ports, not all CPU's that state having 16 PCIe channels work well with multiple independent configuration, mainly server and more expensive CPU's are the ones that are coded to properly work well with 8 independent channels communicating at one time..

So example... Intel i3 6100, its rated at 16 PCIe channels MAX, 1x16, 2x8, 1x8+2x4
So in theory the x16 slot can be broken into 3 independent channels, then all other ports used in 1x, but many motherboards share the bandwidth between the x8 slot and the x16 slot to save money, so you have to figure out what configuration the motherboard is wired firstly to correctly adjust the settings in the bios for it to run. The motherboards that dont share bandwidth allow you to run 3 x GPU's off the x16 slot in (4/4/4), then 2 additional off the x8 slot in (1/1), then utilize the rest x1 slots as singles.

Most fail to use them correctly, because they try to use CPU's with basic PCH control, which doesnt support anything more than 1 independent channel per port to communicate.

The other reason they fail to use them is the fact they have to view the video on the computer to see what they are doing, so they hook a monitor up to the computer, which depending on your GPU auto changes the port priority. Most AMD cards require minimal x4 to work correctly, while most modern Nvidia cards require x8 to work correctly. So they hook 8 GPU's to the computer, go to boot it up and the PCH auto mandates lets say x8 to the viewing GPU on bootup, what this does on the main x16 slot is forces it into (8/0/1) mode, so the video will show up but not all GPU's will show up on the system or the mobo will fail to boot because your demanding to many channels on a mode that has a null (0) channel. you have to setup the bios from remote access to properly setup 8+ GPU's on modern mobo's
sr. member
Activity: 349
Merit: 250
Quick question guys

Is it okay to mix up cards from different companies in one rig? got access to several 570 580s from local stores. ranging from XFX, saphire, gigabyte.. Wouldnt I encounter driver issue if I put all of them in one rig?

Go for it Smiley

Mixing Nvidia and AMD is even possible, but not usually recommended
newbie
Activity: 38
Merit: 0
Quick question guys

Is it okay to mix up cards from different companies in one rig? got access to several 570 580s from local stores. ranging from XFX, saphire, gigabyte.. Wouldnt I encounter driver issue if I put all of them in one rig?
full member
Activity: 322
Merit: 233

Man that turned out good, but i see you inserted them into the mobo directly over mounting them horizontal in the extender frame...

Well if I mounted them in the extender frame I would need risers, and the extender only holds 2 I think, not sure. If I need risers, I might as well do a riser build and not bother with the Thermaltake frame Wink

I really wish i could a handful of these cases... but the company that makes them has no USA based supplier and to ship one its $159+$166 shipping to Florida for 1 or $318+$290 for 2 of them... just the shipping is unrealistic...


full member
Activity: 350
Merit: 100

Man that turned out good, but i see you inserted them into the mobo directly over mounting them horizontal in the extender frame...

Well if I mounted them in the extender frame I would need risers, and the extender only holds 2 I think, not sure. If I need risers, I might as well do a riser build and not bother with the Thermaltake frame Wink
full member
Activity: 322
Merit: 233

phil would this work with your 2x 1080 TI setup + 1080 mini instead of the plywood? same efficiency?



Man that turned out good, but i see you inserted them into the mobo directly over mounting them horizontal in the extender frame...
full member
Activity: 322
Merit: 233
@Vosk...Are you just posting the webpage address that your pic is on ?

You need to click on your pic and then "copy image location" and paste that into IMG tags.

Code:
 [img]https://i.imgur.com/QZs1bOn.png[/img]

doing this. thanks though

@flminer...read my post #931 in this thread to fix your photo display problem Smiley

Think I just need to figure out what size to adjust to so it will be accepted. Saw Phil suggest this to someone so keep fiddling with the size till I get it. Wink

Appreciate the info anyways.



No problem with your photo size. see : Smiley







Quote
If you dont place an intensity in the batch file, what does it default to in the miner?

Read the text when ccminer first starts up, it will say what the intensity setting is. Default is around 27-29 which is too much for my computer.




not sure what the deal is then cause I did the insert picture button on one and typed it in on the other like you said. Just don't want my pics I guess  Huh

Had dropped mine intensity down to 25 when I was having problems with miner closing unexpectedly. Then figured out I was using too much of my cpu for mining and choking down the pcie lanes. Have stopped cpu mining all together now and have intensity back up to 27 and slowly raising it to see how high I can get before crashing. With 5 1080 ti and 1 1080 changing one intensity raises hashrate by 150 mhs on skein

Edit: just saw this at bottom of page....

http://imgur.com/a/Sni5B





I dont understand what or how your using your CPU to mine.... my computer peaks maybe 5% at most on the cpus when it is sitting idle mining away...
full member
Activity: 238
Merit: 100
Up to 28 intensity now so far so good.
http://imgur.com/a/jbjfM
legendary
Activity: 1498
Merit: 1030
@philipma1957 or anyone else with 1070 experience

I'm building a 7x1070 riser rig and have GPU choice down to these two cards at roughly the same price (G1 about $15US/card more than Mini):

Gigabyte GTX 1070 Mini ITX OC
Gigabyte GTX 1070 G1 Gaming 8G

They both look like they have decent thermal management and samsung ram (going by reviews) but I wonder what other pros/cons I should consider?

 G1 clocks higher, and should have a 180 watt TDP vs 155 on the ITX.

 I like my ITX cards (just put another one on order yesterday intended to replace a 950), but they mostly shine in situations where you NEED a short card for thermal management - I don't recommend them for riser builds as you can get more performance out of full-length cards.



Pages:
Jump to: