Pages:
Author

Topic: BAMT - Easy persistent USB key based linux for dedicated miners/mining farms - page 13. (Read 167479 times)

hero member
Activity: 518
Merit: 500
Did you take now an bit more actual kernel? Would be nice to see 3.2.x Kernel + display driver 12.1 + sdk 2.5. The 12.1 driver also support kernel 3.3 but 3.2 is stable and fast.

Pro:
- More network chipsets are supported
- Much more wifi chipsets are supported
- less problems with actual mainboard chipsets
- better speed on usb3 ports (many use usb-drives and dont dd the image just on an ssd/hdd drive)
- better support for actual 6gbit sata chipsets. Some just work with more recent kernel
- less problems and better speed on sandy bridge
- ...

Cons:
- you have about 15minutes of work (normaly)


I am myself kernel developer of an linux distribution.


Greetings

PS: I defenetly prefere the 64bit version. I would like to use 8 and when possible 12 or more gpu's on my rig. At the moment i realy stay at the problem that i can use just 7gpu's and have to take more boards + ram + cpu ...
I have no initial problem with using any given linux version if it is proven reliable and does not negatively effect performance of mining.   As BAMT is a very special purpose Linux with only one real function, all choices are made based on how they effect that single role.

BAMT is almost exclusively used on USB keys, although it can be used from HDD.  Using a HDD is wasteful of power, using SSD for a mining rig would just be silly.  So.. support for SATA is not important.  I am not aware of anyone reporting their network hardware isn't supported by the current kernel, nor am I aware of any issues with motherboard support that have been attributed to the kernel, but who knows.

For the mining software, it seems to be general consensus that ATI driver 11.6 and SDK 2.4 is best fit/performance for a majority of GPUs.  Mining performance with 12.x drivers suffers, as it does with newer SDK.  Some cards like SDK 2.1, but its only a few from my impression.

If this is incorrect, someone please point to reliable evidence of the contrary.  I am happy to put whatever versions of driver/sdk/etc, I just try to guess what will be best for the most.  Of course you can install any alternate driver or SDK you prefer at any time anyway.



The new kernel network part have been optimized. On new AND old network cards you should be able to see an improvement of the work that is been done in time X. Compare that time X with the actual 2.6.32 kernel that is been used and you should see the improvement. Not only the network optimization would add more speed. There are other parts that will add more speed and stability.
I checked what version the 11.6 driver supports. It added support for 2.6.39 and 3.0.
With the actual 3.0.21 (or as actual als possible from the debian packages) you should get the most out of the combination of 11.6 driver + 2.4 sdk.

And this would maybe also fix the problems on 64bit!

Would be nice when you check it out shortly.

Thanks a lot!

I think this is BS. Improvement in network speed ? LOL. 

2.6.32 is MUCH more stable than 3.0 kernel and for mining, stability is essential and tested software is crucial. 3.0 is shaky ground.

If you don't like this policy then feel free to make your own distro and share with us and also maintain it while working for almost nothing Wink

Kernel 3.0 for mining ? You must be seriously be joking ...

hero member
Activity: 616
Merit: 506
Current they don't I think that is why he said "when possible".  There is no technical reason under 64bit drivers (and its petabyte sized VM space) that a system couldn't support 9+ GPUs.

Missed that. But I doubt AMD would lift this limit in the foreseeable future, people using more than 2-4 GPUs are already a niche.

GPGPU is becoming more common in analytical servers and supercomputer nodes.  Granted niche markets hopefully the demands of industrial GPGPU users will force AMD to separate drivers from xorg so xserver isn't even needed.

Yes. This would be HEAVEN if true. Why have xserver active ? It is just dumb programming by AMD bastards. Nvidia or this would make my day.

AFAIK there even seem to be people on the official AMD forums demanding this ( something about Ubuntu 12.04 not using xserver or something like that ).

Maybe the luck is with us and AMD removes this stupid "feature" or Nvidia Kepler are excellent miners.

dropping X would be nice, but some X is much better than other..  when I first created BAMT I experimented with various ways to provide X and a window manager, or just X, etc.

Gnome = ruined for mining, kills as much as 10% of hash on gpu0 and hurts others as well just by running.
Tried a few of the lightweight WMs and also bare X.  The one I ended up using, LXDE had no measurable loss compared to bare X.  The others did, although I must admit I didn't spend a lot of time tweaking anything once i noticed LXDE was essentially 0 cost.


hero member
Activity: 518
Merit: 500
Current they don't I think that is why he said "when possible".  There is no technical reason under 64bit drivers (and its petabyte sized VM space) that a system couldn't support 9+ GPUs.

Missed that. But I doubt AMD would lift this limit in the foreseeable future, people using more than 2-4 GPUs are already a niche.

GPGPU is becoming more common in analytical servers and supercomputer nodes.  Granted niche markets hopefully the demands of industrial GPGPU users will force AMD to separate drivers from xorg so xserver isn't even needed.

Yes. This would be HEAVEN if true. Why have xserver active ? It is just dumb programming by AMD bastards. Nvidia or this would make my day.

AFAIK there even seem to be people on the official AMD forums demanding this ( something about Ubuntu 12.04 not using xserver or something like that ).

Maybe the luck is with us and AMD removes this stupid "feature" or Nvidia Kepler are excellent miners.
hero member
Activity: 616
Merit: 506
Current they don't I think that is why he said "when possible".  There is no technical reason under 64bit drivers (and its petabyte sized VM space) that a system couldn't support 9+ GPUs.

Missed that. But I doubt AMD would lift this limit in the foreseeable future, people using more than 2-4 GPUs are already a niche.

I am pretty sure there is no technical reason the 32bit driver cannot support 8 GPUs too.. it runs 7 fine after all.  But who knows, modern PCs are complicated beasts.

In any case.. I suspect that what works and doesn't is down to some amount of luck, maybe more than technical reasons.  Read somewhere that ATI only supports 3 or 4 GPUs technically, and anything more is not something they care to be officially involved with supporting.

If only the 32 bit driver would support that 8th GPU, I could make the people who don't want 64 happy and make the people who do want 8 GPUs happy, and everything would be happy in the secret BAMT laboratory.
But, its been bugged since at least 10.12 or 11.1, the end of 10/beginning 11 and still bugged in 12.1.  So... not going to see that get fixed any time soon i think.
donator
Activity: 1218
Merit: 1079
Gerald Davis
Current they don't I think that is why he said "when possible".  There is no technical reason under 64bit drivers (and its petabyte sized VM space) that a system couldn't support 9+ GPUs.

Missed that. But I doubt AMD would lift this limit in the foreseeable future, people using more than 2-4 GPUs are already a niche.

GPGPU is becoming more common in analytical servers and supercomputer nodes.  Granted niche markets hopefully the demands of industrial GPGPU users will force AMD to separate drivers from xorg so xserver isn't even needed.
legendary
Activity: 3472
Merit: 1724
Current they don't I think that is why he said "when possible".  There is no technical reason under 64bit drivers (and its petabyte sized VM space) that a system couldn't support 9+ GPUs.

Missed that. But I doubt AMD would lift this limit in the foreseeable future, people using more than 2-4 GPUs are already a niche.
donator
Activity: 798
Merit: 500
For the mining software, it seems to be general consensus that ATI driver 11.6 and SDK 2.4 is best fit/performance for a majority of GPUs.  Mining performance with 12.x drivers suffers, as it does with newer SDK.  Some cards like SDK 2.1, but its only a few from my impression.

If this is incorrect, someone please point to reliable evidence of the contrary.  I am happy to put whatever versions of driver/sdk/etc, I just try to guess what will be best for the most.  Of course you can install any alternate driver or SDK you prefer at any time anyway.

I agree across all generations of GPU 11.6 & 2.4 tends to be a good compromise.  5970s seem to like 2.1 but it is a small increase and 6000 series cards can't run it.

I can see an issue comming up with more 7000 series cards hitting mining rigs, but I've run this on 5 different model 5000, and 6000 cards with no issues.  So I would say if it ain't broke, don't fix it.
hero member
Activity: 616
Merit: 506
Sunday night a rig I was running for a very long time went down. I was tired and not in the mood to reinstall 11.04, ati driver, opencl, a miner...

Then I think about BAMT... It was my first experience with this project. I started the download and go for a quick shower and snack downstair. After that, I copied the image on an 8G flash drive by following instructions on the website.

sudo dd if=bamt_v0.4b.img of=/dev/sdc bs=4096 conv=notrunc,noerror

I logged into BAMT
ssh [email protected]  //password is changeme
passwd //set a new password
vi /etc/bamt/bamt.conf
vi /etc/bamt/pools
/etc/init.d/mine restart

After a few seconds, I started to hear the fans going faster! Simply WOW. It worked the first time. Thank you BAMT !!!

good to hear.

be sure to run the fixer:

(as root, always be root)

/opt/bamt/fixer

and hit 1 or enter a lot.  maybe even read the notes, if you're bored.   several important bugs and many nice new features in the fix line.
donator
Activity: 1218
Merit: 1079
Gerald Davis
For the mining software, it seems to be general consensus that ATI driver 11.6 and SDK 2.4 is best fit/performance for a majority of GPUs.  Mining performance with 12.x drivers suffers, as it does with newer SDK.  Some cards like SDK 2.1, but its only a few from my impression.

If this is incorrect, someone please point to reliable evidence of the contrary.  I am happy to put whatever versions of driver/sdk/etc, I just try to guess what will be best for the most.  Of course you can install any alternate driver or SDK you prefer at any time anyway.

I agree across all generations of GPU 11.6 & 2.4 tends to be a good compromise.  5970s seem to like 2.1 but it is a small increase and 6000 series cards can't run it.
donator
Activity: 1218
Merit: 1079
Gerald Davis
PS: I defenetly prefere the 64bit version. I would like to use 8 and when possible 12 or more gpu's on my rig. At the moment i realy stay at the problem that i can use just 7gpu's and have to take more boards + ram + cpu ...

12 or more GPUs??? AFAIK AMD drivers only support 8 GPUs, or is it different on linux? ;]

Current they don't I think that is why he said "when possible".  There is no technical reason under 64bit drivers (and its petabyte sized VM space) that a system couldn't support 9+ GPUs.
hero member
Activity: 637
Merit: 502
Sunday night a rig I was running for a very long time went down. I was tired and not in the mood to reinstall 11.04, ati driver, opencl, a miner...

Then I think about BAMT... It was my first experience with this project. I started the download and go for a quick shower and snack downstair. After that, I copied the image on an 8G flash drive by following instructions on the website.

sudo dd if=bamt_v0.4b.img of=/dev/sdc bs=4096 conv=notrunc,noerror

I logged into BAMT
ssh [email protected]  //password is changeme
passwd //set a new password
vi /etc/bamt/bamt.conf
vi /etc/bamt/pools
/etc/init.d/mine restart

After a few seconds, I started to hear the fans going faster! Simply WOW. It worked the first time. Thank you BAMT !!!
legendary
Activity: 3472
Merit: 1724
PS: I defenetly prefere the 64bit version. I would like to use 8 and when possible 12 or more gpu's on my rig. At the moment i realy stay at the problem that i can use just 7gpu's and have to take more boards + ram + cpu ...

12 or more GPUs??? AFAIK AMD drivers only support 8 GPUs, or is it different on linux? ;]
hero member
Activity: 616
Merit: 506
Did you take now an bit more actual kernel? Would be nice to see 3.2.x Kernel + display driver 12.1 + sdk 2.5. The 12.1 driver also support kernel 3.3 but 3.2 is stable and fast.

Pro:
- More network chipsets are supported
- Much more wifi chipsets are supported
- less problems with actual mainboard chipsets
- better speed on usb3 ports (many use usb-drives and dont dd the image just on an ssd/hdd drive)
- better support for actual 6gbit sata chipsets. Some just work with more recent kernel
- less problems and better speed on sandy bridge
- ...

Cons:
- you have about 15minutes of work (normaly)


I am myself kernel developer of an linux distribution.


Greetings

PS: I defenetly prefere the 64bit version. I would like to use 8 and when possible 12 or more gpu's on my rig. At the moment i realy stay at the problem that i can use just 7gpu's and have to take more boards + ram + cpu ...
I have no initial problem with using any given linux version if it is proven reliable and does not negatively effect performance of mining.   As BAMT is a very special purpose Linux with only one real function, all choices are made based on how they effect that single role.

BAMT is almost exclusively used on USB keys, although it can be used from HDD.  Using a HDD is wasteful of power, using SSD for a mining rig would just be silly.  So.. support for SATA is not important.  I am not aware of anyone reporting their network hardware isn't supported by the current kernel, nor am I aware of any issues with motherboard support that have been attributed to the kernel, but who knows.

For the mining software, it seems to be general consensus that ATI driver 11.6 and SDK 2.4 is best fit/performance for a majority of GPUs.  Mining performance with 12.x drivers suffers, as it does with newer SDK.  Some cards like SDK 2.1, but its only a few from my impression.

If this is incorrect, someone please point to reliable evidence of the contrary.  I am happy to put whatever versions of driver/sdk/etc, I just try to guess what will be best for the most.  Of course you can install any alternate driver or SDK you prefer at any time anyway.

hero member
Activity: 956
Merit: 1001
A bug that I've noticed (not sure it it's related to linux)

If you plug in a monitor to a video card while bamt is running everything works fine.  But when you unplug it, the web-status-thingy-ma-jig spits out an error until it's restarted.  it's a bunch of []{}\/ and |'s with a reference to status.pl inside it.

Anyone else have this issue?

By hooking up the monitor are you moving or shifting the gpu card in the pcie slot?
sr. member
Activity: 349
Merit: 250
So who / what is holding up the move to 64 bits ? Cheesy

I would love 32 bits but I don't wanna hold up that awesomeness that is 0.5 !

The "hold up" I guess is that neither option seems very good.  Mostly got reports of bad experiences with a 64 bit image (exact same software as 32 bit, just 64 bit kernel, ATI libs, driver). 
Seemed to create lots of new problems for people.  Meanwhile we know the 32 bit driver has one specific problem: it only works with 7 gpus max.

So... both options kind of suck, that is the problem.  Not sure what to do about it.



Yeah from what you described just there I still think go for 32bits for wider compatibility and stability.

Who has more than 7 GPUs in a single rig anyway !?

I have 8 GPUs in 1 rig...
hero member
Activity: 616
Merit: 506
Also #2, it is highly recommended that you assign static IP addresses using DHCP reservations if you must have them for some reason.  Maintaining a bunch of manually assigned and manually configured machines is a real waste of time, as you are probably finding out.

Yeah DHCP reservations makes node assignment much easier.  Every image is exactly the same and you have all the advantages of static IP.  If you forget IP assignments you can look the up in the router.

I haven't used it yet but remote config combined with DHCP reservation = no machine specific data on any image.  Slap a USB drive with the farm's "generic" image into any machine and it "should" just work. 

yes, the new autoconf stuff is all based around the DHCP concept, no configuration needs to be done on any specific node at all.   using manually configured IPs (or manually configured *anything* ) is a step in the wrong direction.
donator
Activity: 1218
Merit: 1079
Gerald Davis
Also #2, it is highly recommended that you assign static IP addresses using DHCP reservations if you must have them for some reason.  Maintaining a bunch of manually assigned and manually configured machines is a real waste of time, as you are probably finding out.

Yeah DHCP reservations makes node assignment much easier.  Every image is exactly the same and you have all the advantages of static IP.  If you forget IP assignments you can look the up in the router.

I haven't used it yet but remote config combined with DHCP reservation = no machine specific data on any image.  Slap a USB drive with the farm's "generic" image into any machine and it "should" just work. 
hero member
Activity: 518
Merit: 500
So who / what is holding up the move to 64 bits ? Cheesy

I would love 32 bits but I don't wanna hold up that awesomeness that is 0.5 !

The "hold up" I guess is that neither option seems very good.  Mostly got reports of bad experiences with a 64 bit image (exact same software as 32 bit, just 64 bit kernel, ATI libs, driver). 
Seemed to create lots of new problems for people.  Meanwhile we know the 32 bit driver has one specific problem: it only works with 7 gpus max.

So... both options kind of suck, that is the problem.  Not sure what to do about it.



Yeah from what you described just there I still think go for 32bits for wider compatibility and stability.

Who has more than 7 GPUs in a single rig anyway !?
hero member
Activity: 616
Merit: 506
So who / what is holding up the move to 64 bits ? Cheesy

I would love 32 bits but I don't wanna hold up that awesomeness that is 0.5 !

The "hold up" I guess is that neither option seems very good.  Mostly got reports of bad experiences with a 64 bit image (exact same software as 32 bit, just 64 bit kernel, ATI libs, driver). 
Seemed to create lots of new problems for people.  Meanwhile we know the 32 bit driver has one specific problem: it only works with 7 gpus max.

So... both options kind of suck, that is the problem.  Not sure what to do about it.

hero member
Activity: 518
Merit: 500
So who / what is holding up the move to 64 bits ? Cheesy

I would love 32 bits but I don't wanna hold up that awesomeness that is 0.5 !
Pages:
Jump to: