Author

Topic: Demand for computing power (Read 209 times)

sr. member
Activity: 560
Merit: 250
January 16, 2020, 01:59:33 AM
#20
Hi,

I have a question about the global demand for computing power. During the last few years the topics of artificial intelligence and cryptocurrencies were very popular. Millions of GPUs were used for cryptocurrency mining. I also have few GPU mining rigs at home.


What do you think about perspective of cryptocurrency mining (GPUs/ASICs/FPGAs) on long distance (3-5-10 years)?

Thanks
The demand will definitely be huge in the future as bitcoin's price is increasing and the Halving event is also coming.  We will see that selling GPUs will definitely go smoothly in the future, because mining is a pretty great business because the buying demand is huge and the cost is cheap.  Therefore, I think the demand will grow in the future and this will be an opportunity for suppliers of GPUs, especially Nvidia.
copper member
Activity: 2940
Merit: 1280
https://linktr.ee/crwthopia
January 16, 2020, 12:59:34 AM
#19
The demand for computing power has boggled my mind as well, knowing that the difficulty is always increasing. So would it be feasible if miners have a change of heart and dedicate their hash rate to another cryptocurrency? I think that the miners have that capacity to change the leading cryptocurrency, but we would see it.

Just like what Hydrogen said, Moores law is something to see about. Maybe there's a study where the increasing performance of GPUs and CPUs alleviates the increase of difficulty of mining? Perhaps someone could point it out.
hero member
Activity: 1120
Merit: 553
Filipino Translator 🇵🇭
January 16, 2020, 12:46:08 AM
#18
The most neglected aspect of the discussion is usually moore's law and the number of transistors on an area doubling roughly every 2 years. As transistor density of GPUs and ASICs increases, performance per watt rises dramatically. Leading to a double whammy where not only do new GPUs and ASICs perform better. They also consume less electricity.

True. About 2-3 years ago top GPUs (RX570/1080ti) were built with 14-16 nm process technology. Now AMD has released new GPUs with 7 nm process technology (RX5500/5700). They are more efficient and utilizes less power. The same trend follows with Bitmain ASICs: Antminer S9 (16 nm) in 2017 vs Antminer S17 (7 nm) in 2020.

What if, the new amd threadripper will be use to mine bitcoin and other cryptpcurrencies, and since we are facing problems through the mining especially to the efficiency of its consumption of electricity, we could use more powerful mining rigs as of today to mine bitcoin. In addition, instead of increasing the hash power, we could also help the system itself to consume less hash power, less energy to process more efficient blocks in the future through the use of different consensus algorithms.
copper member
Activity: 2324
Merit: 2142
Slots Enthusiast & Expert
January 16, 2020, 12:18:03 AM
#17
Is it possible to utilize such computing power of GPUs in a way other than cryptocurrency mining? For example, for artificial intelligence, scientific computing, graphic rendering, etc. In other words is there someone in the world who want to buy computing power of huge amount GPUs?
At the moment, AFAIK it (mining rig/GPUs) can only be used for mining, and you can rent it on nicehash.
I don't think it can be used in other applications since GPU itself is not efficient. You need specialized hardware to do specifics tasks to be efficient and earn some profit.

Home PC and commercial GPU do what its best for home users, but you need specialized hardware to perform better on the mentioned tasks.
Ucy
sr. member
Activity: 2674
Merit: 403
Compare rates on different exchanges & swap.
January 14, 2020, 02:22:52 PM
#16
I guess it will be possible to get paid in the future to do serious decentralized/Blockchain computations. Looks like that is where the crypto community is heading to.
 It's more like hitting two targets with a stone.
legendary
Activity: 2100
Merit: 1058
January 14, 2020, 07:42:14 AM
#15
There has been couple of coins that actually did something similar to this and they have failed to gain contracts. As many of you know that the value of XRP comes from the deals Ripple makes with banks, well technically it doesn't but people believe that anyways.

So, if you have a coin that gets computing power from the miners and uses that to rent out to artificial intelligence companies or any company that actually needs computing power they could make money and they could share that money with the people who gave that computing power, that has been tried but those coins lacked the deals with the companies that need computing power so there was no money making opportunity so people moved back to things they know they will make money from and projects died.
newbie
Activity: 20
Merit: 1
January 14, 2020, 02:53:04 AM
#14
The most neglected aspect of the discussion is usually moore's law and the number of transistors on an area doubling roughly every 2 years. As transistor density of GPUs and ASICs increases, performance per watt rises dramatically. Leading to a double whammy where not only do new GPUs and ASICs perform better. They also consume less electricity.

True. About 2-3 years ago top GPUs (RX570/1080ti) were built with 14-16 nm process technology. Now AMD has released new GPUs with 7 nm process technology (RX5500/5700). They are more efficient and utilizes less power. The same trend follows with Bitmain ASICs: Antminer S9 (16 nm) in 2017 vs Antminer S17 (7 nm) in 2020.
legendary
Activity: 2562
Merit: 1441
January 13, 2020, 08:30:26 PM
#13
There has been experimentation with server farms and data center technology. Years back google loaded a floating barge full of container based servers to see if they could save on cooling costs via utilizing water based cooling.

https://en.wikipedia.org/wiki/Google_barges

The sector revolves around contrasts of FLOP per watt, total FLOPs. Those types of metrics.

At the moment, nations like iceland are investing in the server farm/data center industry on the basis of their cooler climate giving them an advantage in power cost savings.

The most neglected aspect of the discussion is usually moore's law and the number of transistors on an area doubling roughly every 2 years. As transistor density of GPUs and ASICs increases, performance per watt rises dramatically. Leading to a double whammy where not only do new GPUs and ASICs perform better. They also consume less electricity.

It should also be mentioned software carries a good potential to offset viability of fields like AI as well as data center and server farms. Code optimization as well as improved software engineering paradigms could make a substantial difference. Although I don't know that we'll see much of that with branches like code optimization being a dying if not lost art.
newbie
Activity: 20
Merit: 1
January 13, 2020, 01:44:29 PM
#12
It would be nice if there was a second use for old ASIC chips and/or GPU cards, but most of these cards and chips has been pushed to it's end, so the market for these cards would be small.  Roll Eyes

That's why I'm asking about the demand for computing power Smiley It would be great to create a marketplace, where people can buy a computing power for AI, science calculations and so on. This marketplace may help to bring old GPUs back to life. The problem here is the demand for such computing power.

I have bought GPU cards from some Alt coin miners that has gone bust a few years ago and I made a good profit selling them to gamers, because there was a high demand for gaming cards and the supply was too low. Things has changed since then and the demand for the cards from the Alt coin mining has declined and the manufacturers has increased the supply, so the second hand market was destroyed.  Sad

After prices for cryptocurrencies dropped down the market of used GPUs was overflowed by cheap GPUs from miners. So prices were dropped down more than two times. I've also bought few used GPUs for my mining rigs Smiley
legendary
Activity: 3542
Merit: 1965
Leading Crypto Sports Betting & Casino Platform
January 13, 2020, 08:40:38 AM
#11
It would be nice if there was a second use for old ASIC chips and/or GPU cards, but most of these cards and chips has been pushed to it's end, so the market for these cards would be small.  Roll Eyes

I have bought GPU cards from some Alt coin miners that has gone bust a few years ago and I made a good profit selling them to gamers, because there was a high demand for gaming cards and the supply was too low. Things has changed since then and the demand for the cards from the Alt coin mining has declined and the manufacturers has increased the supply, so the second hand market was destroyed.  Sad
newbie
Activity: 20
Merit: 1
January 13, 2020, 08:29:56 AM
#10
I dont understand why you left CPU mining out. CPUs are everywhere. And there will be many many more in the near future. They will have to be active and to be sure they are active is best to do something. Mining is perfect thing for them to do. Much better then just some useless test.

Sorry for that Smiley The CPU mining is a good thing, but there are no very much algorithms/cryptocurrencies, which are designed for CPU mining. Yes, there is Monero/RandomX, but the most profitability during the last few years was on GPUs/ASICs.
full member
Activity: 1442
Merit: 153
★Bitvest.io★ Play Plinko or Invest!
January 13, 2020, 07:32:35 AM
#9
Gpus are and were already used heavily for data mining. Not sure about deep learning ai but pattern matching and protein virtualisation hsve both been used on gpus by universities with success. Stanfprd had folding@home which you might be interested in looking at (folding is taking a large series of numbers and reducing it into one meaningful output). These tasks would t as ke anywhere from around half an hour to around twelve sbd would sit idle when not using your computer (reassigned after 2 weeks if still incomplete).

Similar gpu farms are used by big pharma in the US and government research in the UK. But these will probably not resemble standard domestic gpus.
We already develop some technologies like computers, gadgets, weapons and other more that might have reason why they are invented. Computer hardware are also invented not just only for computer but for earning some income by using it. Gpu are very popular for this activity. It is used for online mining and data mining.Most people like having hardtime in work are doing this like buying a powerful gpu and let it to do the data mining.
copper member
Activity: 2856
Merit: 3071
https://bit.ly/387FXHi lightning theory
January 13, 2020, 07:19:49 AM
#8
Gpus are and were already used heavily for data mining. Not sure about deep learning ai but pattern matching and protein virtualisation hsve both been used on gpus by universities with success. Stanfprd had folding@home which you might be interested in looking at (folding is taking a large series of numbers and reducing it into one meaningful output).

He is interested in the $ aspect, folding@home is not an attractive option.

That's why I said looking at, NOT DOING!

It's a proof of concept in itself at least and demonstrates one use case...
legendary
Activity: 2912
Merit: 6403
Blackjack.fun
January 13, 2020, 05:54:53 AM
#7
Gpus are and were already used heavily for data mining. Not sure about deep learning ai but pattern matching and protein virtualisation hsve both been used on gpus by universities with success. Stanfprd had folding@home which you might be interested in looking at (folding is taking a large series of numbers and reducing it into one meaningful output).

He is interested in the $ aspect, folding@home is not an attractive option.

The mining will stop when all the Bitcoins are mined so I think we won't see much rise in that sector Sad

You've been around here for almost four years and you come up with something even new users know it's stupid?
Seriously, do some research before typing something like this

In other words is there someone in the world who want to buy computing power of huge amount GPUs?
How do you think is it enough for artificial intelligence tasks to buy a few GPUs or rent somewhere in the cloud (AWS, GCP, etc)?
.

Yes, there are companies that will buy computing power at any time.
But, not from home users, and not 100 GPUs in some shed.
You will have to be a certified business, you will have to provide proof of reliability, back-up solutions and much much more.

What you can try is browsing through all the so-called rental websites where you rent your GPU or CPU power for pennies once a month and in 99% you don't get paid.

legendary
Activity: 2730
Merit: 1288
January 12, 2020, 11:29:25 AM
#6
What do you think about perspective of cryptocurrency mining (GPUs/ASICs/FPGAs) on long distance (3-5-10 years)?

I dont understand why you left CPU mining out. CPUs are everywhere. And there will be many many more in the near future. They will have to be active and to be sure they are active is best to do something. Mining is perfect thing for them to do. Much better then just some useless test.
hero member
Activity: 1890
Merit: 831
January 12, 2020, 09:45:58 AM
#5
All of them are already being used , the artificial intelligence is something that would bring catastrophic changes in the society , we should know that even the scientists are very scared of this and there was a robot displaying it and the first thing that company did was to shut it down , people are afraid that it would actually make things worse and at the same time it would also , try and dominate humans , I think computing powers will rise like how the super computers are getting invented and considering that there was a time when people had 200 mb Rams and 100 mbs rom and see today ..
It will ofc rise and they would be are are still being used for various things .
The mining will stop when all the Bitcoins are mined so I think we won't see much rise in that sector Sad
hero member
Activity: 1750
Merit: 589
January 12, 2020, 09:38:46 AM
#4
Yes. I think they were originally made for such jobs, computers in general that is. To solve, or calculate numerable equations and formulas that the human mind would take too long to solve, so that they can progress through various researches and studies. Unfortunately, for AI, there isn't much known since the most advanced AI of today is just a bot which has access to large data storages that match search anything that is asked, which is good, but not good enough imo. Mining farms should still stand for years to come, since constant development would probably occur to increase the speed it takes, so I doubt it would die anytime sooner.
hero member
Activity: 2702
Merit: 672
I don't request loans~
January 12, 2020, 06:25:18 AM
#3
It's possible, but only a part of it. Kind of think of a situation that is impossible for the human time to solve, but the computer can, because it could perform millions of calculations possible in a few minutes itself, which in a sense the idea behind the first production behind computers. The human population thinks of the logic itself, and the GPU perform the logic itself over and over for millions of times kind of thing. As for crypto mining, the process would probably continue at least at the minimum when BTC supply gets, AKA when the last halving occurs (probably)
copper member
Activity: 2856
Merit: 3071
https://bit.ly/387FXHi lightning theory
January 12, 2020, 05:21:40 AM
#2
Gpus are and were already used heavily for data mining. Not sure about deep learning ai but pattern matching and protein virtualisation hsve both been used on gpus by universities with success. Stanfprd had folding@home which you might be interested in looking at (folding is taking a large series of numbers and reducing it into one meaningful output). These tasks would t as ke anywhere from around half an hour to around twelve sbd would sit idle when not using your computer (reassigned after 2 weeks if still incomplete).

Similar gpu farms are used by big pharma in the US and government research in the UK. But these will probably not resemble standard domestic gpus.
newbie
Activity: 20
Merit: 1
January 12, 2020, 05:15:35 AM
#1
Hi,

I have a question about the global demand for computing power. During the last few years the topics of artificial intelligence and cryptocurrencies were very popular. Millions of GPUs were used for cryptocurrency mining. I also have few GPU mining rigs at home.

Is it possible to utilize such computing power of GPUs in a way other than cryptocurrency mining? For example, for artificial intelligence, scientific computing, graphic rendering, etc. In other words is there someone in the world who want to buy computing power of huge amount GPUs?

How do you think is it enough for artificial intelligence tasks to buy a few GPUs or rent somewhere in the cloud (AWS, GCP, etc)?

What do you think about perspective of cryptocurrency mining (GPUs/ASICs/FPGAs) on long distance (3-5-10 years)?

Thanks
Jump to: