Pages:
Author

Topic: delete (Read 9586 times)

hero member
Activity: 518
Merit: 500
October 22, 2011, 05:58:34 PM
The speed of light might not be the ultimate limit, they still haven't figured out exactly if that neutrino really moved under the known laws or if somthing unexpected took place.

They didn't account for the relative motion of the satellite.

http://www.technologyreview.com/blog/arxiv/27260/?ref=rss
okay more OT since the original topic is boring anyway:
That report is HIGHLY dubious. To point out just one thing, AFAIK the opera team did not use GPS satellites for timing as claimed by the author; they used atomic clocks on the ground, and these atomic clocks where synchronized by moving a third atomic clock from one location to the other and back. The question of relativistic effects has been raised on the first press conference, and the researchers were adamant ALL effects had been taken in to account. Considering this was a 3 year experiment with highly unusual results, to assume the entire team forgot something as basic as.. relativity, is just not likely. BTW, it also appears (if the internet is to believed) the author is a crackpot. I cant vouch for that, but writing the report alone and working for a "department of artificial intelligence", I wouldnt take this as fact until its confirmed.
hero member
Activity: 616
Merit: 500
Firstbits.com/1fg4i :)
October 22, 2011, 04:27:46 PM
The speed of light might not be the ultimate limit, they still haven't figured out exactly if that neutrino really moved under the known laws or if somthing unexpected took place.

They didn't account for the relative motion of the satellite.

http://www.technologyreview.com/blog/arxiv/27260/?ref=rss
Awn, that's disappointing...

I guess i was getting outdated info...
sr. member
Activity: 294
Merit: 252
October 22, 2011, 03:44:56 PM
The speed of light might not be the ultimate limit, they still haven't figured out exactly if that neutrino really moved under the known laws or if somthing unexpected took place.

They didn't account for the relative motion of the satellite.

http://www.technologyreview.com/blog/arxiv/27260/?ref=rss
hero member
Activity: 616
Merit: 500
Firstbits.com/1fg4i :)
October 22, 2011, 03:22:33 PM
The speed of light might not be the ultimate limit, they still haven't figured out exactly if that neutrino really moved under the known laws or if somthing unexpected took place.
hero member
Activity: 540
Merit: 500
The future begins today
October 22, 2011, 12:17:17 PM


Be careful BTC-E is full of SolidCoin trolls such as Ten98 and few others.
For your safety avoid them.
hero member
Activity: 518
Merit: 500
October 22, 2011, 10:36:12 AM
There are so many things that we could do better if only computers were another million times faster.

Actually, Im not so sure. For the past decade I have been waiting for that "killer app" to make use of our computing resources, but it isnt happening. 10 years ago we heard all those promises of voice recognition, AI and what not. Ill see if I can find a funny slide I saw from intel making 10 year predictions, I found it laughable at the time, I think its hilarious now. In reality those problems have proven to be software, not hardware problems, and we havent progressed much since 2000. Basically our PCs are still doing the exact same things they were doing 10 years ago, and a need for faster processing for the most part just isnt there. For the bulk of daily tasks, our current hardware, or even old hardware, is more than fast enough, given efficient software.

Its therefore no surprise to see the current trends are away from highend PCs, towards slower but more mobile, more easy to use and usually, more affordable devices. First the netbook rage, now smartphones, tablets, and upcoming smartbooks. For those tasks these machines are too slow to handle, there is always the cloud. Even gaming is moving to the cloud, look up OnLive or Otoy.

Im willing to bet in 10 years, the majority of us will use computing devices that are barely faster than our current desktops. They might even be slower. They will no doubt be more useful, have better software and connectivity, they will be more portable and tap in to performance remotely when needed. But significantly better raw performance? I doubt it.

Anyway, thats not to say I declare Moore's Law dead yet. It will be used however primarily to obtain better energy efficiency. The same chips that power your iphone  or android phone will end up in huge numbers in datacenters, and those datacenters will do ever more of our number crunching. PCs will become like Unix workstations: rare (and expensive) dinosaurs.
donator
Activity: 1218
Merit: 1079
Gerald Davis
October 22, 2011, 09:06:09 AM
That limit is quite simply, the speed of light.
At 4GHz a wave can only travel ~75mm each cycle thus the paths within a CPU have an upper bound on them that is already close to being necessary to be taken into consideration.

Also IIRC the propagation speed of electricity in microprocessors is slower than speed of light so that size limit is actually smaller.  Still thanks for reminding me of the propagation limit.  Even if you could take a chip to 6GHz, 8GHz, or 10GHz it would have to be an incredibly small chip to get speeds that high.

Which is why chips likely will stay in the 3GHz to 4GHz range for the next couple generations and use architectural improvements, increase high speed cache, and more cores to COST EFFECTIVELY raise computational power.  Bulldozer has 8/4 cores (8 integer cores and 4 shared floating point cores).  The next gen Bulldozer planned for 2013 will have 12/6 cores. 

People sometimes forget Moore's law is about doubling transistor count AT THE LOWEST COST.  Sure you could make a chip that is 4000 cores running at 6GHz and is the size of a pizza box however that would have nothing to do with Moore's law since the yield would be incredibly bad and the cost would be astronomical.
sr. member
Activity: 294
Merit: 252
October 21, 2011, 06:12:55 PM
Who knows what the next one will be.

I'm thinking 3D chip features.
legendary
Activity: 4592
Merit: 1851
Linux since 1997 RedHat 4
October 21, 2011, 05:14:46 PM
Actually the interesting (and obvious) issue with Moores law is that of course with a single technology there will eventually be a convergence to some physical limit.
However, the solution to that problem is a change in direction or a change in technology.
CPU's already do have a processing speed limit.
That limit is quite simply, the speed of light.
At 4GHz a wave can only travel ~75mm each cycle thus the paths within a CPU have an upper bound on them that is already close to being necessary to be taken into consideration.
The solution that GPU mining already shows is multiple cores.
The next batch of CPU's from ATI are already out with 8 CPU cores.
The top 69xx ATI GPU's have roughly 1000 cores in them.
That's the current solution to the issue.
Who knows what the next one will be.
full member
Activity: 154
Merit: 100
October 21, 2011, 05:10:52 PM
Beyond two decades is harder to know because it will require a complete shift from silicon chips to something else.  Still taking this in a complete circle back to the original tangent computing power will continue to grow exponentially over next two decades.  Any economic model that considers otherwise won't survive the short term to reach any long term where computing power growth slows down.

I've been following death and vipers discussion. While I think ya'll essentially agree with the curve Moores Law will take, I also can't wrap my head around how betting against Moores Law in the short-term with SCs inflation algorithm makes any sense.
donator
Activity: 1218
Merit: 1079
Gerald Davis
October 21, 2011, 05:02:13 PM
If Moore's law is broken it will because of an inability to sustain it not because of a lack of desire.
There are so many things that we could do better if only computers were another million times faster.

ok one last post since I'm an addict... THIS was what I was trying to argue, alongside where that in-sustainability will come from... desire was nowhere in my argument, you implied it.

I didn't imply anything you said human race will turn away from increasing computing performance to persue other things.

I also conditioned that Moore's law wouldn't be broken in the near term.  Eventually we won't be able to sustain it but I think we got another two decades at least.   There is sufficient demand that the prize for faster and cheaper computing power means it will get sufficient resources to solve problems that might derail it.  

Beyond two decades is harder to know because it will require a complete shift from silicon chips to something else.  Still taking this in a complete circle back to the original tangent computing power will continue to grow exponentially over next two decades.  Any economic model that considers otherwise won't survive the short term to reach any long term where computing power growth slows down.
donator
Activity: 1218
Merit: 1079
Gerald Davis
October 21, 2011, 04:52:56 PM
If Moore's law is broken it will because of an inability to sustain it not because of a lack of desire.
There are so many things that we could do better if only computers were another million times faster.

I don't think you will find any researcher in any field using parallel computing that would say "yup we would be able to do X faster in the slowed down computer growth and focused on X".

It is because of massive computational growth that we are able to advance X (yes I used a variable because it applies almost universally to any research).  Every potential problem you named if solved FASTER not SLOWER with a massive increase in global computing power.

sr. member
Activity: 294
Merit: 252
October 21, 2011, 04:43:54 PM
Really? You think at some point we're just going to say "naaaah, computers are fast enough, let's not bother making them better"?

lol

Nope, no one will say it, it will just slow down for the very reasons I have already described....

now c'mon you're just starting to troll now.

Are you talking about your flawed idea of how humans research and develop (faster computers, better models will allow us to develop those other fields better and faster), or your flawed understanding of paradigms of accelerating change (when 2d transistors can no longer be shrunk, we'll be on to the next paradigm)?
donator
Activity: 1218
Merit: 1079
Gerald Davis
October 21, 2011, 04:43:21 PM
There you go, while not understanding what I am saying you give another possible example of what I am talking about since if much of the human resource is directed at bio-engineering there will be less thrown at pure computing horse power break through, while computational power will be likely needed for certain future studies such as in bio-engineering if the focus of humanity drifts far enough away, Moore's law becomes broken.

Hardly.

1) the entire human race in aggregate is capable of researching more than one thing at a time.  As a % of global GDP the amount of R&D spent on faster chips is a rounding error.  We managed to get a million fold increase in computation power per unit of cost and it didn't require some huge fraction of global research.  

2) The only reason we are even begining to unlock the secrets of bio-engineering is because of massively parallel super computers.  With computers a billion times faster we are MORE likely not less likely to make meaningful breakthroughs.  If anything a stagnation in usable computing power means a stagnation in global research.  Not just in bio-engineering but in dozens of other sciences.

Quote
But I think Energy will likely be the next big revolution/Age to happen because energy is the crucial resource necessary to feed and operate an overpopulated world, and help take humanity beyond the borders of our skies.  We'll likely see wind, hydro and solar energy sources nearing 100% efficiency, smarter and safer harnessing of nuclear energy (fission and maybe fussion), space based energy farms, smaller and stronger battery systems, super conductor break throughs, reemergence of safe wireless energy transfer, new propulsion systems very much unlike what we see in the various vehicles we have today, the harnessing of gravitational energy who knows, the mind of humans knows no limits.

All of which will require more and more and more computing power for cheaper and cheaper and cheaper.


Then again the Fermi Paradox indicates that on a long enough timeline it may not matter (however that is just to depressing so I just choose to discount it)  
Grin
donator
Activity: 1218
Merit: 1079
Gerald Davis
October 21, 2011, 04:38:43 PM
Yeah utter foolishness.  

The modern world is only possible because computers have gotten 1 MILLION times faster than 8080 microprocessor built my Intel 40 years ago.  Entire tech trees have opened up in every possible field because of that explosion in computations power.  The fact that he considers it is "the frivolity of buying the latest and greatest computer tech" is a luddite point of view.

Those "alternate forms of science" are already ongoing.  Massivelly parallel computing power has lead to better understanding of human brain, mapping the human genome, better understanding of our planets, decoding data from radio telescopes.  Those are just the "grand projects".

My life is improved because I have a Tivo and don't need to time my day around a TV schedule.  Without a million fold increase in computational power even if someone had theorized the idea of a Tivo it would never have been practical or feasible.

The idea that we will STOP DOING THAT COMPUTER THING and concentrate on other stuff is stupid.  Sorry can't think of an nice word.  We have made breakthroughs at an astonsihing rate BECAUSE OF COMPUTING POWER and more (hopefully millions or a billions times more) computing power in the future will enable faster not slower rate of discovery.
legendary
Activity: 1764
Merit: 1015
October 21, 2011, 04:33:48 PM
#99
the world will enter a new age where human resources are directed at a new frontier, not straight up computing horse power

Really? You think at some point we're just going to say "naaaah, computers are fast enough, let's not bother making them better"?

lol
I think there will be a point where computers will be "infinitely fast".
Or so I have read Smiley.
sr. member
Activity: 294
Merit: 252
October 21, 2011, 04:29:05 PM
#98
the world will enter a new age where human resources are directed at a new frontier, not straight up computing horse power

Really? You think at some point we're just going to say "naaaah, computers are fast enough, let's not bother making them better"?

lol
donator
Activity: 1218
Merit: 1079
Gerald Davis
October 21, 2011, 04:00:01 PM
#97
I have a hard time doing something on this scale without a Carl Sagan ish view of the future and leaving something realistic from which they can grow from.

Do you believe that we will have as much technological progress from now until 2111 as we did from 1911 until now?

I know it was directed at lemonade man but given technological progress is increasing we should have far more progress in the next century than in the prior one.  A large component of that is the free flow of informations.  From printing press -> mass printing -> digital records -> computer networks the ease at which information can be shared is continually increases.  Social changes like the rise of open source and distributed projects fits into that too.  There is less re-inventing of the wheel, more forward progress. 

Personally I think we are at the Commodore 64 stage in the rise of bio-engineering.  100 years from now people will look back and consider our understanding of bio-chemistry to be so primitive.  I mean look at Pharmacueticals today.  We try a bunch of compounds, most do nothing, some work but not well, others are dangerous maybe 1 in 10,000 is useful and safe enough to market.   The reason why is our understanding of how various compounds affects the body is very limited so it is more a "poke it and see what happens" model of research. 
sr. member
Activity: 294
Merit: 252
October 21, 2011, 03:48:54 PM
#96
I have a hard time doing something on this scale without a Carl Sagan ish view of the future and leaving something realistic from which they can grow from.

Do you believe that we will have as much technological progress from now until 2111 as we did from 1911 until now?
member
Activity: 112
Merit: 11
Hillariously voracious
October 21, 2011, 03:24:08 PM
#95
So as I originally stated I think the most PROBABLE scenario is that we continue to have exponential growth for near term.  Any economic model should be based on the most probable scenario don't the collapse of civilization as we know it.

So my assumptions about your beliefs are wrong but the reality is even worse... you are only planning on the now and current and not something viable for hundreds of years and hopefully longer....  I don't let the short term now greed and impulse factor cloud my judgement that the world needs something better that will last well beyond my lifetime.  Making a buck now is great but giving something that gives my great great grand children a better chance at a good economic future and freedom is far greater.

With all due respect, planning thirty years ahead is problematic to the point of intractability.

Planning for hundreds of years ahead borders on  delusional - consider the world of 1911, and you will realize that our not so ancient past is more alien to us that most "alien civilizations" contrived by science-fiction authors (even good authors), and this process is accelerating.

It is not unlikely that our grandchildren will be almost incapable of relating to us and our concerns due to changes in lifestyle and psychology (the inverse will also be true)
Pages:
Jump to: