Pages:
Author

Topic: The technological singularity - page 2. (Read 766 times)

jr. member
Activity: 76
Merit: 1
October 14, 2018, 04:38:55 AM
#30
depends on what are you talking about. AI? maybe. genetic editing? possibly. mind uploading? I don't know but 2045 seems way too soon. i've never even heard a company that's working on mind uploading. have you?
legendary
Activity: 3906
Merit: 1373
October 13, 2018, 11:24:55 AM
#29
The technological singularity, how far are we off from the 2045 date when this will all come together to the next leap in civilisation or that computers will be as smart as humans by 2029 - according to Ray Kurzweil

Whats your guys thoughts..

Personally, I think that computers are already three times smarter than people. Do you think so too?

Not a chance. It's like comparing a Microsoft Windows computer to an ASICs computer. A Windows PC can do multitudes of things that an ASIC can't do. It's just that the ASIC has been designed to do the one thing that it does very fast. Such doesn't make the ASIC smarter than Windows. They are simply different and have different jobs.

The point is, will you ever be able to make a computer that has real identity, real feelings, real emotions, so that when it says "I," it is really existing as a sentient being?

Cool
legendary
Activity: 2702
Merit: 1468
October 12, 2018, 06:01:24 PM
#28
legendary
Activity: 3318
Merit: 2008
First Exclusion Ever
October 12, 2018, 10:56:17 AM
#27
IMO the base premise that most of you are operating on is flawed. Most of you assume we are all privy to the latest cutting edge technological advancements. We are not. We are many times insulated and removed, and quite purposefully.

You will realize we have reached the singularity as the world starts changing more aggressively, like a boa constrictor, it wraps around and squeezes just a bit tighter each time you take a breath. We all already largely live in a very carefully crafted illusion. The transition will be seamless, but not painless.
copper member
Activity: 224
Merit: 14
October 12, 2018, 08:08:23 AM
#26
I read Kurzweil's book, but it has been several years.  Personally, I don't see the technological singularity taking place in the next eleven years.  Even at the pace that technology is moving, that just seems to be a bit too optimistic as well as psychologically flavored by Kurzweil's hope that he himself will be able to use the technological singularity to escape death and bring his father back to life in some form.  I've heard this called "The Rapture for nerds," and I tend to agree with that assessment.  We also have to keep in mind that there is still a lot we don't understand about the mind and human consciousness.  Is consciousness a result of the structure of the brain, or is consciousness a pre-existing fundamental aspect of reality that the brain somehow channels?  My honest answer is I have no idea, but it seems to me that in order to create consciousness, we would first need to understand what it is, and how it operates within the biological human mind. 

I have yet to read Kurzweil's book, but I have read Virternity by David Evans Bailey. Have you read or heard about it? The concepts are quite similar. Virternity explores the concept of digital immortality in a virtual realm, via mind uploading.
All this discussion just means this could actually be the world future, huh?

Its in our nature ever since picking up the first stick and knocking an apple of a tree as primal man.. to do things which are more energy efficient and require a lot less resources or effort.. 'path of least resistance' I suppose one could say... its logical that if oneway we could exist eternally in a digital form without requiring food, water, oxygen and all the other earthly bound things that a digital existence would be the most practical. but it would surely require a whole shift in our capitalist, economic existence.. and to eradicate greed and discernible characteristics which make us HUMAN.. one way to solve this might be a hive mind which others can see how and what we are thinking as a way to temper our own self centred ambitions. what do you think?
legendary
Activity: 2926
Merit: 1386
October 09, 2018, 10:23:42 PM
#25
The technological singularity, how far are we off from the 2045 date when this will all come together to the next leap in civilisation or that computers will be as smart as humans by 2029 - according to Ray Kurzweil

Whats your guys thoughts..

I do not doubt that technological singularity is upon us, however, giving a specific year for this to happen is far-fetched. When we are approaching the year 2000, people made predictions, too. They said that by 2000 computers will bring doomsday, or that we will be living in ice age, none of which have come true. Although I believe that singularity may eventually happen, I think that saying it will happen in 11 years is a little too far.

Given there are billions of planets supporting life, it would seem reasonable that technological singularities have occurred countless times and exist now. Somewhere else, of course, to the extent that matters.
jr. member
Activity: 126
Merit: 1
October 09, 2018, 07:44:29 PM
#24
I read Kurzweil's book, but it has been several years.  Personally, I don't see the technological singularity taking place in the next eleven years.  Even at the pace that technology is moving, that just seems to be a bit too optimistic as well as psychologically flavored by Kurzweil's hope that he himself will be able to use the technological singularity to escape death and bring his father back to life in some form.  I've heard this called "The Rapture for nerds," and I tend to agree with that assessment.  We also have to keep in mind that there is still a lot we don't understand about the mind and human consciousness.  Is consciousness a result of the structure of the brain, or is consciousness a pre-existing fundamental aspect of reality that the brain somehow channels?  My honest answer is I have no idea, but it seems to me that in order to create consciousness, we would first need to understand what it is, and how it operates within the biological human mind. 

I have yet to read Kurzweil's book, but I have read Virternity by David Evans Bailey. Have you read or heard about it? The concepts are quite similar. Virternity explores the concept of digital immortality in a virtual realm, via mind uploading.
All this discussion just means this could actually be the world future, huh?
jr. member
Activity: 196
Merit: 4
October 05, 2018, 09:49:24 PM
#23
The technological singularity, how far are we off from the 2045 date when this will all come together to the next leap in civilisation or that computers will be as smart as humans by 2029 - according to Ray Kurzweil

Whats your guys thoughts..

I do not doubt that technological singularity is upon us, however, giving a specific year for this to happen is far-fetched. When we are approaching the year 2000, people made predictions, too. They said that by 2000 computers will bring doomsday, or that we will be living in ice age, none of which have come true. Although I believe that singularity may eventually happen, I think that saying it will happen in 11 years is a little too far.
legendary
Activity: 1512
Merit: 1218
Change is in your hands
October 05, 2018, 09:26:32 AM
#22
I agree with theymos, We won't see GAI anytime soon, once we have figured what consciousness is and how it is formed surely we will be able to make anything conscious. Quantum computing will help us in understanding consciousness. Once we understand it may be we will be able to clone it or even create it. But I don't see it happening in this century yet alone in the next few decades.

But my question is, do we really need artificial consciousness to save humanity? Weren't Quantum-Computers supposed to be the salvation of humanity? They were supposed to be mythical devices which would enable us to do 'godlike' things. Like curing cancer, Living forever or just as RGarrision mentioned, bringing dead to life. Why is that narrative not getting any traction anymore? We are closer to quantum-computers than GAI yet we hardly see/hear about them in the mainstream media. Are they trying to suppress the technology? To me, its just odd that AI is getting a lot of attention while the thing which could save us might be just around the corner.
newbie
Activity: 22
Merit: 0
October 05, 2018, 08:49:39 AM
#21
I read Kurzweil's book, but it has been several years.  Personally, I don't see the technological singularity taking place in the next eleven years.  Even at the pace that technology is moving, that just seems to be a bit too optimistic as well as psychologically flavored by Kurzweil's hope that he himself will be able to use the technological singularity to escape death and bring his father back to life in some form.  I've heard this called "The Rapture for nerds," and I tend to agree with that assessment.  We also have to keep in mind that there is still a lot we don't understand about the mind and human consciousness.  Is consciousness a result of the structure of the brain, or is consciousness a pre-existing fundamental aspect of reality that the brain somehow channels?  My honest answer is I have no idea, but it seems to me that in order to create consciousness, we would first need to understand what it is, and how it operates within the biological human mind. 
copper member
Activity: 224
Merit: 14
October 05, 2018, 08:23:50 AM
#20
Quote
That's the real phylosophycal deal in this field, which has been explored by Phillip K. Dick in his novel called "Do androids dream of electrical sheeps?" and in a Ridley Scott movie called "Blade Runner" which is an adaptation of the mentioned novel.

I`m afraid the scenario of "I have no mouth, and I must scream" by Harlan Ellison is more likely to happen.
Governments will search for ways to weaponize AI for the seemingly just cause of the national security.
What if then the AI decides, that the best way to ensure said security is to nuke the others?

What a scary thought! logically I thought the nukes have always required 2 keys, two presses of the button.. as a way to stop the idea of one lone idiot causing WW3... I think will be the same with AI implementation, it will require some human intervention too for making such decisions. I know killer AI is being explored but thats more on the battlefield 1:1 type situations.
jr. member
Activity: 261
Merit: 3
October 05, 2018, 01:48:34 AM
#19
Quote
That's the real phylosophycal deal in this field, which has been explored by Phillip K. Dick in his novel called "Do androids dream of electrical sheeps?" and in a Ridley Scott movie called "Blade Runner" which is an adaptation of the mentioned novel.

I`m afraid the scenario of "I have no mouth, and I must scream" by Harlan Ellison is more likely to happen.
Governments will search for ways to weaponize AI for the seemingly just cause of the national security.
What if then the AI decides, that the best way to ensure said security is to nuke the others?
legendary
Activity: 2926
Merit: 1386
October 04, 2018, 05:19:51 PM
#18
I read Kurzweil's book several years ago, and it was really stupid. It's the extrapolation fallacy in book form: ..... leftists have a habit of thinking that since progress continues continuously and exponentially, we should just assume post-scarcity any day now. Which is exactly how you damage society and the economy so badly that you stop all progress completely...

Whether in human or machine form, we're certainly NOT SEEING an exponential extrapolation of intelligence.

I sadly have to temper my opinions about a particular group of people influencing our world today manipulating media, TV- Movies, academia, politics and our fragile financial systems..

Perhaps once we 'wakeup' from the grips of this particular group.. things will look more positively.


On a lighter note; anybody read 'Who Owns the Future by Jaron Lanier' ?
https://en.wikipedia.org/wiki/Who_Owns_the_Future%3F

The wikipedia article on tech sing is very good also.

https://en.wikipedia.org/wiki/Technological_singularity
copper member
Activity: 224
Merit: 14
October 04, 2018, 09:44:06 AM
#17
I read Kurzweil's book several years ago, and it was really stupid. It's the extrapolation fallacy in book form: ..... leftists have a habit of thinking that since progress continues continuously and exponentially, we should just assume post-scarcity any day now. Which is exactly how you damage society and the economy so badly that you stop all progress completely...

Whether in human or machine form, we're certainly NOT SEEING an exponential extrapolation of intelligence.

I sadly have to temper my opinions about a particular group of people influencing our world today manipulating media, TV- Movies, academia, politics and our fragile financial systems..

Perhaps once we 'wakeup' from the grips of this particular group.. things will look more positively.


On a lighter note; anybody read 'Who Owns the Future by Jaron Lanier' ?
https://en.wikipedia.org/wiki/Who_Owns_the_Future%3F
legendary
Activity: 2926
Merit: 1386
October 03, 2018, 09:19:34 AM
#16
I read Kurzweil's book several years ago, and it was really stupid. It's the extrapolation fallacy in book form: ..... leftists have a habit of thinking that since progress continues continuously and exponentially, we should just assume post-scarcity any day now. Which is exactly how you damage society and the economy so badly that you stop all progress completely...

Whether in human or machine form, we're certainly NOT SEEING an exponential extrapolation of intelligence.
copper member
Activity: 224
Merit: 14
October 03, 2018, 12:44:29 AM
#15
Quote

We have already passed the 'A life' stage.

https://www.youtube.com/watch?v=M5gIKn7mc2U

Come on 'Sophia' is a gimmick, its similar to rulebase programming, the old days where you ask a technician questions about your car or PC, and they go thru a set of pre-determined answers, the only difference here is MAYBE Sophia can choose the which answer from the set of pre-defined answers can be chosen... but this isn't AI anybody who asks Sophia a question always have a piece of paper in-front of them when asking questions.
legendary
Activity: 2926
Merit: 1386
October 02, 2018, 06:24:03 PM
#14
I read Kurzweil's book several years ago, and it was really stupid. It's the extrapolation fallacy in book form:

He seems to believe that progress just happens regardless of everything else. In reality, progress happens because people make it happen, and it can be stopped if we hit a wall in research or if society changes to no longer allow for effective/useful scientific progress.

If human-level AI is created on traditional computing systems, then a singularity-like explosion of technology seems likely. There are existential risks there, but also potentially ~infinite benefit. But I'm not convinced that we're close to human-level AI. Deep neural networks can do some impressive things, but they don't actually think or plan: they're like a very effective form of intuition. I don't think that we will find human-level AI at the end of that road. In the worst-case scenario we should eventually be able to completely map out the human brain and simulate it on a computer, but that'll be many decades into the future at least.

Many leftists have a habit of thinking that since progress continues continuously and exponentially, we should just assume post-scarcity any day now. Which is exactly how you damage society and the economy so badly that you stop all progress completely...

These are very good points, but note the entire discussion is about "the time frame."

Suppose instead of 50 years, you look at the next 500 years ....
administrator
Activity: 5222
Merit: 13032
October 02, 2018, 05:53:46 PM
#13
I read Kurzweil's book several years ago, and it was really stupid. It's the extrapolation fallacy in book form:

He seems to believe that progress just happens regardless of everything else. In reality, progress happens because people make it happen, and it can be stopped if we hit a wall in research or if society changes to no longer allow for effective/useful scientific progress.

If human-level AI is created on traditional computing systems, then a singularity-like explosion of technology seems likely. There are existential risks there, but also potentially ~infinite benefit. But I'm not convinced that we're close to human-level AI. Deep neural networks can do some impressive things, but they don't actually think or plan: they're like a very effective form of intuition. I don't think that we will find human-level AI at the end of that road. In the worst-case scenario we should eventually be able to completely map out the human brain and simulate it on a computer, but that'll be many decades into the future at least.

Many leftists have a habit of thinking that since progress continues continuously and exponentially, we should just assume post-scarcity any day now. Which is exactly how you damage society and the economy so badly that you stop all progress completely...
legendary
Activity: 3318
Merit: 2008
First Exclusion Ever
October 02, 2018, 12:00:43 PM
#12
We have already reached the singularity, just none of is have clearance to know about it.
legendary
Activity: 2702
Merit: 1468
October 02, 2018, 10:41:15 AM
#11
I read a few years ago that we put 300,000 rat neurons into a robot and out of pure boredom the robot started doing things on its own accord.. consciousness in some form was born.. can't we lift the morel dilemma about what is defined as 'life' and start really exploring deeply what happens if we put 30,000,000 human neurons into a robot.

HYBROT


Wont organic neuron's, and bionic CPU's and programmable DNA be the best way to spawn Ai and its artificial general intelligence (AGI)/Consciousness .. and in turn the question/concerns about how will Ai will react to us.. will be that we are nothing more than the antiquated ancestors of the next organic/semiconductor evolutionary step.

Surely that puts to rest the fears of AGI being our demise.

We have already passed the 'A life' stage.

https://www.youtube.com/watch?v=M5gIKn7mc2U
Pages:
Jump to: