Pages:
Author

Topic: Machines and money - page 11. (Read 12830 times)

hero member
Activity: 742
Merit: 526
March 09, 2015, 01:50:15 PM
#55
once i've read, that the first thing a machine does, is to create a more powerful/better machine of itself

so a machine will probably make a better bitcoin

So if a machine creates an even more powerful or better machine of itself, then how come you would say that it will create a better bitcoin? Is this machine that you're talking about is bitcoin? Huh
No, just stop it, we'll all die without seeing AI, it will never be a problem for us, the people of the future are the ones that will have skynet problems not us.

Seeing that there is a rapid development in our technology everyday, it is not too far from reality that we may live to see a . Maybe we won't see it in a long time, but we can still see the concepts of it before saying goodbye to this world.

If you mean by a true AI a self-aware machine, this may never happen at all. Not that I'm implicitly referring to an existence of soul and such, but even if we are, nevertheless, able to recreate a self-aware mind in a machine somehow (as we basically do in our children), we may still not be able to understand what self-awareness conceptually is from a scientific point of view.
legendary
Activity: 3542
Merit: 1352
Cashback 15%
March 09, 2015, 01:40:41 PM
#54
once i've read, that the first thing a machine does, is to create a more powerful/better machine of itself

so a machine will probably make a better bitcoin

So if a machine creates an even more powerful or better machine of itself, then how come you would say that it will create a better bitcoin? Is this machine that you're talking about is bitcoin? Huh
No, just stop it, we'll all die without seeing AI, it will never be a problem for us, the people of the future are the ones that will have skynet problems not us.

Seeing that there is a rapid development in our technology everyday, it is not too far from reality that we may live to see a true AI. Maybe we won't see it in a long time, but we can still see the concepts of it before saying goodbye to this world.
hero member
Activity: 770
Merit: 509
March 09, 2015, 11:49:34 AM
#53
once i've read, that the first thing a machine does, is to create a more powerful/better machine of itself

so a machine will probably make a better bitcoin

So if a machine creates an even more powerful or better machine of itself, then how come you would say that it will create a better bitcoin? Is this machine that you're talking about is bitcoin? Huh
No, just stop it, we'll all die without seeing AI, it will never be a problem for us, the people of the future are the ones that will have skynet problems not us.
legendary
Activity: 3248
Merit: 1070
March 08, 2015, 10:40:26 AM
#52
once i've read, that the first thing a machine does, is to create a more powerful/better machine of itself

so a machine will probably make a better bitcoin

So if a machine creates an even more powerful or better machine of itself, then how come you would say that it will create a better bitcoin? Is this machine that you're talking about is bitcoin? Huh

no you didn't understood, i mean i machine aims, it's to make itself always more powerfull, always enhance what they are, so if you "give" them bitcoin, they will make it better
legendary
Activity: 3542
Merit: 1352
Cashback 15%
March 08, 2015, 10:22:57 AM
#51
once i've read, that the first thing a machine does, is to create a more powerful/better machine of itself

so a machine will probably make a better bitcoin
You mean they would altruistically agree on an altcoin that benefits all machines?

not really, because it should come from their own construction, and should be better in any way versus the last one(bitcoin in this case)
What I am asking is if all machines will agree to use an altcoin? Don't you think that if they didn't like Bitcoin, they would just make thousands of altcoins to each their own liking?

It depends on what they want though. If they develop their own programs of wanting an altcoin, then probably they would create altcoins of their own liking.
hero member
Activity: 742
Merit: 526
March 08, 2015, 10:08:01 AM
#50
Why would it want to be everywhere and comprehend everything? Human actions are driven by emotions and feelings in an effort to avoid pain and derive pleasure, as much as possible. The desire to learn new things is no exception. If you don't provide needs and means for their satisfaction, your thinking machine will just sit where you leave it, in a state of self-contemplation (of sorts).

Because it's in our nature to learn and improve, therefore a sentient machine might want to do the same. Knowledge helps you survive and the need to survive is the most basic.

It is our nature as you yourself said (for better survival), but why would a thinking machine possess the same qualities that a human has? My point is that your machine won't have any desires if you barely create self-awareness. It wouldn't care if it survived or not. I doubt that it would even understand the concept of life and death and, unless you provide it with memory, its own existence as such. You know that you didn't exist before having been born or conceived (in fact, before becoming conscious) only from external sources. Internally, there is no before you become conscious or after you cease to be.

I'd think that because I've never met any intelligent beings besides other humans. I assume that since both people and animals have these basic instincts an artificial brain might also form them.
In my view a self-aware robot would like to acquire basic knowledge, like what it is and where, why was it built and by whom.

The memories are a good point here. At the early stages the machine would probably be guided by its creator and share his life experience, which is another troubling aspect. An intelligent machine would probably not only take pure facts and compare them, but draw its own conclusions, like a child.

In fact, you needn't have self-awareness in a machine to make it draw its own conclusions. Neuron networks are capable of doing just that, though they don't in the least possess consciousness. Thoughts can be effectively emulated in respect to what can be considered an end result, i.e. a conclusion.
legendary
Activity: 3542
Merit: 1352
Cashback 15%
March 08, 2015, 09:55:20 AM
#49
once i've read, that the first thing a machine does, is to create a more powerful/better machine of itself

so a machine will probably make a better bitcoin

So if a machine creates an even more powerful or better machine of itself, then how come you would say that it will create a better bitcoin? Is this machine that you're talking about is bitcoin? Huh
legendary
Activity: 1001
Merit: 1005
March 08, 2015, 08:10:14 AM
#48
Interesting article.. Thanks for the link. Its kind of already happening. Like Willy bot for example.
legendary
Activity: 3248
Merit: 1070
March 08, 2015, 05:25:51 AM
#47
once i've read, that the first thing a machine does, is to create a more powerful/better machine of itself

so a machine will probably make a better bitcoin
You mean they would altruistically agree on an altcoin that benefits all machines?

not really, because it should come from their own construction, and should be better in any way versus the last one(bitcoin in this case)
What I am asking is if all machines will agree to use an altcoin? Don't you think that if they didn't like Bitcoin, they would just make thousands of altcoins to each their own liking?

i'm more inclined to think, that there will be a core which rules them all, so those machines must accept what the core does, there won't be an altcoin spam fest
donator
Activity: 1736
Merit: 1014
Let's talk governance, lipstick, and pigs.
March 08, 2015, 03:47:06 AM
#46
once i've read, that the first thing a machine does, is to create a more powerful/better machine of itself

so a machine will probably make a better bitcoin
You mean they would altruistically agree on an altcoin that benefits all machines?

not really, because it should come from their own construction, and should be better in any way versus the last one(bitcoin in this case)
What I am asking is if all machines will agree to use an altcoin? Don't you think that if they didn't like Bitcoin, they would just make thousands of altcoins to each their own liking?
legendary
Activity: 3248
Merit: 1070
March 08, 2015, 03:10:56 AM
#45
once i've read, that the first thing a machine does, is to create a more powerful/better machine of itself

so a machine will probably make a better bitcoin
You mean they would altruistically agree on an altcoin that benefits all machines?

not really, because it should come from their own construction, and should be better in any way versus the last one(bitcoin in this case)
hero member
Activity: 742
Merit: 526
March 08, 2015, 03:00:45 AM
#44
Those machines were broken or defective. They didn't go wild. Besides, they all had safety mechanisms the operator failed to utilize such as neutral gear and brakes.

And you've just touched the great unknown, a concept of a thinking machine. A machine that would want to improve itself and become conscious. Such machine wouldn't live by the rules, it would write its own programs. Think Skynet or that thing from the Matrix, a machine that is selfish and doesn't care about peoples lives, it just takes what it needs and uses as it pleases. It wants to conduct experiments to learn and it will use you as a subject, a slave, an organ donor, a living hard drive, you name it.

Before you say it's science fiction and will never happen, think about the needs of such a machine. If it becomes conscious it will want to be everywhere and comprehend everything. It will do anything to learn and won't care about ethics or morality.

Why would it want to be everywhere and comprehend everything? Human actions are driven by emotions and feelings in an effort to avoid pain and derive pleasure, as much as possible. The desire to learn new things is no exception. If you don't provide needs and means for their satisfaction, your thinking machine will just sit where you leave it, in a state of self-contemplation (of sorts).

Because it's in our nature to learn and improve, therefore a sentient machine might want to do the same. Knowledge helps you survive and the need to survive is the most basic.

It is our nature as you yourself said (for better survival), but why would a thinking machine possess the same qualities that a human has? My point is that your machine won't have any desires if you barely create self-awareness. It wouldn't care if it survived or not. I doubt that it would even understand the concept of life and death and, unless you provide it with memory, its own existence as such. You know that you didn't exist before having been born or conceived (in fact, before becoming conscious) only from external sources. Internally, there is no before you become conscious or after you cease to be.
donator
Activity: 1736
Merit: 1014
Let's talk governance, lipstick, and pigs.
March 07, 2015, 10:11:01 PM
#43
once i've read, that the first thing a machine does, is to create a more powerful/better machine of itself

so a machine will probably make a better bitcoin
You mean they would altruistically agree on an altcoin that benefits all machines?
hero member
Activity: 742
Merit: 526
March 07, 2015, 04:19:57 PM
#42
once i've read, that the first thing a machine does, is to create a more powerful/better machine of itself

so a machine will probably make a better bitcoin
It's unpredictable what true AI would do. But it's so sci-fi that its kind of a waste of time. We are light years from human like robot with AI.

The sleep of reason produces monsters while imagination abandoned by reason produces impossible monsters.
legendary
Activity: 1204
Merit: 1028
March 07, 2015, 03:26:30 PM
#41
once i've read, that the first thing a machine does, is to create a more powerful/better machine of itself

so a machine will probably make a better bitcoin
It's unpredictable what true AI would do. But it's so sci-fi that its kind of a waste of time. We are light years from human like robot with AI.
newbie
Activity: 29
Merit: 0
March 07, 2015, 03:00:56 PM
#40
Why would it want to be everywhere and comprehend everything? Human actions are driven by emotions and feelings in an effort to avoid pain and derive pleasure, as much as possible. The desire to learn new things is no exception. If you don't provide needs and means for their satisfaction, your thinking machine will just sit where you leave it, in a state of self-contemplation (of sorts).

This. So much this. Human desires/goals are shaped by the process of evolution, that's why we're selfish. If the artificial intelligence is "simply" created without selfish desires/goals it's in my opinion very likely that it'll be the most benevolent creature to ever exist.
legendary
Activity: 3248
Merit: 1070
March 07, 2015, 02:01:48 PM
#39
once i've read, that the first thing a machine does, is to create a more powerful/better machine of itself

so a machine will probably make a better bitcoin
hero member
Activity: 742
Merit: 526
March 07, 2015, 01:45:11 PM
#38
Those machines were broken or defective. They didn't go wild. Besides, they all had safety mechanisms the operator failed to utilize such as neutral gear and brakes.

And you've just touched the great unknown, a concept of a thinking machine. A machine that would want to improve itself and become conscious. Such machine wouldn't live by the rules, it would write its own programs. Think Skynet or that thing from the Matrix, a machine that is selfish and doesn't care about peoples lives, it just takes what it needs and uses as it pleases. It wants to conduct experiments to learn and it will use you as a subject, a slave, an organ donor, a living hard drive, you name it.

Before you say it's science fiction and will never happen, think about the needs of such a machine. If it becomes conscious it will want to be everywhere and comprehend everything. It will do anything to learn and won't care about ethics or morality.

Why would it want to be everywhere and comprehend everything? Human actions are driven by emotions and feelings in an effort to avoid pain and derive pleasure, as much as possible. The desire to learn new things is no exception. If you don't provide needs and means for their satisfaction, your thinking machine will just sit where you leave it, in a state of self-contemplation (of sorts).

Machines, on the other hand, are driven by their own programs (or the programs that Man had put in it).

We don't know what is consciousness (and probably will never find out), but it can be said with certainty that it has nothing to do with programming. In any case, self-awareness (machine or whatever) per se doesn't impose any threat to human existence.
legendary
Activity: 3542
Merit: 1352
Cashback 15%
March 07, 2015, 01:24:00 PM
#37
Those machines were broken or defective. They didn't go wild. Besides, they all had safety mechanisms the operator failed to utilize such as neutral gear and brakes.

And you've just touched the great unknown, a concept of a thinking machine. A machine that would want to improve itself and become conscious. Such machine wouldn't live by the rules, it would write its own programs. Think Skynet or that thing from the Matrix, a machine that is selfish and doesn't care about peoples lives, it just takes what it needs and uses as it pleases. It wants to conduct experiments to learn and it will use you as a subject, a slave, an organ donor, a living hard drive, you name it.

Before you say it's science fiction and will never happen, think about the needs of such a machine. If it becomes conscious it will want to be everywhere and comprehend everything. It will do anything to learn and won't care about ethics or morality.

Why would it want to be everywhere and comprehend everything? Human actions are driven by emotions and feelings in an effort to avoid pain and derive pleasure, as much as possible. The desire to learn new things is no exception. If you don't provide needs and means for their satisfaction, your thinking machine will just sit where you leave it, in a state of self-contemplation (of sorts).

Machines, on the other hand, are driven by their own programs (or the programs that Man had put in it).
legendary
Activity: 3542
Merit: 1352
Cashback 15%
March 07, 2015, 01:19:31 PM
#36
Those machines were broken or defective. They didn't go wild. Besides, they all had safety mechanisms the operator failed to utilize such as neutral gear and brakes.

And you've just touched the great unknown, a concept of a thinking machine. A machine that would want to improve itself and become conscious. Such machine wouldn't live by the rules, it would write its own programs. Think Skynet or that thing from the Matrix, a machine that is selfish and doesn't care about peoples lives, it just takes what it needs and uses as it pleases. It wants to conduct experiments to learn and it will use you as a subject, a slave, an organ donor, a living hard drive, you name it.

Before you say it's science fiction and will never happen, think about the needs of such a machine. If it becomes conscious it will want to be everywhere and comprehend everything. It will do anything to learn and won't care about ethics or morality.

This just sent chills to my spine. Uncontrollable machines are really scary to have, especially when they gain consciousness. If it became sentient, it doesn't care whether there is a thing called ethics, moralities, and most importantly, emotions. It will do whatever it wants to do.
Pages:
Jump to: