Would advanced AI possess Buddha-nature?

If you're new to the forum or new to Buddhism, this is the best place for your questions. Responses require moderator approval before they are visible.
User avatar
odysseus
Posts: 1060
Joined: Tue Apr 17, 2012 11:50 pm
Contact:

Re: Would advanced AI possess Buddha-nature?

Post by odysseus » Sat Oct 07, 2017 8:25 pm

boda wrote:
Sat Oct 07, 2017 12:56 am

Was the human mind intelligently designed?

This is a question that has something to do with "intelligent design" theory. This is a New Age form of Creation to renew and modernise the old belief that it comes from God. Sorry, there is no intelligent design that implies a designer in Buddhism. http://www.intelligentdesign.org/whatisid.php
Let a man not seek for the respect of his peers, but let him seek wisdom.

-- Dhammapada

User avatar
The Cicada
Posts: 567
Joined: Sat Apr 16, 2016 5:15 am
Location: Trumpaloka

Re: Would advanced AI possess Buddha-nature?

Post by The Cicada » Sat Oct 07, 2017 8:53 pm

Grigoris wrote:
Sat Oct 07, 2017 6:06 pm
Consider masochism: logically a being will wish to avoid harm, but here we have people that desire to be harmed. How do you code for that? On the basis of probability? You code for 2.2% of male AI machines and 1.3% of female AI machines to display masochistic disorder?
A desire to fight can be considered essentially masochistic and would also have great survival value in a harsh, uncivilized world, so certainly there is a logic to it that has karmically/genetically predisposed our species in that way—if we can say that this is, in fact, the way that we are predisposed.

boda
Posts: 1667
Joined: Thu Jul 03, 2014 8:40 pm

Re: Would advanced AI possess Buddha-nature?

Post by boda » Sun Oct 08, 2017 12:09 am

:jawdrop:
Grigoris wrote:
Sat Oct 07, 2017 6:06 pm
humans cannot really design themselves,
We don't need to design ourselves, there's more than enough of us around, in case you haven't noticed.
so how can something that is designed and produced by humans surpass humans?
Take Google DeepMind for example, an artificial intelligence program using deep reinforcement learning that plays Atari games and improves itself to a superhuman level. This is a pretty specific task of course. AGI or artificial general intelligence, an intelligence that's capable of accomplishing a wide spectrum of goals is the ideal.


And then there is that little matter of emotions that throws a spanner into AI design. How can you code for that? Emotions are evolutionary traits too.
Human intelligence is built on the substrate of a biological organism, an organism that needs to eat, drink, regulate energy, avoid danger, reproduce, socialize, etc etc. AI is built on a fundamentally different substrate, one without biological needs, so AI doesn't require emotions to accomplish goals. In fact it may be able to accomplish particular goals more efficiently without emotion. You'd need to go out of your way to simulate human biology in order to produce emotional AI, but why would that be a desirable thing to do? I suppose it depends on why we're developing AI. Are we looking for smarter tools, companions, or replacements?

User avatar
Grigoris
Global Moderator
Posts: 15180
Joined: Fri May 14, 2010 9:27 pm
Location: Greece

Re: Would advanced AI possess Buddha-nature?

Post by Grigoris » Sun Oct 08, 2017 8:17 am

boda wrote:
Sun Oct 08, 2017 12:09 am
Take Google DeepMind for example, an artificial intelligence program using deep reinforcement learning that plays Atari games and improves itself to a superhuman level. This is a pretty specific task of course. AGI or artificial general intelligence, an intelligence that's capable of accomplishing a wide spectrum of goals is the ideal.
Personally, I do not consider the ability to play 80's video games effectively a sign of advanced intelligence. Having been an avid video game player in the past, it is just a matter of figuring out the logic of the program, and since it is fixed...

And that is where this ability:
...so AI doesn't require emotions to accomplish goals.
...fails evolutionarily. Emotions, the randomising effect on logic, are a positive evolutionary trait because they allow for reactions that may appear illogical, but ultimately may lead to survival. It is because of this ability to randomise that humans have been able to survive, because of the ability to innovate. A computer, for example, will not get bored and so they will not look for an avenue of change/escape and thus will not be open to innovation. Intelligence needs emotion.
"My religion is not deceiving myself."
Jetsun Milarepa 1052-1135 CE

"Butchers, prostitutes, those guilty of the five most heinous crimes, outcasts, the underprivileged: all are utterly the substance of existence and nothing other than total bliss."
The Supreme Source - The Kunjed Gyalpo
The Fundamental Tantra of Dzogchen Semde

User avatar
Vasana
Posts: 1375
Joined: Thu Aug 22, 2013 11:22 am

Re: Would advanced AI possess Buddha-nature?

Post by Vasana » Sun Oct 08, 2017 9:32 am

This may be on the borders of being off topic, but I caught up with the original Blade-Runner the other day in anticipation of catching the new one soon and it made me think about the ethical dimensions of how an A.I should be treated by humans if the A.I being/robot/replicant truly believed it's self to be sentient and possessed anything resembling emotions. Would verbal or physical abuse and exploitation still be considered non-virtuous towards these non-beings if they themselves had the 'belief' that were sentient? What would suffering be like for an A.I ?
"The changing cycle of joy and sorrow, like the changing seasons –
As a time of suffering will surely come around to me,
May I truly practice the sublime teachings."
- Dudjom Rinpoche

boda
Posts: 1667
Joined: Thu Jul 03, 2014 8:40 pm

Re: Would advanced AI possess Buddha-nature?

Post by boda » Sun Oct 08, 2017 2:24 pm

Grigoris wrote:
Sun Oct 08, 2017 8:17 am
Emotions, the randomising effect on logic, are a positive evolutionary trait because they allow for reactions that may appear illogical, but ultimately may lead to survival. It is because of this ability to randomise that humans have been able to survive, because of the ability to innovate. A computer, for example, will not get bored and so they will not look for an avenue of change/escape and thus will not be open to innovation. Intelligence needs emotion.
What do you mean by saying that emotions are a randomizing effect on logic? How does that work exactly?

If it’s simply a matter of randomization, would a random number generator function be sufficient to produce the desired effect?

DeepMind also won a game of Go recently, playing against a world master. It won using an innovative strategy that no one has seen before. Of course it didn’t consciously strategize and creativity develop the new game play. It simply solved a problem, like a pocket calculator solves a problem, only it used reinforced learning and neural network computing.

User avatar
Grigoris
Global Moderator
Posts: 15180
Joined: Fri May 14, 2010 9:27 pm
Location: Greece

Re: Would advanced AI possess Buddha-nature?

Post by Grigoris » Sun Oct 08, 2017 4:02 pm

Vasana wrote:
Sun Oct 08, 2017 9:32 am
This may be on the borders of being off topic, but I caught up with the original Blade-Runner the other day in anticipation of catching the new one soon and it made me think about the ethical dimensions of how an A.I should be treated by humans if the A.I being/robot/replicant truly believed it's self to be sentient and possessed anything resembling emotions. Would verbal or physical abuse and exploitation still be considered non-virtuous towards these non-beings if they themselves had the 'belief' that were sentient? What would suffering be like for an A.I ?
Check out Westworld too.
"My religion is not deceiving myself."
Jetsun Milarepa 1052-1135 CE

"Butchers, prostitutes, those guilty of the five most heinous crimes, outcasts, the underprivileged: all are utterly the substance of existence and nothing other than total bliss."
The Supreme Source - The Kunjed Gyalpo
The Fundamental Tantra of Dzogchen Semde

User avatar
Grigoris
Global Moderator
Posts: 15180
Joined: Fri May 14, 2010 9:27 pm
Location: Greece

Re: Would advanced AI possess Buddha-nature?

Post by Grigoris » Sun Oct 08, 2017 4:13 pm

boda wrote:
Sun Oct 08, 2017 2:24 pm
Grigoris wrote:
Sun Oct 08, 2017 8:17 am
Emotions, the randomising effect on logic, are a positive evolutionary trait because they allow for reactions that may appear illogical, but ultimately may lead to survival. It is because of this ability to randomise that humans have been able to survive, because of the ability to innovate. A computer, for example, will not get bored and so they will not look for an avenue of change/escape and thus will not be open to innovation. Intelligence needs emotion.
What do you mean by saying that emotions are a randomizing effect on logic? How does that work exactly?

If it’s simply a matter of randomization, would a random number generator function be sufficient to produce the desired effect?

DeepMind also won a game of Go recently, playing against a world master. It won using an innovative strategy that no one has seen before. Of course it didn’t consciously strategize and creativity develop the new game play. It simply solved a problem, like a pocket calculator solves a problem, only it used reinforced learning and neural network computing.
Intelligence is not just about problem solving/understanding patterns. Intelligence also includes things like creativity for creativities sake. Not all uses of intelligence are goal oriented.

Even one it comes to goal oriented behaviour, emotions play a role. Take political philosophy, for example. What is the motivation for political philosophy? It can be outrage, or desire, or... It may start as a feeling and develop into a goal. Some political philosophies start and end with feeling.

How do you program for that? You can't.
"My religion is not deceiving myself."
Jetsun Milarepa 1052-1135 CE

"Butchers, prostitutes, those guilty of the five most heinous crimes, outcasts, the underprivileged: all are utterly the substance of existence and nothing other than total bliss."
The Supreme Source - The Kunjed Gyalpo
The Fundamental Tantra of Dzogchen Semde

boda
Posts: 1667
Joined: Thu Jul 03, 2014 8:40 pm

Re: Would advanced AI possess Buddha-nature?

Post by boda » Sun Oct 08, 2017 8:24 pm

Grigoris wrote:
Sun Oct 08, 2017 4:13 pm
Intelligence is not just about problem solving/understanding patterns. Intelligence also includes things like creativity for creativities sake. Not all uses of intelligence are goal oriented.
The goal of being creative for the sake of being creative is being creative. Or are you using the word 'sake' as it represents Japanese wine? A good example of what you're trying to say would be reflexive or habitual behavior. Most of our neural activity is dedicated to this kind of subconscious intelligence which lacks conscious goals.
Even one it comes to goal oriented behaviour, emotions play a role. Take political philosophy, for example. What is the motivation for political philosophy? It can be outrage, or desire, or... It may start as a feeling and develop into a goal. Some political philosophies start and end with feeling.

How do you program for that? You can't.
Why would we want to? Do we want smarter tools to help accomplish our goals or do we want to make politically outraged machines? I think we have more than enough politically outraged machines as it is.

User avatar
Admin_PC
Site Admin
Posts: 3958
Joined: Wed Sep 19, 2012 11:17 pm
Location: Texas, USA

Re: Would advanced AI possess Buddha-nature?

Post by Admin_PC » Wed Oct 11, 2017 5:31 pm

Relevant article for this discussion:
https://boingboing.net/2017/10/09/clark ... t-law.html
月影の いたらぬ里は なけれども 眺むる人の 心にぞすむ
法然上人

Yuren
Posts: 132
Joined: Thu Aug 08, 2013 1:39 am

Re: Would advanced AI possess Buddha-nature?

Post by Yuren » Tue Oct 17, 2017 12:13 am

Yes, even a Playstation or an old Game Boy has buddha-nature.

http://www.chinabuddhismencyclopedia.co ... dha-Nature

User avatar
Johnny Dangerous
Global Moderator
Posts: 7459
Joined: Fri Nov 02, 2012 10:58 pm
Location: Olympia WA
Contact:

Re: Would advanced AI possess Buddha-nature?

Post by Johnny Dangerous » Tue Oct 17, 2017 7:40 am

Obviously there is a connection between intelligence and sentience, but they are not synonymous. There are plenty of sentient beings far less complex than any computer. Additionally, it is debatable whether complex sets of instructions (even the expert systems and neural networks that PC was talking about, algorithmic prepossess etc. ) are really discrete "intelligences" at all in this sense, or simply extraordinarily complex automated tools, which exist as extensions of human intelligence. I personally opt for the latter.

In a Buddhist sense, by definition a sentient being is a somewhat self-aware (to diminishing degrees depending on the order of being) experience of subjectivity, until there is a machine that fits this definition, I don't buy it.

The most "sentient" type AI i've seen was this guy years ago that used to design these analog-based "insect" machines that basically learned without being programmed to. I admit as I have no expertise in the field I may have been missing something, but to me there is marked difference between programming something to "think", however fancily, and something which starts learning due to basic structural design - by my understanding this is what was happening.
"it must be coming from the mouthy mastermind of raunchy rapper, Johnny Dangerous”

-Jeff H.

User avatar
Grigoris
Global Moderator
Posts: 15180
Joined: Fri May 14, 2010 9:27 pm
Location: Greece

Re: Would advanced AI possess Buddha-nature?

Post by Grigoris » Tue Oct 17, 2017 9:54 am

boda wrote:
Sun Oct 08, 2017 8:24 pm
Why would we want to? Do we want smarter tools to help accomplish our goals or do we want to make politically outraged machines? I think we have more than enough politically outraged machines as it is.
What you are failing to understand is that sometimes one needs an intense emotional reaction (outrage, compassion, desire) to spark an intelligent response, to get one motivated to think about solutions.
"My religion is not deceiving myself."
Jetsun Milarepa 1052-1135 CE

"Butchers, prostitutes, those guilty of the five most heinous crimes, outcasts, the underprivileged: all are utterly the substance of existence and nothing other than total bliss."
The Supreme Source - The Kunjed Gyalpo
The Fundamental Tantra of Dzogchen Semde

boda
Posts: 1667
Joined: Thu Jul 03, 2014 8:40 pm

Re: Would advanced AI possess Buddha-nature?

Post by boda » Thu Oct 19, 2017 10:22 pm

Grigoris wrote:
Tue Oct 17, 2017 9:54 am
boda wrote:
Sun Oct 08, 2017 8:24 pm
Why would we want to? Do we want smarter tools to help accomplish our goals or do we want to make politically outraged machines? I think we have more than enough politically outraged machines as it is.
What you are failing to understand is that sometimes one needs an intense emotional reaction (outrage, compassion, desire) to spark an intelligent response, to get one motivated to think about solutions.
You're absolutely right, I fail to understand that. I also fail to believe that you could explain why this is the case. It might be interesting though.

User avatar
Grigoris
Global Moderator
Posts: 15180
Joined: Fri May 14, 2010 9:27 pm
Location: Greece

Re: Would advanced AI possess Buddha-nature?

Post by Grigoris » Fri Oct 20, 2017 8:31 am

boda wrote:
Thu Oct 19, 2017 10:22 pm
Grigoris wrote:
Tue Oct 17, 2017 9:54 am
boda wrote:
Sun Oct 08, 2017 8:24 pm
Why would we want to? Do we want smarter tools to help accomplish our goals or do we want to make politically outraged machines? I think we have more than enough politically outraged machines as it is.
What you are failing to understand is that sometimes one needs an intense emotional reaction (outrage, compassion, desire) to spark an intelligent response, to get one motivated to think about solutions.
You're absolutely right, I fail to understand that. I also fail to believe that you could explain why this is the case. It might be interesting though.
I have explained it a number of times. But you prefer to feign ignorance, instead of advancing a counter-argument.
"My religion is not deceiving myself."
Jetsun Milarepa 1052-1135 CE

"Butchers, prostitutes, those guilty of the five most heinous crimes, outcasts, the underprivileged: all are utterly the substance of existence and nothing other than total bliss."
The Supreme Source - The Kunjed Gyalpo
The Fundamental Tantra of Dzogchen Semde

boda
Posts: 1667
Joined: Thu Jul 03, 2014 8:40 pm

Re: Would advanced AI possess Buddha-nature?

Post by boda » Sat Oct 21, 2017 12:08 am

Grigoris wrote:
Fri Oct 20, 2017 8:31 am
boda wrote:
Thu Oct 19, 2017 10:22 pm
Grigoris wrote:
Tue Oct 17, 2017 9:54 am
What you are failing to understand is that sometimes one needs an intense emotional reaction (outrage, compassion, desire) to spark an intelligent response, to get one motivated to think about solutions.
You're absolutely right, I fail to understand that. I also fail to believe that you could explain why this is the case. It might be interesting though.
I have explained it a number of times. But you prefer to feign ignorance, instead of advancing a counter-argument.
You haven't, actually.

Let's use DeepMind, an AI that uses neural network computing and reinforced learning, as an example. As I mentioned earlier, DeepMind won a game of Go against a world champion. Go is an extremely complex game and has more potential moves than there are atoms in the universe, so it's not really possible to program a computer to play like you can for a game like chess. They say Go requires intuition and creativity, because of this complexity. Nevertheless, DeepMind won against a world champion with an innovative strategy that no one had ever seen before. So, is DeepMind Conscious? No, at least not conscious in the way we normally understand it. Is DeepMind creative? Apparently. Is DeepMind emotional? No.

In this case DeepMind learned and developed a strategy to win the game without any kind of emotional motivation. How is this possible if what you say is true?

User avatar
Grigoris
Global Moderator
Posts: 15180
Joined: Fri May 14, 2010 9:27 pm
Location: Greece

Re: Would advanced AI possess Buddha-nature?

Post by Grigoris » Sat Oct 21, 2017 8:30 am

boda wrote:
Sat Oct 21, 2017 12:08 am
They say Go requires intuition and creativity...
The creativity and intuition required for Go is based on a finite set of specific positions and moves of the pieces. A computer can just crunch the numbers and make a prediction based on this large, yet finite, set thus simulating what appears to be intuition and creativity. A computer will not take the pieces and make a picture of a flower with them, or use them to prop up the play board table, or throw one of them at the opponent as a distraction and then shift the pieces around to suit them, or...

I am talking about Artificial INTELLIGENCE, you are defining intelligence merely as problem solving. I have given examples of intelligence beyond the bounds of problem solving, something that computers are completely incapable of right now. Creativity is based on thinking outside of set boundaries and it is emotions that allow us to transcend the boundaries of reason.
"My religion is not deceiving myself."
Jetsun Milarepa 1052-1135 CE

"Butchers, prostitutes, those guilty of the five most heinous crimes, outcasts, the underprivileged: all are utterly the substance of existence and nothing other than total bliss."
The Supreme Source - The Kunjed Gyalpo
The Fundamental Tantra of Dzogchen Semde

Bristollad
Posts: 273
Joined: Fri Aug 21, 2015 11:39 am

Re: Would advanced AI possess Buddha-nature?

Post by Bristollad » Sat Oct 21, 2017 10:44 am

Grigoris wrote:
Sat Oct 21, 2017 8:30 am
The creativity and intuition required for Go is based on a finite set of specific positions and moves of the pieces. A computer can just crunch the numbers and make a prediction based on this large, yet finite, set thus simulating what appears to be intuition and creativity.
Actually, with Go that's not possible given current computing power.
Traditional AI methods, which construct a search tree over all possible positions, don’t have a chance in Go. This is because of the sheer number of possible moves and the difficulty of evaluating the strength of each possible board position.

In order to capture the intuitive aspect of the game, we knew that we would need to take a novel approach. AlphaGo therefore combines an advanced tree search with deep neural networks. These neural networks take a description of the Go board as an input and process it through a number of different network layers containing millions of neuron-like connections. One neural network, the “policy network”, selects the next move to play. The other neural network, the “value network”, predicts the winner of the game.

We showed AlphaGo a large number of strong amateur games to help it develop its own understanding of what reasonable human play looks like. Then we had it play against different versions of itself thousands of times, each time learning from its mistakes and incrementally improving until it became immensely strong, through a process known as reinforcement learning.
from https://deepmind.com/research/alphago/

Since then, they've created AlphaGoZero which instead of being trained with human games, only trained by playing against itself - supposedly, it is an even stronger opponent.

None of this of course demonstrates that AlphaGo is a sentient being or likely to be a sentient being anytime soon. It seems to me that people are blurring the way the term sentient is used in Buddhism: a sentient being is a living being, other than a Buddha, that acts with intention and thus experiences the results of their behavior. It is necessarily has a mind.

It is not same way sentient is used in the West: able to perceive or feel things. https://en.oxforddictionaries.com/definition/sentient
It could be argued that indeed, using this definition that even simple computers are capable of being sentient - that they can be programmed to react to external stimuli but does this qualify them as having a mind? I don't think so. In much the same way, there are living beings (plants) that clearly react to external stimuli and so are sentient as far as the Western usage goes. Does this make them sentient beings according to Buddhism? Not according to my teachers because they do not have a mind.

User avatar
Jesse
Posts: 1533
Joined: Wed May 08, 2013 6:54 am
Location: Mordor, Middle Earth

Re: Would advanced AI possess Buddha-nature?

Post by Jesse » Sun Oct 22, 2017 2:12 pm

Whether or not an Advanced AI was sentient, is the issue of proving it. When you go down that road, there are no answers; and you may even begin to doubt your own sentience.
“Freedom is secured not by the fulfilling of one's desires, but by the removal of desire” – Epictetus

User avatar
Jesse
Posts: 1533
Joined: Wed May 08, 2013 6:54 am
Location: Mordor, Middle Earth

Re: Would advanced AI possess Buddha-nature?

Post by Jesse » Sun Oct 22, 2017 2:22 pm

The number of possible go games is extremely large. It is often compared to the number of atoms in the universe ([ext] around 10^80), but it is in fact much much larger.
From: https://senseis.xmp.net/?NumberOfPossibleGoGames

So, from a human perspective. GO may as well have infinite movies. Computers, on the other hand, will eventually be able to calculate damn near all possible moves, not just from a current point in time, but from all points in times, at every possible variation of the board.

At the moment, however, computers are limited by processing speed, memory, and even the algorithms used in the AI it'self. They solved this dilemma currently, by limiting the number of moves the AI can predict in advance. So a human MAY be able to predict in advance (mentally), maybe 10-15 moves, from at most 3-5 different moments in time. (if piece (x) moves to (y) and, so on..)

Even that would be the upper limit for the best 1-4 GO players in the world.

With that definition of intelligence, computers are capable of an intelligence that far surpasses our own. However, as greg said, are computers capable of creativity? I think so.

When you really get into it, you reach a line of reasoning that questions your own mind. To actually know these things would require knowing entirely how our own minds, brains, and consciousness work. Which we currently do not.

So as of now, the question is unanswerable.

I think ultimately the answer will be yes: computers are capable of everything humans are, consciousness/awareness included, and scarily much much more than we are.

Computer intelligence will someday reach a level that will be god-like compared to human intelligence. When this occurs, will the computers then question whether or not we are sentient? Probably.
“Freedom is secured not by the fulfilling of one's desires, but by the removal of desire” – Epictetus

Post Reply

Who is online

Users browsing this forum: No registered users and 5 guests