Godhood - A dilemma

SLAM: debunk creationism, pseudoscience, and superstitions. Discuss logic and morality.

Moderator: Alyrium Denryle

Simon_Jester
Emperor's Hand
Posts: 30165
Joined: 2009-05-23 07:29pm

Re: Godhood - A dilemma

Post by Simon_Jester »

Singular Intellect wrote:Assuming one dismisses the possibility of altering one's self to 'micromanage' the universe as easily as one currently operates their heart rythym, what's wrong with delegating 'micro management' to newly created entities that exist solely for that purpose? If you're effectively 'omnipotent', failure to re organize the universe into a perfect one is due solely to an inferior mind and imagination.
Why?

What obligation would you have to edit reality into whatever happens to suit you? Do you have some duty to do what you, specifically you, would want done on account of you got the phenomenal cosmic power and other people who would act differently didn't?

How, exactly, is it a failing of mind and imagination to decide that it honestly doesn't make any difference to me whether other people comply with my image of how they ought to behave?
This space dedicated to Vasily Arkhipov
User avatar
Singular Intellect
Jedi Council Member
Posts: 2392
Joined: 2006-09-19 03:12pm
Location: Calgary, Alberta, Canada

Re: Godhood - A dilemma

Post by Singular Intellect »

Simon_Jester wrote:Why?

What obligation would you have to edit reality into whatever happens to suit you? Do you have some duty to do what you, specifically you, would want done on account of you got the phenomenal cosmic power and other people who would act differently didn't?

How, exactly, is it a failing of mind and imagination to decide that it honestly doesn't make any difference to me whether other people comply with my image of how they ought to behave?
Allow me to quickly explain what I would do if gifted with absolute godlike power over the universe.

Every being/person in existence would effectively become their own god over their own universe(s) and self, and be able to interact, learn and experience any other person's universe(s) at will. There would be only one unbreakable and unbendable rule: no interaction on any level can be achieved without freely given consent and desire.

Presto, an absolutely perfect universe by any measure since every person's universe caters to their subjective taste. The only people who can rationally object to it are those who've lost the ability to control and manipulate others into doing anything they do not wish to do.
"Now let us be clear, my friends. The fruits of our science that you receive and the many millions of benefits that justify them, are a gift. Be grateful. Or be silent." -Modified Quote
Simon_Jester
Emperor's Hand
Posts: 30165
Joined: 2009-05-23 07:29pm

Re: Godhood - A dilemma

Post by Simon_Jester »

I'm not sure I agree- seven billion solipsists is not my idea of a good universe.

So much of the human condition, the good along with the bad, hinges on the existence of some kind of nonconsensual interaction- your ability to do things that displease others, their ability to make their displeasure known. Healthy humans are social animals, and interact meaningfully with both each other and their environment.

Interactions which don't have consequences, which do not affect and are not affected by an external reality that lives on independent of your own whims, aren't meaningful in my book.

And that's not me saying I ought to be in a position to judge what the external realities are or ought to be. But taking the libertarian position to its logical extreme (only interactions desired by both parties are possible) doesn't strike me as a reliable recipe for an ideal version of reality. It strikes me as a recipe for seven billion asocial lunatics with variations on the theme of "God complex."

Which, well. Let's just say that any competent god will tend to remake reality in their own image, and I don't want to be remembered as having that as my image.
This space dedicated to Vasily Arkhipov
User avatar
Singular Intellect
Jedi Council Member
Posts: 2392
Joined: 2006-09-19 03:12pm
Location: Calgary, Alberta, Canada

Re: Godhood - A dilemma

Post by Singular Intellect »

Simon_Jester wrote:I'm not sure I agree- seven billion solipsists is not my idea of a good universe.

So much of the human condition, the good along with the bad, hinges on the existence of some kind of nonconsensual interaction- your ability to do things that displease others, their ability to make their displeasure known. Healthy humans are social animals, and interact meaningfully with both each other and their environment.
That's your opinion and subjective take on things.

You can build your own little universe to whatever whim you desire, make it an 'external reality', wipe your own memory and go about existence with all the particular hardships, annoyances and horrors you see fit to inflict upon yourself. You just can't make anyone else do that, which goes back to my 'perfect universe' assertion.
Interactions which don't have consequences, which do not affect and are not affected by an external reality that lives on independent of your own whims, aren't meaningful in my book.
And anyone sharing that point of view can either build their own universe version or plop themselves into yours. Presto, problem solved, you now live in your perfect universe that includes all your personal criteria for a 'meaningful' existence. You can even ensure you never know the universe you're existing in is one you yourself created, so you can thoroughly enjoy bitching about that late taxi or whatever you need to subscribe 'meaning' to.
And that's not me saying I ought to be in a position to judge what the external realities are or ought to be. But taking the libertarian position to its logical extreme (only interactions desired by both parties are possible) doesn't strike me as a reliable recipe for an ideal version of reality. It strikes me as a recipe for seven billion asocial lunatics with variations on the theme of "God complex."

Which, well. Let's just say that any competent god will tend to remake reality in their own image, and I don't want to be remembered as having that as my image.
If you wish to construct a universe of some nature that allows you to smugly claim you're not falling for any 'god complex' mentality, so be it. The perfect model I proposed easily permits that, you just can't force anyone else to subscribe to it.
"Now let us be clear, my friends. The fruits of our science that you receive and the many millions of benefits that justify them, are a gift. Be grateful. Or be silent." -Modified Quote
Simon_Jester
Emperor's Hand
Posts: 30165
Joined: 2009-05-23 07:29pm

Re: Godhood - A dilemma

Post by Simon_Jester »

Singular Intellect wrote:
Simon_Jester wrote:I'm not sure I agree- seven billion solipsists is not my idea of a good universe.

So much of the human condition, the good along with the bad, hinges on the existence of some kind of nonconsensual interaction- your ability to do things that displease others, their ability to make their displeasure known. Healthy humans are social animals, and interact meaningfully with both each other and their environment.
That's your opinion and subjective take on things.

You can build your own little universe to whatever whim you desire, make it an 'external reality', wipe your own memory and go about existence with all the particular hardships, annoyances and horrors you see fit to inflict upon yourself. You just can't make anyone else do that, which goes back to my 'perfect universe' assertion.
Thing is, you are inherently declaring your notion of how people ought to interact as trumping mine- which sort of defeats your purpose.

Another obvious problem is that as far as I can tell, you're letting me put people in "my own little universe..." who are capable of interacting with each other in ways they didn't agree on in advance. If you ignore that, you might as well not have bothered creating a non-intersecting "bubble child" universe for every individual. If you don't, then you are really pushing that whole "my home is my castle" ability to affect 'my' universe.
Interactions which don't have consequences, which do not affect and are not affected by an external reality that lives on independent of your own whims, aren't meaningful in my book.
And anyone sharing that point of view can either build their own universe version or plop themselves into yours. Presto, problem solved, you now live in your perfect universe that includes all your personal criteria for a 'meaningful' existence. You can even ensure you never know the universe you're existing in is one you yourself created, so you can thoroughly enjoy bitching about that late taxi or whatever you need to subscribe 'meaning' to.
My argument is that you're doing people no favors by sticking them in personal bubble universes- not that I have the choice of setting up some kind of perverse universe you stole from Haruhi Suzumiya, but that the people who sign up for this will inevitably and predictably lose important things about themselves.

Basically, your argument boils down to the idea that free will is more important than everything else, so you should be trying to maximize the number of beings who get to do whatever they want. There are so many problems with this, so many unexamined assumptions and problems that I don't know where to begin. One obvious starting point is the current quote of the week:

"To American exceptionalists, "freedom" means being able to do what you want unencumbered by obligations to your fellow citizens. It is a definition of freedom the rest of the world finds bewildering."

I find it a bit bewildering too.

Another, more significant objection is: how do you define volition? What beings are capable of having a free will which must be honored? Does a baby get this kind of perfect universe where nothing they don't want ever happens to them? Because a baby whose every wish is gratified is not, and will never become, a mature, developed mind. Babies aren't human; they become human by a process of socialization. You are not doing a baby a favor by granting their every whim and desire, because they are unable to wish or desire their own growing-up.

So, do babies get this total freedom from anyone making decisions about what should happen to them?
And that's not me saying I ought to be in a position to judge what the external realities are or ought to be. But taking the libertarian position to its logical extreme (only interactions desired by both parties are possible) doesn't strike me as a reliable recipe for an ideal version of reality. It strikes me as a recipe for seven billion asocial lunatics with variations on the theme of "God complex."

Which, well. Let's just say that any competent god will tend to remake reality in their own image, and I don't want to be remembered as having that as my image.
If you wish to construct a universe of some nature that allows you to smugly claim you're not falling for any 'god complex' mentality, so be it. The perfect model I proposed easily permits that, you just can't force anyone else to subscribe to it.
No, because you're still entirely missing my point: You are not doing people a favor by gratifying their desires in this way.

Granting everyone the maximal amount of ability to do what they want without being encumbered by obligations to other people is not the same thing as creating a perfect universe. It's far too simplistic.

I'd argue that the fact that you can come up with such a short description of what "perfection" looks like should warn you that you haven't put a large enough volume of good enough thought into it.

It is not about my personal sense of a god complex, or lack thereof, or anything about me. It is a general observation about people, and about the dangers of viewing self-gratification as the supreme virtue.
This space dedicated to Vasily Arkhipov
User avatar
Singular Intellect
Jedi Council Member
Posts: 2392
Joined: 2006-09-19 03:12pm
Location: Calgary, Alberta, Canada

Re: Godhood - A dilemma

Post by Singular Intellect »

Simon_Jester wrote:Thing is, you are inherently declaring your notion of how people ought to interact as trumping mine- which sort of defeats your purpose.
No I'm not. People can interact at any level and to any degree they personally wish.
Another obvious problem is that as far as I can tell, you're letting me put people in "my own little universe..." who are capable of interacting with each other in ways they didn't agree on in advance.
False. All interactions must be completely voluntary.
If you ignore that, you might as well not have bothered creating a non-intersecting "bubble child" universe for every individual. If you don't, then you are really pushing that whole "my home is my castle" ability to affect 'my' universe.
Think of it as networked virtual reality. You can pull the plug at any time you desire, but while plugged in you can interact to any mutually agreed level. You could have blood thirsty wars with horror and pain to any degree you imagine, the only difference is no one will be there who doesn't wish to experience that kind of reality.
My argument is that you're doing people no favors by sticking them in personal bubble universes-
'Bubble universes' they can connect to any number of others to any degree they wish, so long as the criteria of freely given consent is met by every party.
not that I have the choice of setting up some kind of perverse universe you stole from Haruhi Suzumiya, but that the people who sign up for this will inevitably and predictably lose important things about themselves.

Basically, your argument boils down to the idea that free will is more important than everything else, so you should be trying to maximize the number of beings who get to do whatever they want. There are so many problems with this, so many unexamined assumptions and problems that I don't know where to begin. One obvious starting point is the current quote of the week:

"To American exceptionalists, "freedom" means being able to do what you want unencumbered by obligations to your fellow citizens. It is a definition of freedom the rest of the world finds bewildering."

I find it a bit bewildering too.
What obligations? Obligations no longer exist in my scenario, because there is absolutely nothing you can give another person they cannot provide themselves. The only exception is your social interaction, which must be entirely voluntary and at whatever level the parties decide to go with.
Another, more significant objection is: how do you define volition? What beings are capable of having a free will which must be honored? Does a baby get this kind of perfect universe where nothing they don't want ever happens to them? Because a baby whose every wish is gratified is not, and will never become, a mature, developed mind. Babies aren't human; they become human by a process of socialization. You are not doing a baby a favor by granting their every whim and desire, because they are unable to wish or desire their own growing-up.
Young minds will develope in the universe(s) created by their parents until they reach a point of self awareness and independence to decide things for themselves. The only real differences will be the parents will have vastly greater control over the universe they bring their child up in, but zero control on limiting their child to it.
So, do babies get this total freedom from anyone making decisions about what should happen to them?
Once they're capable of making decisions and have a degree of self awareness making that possible, absolutely.
Granting everyone the maximal amount of ability to do what they want without being encumbered by obligations to other people is not the same thing as creating a perfect universe. It's far too simplistic.

I'd argue that the fact that you can come up with such a short description of what "perfection" looks like should warn you that you haven't put a large enough volume of good enough thought into it.

It is not about my personal sense of a god complex, or lack thereof, or anything about me. It is a general observation about people, and about the dangers of viewing self-gratification as the supreme virtue.
No one will be obligated to consider or subscribe to your opinion on the matter, and nor will you to anyone else's either. So what's the problem? So far your objection boils down to "I don't think it's a good idea because I say so" and my response is "I'll let everyone decide for themselves, your opinion means fuck all to anyone but you'.
"Now let us be clear, my friends. The fruits of our science that you receive and the many millions of benefits that justify them, are a gift. Be grateful. Or be silent." -Modified Quote
Simon_Jester
Emperor's Hand
Posts: 30165
Joined: 2009-05-23 07:29pm

Re: Godhood - A dilemma

Post by Simon_Jester »

I am now making a conscious effort to reduce the scale of the quote spaghetti.
Singular Intellect wrote:No I'm not. People can interact at any level and to any degree they personally wish... False. All interactions must be completely voluntary...

'Bubble universes' they can connect to any number of others to any degree they wish, so long as the criteria of freely given consent is met by every party.
Ability to share the bubbles doesn't mean you're not living in a bubble. Because this entire... I'm going to call it a 'society' for lack of a better term... consists of people who have every incentive to become a pack of petulant masturbating man-children.

Any kind of psychological growth or discipline they might display will be blind luck; the norm would probably be for interactions to be short before they lose the ability to agree to disagree, take their respective toys, and part ways... because it's so much easier living in a fantasy world where you don't have to interact with those pesky other sentients, as long as you can have good enough VR to drown out whatever residual social-animal instincts in your head are screaming at you.
"To American exceptionalists, "freedom" means being able to do what you want unencumbered by obligations to your fellow citizens. It is a definition of freedom the rest of the world finds bewildering."

I find it a bit bewildering too.
What obligations? Obligations no longer exist in my scenario, because there is absolutely nothing you can give another person they cannot provide themselves. The only exception is your social interaction, which must be entirely voluntary and at whatever level the parties decide to go with.
You have merely confirmed my existing opinion about your position- you honestly do not perceive that this is a problem, that there might actually be something undesirable about deliberately creating a world in which human beings are not bound by any obligations of any kind. Note that this includes more than financial or physical things; it also involves the social obligations that civilize us, that render us able to be civil to each other.

How is being "free" of such obligations to other people intrinsically a goal that it's worth optimizing the universe to obtain?
Another, more significant objection is: how do you define volition? What beings are capable of having a free will which must be honored? Does a baby get this kind of perfect universe where nothing they don't want ever happens to them? Because a baby whose every wish is gratified is not, and will never become, a mature, developed mind. Babies aren't human; they become human by a process of socialization. You are not doing a baby a favor by granting their every whim and desire, because they are unable to wish or desire their own growing-up.
Young minds will develope in the universe(s) created by their parents until they reach a point of self awareness and independence to decide things for themselves.
That is no answer- where, and how, do you draw that line? Is a six month old infant capable of making their own decisions? A three year old? A six year old? A twelve year old? An eighteen year old?

When in the development of human thought do we grant people this supreme power to gratify their wishes and reject any interaction with other people that does not gratify their wishes? Giving that power to an infant means the infant never grows up- the same for a toddler, or even an adolescent.

Who gets to judge "maturity" here? The question of who is mature enough to be trusted with power over their own lives is not an easy or self-evident one, even when the amount of power being given to them is relatively limited.

And what if there are layers above "adult" maturity? It's not hard to find physically grown men and women who are most emphatically not full-grown psychologically. Giving many adults all their wishes would be a disaster, for their own self-actualization's sake, even if you can suppress any physical consequences of doing so?

What makes you think that the minimal unit of "mature enough to have a right to see all their wishes granted" is, say, twelve or fifteen or eighteen years old?

I'm concerned about the logical path of development for people in such a scenario- it seems too likely to collapse into the aforesaid petulant masturbating man-children, because it's too easy to retreat from anyone whose behavior displeases you, even when it's your own damn fault.
Granting everyone the maximal amount of ability to do what they want without being encumbered by obligations to other people is not the same thing as creating a perfect universe. It's far too simplistic.

I'd argue that the fact that you can come up with such a short description of what "perfection" looks like should warn you that you haven't put a large enough volume of good enough thought into it.

It is not about my personal sense of a god complex, or lack thereof, or anything about me. It is a general observation about people, and about the dangers of viewing self-gratification as the supreme virtue.
No one will be obligated to consider or subscribe to your opinion on the matter, and nor will you to anyone else's either. So what's the problem? So far your objection boils down to "I don't think it's a good idea because I say so" and my response is "I'll let everyone decide for themselves, your opinion means fuck all to anyone but you'.
My argument is that you are committing a philosophical error.

See, a philosophical argument takes the form "Belief X is incorrect, or at least of limited validity, because [argument]." You are taking this and saying "Well, it's just your opinion that Belief X is incorrect, so how about I redesign the universe according to Belief X? No one will make you act as if X is correct, after all..."

That is about equivalent to saying "You think two plus two is [number], and I think two plus two is [number], so how about I write all the math textbooks except the one I'll hand to you so you can teach your kids your way."
__________

If I'm wrong and you're right, this is a very odd way to go about it- you're totally abandoning the question of why your ideal universe is a good place, in favor of just saying "well, this is how I'd do it and if you don't like it you'll be free to pretend it never happened."

If you're wrong and I'm right, then you're saying "well, I'll just do this horrible disservice to humanity because my opinion is that it's a good idea, and I will ignore all your arguments about what I should do because I am a hardcore libertarian and everyone should follow their own opinions!"

In neither case does this kind of thinking make any sense.
This space dedicated to Vasily Arkhipov
User avatar
Singular Intellect
Jedi Council Member
Posts: 2392
Joined: 2006-09-19 03:12pm
Location: Calgary, Alberta, Canada

Re: Godhood - A dilemma

Post by Singular Intellect »

Simon_Jester wrote:Ability to share the bubbles doesn't mean you're not living in a bubble. Because this entire... I'm going to call it a 'society' for lack of a better term... consists of people who have every incentive to become a pack of petulant masturbating man-children.
Only if you disregard the social rewards/penalties of such behavior, which you admitted is one of the core aspects of being human.
Any kind of psychological growth or discipline they might display will be blind luck; the norm would probably be for interactions to be short before they lose the ability to agree to disagree, take their respective toys, and part ways... because it's so much easier living in a fantasy world where you don't have to interact with those pesky other sentients, as long as you can have good enough VR to drown out whatever residual social-animal instincts in your head are screaming at you.
So your objection is that people will become reclusive while also admitting people are social creatures that very much need to interact and socialize with other humans. Which is it?
You have merely confirmed my existing opinion about your position- you honestly do not perceive that this is a problem, that there might actually be something undesirable about deliberately creating a world in which human beings are not bound by any obligations of any kind. Note that this includes more than financial or physical things; it also involves the social obligations that civilize us, that render us able to be civil to each other.
I already covered this under the 'voluntary interaction only' rule. The social reward of interacting with other people will be the motivator for 'positive' behavior, although 'positive' will really become 'like minded' since the only negative outcome is a loss of social interaction with other persons.

'External uncontrollable reality' will be reduced to interactions with other people, an exceptionally strong motivator for 'positive' behavior.
How is being "free" of such obligations to other people intrinsically a goal that it's worth optimizing the universe to obtain?
See above. I made it abundantly clear social interaction will still go on and be a motivator for behavior, learning and experiences. There is still a major incentive to be 'positive' which is covered under the only existing rule I would implement.
That is no answer- where, and how, do you draw that line? Is a six month old infant capable of making their own decisions? A three year old? A six year old? A twelve year old? An eighteen year old?

When in the development of human thought do we grant people this supreme power to gratify their wishes and reject any interaction with other people that does not gratify their wishes? Giving that power to an infant means the infant never grows up- the same for a toddler, or even an adolescent.

Who gets to judge "maturity" here? The question of who is mature enough to be trusted with power over their own lives is not an easy or self-evident one, even when the amount of power being given to them is relatively limited.

And what if there are layers above "adult" maturity? It's not hard to find physically grown men and women who are most emphatically not full-grown psychologically. Giving many adults all their wishes would be a disaster, for their own self-actualization's sake, even if you can suppress any physical consequences of doing so?
There will be social consquences, that human condition you earlier professed as such a innate aspect of being human.
What makes you think that the minimal unit of "mature enough to have a right to see all their wishes granted" is, say, twelve or fifteen or eighteen years old?

I'm concerned about the logical path of development for people in such a scenario- it seems too likely to collapse into the aforesaid petulant masturbating man-children, because it's too easy to retreat from anyone whose behavior displeases you, even when it's your own damn fault.
Which results in loss of social interactions which you earlier claimed is such a critical and desired aspect of human existence.
My argument is that you are committing a philosophical error.

See, a philosophical argument takes the form "Belief X is incorrect, or at least of limited validity, because [argument]." You are taking this and saying "Well, it's just your opinion that Belief X is incorrect, so how about I redesign the universe according to Belief X? No one will make you act as if X is correct, after all..."

That is about equivalent to saying "You think two plus two is [number], and I think two plus two is [number], so how about I write all the math textbooks except the one I'll hand to you so you can teach your kids your way."
__________

If I'm wrong and you're right, this is a very odd way to go about it- you're totally abandoning the question of why your ideal universe is a good place, in favor of just saying "well, this is how I'd do it and if you don't like it you'll be free to pretend it never happened."

If you're wrong and I'm right, then you're saying "well, I'll just do this horrible disservice to humanity because my opinion is that it's a good idea, and I will ignore all your arguments about what I should do because I am a hardcore libertarian and everyone should follow their own opinions!"

In neither case does this kind of thinking make any sense.
You have yet to raise any valid objection. So far your only point has been claiming people need some kind external reward/punishment system in order to grow and develope as human beings. That's quite obviously covered under the social interaction rule, since social interactions is very critical to human existence by your own admission. The only rule in existence I proposed also acts as the motivator for positive and cooperative behavior amongst all individuals.

In summary, you've found no flaw in my asserted perfect universe, you've simply further defined it's attributes and how it would work.
"Now let us be clear, my friends. The fruits of our science that you receive and the many millions of benefits that justify them, are a gift. Be grateful. Or be silent." -Modified Quote
User avatar
Garlak
Youngling
Posts: 124
Joined: 2008-10-10 01:08pm
Location: Pale Blue Dot

Re: Godhood - A dilemma

Post by Garlak »

Huh. How much control do parents have over their children? What happens if the two parents disagree on how to raise their child? What if the father wants a girl, and the mother wants a four-armed girl?

How do you handle divorce? Simply clone the child and let each parent raise the child?

How do you resolve differences in solipsistic realities? Some people decide to have a pool party. Another person decides it'd be nice to stop time and murder everyone there. You would say that he can't do that because he doesn't have consent. Alright, but can he go around splashing people in the face? Date-rape drugs?

What about a war simulation? People are prepared to accept permanent death, crippling, e.t.c. Then somebody cheats by stopping time and murdering everyone at once. That's not allowed? Alright, then his guys have gasmasks and the others don't and he uses gas weapons. Not allowed?

... Well, he gives himself a minimap and deep-striking capability. Not allowed?

Are you going to cover everyone eventuality?

Or just declare that people can declare certain areas "like the old reality" or something, except people can say "hold it/rewind/exit" or something? What happens if somebody dies suddenly? Does he have a backup or something? What if he is affected permanently by some mental condition, and he CAN'T undo it because he CAN'T THINK OF UNDOING IT. He still has the power. But not the imagination.


Finally, what if somebody wants to make Fake-People. They'll act like people but not enough to deserve their own godhood.

Also, when do you give someone power over themselves? What if the aforementioned parents decide to mentally retard their child so they'll have permanent total control over them? Or they program him such that while he DOES become a god, he would NEVER be capable of doing anything they don't want him to do.

Do people have privacy, or are you going to be the Supermoderator and will upload 'morality patches' to the universe whenever you or somebody stumble upon something that is reprehensible and breaks the Spirit of the Laws?
I went to the librarian and asked for a book about stars ... And the answer was stunning. It was that the Sun was a star but really close. The stars were suns, but so far away they were just little points of light ... The scale of the universe suddenly opened up to me. It was a kind of religious experience. There was a magnificence to it, a grandeur, a scale which has never left me. Never ever left me.
~Carl Sagan
User avatar
Lord_Of_Change 9
Youngling
Posts: 145
Joined: 2010-08-06 04:49am

Re: Godhood - A dilemma

Post by Lord_Of_Change 9 »

Hmm...this is an interesting dilemma, but there can only be one answer. I'd probably make a terrible god, but the next person this entity makes the offer to might be even worse. For that reason alone I'll have to pick omnipotence and omniscience. Now, what to do?

Step 1. Rename myself. Leviathan is most appropriate.

Step 2. Start smiting various religious fundies with inexplicable deaths, lightning bolts from clear skies, that sort of thing. Make sure not to harm bystanders or politicians.

Step 3. Terraform Mars, the Moon, Venus. Make sure they're suitable for human habitation.

Step 4. Build an immense gilded palace in space, the size of Jerusalem. Make sure nobody notices it before I send it down to Earth.

Step 5. Kill Gaddhafi (or any other dictator) in a highly visible manner, before turning the entire Sahara Desert into arable land. Then send a message to Earth, telling them that I am coming.

Step 6. Descend. My palace descends from the sky a few weeks before the main spectacle, just to confuse people. It lands in an uninhabited wasteland (Arabia's Empty Quarter is most suitable for this). Then, for two weeks, various famous people go into trances, their eyes glowing and their voices obviously not their own, declaring 'Leviathan is coming'. When the time comes, I manifest myself as an immense creature made of light and fire, just to stop anybody from getting ideas I teleport all NBC weapons into the sun.

Step 7. Keep myself in an aloof 'big brother' role, only intervening when things get seriously bad, with occasional reminders of my presence. Share some of my infinite knowledge with scientists to increase the rate of advancement, and change some physical laws that I don't like (I'm gonna make FTL possible for starters).
User avatar
Lagmonster
Master Control Program
Master Control Program
Posts: 7719
Joined: 2002-07-04 09:53am
Location: Ottawa, Canada

Re: Godhood - A dilemma

Post by Lagmonster »

This is trying to be a philosophical discussion, not a teenager's idiot fantasy. Either use what brains you have or stay out of the thread.
Note: I'm semi-retired from the board, so if you need something, please be patient.
Simon_Jester
Emperor's Hand
Posts: 30165
Joined: 2009-05-23 07:29pm

Re: Godhood - A dilemma

Post by Simon_Jester »

I don't know about you, but if I say I'd make a terrible god, I'd pass it on: "You know who'd make a decent deity, or at least a better god than me? My friend X."

If you honestly can't think of anyone to say that of, you are either very poor in terms of the quality of people you know, or very very self-righteous.

The whole thing reminds me of that Heinlein quote that was quote of the week here about... [checks] a year back:

“Men rarely (if ever) manage to dream up a god superior to themselves. Most gods have the manners and morals of a spoiled child.”
This space dedicated to Vasily Arkhipov
Rossum
Padawan Learner
Posts: 422
Joined: 2010-04-07 04:21pm

Re: Godhood - A dilemma

Post by Rossum »

Hmm... would this omnipotence allow me to raise the dead and/or pass on my omnipotence one I've gained it?

If so then I first take the power for myself (to avoid it going to waste or going to someone worse than myself) then read up on all history in the world to find the nicest and sanest people in history. For implicitness sake I'd resurrect Fred Rogers first.

Then, I'd ask the collection of the nice and sane people what I should use the power for. Basically let them be my conscience until I either learn how to be reliably benevolent or get a better system in place.

Eventually, the best bet would be to get a world government in place (or something like it) where the various problems of humanity can be directed, looked at by intelligent and ethical people, and the really major ones can be directed to me to solve. Humanity should explore the universe (actually, one of the first things I should do is scan the universe to see if there is life out there, primarily looking for threats but also be willing to help them if they are in trouble).


Not sure if I should actually transfer my power to someone else or just reprogram myself to be nice and listen to the appointed human government.
Fry: No! They did it! They blew it up! And then the apes blew up their society too. How could this happen? And then the birds took over and ruined their society. And then the cows. And then... I don't know, is that a slug, maybe? Noooo!

Futurama: The Late Philip J. Fry
User avatar
Crossroads Inc.
Emperor's Hand
Posts: 9233
Joined: 2005-03-20 06:26pm
Location: Defending Sparkeling Bishonen
Contact:

Re: Godhood - A dilemma

Post by Crossroads Inc. »

For newcomers to the thread I would please ask to go through and re-read what has been posted already.

Thus far this is a discussion about the true nature of Godhood. Things like "What should I do" may well become moot, because, as "god" you would "know" everything. It isn't a matter of making small choices like who to ask about what to do, you can recreate Creation with a wave of the hand.

What this is so far, is really a discussion about the true moral obligations of what a "god" would intal.

Currently "SI" is debating Simon_Jester about that nature of how such god hood should be utilized.



For me?

I touched on this earlier, but the universe is a hostile place. Someone with "god" powers should not see 'itself' as a good. Gods are, as others have stated, often bitchy, self-righteous, vengful, etc etc. Someone who has such power could spend as long as they wish to decied "how" to use it. They could spend a million years inside a small universe toying with concepts of reality to best deicide what to do.

There is so much that would become inconsequential with such a vast expansion of consciousness that it is hard to truely judge how someone would behave. I feel if this where to happen, the person may simply go off and start a new universe from scratch to experiment with.
Praying is another way of doing nothing helpful
"Congratulations, you get a cookie. You almost got a fundamental English word correct." Pick
"Outlaw star has spaceships that punch eachother" Joviwan
Read "Tales From The Crossroads"!
Read "One Wrong Turn"!
Simon_Jester
Emperor's Hand
Posts: 30165
Joined: 2009-05-23 07:29pm

Re: Godhood - A dilemma

Post by Simon_Jester »

Put simply, I don't like the idea of us saying "a god should transform everything entirely!" Because most of the arguments for that are based on fairly simple, single-variable ethical systems: maximize the number of Happiness Points in the universe, maximize the number of Free Will Points in the universe, something like that.

It's very easy for anyone, even someone who knows very little about the universe we have, to imagine some hypersimplified bubble-world in which the number of Happiness Points or Free Will Points or some such is maximized. These are easy thought-experiments to construct. The question is, is that really what you want? Should it be? Would everyone really want to live in a world where the number of Happiness Points is maximized; does happiness even have value at that point or does it just become an arbitrary number?

Because it is very easy, in this kind of thought experiment, to wind up with a universe that optimizes some arbitrary number at the expense of many things we would normally value, or parts of our experience that a lot of people wouldn't want to lose. And yes, Crossroads, this is something a responsible deity would put a lot of thought into.



I'd argue that any one of us who can say now what they think the whole universe should look like has ambitions far too large-scale compared to the amount of thought they've put into the problem. I am not qualified to decide for myself what the world should be. I feel reasonably competent to decide some of the things it shouldn't be, but that's so much easier it defies description.

To take an example, imagine you had the ability to alter the physical shape of the world, changing all behavior of gravity and atmosphere and so on that you could do it without killing anyone or anything. You can decide if the Earth should be shaped like a sphere, or a cube, or a taco, or any other shape.

Thinking that over, I feel qualified to say that the world shouldn't be a long cylinder the diameter of a tightrope; I wouldn't want to live in such a world because it would be very unpleasant (and difficult to travel on, if you didn't happen to want to spend the rest of your life between your nearest neighbors). That strikes me as a very bad idea.

But at the same time, I wouldn't pretend to be qualified to say "we should turn the world inside out and walk around on the interior of a hollow sphere." I'm confident enough to make negative statements about solutions that I feel would be hurtful, but not enough to make positive statements about what the final outcome ought to be. And I am strongly opposed to the idea of anyone reshaping the world into a geometry that suits them better unless I have tremendous faith in the idea that they have put one hell of a lot more high-quality thought into the problem than I have.

And that generalizes to morals and all other aspects of changing the answers to 'deep' questions with the aim of making the universe better. Anyone who does it bloody well should have put a lot more thought into it than I ever could, and if it is even slightly possible I'd like to see the evidence that they've done it.



This standard of "anyone" includes me too- I am not competent to reshape my world in a way that can reasonably be 'fair' by the standards of my own ethical values, or at least the "better angels of [my] nature."

Now, I can conceive of a being that is capable of doing the job. But it is very difficult for me to imagine such a being, with the level of self-editing ability that the Singularitarians believe is necessary, and with the scale of power to alter reality that a god would need, that wouldn't go completely off the rails and do something that to me feels like an attempt to reshape reality on a Procrustean bed.

There's nothing wrong with the idea of a bed of a certain size, but there's something very wrong with the idea that everything else in the world needs to be reshaped to fit that bed.
This space dedicated to Vasily Arkhipov
User avatar
Purple
Sith Acolyte
Posts: 5233
Joined: 2010-04-20 08:31am
Location: In a purple cube orbiting this planet. Hijacking satellites for an internet connection.

Re: Godhood - A dilemma

Post by Purple »

One thing that strikes me here is that everyone (well except me) is thinking large. You are thinking of how you would re shape the universe in your image or by your motivation or alternatively why one should not do that. But why is there some sort of requirement that with infinite power must come infinite control and infinite desire to control?

Personally, I would just treat the universe as my playground like Q does. The only care I would take is not to cause too much harm and to sort of cause good when I feel like it. But I can't really see where the desire to go all out like a megalomaniac and change the fabric of the universe comes from.
It has become clear to me in the previous days that any attempts at reconciliation and explanation with the community here has failed. I have tried my best. I really have. I pored my heart out trying. But it was all for nothing.

You win. There, I have said it.

Now there is only one thing left to do. Let us see if I can sum up the strength needed to end things once and for all.
Eulogy
Jedi Knight
Posts: 959
Joined: 2007-04-28 10:23pm

Re: Godhood - A dilemma

Post by Eulogy »

One thing the universe should be, however, is fairer. Yeah, I know the universe cannot care, but if people started off on an equal footing (i.e. not being born with a genetic defect or getting his or her genitals mutilated) and were not subject to disastrous, cruel twists of fate, I would think that would go a long way towards reducing suffering.

Everything should be fully reversible, so genuine mistakes don't lead to tortuous brain cancer or something. However, it shouldn't necessarily be easy. A truly dickheaded move should take its own long quest to undo.

If I was really feeling playful, I'd create alternate dimensions with its own physics and let people make use of and have fun with them. A dimension where anybody can fly, for example. Of course, some of these dimensions can also do double duty as an afterlife, not that people would die if they didn't want to.

I'm using the word should a lot - but the universe has had it coming.

Simon, you could always give people the means to transform themselves.
"A word of advice: next time you post, try not to inadvertently reveal why you've had no success with real women." Darth Wong to Bubble Boy
"I see you do not understand objectivity," said Tom Carder, a fundie fucknut to Darth Wong
Simon_Jester
Emperor's Hand
Posts: 30165
Joined: 2009-05-23 07:29pm

Re: Godhood - A dilemma

Post by Simon_Jester »

You can, but there's so much complexity involved, and a great deal at stake- give people the power to remake themselves and you are in some sense responsible for their development. If one is going to do it, one has to look at the outcomes, at what is being changed and why, and whether there's something worth having on the other side of the metaphorical singularity.

I don't think a god should want to be responsible for creating a new natural order that would be considered accursed or unworthy by decent-minded inhabitants of the old natural order. That imposes a very high standard of due-diligence and care on the god.
This space dedicated to Vasily Arkhipov
User avatar
Sriad
Sith Devotee
Posts: 3028
Joined: 2002-12-02 09:59pm
Location: Colorado

Re: Godhood - A dilemma

Post by Sriad »

A dilemma I realized a while after I first came face-to-face with the "If God is all powerful and good[...]" conundrum is this:

Suppose a being is all-knowing.

Implicit in "all-knowing", such a being has perfect knowledge of all possible realities. Everything from eternally-raped-with-acid-covered-sea-urchins Hell to blissful unending Heaven... but here's the kicker. If it has all the data needed to describe something perfectly (and it does) that's equivalent to the thing ACTUALLY EXISTING by the same logic as it being immaterial whether we live in the universe or a flawless simulation of the universe. If you're getting bamboo slivers shoved under your fingernails it doesn't make a difference whether you're made of atoms, signals in a staggeringly powerful computer, or whatever storage medium omniscient beings use... the data is all that matters.

The only way for war, disease, suffering, etc to be eliminated is for the all-knowing being to eliminate its own knowledge of those things.

I'd need to spend a while trying to figure that one out if I got god-powers.
Simon_Jester
Emperor's Hand
Posts: 30165
Joined: 2009-05-23 07:29pm

Re: Godhood - A dilemma

Post by Simon_Jester »

I don't buy it, Sriad.

Consider, among other things, that there is a difference between knowing the outcome of a simulation and the process of simulation itself. If I can run a perfect world simulator, and I know that if I set up certain initial conditions in the simulation, that the world will explode, but do not run the simulation, it is not the same as having blown up a world.

Or if I run a low-granularity simulation, one that does not have the fine-grained detail and self-awareness of the 'real reality' but which is nonetheless useful for predictive purposes.

In general, there has to be some difference between the map and the territory; knowing that a thing will happen is not the same as the thing having already happened.
This space dedicated to Vasily Arkhipov
User avatar
Sriad
Sith Devotee
Posts: 3028
Joined: 2002-12-02 09:59pm
Location: Colorado

Re: Godhood - A dilemma

Post by Sriad »

The issue isn't running the simulation or not; it's that with omniscience one would know the outcome of every infinitesimal step of the simulation without "running" it --if an omniscient being knows the outcome of a process doesn't it HAVE to know the process that led to that outcome too? Similarly granularity of knowledge isn't a refuge because, again, omniscience = perfect knowledge.

I guess in the end there's a moral imperative to limit ones omniscience which SHOULD be within the scope of the godlike powers on offer here; it's just a hard balance to find, and very counterintuitive that to be good I'd have to limit myself not just in action, but in thought itself because with that kind of power and intellect backing it up thought is essentially indistinguishable from action, even if that action can be ret-conned.
Simon_Jester
Emperor's Hand
Posts: 30165
Joined: 2009-05-23 07:29pm

Re: Godhood - A dilemma

Post by Simon_Jester »

Well, to get wonky and philosophical:

If such an omnipotence and omniscience is possible, then we're sort of stuck with the idea that there is some kind of 'superreality-' something which can override the normal operating constraints on mind and body. Operating at that level, I strongly suspect one could, for example, run a universe simulator without qualia: none of the entities in the simulation have any internal feeling that corresponds to their external behavior.

This would be perhaps in some sense creepy, equivalent to growing brainless clones of human bodies to do cancer research or something... but it wouldn't raise the question of "are we torturing people by doing this" because the qualia-free inhabitants of the simulation do not perceive torture as something qualitatively unpleasant.

I would argue that there is still a distinction between thought and action, even with a perfectly fine-grained perception of what goes on in one's thoughts; a simulation can be internally real to its occupants but doesn't have to be, and there is still in some significant sense a hierarchy of realities: a simulation within a simulation within a dream within a dream is not on the same level of significance as the thing doing the dreaming.

Put another way, it is conceivable that the thoughts of a strongly-godlike being can be functionally equivalent to reality for ethical purposes (what is unethical in the thoughts is unethical in reality and vice versa)... but this does not have to be inevitably true. Moreover, the existence of omniscience strongly implies an ability to 'cheat,' to obtain infinite information without having to do infinite computation. This means that I may be able to obtain the results of a course of action without having to perform the process that leads to them... in which case no it is not equivalent to having performed the process.

If I can get fruit without planting trees, the existence of the fruit does not imply that I have planted the tree. If I can get information without performing ghoulish experiments, the existence of the information does not imply that I have performed a ghoulish experiment.
This space dedicated to Vasily Arkhipov
User avatar
Starglider
Miles Dyson
Posts: 8709
Joined: 2007-04-05 09:44pm
Location: Isle of Dogs
Contact:

Re: Godhood - A dilemma

Post by Starglider »

Simon_Jester wrote:Operating at that level, I strongly suspect one could, for example, run a universe simulator without qualia: none of the entities in the simulation have any internal feeling that corresponds to their external behavior.
This is more difficult than it sounds. Qualia aren't trivial epiphenomena to be easily dispensed, they are the reflective perception of the basic functional mechanisms that animal brains use to process sensory data. Those mechanisms are generally the simplest ones that will work, given the available hardware. Alien and artificial intelligences will almost certainly have them too, although of course the details will differ.

It is almost certainly possible to generate intelligent-looking behavior without using mechanisms that constitute qualia, but this is not a low-level universe simulation. This would be an emulation that ommits the content representing the brains of intelligent entities and replaces it with something completely different, e.g. a superintelligence using some combination of brute force, interpolation of canned responses and what we'd consider acting to impersonate the intelligent entities.
Put another way, it is conceivable that the thoughts of a strongly-godlike being can be functionally equivalent to reality for ethical purposes
They are. This is the 'internal actor genocide' problem in Friendly AI; when internal simulations are of sufficient fidelity to be sentient, unless special precautions are taken the mere act of thinking about something can create millions of new people and then immediately kill them.
Simon_Jester
Emperor's Hand
Posts: 30165
Joined: 2009-05-23 07:29pm

Re: Godhood - A dilemma

Post by Simon_Jester »

Starglider wrote:
Simon_Jester wrote:Operating at that level, I strongly suspect one could, for example, run a universe simulator without qualia: none of the entities in the simulation have any internal feeling that corresponds to their external behavior.
This is more difficult than it sounds. Qualia aren't trivial epiphenomena to be easily dispensed, they are the reflective perception of the basic functional mechanisms that animal brains use to process sensory data. Those mechanisms are generally the simplest ones that will work, given the available hardware. Alien and artificial intelligences will almost certainly have them too, although of course the details will differ.

It is almost certainly possible to generate intelligent-looking behavior without using mechanisms that constitute qualia, but this is not a low-level universe simulation. This would be an emulation that ommits the content representing the brains of intelligent entities and replaces it with something completely different, e.g. a superintelligence using some combination of brute force, interpolation of canned responses and what we'd consider acting to impersonate the intelligent entities.
Which is where "omnipotent" comes in. Yes, I appreciate the level of difficulty involved, but this is something that is proposed to be done by an extremely potent superintelligence in the first place, so the ethical problem can be circumvented should said supermind feel it necessary.

We're not restricted here to trying to figure out behavior by brute-force simulation of the brain on an atom-by-atom basis.

Moreover, I am still unsure whether, under the (seemingly impossible) condition of omniscience, we must necessarily assume that knowing the outcome of an action implies that the action has been performed in simulation- and therefore, for all intents and purposes, performed in reality because the simulation is perfect by definition.

As to the "internal actor genocide" problem, I find myself compelled to wonder what 'death' means in the context of an internal Sim* when the simulation stops running. There is no qualitative experience of the simulation ending, it just stops, potentially to be restarted later if we remembered to save the simulation state, which under the circumstances we'd damn well better.

If we are inhabitants of a simulated universe, we can't really hope to tell whether our simulation runs continuously or occasionally spends a millenium or two in limbo sitting on the Thumb Drive of God, so to speak.

The internal actor genocide issue would seem to become really significant only when we're talking about creating especially bad simulated worlds. Popping a world similar to Earth into existence doesn't strike me as intrinsically immoral unless the existence of Earth itself is intrinsically a net moral negative. Whereas running an Auschwitz simulation is clearly a net moral negative if your simulated inmates are suffering the way the real ones did.
____________

*My preferred term for the residents of a hypothetical simulation; what can I say, I was playing Will Wright games around the time I turned eight and it stuck.
This space dedicated to Vasily Arkhipov
User avatar
Starglider
Miles Dyson
Posts: 8709
Joined: 2007-04-05 09:44pm
Location: Isle of Dogs
Contact:

Re: Godhood - A dilemma

Post by Starglider »

Simon_Jester wrote:Moreover, I am still unsure whether, under the (seemingly impossible) condition of omniscience, we must necessarily assume that knowing the outcome of an action implies that the action has been performed in simulation- and therefore, for all intents and purposes, performed in reality because the simulation is perfect by definition.
This gets into ontological questions of causality and reality as information content. For various reasons I find the easiest way to rationalise this to be that (simplifying) instances of a pattern occuring lend support to the platonic prototype pattern, where for intelligent beings the amount of support for a causal path translates into the subjective probability of experiencing that event - this seems like the only way to make sense of potentially infinite universes, many worlds and all kinds of strange paradoxes you get when doing thought experiments on digital intelligences (that can be losslessly copied, repeated etc).

I don't think it's logically possible to acquire information without giving support to some causal chain from axioms to the desired proof, but certainly superintelligence could chose which path to take and potentially reduce the support to an infinitesimal level, compared to that provided by the rest of the universe.
There is no qualitative experience of the simulation ending
There is no qualitative experience of sitting on a nuclear weapon when it goes off either, events happen too fast for any kind of neural processing to take place.
it just stops, potentially to be restarted later if we remembered to save the simulation state, which under the circumstances we'd damn well better.
Yes, this is actually one of the arguments for a 'flight recorder' tamper-proof full log for seed AI, in that if it accidentally creates and kills a few trillion people at least we can restore them later. Of course if you believe the dust hypothesis we don't have a moral requirement to do so anyway, but that is of course highly speculative. :)
If we are inhabitants of a simulated universe, we can't really hope to tell whether our simulation runs continuously or occasionally spends a millenium or two in limbo sitting on the Thumb Drive of God, so to speak.
Obviously. Human consciousness is kind of like that anyway considering the sheer amount of physical events that happen between each neuron firing, never mind actual thought.
The internal actor genocide issue would seem to become really significant only when we're talking about creating especially bad simulated worlds.
Well, this depends on whether you think that people once created should have a right to continue existing, regardless of whether their death is instantaneous and painless or not. Personally I would say that freedom is the right of all sentient beings all sapient intelligences should have the choice of whether to continue existing.
Locked