Destructionator XIII wrote:
It's about the process of decision making. In each step, the driver had a reason to make a new decision: either something changed, causing him to react to to the change, or someone pointed out an error in his original plan.
A single human driver in such a scenario is making decisions on instinct, and/or making snap judgements. Debate doesn't enter into it - there's no time for it. On the other hand, societies change more slowly.
Applying this to transhumanism: we're already humans, so making the decision to change needs to have a positive reason, and (the magnitude depends on the individual) probably a pretty strong one too.
A shot at immortality feels like a pretty strong enough reason. It's a concept that intrigued humanity throughout the ages, or at least those sections who weren't too busy finding their next meal.
If you fail to convince me to remain human, it doesn't matter - I'll naturally stay that way whether I like it or not, unless an active decision is made to change things.
Well, if you want to stay human I don't see the problem with that. It's a phenotype with considerable sentimental value.
Even a list of positives doesn't necessarily hold any weight. "Floor it!" "Why?" "You'll go faster!" Yeah, I guess, but maybe you don't care about going faster, or maybe there's a red light ahead, so you'll just have to stop soon anyway.... so going faster now doesn't even get you to the destination any faster!
The red light is why I don't expect much societal change from AI and the sort. Maybe you can run faster, but if you hit a red light and have to wait anyway... I can still catch up with you.
The red light in the real world might be building new physical infrastructure, getting the materials or data together (for instance, when doing a science experiment, analyzing the data might take a lot less time than taking the measurements), waiting on a team mate, or talking to people to figure out the requirements.
I think this makes the same kind of mistakes that people make about technology or evolution. Multicellular organisms haven't supplanted
single-celled organisms any more than touchscreens have replaced buttons. Similarly, I don't think AIs or transhumans will necessarily replace
humanity; their unique natures will allow them to fill different niches in a larger meta-civilisation.
Of course, the competition aspect Ford Prefect talked about on the last page might convince a lot of individuals to change, which probably changes things about society too... but even that I really doubt will be revolutionary. As a civilization, we'll keep doing basically the same things, just, at most, a wee bit faster on the whole.
"More of the same, but bigger" is not borne out by the history books. We don't build pyramids the size of Everest to bury our god-kings in.
Sure, but suppose you fail to justify the way things are. That isn't going to make things change on it's own. You have to convince people to take action to enact change.
But of course. That's why I'm not happy with the transhumanist movement as it currently stands - too many sociopathic libertarians and wannabe supermen.
Does it follow that I reject all authority? Perish the thought. In the matter of boots, I defer to the authority of the boot-maker - Mikhail Bakunin
Capital is reckless of the health or length of life of the laborer, unless under compulsion from society - Karl Marx
Pollution is nothing but the resources we are not harvesting. We allow them to disperse because we've been ignorant of their value - R. Buckminster Fuller
The important thing is not to be human but to be humane - Eliezer S. YudkowskyNova Mundi
, my laughable attempt at an original worldbuilding/gameplay project