The Transcension Hypothesis?

SLAM: debunk creationism, pseudoscience, and superstitions. Discuss logic and morality.

Moderator: Alyrium Denryle

Post Reply
User avatar
cosmicalstorm
Jedi Council Member
Posts: 1642
Joined: 2008-02-14 09:35am

The Transcension Hypothesis?

Post by cosmicalstorm »

I've been looking for a good explanation to the Fermi paradox for a long time. I found this one recently, it was the best explanation I've come across so far. Are there any major holes in his explanation that I'm missing out on? Too long to be posted in quote format. But here is that TLDR. I know it's a bit "singularity-ish".
Highlights:
• Evolutionary developmental (evo devo) biology is used as a model for universal evolution and development.
• A developmental process is proposed that takes universal intelligence to black hole efficiency and density.
• Unique properties of black holes as attractors for advanced intelligence are reviewed.
• Testable implications of black hole transcension on exoplanet search, METI and SETI are proposed.
• Information theoretic and evo devo arguments for transcension as a solution to the Fermi paradox are proposed.
The Transcension Hypothesis:
Sufficiently Advanced Civilizations May Invariably Leave Our Universe,
and Implications for METI and SETI.

http://accelerating.org/articles/transc ... hesis.html
User avatar
cosmicalstorm
Jedi Council Member
Posts: 1642
Joined: 2008-02-14 09:35am

Re: The Transcension Hypothesis?

Post by cosmicalstorm »

Nice telescope. :D
Furthermore, as Eshleman (1979, 1991), Maccone (1992, 1998) and others propose, massive objects like our sun are great telescopes for gravitational lensing, for collecting nonlocal information about the universe. Sensors in orbit at a stellar focal distance of 550 AU from our sun would have very high quality nonlocal information streaming into them. Maccone has estimated that such a telescope would allow us to observe planets across our galaxy as if they were in our own solar system, and detect and analyze the faintest EM signals. But as Vidal (2010b) has noted, the resources of our parent star, if absorbed into a black hole instead, would allow an even better telescope to be produced, one with a focal distance only a few kilometers away from the hole (Maccone 2010), and thus the ability to field a far higher density of sensors. Should an intelligent civilization, prior to its final formation as a black hole, construct a focal sphere of tiny orbiting sensors at the appropriate focal distance in normal space, the sensors would stream the ultimate high resolution movie of all future universal activity into it as it instantaneously headed to its gravitationally-determined merger point. Thus we may call the black hole focal sphere an "ultimate learning device," as it would capture as much remaining nonlocal information as may be theoretically possible, in the shortest local time possible, allowing all black hole civilizations to record remaining universal reality as fast as possilbe, then update their perennially imperfect and incomplete models as best as possible at the merger.
He's no fan of the Kardashev scale.
Expansion-oriented scholars frequently refer to the Kardashev scale (1964,1997), which first proposed that growth in a civilization’s total energy use (using first a planet, then a sun, then a galaxy, then the universe's energy budget) is a reasonable metric for development. But if transcension is the fate of all advanced civilizations, total energy use would only be a proxy for early civilization development, and a would be a misleading indicator late in development. Total energy use cannot grow exponentially beyond the energy budget of the civilization's star, unless such civilizations enter black holes (inner space) and undergo extreme time dilation, a topic we will discuss shortly. In normal (outer) space, we can expect a civilization's energy use to grow logistically to the limit of local energy resources, due to the vast distances between stars. Barrow (1998) has proposed an anti-Kardashev scale, where the key metric is instead the miniaturization (spatial localization) of a civilization’s engineering, perhaps terminating at the Planck scale. Due to STEM compression, intelligent civilizations can presumably continue to develop exponentially more localized, miniaturized, dense, efficient and complex structures and energy flows to generate greater computational and adaptive capacity, right up to the black hole limit and presumably even beyond, as black hole event horizons in stellar mass and supermassive black holes are still well above the apparent Planck-scale limits of universal structure. Thus, if the hypothesis is correct, the Barrow scale, and more generally, STEM efficiency, STEM density, and computational growth scales would be much more appropriate measures of civilization development.
Simon_Jester
Emperor's Hand
Posts: 30165
Joined: 2009-05-23 07:29pm

Re: The Transcension Hypothesis?

Post by Simon_Jester »

Anyone who thinks the noun for transcending is "transcension" instead of "transcendence" probably isn't bright or educated enough to deserve to be taken seriously. Singularity-wank provides false dignity to a LOT of morons and bozos with power fantasies...
This space dedicated to Vasily Arkhipov
Lord of the Abyss
Village Idiot
Posts: 4046
Joined: 2005-06-15 12:21am
Location: The Abyss

Re: The Transcension Hypothesis?

Post by Lord of the Abyss »

Simon_Jester wrote:Anyone who thinks the noun for transcending is "transcension" instead of "transcendence" probably isn't bright or educated enough to deserve to be taken seriously.
It's not all that rare a usage. "Transcension" seems to be used more to specially refer to the act of transcending something, while "transcendence" is used to refer to the state of having transcended.
"There are two novels that can change a bookish fourteen-year old's life: The Lord of the Rings and Atlas Shrugged. One is a childish fantasy that often engenders a lifelong obsession with its unbelievable heroes, leading to an emotionally stunted, socially crippled adulthood, unable to deal with the real world. The other, of course, involves orcs." - John Rogers
User avatar
cosmicalstorm
Jedi Council Member
Posts: 1642
Joined: 2008-02-14 09:35am

Re: The Transcension Hypothesis?

Post by cosmicalstorm »

Isn't that a bit of an easy dismissal Simon? Most of the trends he is referring to are pretty solid. Do you know anyone who has made a more "thorough" dismissal of this concept? I want to see both sides.
User avatar
GrandMasterTerwynn
Emperor's Hand
Posts: 6787
Joined: 2002-07-29 06:14pm
Location: Somewhere on Earth.

Re: The Transcension Hypothesis?

Post by GrandMasterTerwynn »

cosmicalstorm wrote:I've been looking for a good explanation to the Fermi paradox for a long time. I found this one recently, it was the best explanation I've come across so far. Are there any major holes in his explanation that I'm missing out on? Too long to be posted in quote format. But here is that TLDR. I know it's a bit "singularity-ish".
Highlights:
• Evolutionary developmental (evo devo) biology is used as a model for universal evolution and development.
• A developmental process is proposed that takes universal intelligence to black hole efficiency and density.
• Unique properties of black holes as attractors for advanced intelligence are reviewed.
• Testable implications of black hole transcension on exoplanet search, METI and SETI are proposed.
• Information theoretic and evo devo arguments for transcension as a solution to the Fermi paradox are proposed.
The Transcension Hypothesis:
Sufficiently Advanced Civilizations May Invariably Leave Our Universe,
and Implications for METI and SETI.

http://accelerating.org/articles/transc ... hesis.html
Here are a couple simple alternatives to this hypothesis.
A) It is difficult to envision evolution selecting for extremely long-term planning. Ergo, species will tend to make short-sighted decisions that condemn them to spend their entire existence trapped at the bottom of their home gravity pit. There is a strong possibility that Humans are utterly mundane in this regard. Our decision to put several hundred million years of biologically produced carbon right back into the atmopshere will certainly place on us a very substantial burden as the rate of climate change starts to surpass our ability to cope with it. Also, our decision to use the energy liberated in burning all that carbon to equip everyone with their own personal pollution machine, a McMansion, and the latest disposable McPhones and other electronic toys ... instead of, say efficient urban planning and Moon colonies may lock us on Earth's surface when we wake up one day to realize all the cheap energy is gone, and the cost of producing enough energy just to sustain civilization rises to the point where it excludes humans leaving Earth's surface.
B) It is looking increasingly certain that, apart from some lingering errata around the margins, our understanding of how the universe works achieved the highest possible level by the last half of the twentieth century. It is also looking like we are likely to hit the limits of materials science and engineering by the end of the twenty-first. Which is to say that any alien species in the universe may be limited in what they can achieve by the laws of physics. Which is to say that: For species that managed to not kill themselves, founding interstellar colonies will be bloody expensive. On top of that, miniaturization down to the pixel-level of the universe may not actually be achievable without elf magic. So too would the creation of superduperhypermegaquantum Singularity civilization singularities. Of course, this does not discount the possibility that a sufficiently advanced civilization might choose to find a natural black hole and jump into it ... but they'd be doing so simply out of boredom with a thoroughly depressing universe.

The A and B hypothesis are testable. A species that manages not to get itself killed will probably end up consuming all the rocky material in their home system to build lots of solar collectors and orbital habitats ... creating a Dyson swarm (as you can then stick around your home sun pretty much until it gets ready to blow off all its outer layers and become a white dwarf.) A Dyson swarm has a distinctive infrared signature. So we should find them . . . but only a handful of them.
User avatar
K. A. Pital
Glamorous Commie
Posts: 20813
Joined: 2003-02-26 11:39am
Location: Elysium

Re: The Transcension Hypothesis?

Post by K. A. Pital »

Please; a Kardashev multistellar civilization could easily be built by physically immortal intelligent constructions/beings. The fact that he overlooks such a basic possibility makes the rest of the analysis stupid.
Lì ci sono chiese, macerie, moschee e questure, lì frontiere, prezzi inaccessibile e freddure
Lì paludi, minacce, cecchini coi fucili, documenti, file notturne e clandestini
Qui incontri, lotte, passi sincronizzati, colori, capannelli non autorizzati,
Uccelli migratori, reti, informazioni, piazze di Tutti i like pazze di passioni...

...La tranquillità è importante ma la libertà è tutto!
Assalti Frontali
User avatar
Surlethe
HATES GRADING
Posts: 12267
Joined: 2004-12-29 03:41pm

Re: The Transcension Hypothesis?

Post by Surlethe »

"Black hole-like domains"? So he's saying that advanced civilizations become black holes? Don't you need to posit some post-GR physics about the interiors of black holes for that to be even remotely plausible, or am I misunderstanding his point?
A Government founded upon justice, and recognizing the equal rights of all men; claiming higher authority for existence, or sanction for its laws, that nature, reason, and the regularly ascertained will of the people; steadily refusing to put its sword and purse in the service of any religious creed or family is a standing offense to most of the Governments of the world, and to some narrow and bigoted people among ourselves.
F. Douglass
Simon_Jester
Emperor's Hand
Posts: 30165
Joined: 2009-05-23 07:29pm

Re: The Transcension Hypothesis?

Post by Simon_Jester »

cosmicalstorm wrote:Isn't that a bit of an easy dismissal Simon? Most of the trends he is referring to are pretty solid. Do you know anyone who has made a more "thorough" dismissal of this concept? I want to see both sides.
Despite my screenname, I do occasionally make a jest. ;)

Mostly, that sort of thing simply irritates me. It speaks of the author's pretenses, at least to me. I'm not entirely serious. If you want me to take it more seriously...

I think, if we ever find an alien civilization that has "transcended" by building itself up into a system-spanning computer network, it won't be very impressive compared to the real dreams of singularitarianism. The racial epitaph might read something like:

"Zabriskan Civilization. ~10000 Before [Untranslatable] to 2137 Age of [Untranslatable]. Cause of Death: Uploaded en masse, then vanished into their own World of Warcraft servers and never came back out. I hope they're having fun in there... Wait, what? You think they built a Dyson Swarm? Why? They didn't need that many more solar panels once they tiled the Great Salt Desert to run the server farm; there were only ten billion of them in the first place. And... built their own black hole? Don't be bloody silly!"

The Future in Cyberspace is, I suspect, going to turn out to be rather prosaic, in spite of all the pretty language The wild card is artificial intelligence, but that is a wild card, as in "cannot predict the effects," not as in "makes our wildest dreams come true!"
This space dedicated to Vasily Arkhipov
User avatar
Ziggy Stardust
Sith Devotee
Posts: 3114
Joined: 2006-09-10 10:16pm
Location: Research Triangle, NC

Re: The Transcension Hypothesis?

Post by Ziggy Stardust »

Anyone who thinks the noun for transcending is "transcension" instead of "transcendence" probably isn't bright or educated enough to deserve to be taken seriously.
Actually, "transcension" is just as correct as "transcendence," it is just a slightly more archaic derivation. It is also equally correct to say "transcendency" instead of "transcendence."

--------------
A) It is difficult to envision evolution selecting for extremely long-term planning. Ergo, species will tend to make short-sighted decisions that condemn them to spend their entire existence trapped at the bottom of their home gravity pit. There is a strong possibility that Humans are utterly mundane in this regard. Our decision to put several hundred million years of biologically produced carbon right back into the atmopshere will certainly place on us a very substantial burden as the rate of climate change starts to surpass our ability to cope with it. Also, our decision to use the energy liberated in burning all that carbon to equip everyone with their own personal pollution machine, a McMansion, and the latest disposable McPhones and other electronic toys ... instead of, say efficient urban planning and Moon colonies may lock us on Earth's surface when we wake up one day to realize all the cheap energy is gone, and the cost of producing enough energy just to sustain civilization rises to the point where it excludes humans leaving Earth's surface.
The reason I dislike this argument is that it pre-supposes that humans have always had access to the evidence that carbon emissions are bad. Long-term planning is a non sequitir when, for a very long period of time, there wasn't a firm scientific understanding of the issue. And things get infinitely more complicated when dealing with the present, in which planning is bogged down by politics and other tangential issues. It is just a vast oversimplification to dismiss it all as simply being an issue of long-term planning, and that this is somehow representative of wider evolutionary trends. I mean, do you really think the Republican aversal to accepting global climate change, and the resulting deadlock in coming up with possible legislative solutions, is due only to an inability to long-term plan and nothing else?
User avatar
Darth Wong
Sith Lord
Sith Lord
Posts: 70028
Joined: 2002-07-03 12:25am
Location: Toronto, Canada
Contact:

Re: The Transcension Hypothesis?

Post by Darth Wong »

Ziggy Stardust wrote:
Anyone who thinks the noun for transcending is "transcension" instead of "transcendence" probably isn't bright or educated enough to deserve to be taken seriously.
Actually, "transcension" is just as correct as "transcendence," it is just a slightly more archaic derivation. It is also equally correct to say "transcendency" instead of "transcendence."
I propose a new word: "trandescension". It's a combination of "transcendence" and "condescension", and it would describe all the fucking "spiritual" people who think that their beliefs "transcend" reality and are therefore not subject to critical thinking, and who condescendingly talk down to you because they think you don't understand the profundity of it.
Image
"It's not evil for God to do it. Or for someone to do it at God's command."- Jonathan Boyd on baby-killing

"you guys are fascinated with the use of those "rules of logic" to the extent that you don't really want to discussus anything."- GC

"I do not believe Russian Roulette is a stupid act" - Embracer of Darkness

"Viagra commercials appear to save lives" - tharkûn on US health care.

http://www.stardestroyer.net/Mike/RantMode/Blurbs.html
User avatar
cosmicalstorm
Jedi Council Member
Posts: 1642
Joined: 2008-02-14 09:35am

Re: The Transcension Hypothesis?

Post by cosmicalstorm »

Thanks for the input. I'm something of a info-crack-addict. I'll occassionally let myself drift away in dreams like these. I still like to be optimistic and think that some of the nicer singularity-ish futures will turn out to be true. But don't get me wrong, I do try to be realistic. Perhaps we will all be dead five years from now in an accidental nuclear war, or starving in a climare-damaged ruin of civilization twenty years from now due to our own stupidity.
Simon_Jester
Emperor's Hand
Posts: 30165
Joined: 2009-05-23 07:29pm

Re: The Transcension Hypothesis?

Post by Simon_Jester »

It's not a question of pessimism versus optimism, for me. It's more that I see these people coming up with very wild ideas based on limited, often half-baked understanding of how the universe works. "A black hole telescope could let someone learn everything!" is the sort of thing that sounds a lot more interesting before you sit down and seriously consider what's involved and what you'd get out of it, I think.

And this sort of mindset, which constantly bubbles up on the fringe of Singularitarianism, keeps making me twitchy every time I hear another grand pronouncement. I don't think the future is inherently stupid or dull or inferior. But I do think that fantasizing about what it all looks like in the end state of all technologically-boosted consciousness is a bit pointless. It's not really any different than de Chardin rambling about the "omega point" was sixty years ago, only now people mangle physics along with philosophy to do it...
This space dedicated to Vasily Arkhipov
User avatar
Ziggy Stardust
Sith Devotee
Posts: 3114
Joined: 2006-09-10 10:16pm
Location: Research Triangle, NC

Re: The Transcension Hypothesis?

Post by Ziggy Stardust »

Simon_Jester wrote:It's not a question of pessimism versus optimism, for me. It's more that I see these people coming up with very wild ideas based on limited, often half-baked understanding of how the universe works. "A black hole telescope could let someone learn everything!" is the sort of thing that sounds a lot more interesting before you sit down and seriously consider what's involved and what you'd get out of it, I think.

And this sort of mindset, which constantly bubbles up on the fringe of Singularitarianism, keeps making me twitchy every time I hear another grand pronouncement. I don't think the future is inherently stupid or dull or inferior. But I do think that fantasizing about what it all looks like in the end state of all technologically-boosted consciousness is a bit pointless. It's not really any different than de Chardin rambling about the "omega point" was sixty years ago, only now people mangle physics along with philosophy to do it...
The worst, in my book, are Ray Kurzweil's rabid fans. They remind me a lot of Ron Paul supporters, actually, in how stubborn and passionate they are to defend their beliefs, while simultaneously looking down on dissenters with the "trandescension" Darth Wong was talking about. Basically, anyone who subscribes to unlimited exponential technological advancement should do some research before they speak.
User avatar
Ariphaos
Jedi Council Member
Posts: 1739
Joined: 2005-10-21 02:48am
Location: Twin Cities, MN, USA
Contact:

Re: The Transcension Hypothesis?

Post by Ariphaos »

cosmicalstorm wrote:Thanks for the input. I'm something of a info-crack-addict. I'll occassionally let myself drift away in dreams like these. I still like to be optimistic and think that some of the nicer singularity-ish futures will turn out to be true. But don't get me wrong, I do try to be realistic. Perhaps we will all be dead five years from now in an accidental nuclear war, or starving in a climare-damaged ruin of civilization twenty years from now due to our own stupidity.
You do not need to resort to schizophrenia-induced gobbledygook to be optimistic.
Give fire to a man, and he will be warm for a day.
Set him on fire, and he will be warm for life.
User avatar
cosmicalstorm
Jedi Council Member
Posts: 1642
Joined: 2008-02-14 09:35am

Re: The Transcension Hypothesis?

Post by cosmicalstorm »

Simon_Jester wrote:It's not a question of pessimism versus optimism, for me. It's more that I see these people coming up with very wild ideas based on limited, often half-baked understanding of how the universe works. "A black hole telescope could let someone learn everything!" is the sort of thing that sounds a lot more interesting before you sit down and seriously consider what's involved and what you'd get out of it, I think.

And this sort of mindset, which constantly bubbles up on the fringe of Singularitarianism, keeps making me twitchy every time I hear another grand pronouncement. I don't think the future is inherently stupid or dull or inferior. But I do think that fantasizing about what it all looks like in the end state of all technologically-boosted consciousness is a bit pointless. It's not really any different than de Chardin rambling about the "omega point" was sixty years ago, only now people mangle physics along with philosophy to do it...
I think this guy was pretty reasonable, unlike most outright kooks he provided some falsifiable ideas that can be tested in a few decades.
To me it is a matter of being pessimistic. The way I see it we will definitely run into one or several of the following things, nanotech, AI, extremly powerful processing power. Probably all at the same time, and that is going to be a strange world.

This is, provided that technological development is not halted by a global disaster like a nuclear war or a global totalitarianism.

Personally, I just can not see a future where the galaxy is filled with monkeys flying around in giant tin cans with their own plumbing and everything brought with them. At least, that seems as unlikely to me as the singularity does. Intelligence created by our brain has no magic, it will be replicated and will move into machinery and lose it's biological constraints eventually.

I hope, but don't assume, that continued technological development will be a good thing, maybe it will kill us all and create an entity that pursues some completely ridiculous goal of its own.
Xeriar wrote:
cosmicalstorm wrote:Thanks for the input. I'm something of a info-crack-addict. I'll occassionally let myself drift away in dreams like these. I still like to be optimistic and think that some of the nicer singularity-ish futures will turn out to be true. But don't get me wrong, I do try to be realistic. Perhaps we will all be dead five years from now in an accidental nuclear war, or starving in a climare-damaged ruin of civilization twenty years from now due to our own stupidity.
You do not need to resort to schizophrenia-induced gobbledygook to be optimistic.
Hi Xeriar, I used to follow you're threads regarding RKV's and the like with great interest, I'm sad to see you post so seldom these days. What are you're own thoughts on the subject? Have you posted them somewhere on this board in the past?
User avatar
Ariphaos
Jedi Council Member
Posts: 1739
Joined: 2005-10-21 02:48am
Location: Twin Cities, MN, USA
Contact:

Re: The Transcension Hypothesis?

Post by Ariphaos »

Heh.

In terms of 'existential' level threats, wherein humans lose the ability to determine their own destiny, I think we can break them down into four categories, as typically described.
1) Resource exhaustion, and the effects of such waste.
2) Rent-seekers of various sorts, trying to maintain some grip on humanity's future into the next century.
3) People making mistakes. Rogue AGI, Gray Goo, whatever.
4) Disaffected individuals who will have an easier time having a greater and greater potential to do serious damage.

In terms of civilization as a whole, I think 1 is a non-issue. There will of course be massive tolls to be paid for what has been done to the planet - but no resource is going to vanish overnight, and barons of scarce resources like oil will be facing greater political pressure as time goes on, rather than less. By all means, we're going to have to re-terraform Earth before even thinking about doing it to Mars. But the planet is going to be bigger than we are at least until we get the capacity to do something about it, at which point it becomes somewhat moot.

Rent-seekers are only beginning to feel political backslash, but it has clearly begun. I'm going to go on a lark and make a prediction - the next attempt at copyright extension in the EU and US is going to see a backlash strong enough to get seriously noticed by national media, and rather than hedging anything, I'm going to make a firm prediction that the extension acts in both the EU and US will fail. As a bonus prediction, unlike where pressure on SOPA merely stopped the bill, we may see attempts to restore sane copyright terms, and possibly a restructuring of the DMCA (though I doubt the latter).

The real killer here isn't just the rise of an increasingly technocratic populace, but also 3d printing - fabbing. No, you won't have an omnimachine in 40 years. But you might choose to have your shop specialize in making glass. Your friend might specialize in making circuitboards. Another chips. Another makes motors. Another makes casts, to cast steel blocks with. Or whatever.

And this sort of thing is going to be the nail in the coffin of the industrial age. I wouldn't call it post-scarcity, but artificial scarcity is going to become harder to maintain - eventually impossible.

And so what's left is services and people lucky enough to own crazy amounts of land, wires, and wireless spectrum. Against the political will of everyone else.

In short, I'm not seeing a corporate dystopia forming.

For rogue AI / Grey Goo, there are some things to consider.

Hard singularitans are famous for not grasping limits and technology. I'd be surprised if there is an actual engineer among them. While it's true that the critics of AGI tend to underestimate our understanding of how intelligence might work and thus our ability to build one... Earth has been infested with virii and bacteria for over four billion years. And we're still learning about the stresses they cope with, their immense variety, capability for self-maintenance, limited 'social' capacity, and their rapid response to novel pressures.

Members here have commented about how photosynthesis is only 1% efficient. A nanite could surely do more! But, think about it. If you're not using that energy it has to be lost as heat. You're also competing with countless other organisms for resources. Plankton release cloudforming agents in order to reduce the amount of sunlight striking them. They certainly have the capacity to be more efficient as is - but they 'choose' not to be - those that evolve such traits, cook.

While I don't have the math to put 'Grey Goo' as far into the fiction part of science fiction as we can RKVs, I have yet to read grey goo fearmongerers by a microbiologist. At least as a personal observation. You can certainly make some forms of 'life' more efficient, but you won't make one to rule the Earth.

So, rogue AGIs.

Some context may be necessary. Take the atom bomb back to World War I, and imagine how that would change the future.

I'm sure if you got the world's smartest minds together back then, you could go through enough experiments that they could conceive of such a device a decade or two early.

That's sortof what is going on right now - we imagine how such an invention might change our lives now, as opposed to how it will change our lives when it finally comes to pass. It is certainly possible; all you have to do is figure out how to think. So much easier said than done. But we ourselves are proof that it is possible by demonstration.

There are some things to consider, even though multicellular creatures - and our resulting brains - have not been through the same scale of evolutionary pressure that single-celled organisms and less have.

Probably the most obvious is that AGI research is not going on in a vacuum. Language development is occurring across the board - bringing everyone more powerful tools, a tighter security environment, etc. I think it's perfectly possible that we may shoot past the point where AGI is very feasible for a few years, and then suddenly a few lightbulbs go off at the same time and yes, our world will change drastically, however, with a few million people rather than a few people toying around with it, it becomes nigh impossible for one, half or even 99% to be catastrophic.

Even if that doesn't exist, though, an AGI doesn't enter our environment but rather one that is faster, more optimized, more secure, and more able to respond than our current infrastructure.

As for how life will change, I fully expect that we'll each have what amounts to a suite of personal assistants help direct our lives. It's your secretary, your accountant, your personal trainer, your nutritionist, your therapist, your lawyer, your hands-off doctor, your stylist, your tutor, your critic.

You still direct your life, to be sure. In Solar Storms I call this sort of machine an ordinator - something trusted to coordinate humans. Your personal ordinator is more alert than you are, and more knowledgeable, but probably not much smarter than you if at all. More powerful ordinators would help keep communities running smoothly.

Another factor to consider, of course, it's not given that we'll discover AGI before we're already enhancing ourselves, genetically and/or cybernetically. This also may offset the impact a rogue AGI may be able to have.

The final issue is somewhat more salient and I think it's why some people don't feel the world has much time left. What's to stop a nutjob from making a disease and wiping humans out?

To me, the only serious answer is to implement a full scope of preventative measures - preventing disaffection in the first place, institutionalizing where rehab is not possible, close off disease vectors to catch what slips through the cracks. Even if that fails, however, the combined computing resources of future humanity will be available to combat such a plague. It might be that it has to happen first before people decide to make sure it doesn't, again.

For Solar Storms, I have homo panacea's development start in the mid-2020's (basic treatments such as telomerase extension) into the 22nd century (cracking whatever really hard problems there may be), with many people from the mid 20th century living through the 23rd and thus beyond. Gradual brain replacement and subsequent uploading after completion may offer 'true immortality' before then, but I have a feeling it will take several decades between 'first proven possible' and people being even remotely comfortable with that idea. Much more comfortable with the idea of repairing damage.

My pet setting probably leans on the optimistic side. On the other hand, I can see a definite progression on each point, how and why. So I feel more confident about it than 'simulate brain in 2020 because petaflops'.
Give fire to a man, and he will be warm for a day.
Set him on fire, and he will be warm for life.
User avatar
cosmicalstorm
Jedi Council Member
Posts: 1642
Joined: 2008-02-14 09:35am

Re: The Transcension Hypothesis?

Post by cosmicalstorm »

That was very interesting. I had some similar (but much less "thought out") ideas after reading through Greg Egan's stuff the other year. I'm glad that there is an optimistic side, although I guess I'm going to get swiped by a car or a cancer before I get to enjoy any of this. But if I get to live for another couple of decades, life will be interesting and it feels good to have been born at this strange time.
Post Reply