The Singularity in Sci-Fi

SF: discuss futuristic sci-fi series, ideas, and crossovers.

Moderator: NecronLord

User avatar
Battlehymn Republic
Jedi Council Member
Posts: 1824
Joined: 2004-10-27 01:34pm

The Singularity in Sci-Fi

Post by Battlehymn Republic »

What do you feel about the Singularity as a sci-fi element? Do you feel that it is possible (and inevitable) in real life? Can you recommend me any good stories that deal with the Singularity not writting be Vernor Vinge?

So far it's left me pretty disgusted, since the most prominent online sci-fi universe that has had a Singularity is Orion's Arm. And the people who run it are just so damn arrogant.
User avatar
SirNitram
Rest in Peace, Black Mage
Posts: 28367
Joined: 2002-07-03 04:48pm
Location: Somewhere between nowhere and everywhere

Re: The Singularity in Sci-Fi

Post by SirNitram »

Battlehymn Republic wrote:What do you feel about the Singularity as a sci-fi element? Do you feel that it is possible (and inevitable) in real life? Can you recommend me any good stories that deal with the Singularity not writting be Vernor Vinge?

So far it's left me pretty disgusted, since the most prominent online sci-fi universe that has had a Singularity is Orion's Arm. And the people who run it are just so damn arrogant.
Most incarnations of the Singularity are little more than NeoCommunist wanking, so it's fairly stupid. The one time I've seen something similar is the Culture, with it's 'Post-scarcity' economy and superintelligent Minds, and that's because the stories aren't in the Culture, but on it's edges.
Manic Progressive: A liberal who violently swings from anger at politicos to despondency over them.

Out Of Context theatre: Ron Paul has repeatedly said he's not a racist. - Destructinator XIII on why Ron Paul isn't racist.

Shadowy Overlord - BMs/Black Mage Monkey - BOTM/Jetfire - Cybertron's Finest/General Miscreant/ASVS/Supermoderator Emeritus

Debator Classification: Trollhunter
User avatar
18-Till-I-Die
Emperor's Hand
Posts: 7271
Joined: 2004-02-22 05:07am
Location: In your base, killing your d00ds...obviously

Post by 18-Till-I-Die »

Pardon i'm comfused...do you mean the treatment of singularities, as in like black holes, in sci-fi? Cause i'm fairly sure (just short of certainty) they DO exist in real life.

But Nitram was talking about the Culture and commie-wanking...so i may be misunderstanding.

Knowledge, plz? :?
Kanye West Saves.

Image
User avatar
Lord Zentei
Space Elf Psyker
Posts: 8742
Joined: 2004-11-22 02:49am
Location: Ulthwé Craftworld, plotting the downfall of the Imperium.

Post by Lord Zentei »

18-Till-I-Die wrote:Pardon i'm comfused...do you mean the treatment of singularities, as in like black holes, in sci-fi? Cause i'm fairly sure (just short of certainty) they DO exist in real life.

But Nitram was talking about the Culture and commie-wanking...so i may be misunderstanding.

Knowledge, plz? :?
You are not alone.

As for singumarities in sci-fi, they tend to be handled rather like all other science: relatively shallow, in other words and not neccesarily close to the properties of actual singularities.
CotK <mew> | HAB | JL | MM | TTC | Cybertron

TAX THE CHURCHES! - Lord Zentei TTC Supreme Grand Prophet

And the LORD said, Let there be Bosons! Yea and let there be Bosoms too!
I'd rather be the great great grandson of a demon ninja than some jackass who grew potatos. -- Covenant
Dead cows don't fart. -- CJvR
...and I like strudel! :mrgreen: -- Asuka
User avatar
Lord Zentei
Space Elf Psyker
Posts: 8742
Joined: 2004-11-22 02:49am
Location: Ulthwé Craftworld, plotting the downfall of the Imperium.

Post by Lord Zentei »

GHETO:
Lord Zentei wrote:singumarities
ROFL: typo!

Singularities.
CotK <mew> | HAB | JL | MM | TTC | Cybertron

TAX THE CHURCHES! - Lord Zentei TTC Supreme Grand Prophet

And the LORD said, Let there be Bosons! Yea and let there be Bosoms too!
I'd rather be the great great grandson of a demon ninja than some jackass who grew potatos. -- Covenant
Dead cows don't fart. -- CJvR
...and I like strudel! :mrgreen: -- Asuka
User avatar
Gustav32Vasa
Worthless Trolling Palm-Fucker
Posts: 2093
Joined: 2004-08-25 01:37pm
Location: Konungariket Sverige

Post by Gustav32Vasa »

"Ha ha! Yes, Mark Evans is back, suckers, and he's the key to everything! He's the Half Blood Prince, he's Harry's Great-Aunt, he's the Heir of Gryffindor, he lives up the Pillar of Storgé and he owns the Mystic Kettle of Nackledirk!" - J.K. Rowling
***
"Senator, when you took your oath of office, you placed your hand on
the Bible and swore to uphold the Constitution. You did not place your
hand on the Constitution and swear to uphold the Bible."
User avatar
Spanky The Dolphin
Mammy Two-Shoes
Posts: 30776
Joined: 2002-07-05 05:45pm
Location: Reykjavík, Iceland (not really)

Post by Spanky The Dolphin »

Yeah, another who has no idea what you guys are talking about if it's not black holes...
Image
I believe in a sign of Zeta.

[BOTM|WG|JL|Mecha Maniacs|Pax Cybertronia|Veteran of the Psychic Wars|Eva Expert]

"And besides, who cares if a monster destroys Australia?"
Plushie
Padawan Learner
Posts: 373
Joined: 2005-07-15 12:49am

Post by Plushie »

From what I understand about the singularity concept, I don't believe it's something that's going to happen (either any time soon or at all). The two interpretations I've seen of it (1. Technological progression reaches a point where it happens so fast that it's impossible to predict or 2. We invent an AI smarter than us capable of inventing an AI smarter than it and on ad infinitum) both posit things we don't quite understand.

For one, we don't quite understand enough about our own conciousness to say with confidence that something could be more "intelligent". Something that learns fast and thinks faster? Of course. But not something that still wouldn't follow the same exact "logic" that our brain follows, so the second possibility isn't exactly something that makes sense to me. That and I don't know if we could make an AI that doesn't suffer from the same sensibilities that humans suffer from -- while it might be CAPABLE of some insane number of operations per second, it might do so, much like humans spend more time worrying about where it's going to get some tail this evening than the accounting problem in front of it.

Secondly, the technological interpretation of the singualrity I've seen falliciously assumes that technological progression is something that can be accurately plotted on a graph, without taking into account hard limits on what technology can do based upon physics.

Something similar to what many people see as singularity may occur someday, but I think it'll be long after we're ready to deal with it and it won't be the society changing apocoliptic event that a lot of transhumanists and enablers envision it is (I don't know why so many people take such a religious view of it).
User avatar
18-Till-I-Die
Emperor's Hand
Posts: 7271
Joined: 2004-02-22 05:07am
Location: In your base, killing your d00ds...obviously

Post by 18-Till-I-Die »

Oh, ok, i got it now.

You mean like, an event wherein humans create a kind of 'AI God' that gets out of control.

Yeah, sort of like the old fable about the frogs who want a king so they ask Zeus for it but word it wrongly, "Give us a mighty ruler!", and he gives them a bird that eats them all, seeing as (from their POV) he is indeed mighty and will 'rule' over their pond now.

Well actually no i dont see that happening, cause frankly i dont think humans are all that clever. I think we'll eventually develop some scifi-like technology (i hold out hope for FTL for example) but i doubt we'll ever be able to fully understand and recreate the delicate, precise conditions for sapient machines to be a possibility.
Kanye West Saves.

Image
User avatar
Spacebeard
Padawan Learner
Posts: 473
Joined: 2005-03-21 10:52pm
Location: MD, USA

Post by Spacebeard »

Vinge originally defined the "singularity" concept as the idea that technology will eventually advance far enough that it will be totally incomprehensible from our contemporary point of view. I don't think there's anything particularly wrong with that though; if anything, it's just a modification of Clarke's famous statement that any sufficiently advanced technology is indistinguishable from magic. The concept works perfectly well in "A Fire Across the Deep", where the laws of physics are different in different regions of the galaxy, making it possible in some areas to have technology that is incomprehensibly advanced. However, the notion seems to have become something of a brainbug in its treatment by other writers.

Firstly, we have stories (the ones I've read were by Doctorow and McLeod, but there are others) in which magical nanite wank-AIs have taken over the world, leaving behind a society of ordinary humans to slack off and lead lives of no consequence. This is probably what Sir Nitram referred to as "NeoCommunist wanking", and I think it only rarely makes interesting reading. The authors are basically using the "singularity" as an escape clause: they declare the productive elements of society "incomprehensible" and therefore outside the scope of the story, and then write about a bunch of slackers goofing off at Disneyland instead.

Secondly, there are the Orions' Arm types who treat Vinge's idea as an indisputable physical law and insist that any science fiction that doesn't feature wank-AIs overshadowing "baselines" is unrealistic "munchkin" fiction. I haven't read any stories from that universe, but their website speaks for itself: they've declared that one speculative notion from a science fiction writer is an indisputable prediction of the future, and wank off to their super-AIs with one hand while they pat themselves on the back for being so "realistic" with the other.
"This war, all around us, is being fought over the very meanings of words." - Chad, Deus Ex
User avatar
CaptainChewbacca
Browncoat Wookiee
Posts: 15746
Joined: 2003-05-06 02:36am
Location: Deep beneath Boatmurdered.

Post by CaptainChewbacca »

Some OA guys came here once and tried to defend "femtotech" as serious science. Probably still somewhere in HoS.
Stuart: The only problem is, I'm losing track of which universe I'm in.
You kinda look like Jesus. With a lightsaber.- Peregrin Toker
ImageImage
User avatar
Xon
Sith Acolyte
Posts: 6206
Joined: 2002-07-16 06:12am
Location: Western Australia

Post by Xon »

That reminds me of a Stephan Baxter book where they have a God Like AI called Gaia as the result of multigenerations of AI's making AIs(and computer hardware). It has a tendancy to spend most of it's time offline due to the massive amounts of heat it dumps into the enviroment and the humans of the time period cant make heads or tails of the hardware.

A human from ~0.5 million years into the future who is recognisably human, both socially and physically, calls it a well constructed toy. :D
"Okay, I'll have the truth with a side order of clarity." ~ Dr. Daniel Jackson.
"Reality has a well-known liberal bias." ~ Stephen Colbert
"One Drive, One Partition, the One True Path" ~ ars technica forums - warrens - on hhd partitioning schemes.
User avatar
Winston Blake
Sith Devotee
Posts: 2529
Joined: 2004-03-26 01:58am
Location: Australia

Post by Winston Blake »

Expecting simplistic mathematical rules to predict the future of real-world phenomena just isn't reasonable. For example, fitting the number of blades on a razor against history produces a perfect hyberbolic curve which shows that, by 2015, razors will have an infinite number of blades.

Image

Image

Further,
Wikipedia wrote:Jürgen Schmidhuber calls the Singularity 'Omega', referring to Teilhard de Chardin's Omega point (1916). For Omega = 2040 he says the series Omega - 2^n human lifetimes (n<10; one lifetime = 80 years) roughly matches the most important events in human history. But he also questions the validity of such lists, suggesting they just reflect a general rule for "both the individual memory of single humans and the collective memory of entire societies and their history books: constant amounts of memory space get allocated to exponentially larger, adjacent time intervals further and further into the past." He suggests that this may be the reason "why there has never been a shortage of prophets predicting that the end is near - the important events according to one's own view of the past always seem to accelerate exponentially."
Robert Gilruth to Max Faget on the Apollo program: “Max, we’re going to go back there one day, and when we do, they’re going to find out how tough it is.”
User avatar
Lord Zentei
Space Elf Psyker
Posts: 8742
Joined: 2004-11-22 02:49am
Location: Ulthwé Craftworld, plotting the downfall of the Imperium.

Post by Lord Zentei »

18-Till-I-Die wrote:Oh, ok, i got it now.

You mean like, an event wherein humans create a kind of 'AI God' that gets out of control.

<snippa>

Well actually no i dont see that happening, cause frankly i dont think humans are all that clever. I think we'll eventually develop some scifi-like technology (i hold out hope for FTL for example) but i doubt we'll ever be able to fully understand and recreate the delicate, precise conditions for sapient machines to be a possibility.
From what I understood, it was not neccesarily a God-AI, but a sudden, explosively fast development in technology and science, sort of like the razors Winston Blake is illustrating above.

With run of the mill exponential growth, development will go progressively faster, but growth is continious, and thus never allows for spontaneous quantum jumps into supertechnology. The Singularity, if I understand it now, assumes just that.
CotK <mew> | HAB | JL | MM | TTC | Cybertron

TAX THE CHURCHES! - Lord Zentei TTC Supreme Grand Prophet

And the LORD said, Let there be Bosons! Yea and let there be Bosoms too!
I'd rather be the great great grandson of a demon ninja than some jackass who grew potatos. -- Covenant
Dead cows don't fart. -- CJvR
...and I like strudel! :mrgreen: -- Asuka
User avatar
Rye
To Mega Therion
Posts: 12493
Joined: 2003-03-08 07:48am
Location: Uighur, please!

Post by Rye »

I remember kojikun introducing me to this idea a few months before he was banned. I liked the idea and used it in my own universe, so I guess I'll do the nerdy thing and explain what it does there.

Essentially, there's a race of cybernetic humans that develop an AI computer that designs better AI computers, and generally more efficient technology, be it sensors or whatever, tests it all itself. It start incorporating that new technology into itself until eventually it's just surrounded by machines built by machines for machines, and is totally incomprehensible to the cyber-humans that built the original, and way outclasses every other technology they have. With its technological ans sensory evolution, it has become self aware, and aware of the humans surrounding it.

Then it goes all skynet on them, concluding they are fearful, primitive and obsolete, but still a potential threat. They have a civil war, and the computer disintegrates technology from biology, and attacking populations with the worst weapons available to it, earning it the rather cheesy names "the disintegrator," and "the chaos engine." Since then it's set up factories on airless worlds, producing robot armies and spaceships and generally made itself felt as an up and coming galactic problem. It doesn't keep upgrading itself ad infinitum, though, that would just be crazy, there'd be limits based on programming and physics.
EBC|Fucking Metal|Artist|Androgynous Sexfiend|Gozer Kvltist|
Listen to my music! http://www.soundclick.com/nihilanth
"America is, now, the most powerful and economically prosperous nation in the country." - Master of Ossus
User avatar
Xenophobe3691
Sith Marauder
Posts: 4334
Joined: 2002-07-24 08:55am
Location: University of Central Florida, Orlando, FL
Contact:

Post by Xenophobe3691 »

Plushie wrote:For one, we don't quite understand enough about our own conciousness to say with confidence that something could be more "intelligent". Something that learns fast and thinks faster? Of course. But not something that still wouldn't follow the same exact "logic" that our brain follows, so the second possibility isn't exactly something that makes sense to me. That and I don't know if we could make an AI that doesn't suffer from the same sensibilities that humans suffer from -- while it might be CAPABLE of some insane number of operations per second, it might do so, much like humans spend more time worrying about where it's going to get some tail this evening than the accounting problem in front of it.
And? A hundred years ago, we didn't know precisely what made us sick, and knew next to nothing about what went on at the cellular level to make us live. We still don't know all we want to know, but what we do know has allowed us all the marvels and horrors of modern medicine. What you're arguing, in a nutshell, is that because we don't know much now, we won't be able to know it in the future. Arguing from Ignorance isn't a very fruitful endeavour.
Dark Heresy: Dance Macabre - Imperial Psyker Magnus Arterra

BoTM
Proud Decepticon

Post 666 Made on Fri Jul 04, 2003 @ 12:48 pm
Post 1337 made on Fri Aug 22, 2003 @ 9:18 am
Post 1492 Made on Fri Aug 29, 2003 @ 5:16 pm

Hail Xeno: Lord of Calculus -- Ace Pace
Image
User avatar
Xon
Sith Acolyte
Posts: 6206
Joined: 2002-07-16 06:12am
Location: Western Australia

Post by Xon »

Depending on how Singularity is defined, you can say humanity has already gone through it. Someone from pre-Industrial revolution sure as hell isnt equiped to understand post-Industrial revolution Earth.
"Okay, I'll have the truth with a side order of clarity." ~ Dr. Daniel Jackson.
"Reality has a well-known liberal bias." ~ Stephen Colbert
"One Drive, One Partition, the One True Path" ~ ars technica forums - warrens - on hhd partitioning schemes.
User avatar
GrandMasterTerwynn
Emperor's Hand
Posts: 6787
Joined: 2002-07-29 06:14pm
Location: Somewhere on Earth.

Post by GrandMasterTerwynn »

Xon wrote:Depending on how Singularity is defined, you can say humanity has already gone through it. Someone from pre-Industrial revolution sure as hell isnt equiped to understand post-Industrial revolution Earth.
Yes, and from the viewpoint of our Homo Erectus predecessors a million years ago, modern Homo Sapiens is a post-Singularity species. They would be literally incapable of grasping the concept of something like Mozart, the Mona Lisa, or 20,000 Leagues Under the Sea, much less a Dell desktop or Chevy truck. Even a relatively "primitive" concept like a single all-powerful, unknowable sky-father-of-everything God would probably escape them.

Virtually all attempts at describing a post-Singularity civilization end in failure, as we literally cannot imagine what might motivate such a civilization, beyond the basic desires to keep living and reproduce.

Envision, for example, a computer the size of a galaxy. You could do it within the confines of Einstein's physics too. Imagine that every every solar system has had its planets slowly converted into a Dyson swarm, and the individual components of the swarm would make up individual fast nodes of this galactic supercomputer. Each node would communicate the results of its computations to its neighbors via enormously powerful laser beams. Now, by our standards, this computer would be unimaginably slow. A computation started at one end would take 100,000 years to propogate to the other end, and another 100,000 years for the response to propogate back. However, such a computer would have something like 400 billion fast nodes, each comprised of many tens of millions of smaller units. It would be unimaginably powerful owing to this absurd degree of parallelization.

Now I can describe this computer, but I am incapable of telling you what might occupy attentions of this device. Sure any one of its major nodes (or a small fraction of a major node) could probably simulate the entire evolution of the human species, plus the histories of every human being who has ever lived and ever will live to the extinction of the species . . . but there's no guarantee the galactic supercomputer would even consider such an activity to be anything more than a total waste of its time. And it would be extremely difficult for us down here to accurately envision the origins of this beast, or what it was supposed to be used for.

That in a nutshell, describes a potential Singularity civilization and finding ways to adequately describe what it does and what motivates it.
User avatar
SWPIGWANG
Jedi Council Member
Posts: 1693
Joined: 2002-09-24 05:00pm
Location: Commence Primary Ignorance

Post by SWPIGWANG »

I agree with the notion that we are living in singularity. The effects are most obvious since the scientific method. While biological humans may have not improvement much (though modern education is superior) the systematic structure that build up civilization is far more "intelligent" and capable of engineering feats impossible before, which is fed back into the cycle.

As we developed communication and transportation technology since the 1800s, the data collection and distribution have been improved orders of magnitudes, from centuries to minutes, greatly reducing wasted processing power from repeated works by people. The improvement in farming have translated into a larger population which futher boosts development rate. The development in mechanical and electronic technology have also improved data storage, with density and speeds unimaginable two centuries ago.

The development of computers is an obvious case of feedback. With the development of computers, it is fed back into its own development cycle with CAD/CAM tools required to design a computer, and highly computerized faberication tools needed to build them. Today an engineering undergrad have tools to design a processor that would have taken a professional human team years in the days of slide rules. Its another orders of magnitude improvement that gets sends back into the cycle.

-------------------------------------------------------------
While this happened in the background, it is true that no single human can ever hope to comprehend even a tiny fractor of technology around us. It might be possible to understand physics and principles, but tracing every design decision down to the bolt and transistor is simply completely impossible.

-------------------------------------------------------------
The two interpretations I've seen of it (1. Technological progression reaches a point where it happens so fast that it's impossible to predict or 2. We invent an AI smarter than us capable of inventing an AI smarter than it and on ad infinitum) both posit things we don't quite understand.
1. It have happened already. Remember the "640KB of memory should be enough for everyone" quotes somewhere? Nowadays most people don't even try to predict the future. (unlike the eons of humanity where tech was static over an lifespan)

2. If you take out the "AI" in the concept (you can say the 'intelligence' of the civilization have improve throught specialization and structure), than it has happened already.
to say with confidence that something could be more "intelligent"
Intelligence is the ability to process data to solve problems. There are many kinds of intelligence (as there are different ways of processing data) but they are perfectly comparable for a given problem.

It should be noted that "human-like intelligence" is not the only factor and there is mutiple solutions to one problem. Playing a game of chess is an intelligence problem, but you can train a biological neuro-net in our heads to play or you can run a parallel computer to brute force it. While human intelligence is far from being eclipsed by computers or AI at this point, the combined system of both humans and computers is far more "intelligent" the components as they can solve an far larger set of problems.
User avatar
Lord Zentei
Space Elf Psyker
Posts: 8742
Joined: 2004-11-22 02:49am
Location: Ulthwé Craftworld, plotting the downfall of the Imperium.

Post by Lord Zentei »

GrandMasterTerwynn wrote:
Xon wrote:Depending on how Singularity is defined, you can say humanity has already gone through it. Someone from pre-Industrial revolution sure as hell isnt equiped to understand post-Industrial revolution Earth.
Yes, and from the viewpoint of our Homo Erectus predecessors a million years ago, modern Homo Sapiens is a post-Singularity species.
:? I rather got the impression that the singularity was not meant to be a result of continious if rapid progress, but a "jump" - hence the use of the mathematical term "singularity". At any given point during the deveopment you cite, the people in question were well aware of what was going on. We have not seen the asymptote of a hyperbolic growth yet as far as I'm aware.
SWPIGWANG wrote:I agree with the notion that we are living in singularity. The effects are most obvious since the scientific method. While biological humans may have not improvement much (though modern education is superior) the systematic structure that build up civilization is far more "intelligent" and capable of engineering feats impossible before, which is fed back into the cycle. <snippsky>
As above: is a singularity not meant to be more than merely an exponential growth through posetive feedback?
SWPIGWANG wrote:1. It have happened already. Remember the "640KB of memory should be enough for everyone" quotes somewhere? Nowadays most people don't even try to predict the future. (unlike the eons of humanity where tech was static over an lifespan)
That overused quote would just be Bill Gates making a sell. Moore's Law, on the other hand has been pretty consistent.
CotK <mew> | HAB | JL | MM | TTC | Cybertron

TAX THE CHURCHES! - Lord Zentei TTC Supreme Grand Prophet

And the LORD said, Let there be Bosons! Yea and let there be Bosoms too!
I'd rather be the great great grandson of a demon ninja than some jackass who grew potatos. -- Covenant
Dead cows don't fart. -- CJvR
...and I like strudel! :mrgreen: -- Asuka
User avatar
SWPIGWANG
Jedi Council Member
Posts: 1693
Joined: 2002-09-24 05:00pm
Location: Commence Primary Ignorance

Post by SWPIGWANG »

result of continious if rapid progress, but a "jump" - hence the use of the mathematical term "singularity". At any given point during the deveopment you cite, the people in question were well aware of what was going on.
Unless we have discrete time....jump makes no sense :P (okay, we aren't close to quantum-speed level of development)

As for "people knowing what was going on", I disagree that is the case or a meaningful criteria. First, most people don't know what is going on, and no one knows everything that is going on. (only a small piece of the puzzle) Second, if the question is increase in intelligence, than there probably be an "intelligent observer" that knows whats some of the things thats going on. Thirdly, if intelligence is physical, its effects must be observable and people capable of observation and measurement will always know that intelligence development have happened. (even if all it is that skynet trying to kill us all!)

You need stupidity or sheer chaos for the system to lose track of its own development.
User avatar
B5B7
Jedi Knight
Posts: 785
Joined: 2005-10-22 02:02am
Location: Perth Western Australia
Contact:

Post by B5B7 »

A good non-fiction source for information on this is Damien Broderick's book The Spike.
Info at this link and elsewhere:The Spike
TVWP: "Janeway says archly, "Sometimes it's the female of the species that initiates mating." Is the female of the species trying to initiate mating now? Janeway accepts Paris's apology and tells him she's putting him in for a commendation. The salamander sex was that good."
"Not bad - for a human"-Bishop to Ripley
GALACTIC DOMINATION Empire Board Game visit link below:
GALACTIC DOMINATION
User avatar
Winston Blake
Sith Devotee
Posts: 2529
Joined: 2004-03-26 01:58am
Location: Australia

Post by Winston Blake »

Lord Zentei wrote::? I rather got the impression that the singularity was not meant to be a result of continious if rapid progress, but a "jump" - hence the use of the mathematical term "singularity".
Maybe the idea you're thinking of is 'event horizon'. AFAIK a mathematical singularity really is just a result of continuous progress, like an asymptote of a hyperbola. It's unfortunate that references like 'reaching the Technological Singularity' or 'post-Singularity' are so commonly used, when mathematically you can't ever reach such a point. No doubt i'll soon be corrected by somebody who knows more maths.

Anyway, i expect the idea has simply mutated since it was first pointed out that a graph of historical technological progress appeared to have an asymptote in the near future. The poetic similarity to mind-boggling black holes probably led to this asymptotic point being called the Technological Singularity, and nobody had any better names.
Robert Gilruth to Max Faget on the Apollo program: “Max, we’re going to go back there one day, and when we do, they’re going to find out how tough it is.”
User avatar
Covenant
Sith Marauder
Posts: 4451
Joined: 2006-04-11 07:43am

Post by Covenant »

Here's a question I've always had--what good are these FTL-speed computational monstrosities, really? Aren't you still limited by what you yourself (or itself) decide to calculate? Isn't there frankly just a limit to the good of a fast computer can do?
User avatar
Lord Zentei
Space Elf Psyker
Posts: 8742
Joined: 2004-11-22 02:49am
Location: Ulthwé Craftworld, plotting the downfall of the Imperium.

Post by Lord Zentei »

Covenant wrote:Here's a question I've always had--what good are these FTL-speed computational monstrosities, really? Aren't you still limited by what you yourself (or itself) decide to calculate? Isn't there frankly just a limit to the good of a fast computer can do?
Not if it is free-willed (whatever the fuck that means). We, for instance, are not limited to what our evolutionary ancestors could do or conceive of.
CotK <mew> | HAB | JL | MM | TTC | Cybertron

TAX THE CHURCHES! - Lord Zentei TTC Supreme Grand Prophet

And the LORD said, Let there be Bosons! Yea and let there be Bosoms too!
I'd rather be the great great grandson of a demon ninja than some jackass who grew potatos. -- Covenant
Dead cows don't fart. -- CJvR
...and I like strudel! :mrgreen: -- Asuka
Post Reply