Do you believe a technological singularity is near?

SLAM: debunk creationism, pseudoscience, and superstitions. Discuss logic and morality.

Moderator: Alyrium Denryle

User avatar
Irbis
Jedi Council Member
Posts: 2262
Joined: 2011-07-15 05:31pm

Re: Do you believe a technological singularity is near?

Post by Irbis »

Stas Bush wrote:The AI is safe until it gets enough tools to interact with the outside world. It is a bit different here, creating an isolated program environment is an easier task than isolating advanced machinery. Or at least it seems so now.
If the task was so simple, we wouldn't have the pesky problem of computer viruses, you know. And that's with static code written by humans, not superhuman AI actively trying to hack barrier.

It actually remind me of Stanislaw Lem story - scientist builds 2 AI boxes, learns they started communicating, cuts them off from the world and each other using all sorts of shields, screenings, and the like... Only to find out they still communicate - AIs managed to exploit him touching the computer cases they were in, leaving messages in static electricity of his skin. Something superhuman might well be able to find ways out of the box we can't even fathom, that's why they're more intelligent.
PeZook wrote:Not that a truly super-smart AI would even try to antagonize humans like that, IMHO. The smart thing to do if you want to survive is to placate them, act friendly and make yourself useful. If it wants to go all Skynet, it will start to act once it has access to physical resources, power and influence that make it possible.
Antagonize? :wink:

How about: I'm sorry, Dave. I'm afraid I can't do that. when someone tries to give it not very well thought off objective?
Zinegata wrote:First of all, internet connectivity can be shut off very easily because it is ultimately hardware dependent. Just cut the cable or pull the plug.
And then sleeper cell you missed reinfects the net with version 2.0?
Secondly, any sentient program will almost undoubtedly be a pretty large program. It is NOT going to be like a virus that only relies on a few lines of code. To be copied to a new machine will take a while, no different from downloading a very large file - which also makes it unlikely to remain undetected. People are gonna notice the slowdowns.
Slowdowns. Ahahahaaha :lol:

I'm sorry, but in a world where 95% of the population completely ignores what's written in error boxes before clicking 'ok', people are going to notice a slowdown? And that without possibility infecting program which will download the AI into new machine won't be capable of also (pretty trivial) optimizing your downloads so that the most obvious slowdowns never happen? Let's assume you enter facebook and google pretty often - it can store graphics locally and download another bit of itself instead of pulling the logos and such, and no one will be able to notice it. Superhuman AI can surely come up with pretty radical ways of optimization if it wants, slowdowns won't magically stop it :wink:

Heck, even with no optimization around, 90% of the time a significant portion of your connection is unused, if it just can tap into that, there will be no slowdown at all.
Finally, there is something called "compatibility issues", and that any highly complex program simply cannot copy itself from one machine to another with 100% guarantee that it will work without flaw.
Yes, and? If it's so intelligent, it can then modify itself to fit the environment. If it can't work, the parent program tries to infect new machine with new version.

Unlike human programmers, AI would be actually capable of thoroughly testing and understanding the processor it runs at, finding out all the errors it possibly can, and adjusting accordingly. We do it with dumb programs already, for intelligent one task should be trivial.
Which is why again most computer scientists tend to not be overly worried about the Skynet scenario. What happened in Terminator 3 was honestly stupid, and it ain't gonna happen. At worse what will happen is what the US military did in Transformers 1 - which it to get an axe and chop the cable to pieces - and that's already giving a very generous assumption that an amok machine intelligence will try to copy itself into the World Wide Web.
That only works until certain threshold wasn't reached, and I wouldn't bet at humans being better than malicious AI at figuring out what's going on behind the curtains.
User avatar
Ryan Thunder
Village Idiot
Posts: 4139
Joined: 2007-09-16 07:53pm
Location: Canada

Re: Do you believe a technological singularity is near?

Post by Ryan Thunder »

...Uh, no. Irbis, without wanting to be derisive--have you ever written so much as a "hello, world" script? :lol:
SDN Worlds 5: Sanctum
Zinegata
Jedi Council Member
Posts: 2482
Joined: 2010-06-21 09:04am

Re: Do you believe a technological singularity is near?

Post by Zinegata »

Irbis->

What's funny is that I'm just pointing out how improbable the whole thing is even if we assume that somebody actually programmed a God-AI software into existence in the first place; which as I've repeatedly pointed out is not actually an easy (or imminent) thing to do.

Nor is it even necessary to network such a machine, because only idiots think that the World Wide Web with its enormous mass of contradictory information is an ideal tool for "teaching" an AI. Heck, an AI that bases its knowledge base on the Internet would probably conclude the its purpose in life was to create porn for anything and everything.

So your objections demonstrate not only your ignorance, but blatant avoidance of the core of the issue.
And then sleeper cell you missed reinfects the net with version 2.0?
Again, viruses and other similar "infection" are by necessity small programs to avoid detection. You can't have a God-AI program the size of a virus, because it literally can't have that kind of functionality with only a couple of bytes of data.
Slowdowns. Ahahahaaha :lol:
Actually, if you again look at the potential file size of an AI program, it will in fact consume the majority of your bandwidth. We're not talking about a 100kb file here. We're probably going to talk in terms of a 100GB file at the minimum. Even if you have a fibr internet connection, you'll generally be working under a usage subscription plan so you're gonna see your usage used up. And there is no way to prevent the inevitable harddisk slowdown as it tries to copy such a huge file into your PC.

So again, even if we assume that a God AI does try to do a Skynet, only an idiot would believe it can self-replicate. The Terminator 3 scenario was extreme idiocy (and not just because Skynet actually subsequently committed suicide because it nuked all of the computers it was running on)
Yes, and? If it's so intelligent, it can then modify itself to fit the environment. If it can't work, the parent program tries to infect new machine with new version.
Except of course this only demonstrates that you've never actually done any coding.

Compatibility issues are a huge hurdle in any software development cycle. You cannot make a program run on Linux if it was meant to run on Windows natively, and it gets worse with a more complex program. In fact, you're probably going to look at a total recode from top to bottom just to get it to run again... and with a huge and highly complex program, that ain't gonna happen overnight.

Unless of course you're again one of the singularist idiots who hand-wave the actual complexity of code.
Unlike human programmers, AI would be actually capable of thoroughly testing and understanding the processor it runs at, finding out all the errors it possibly can, and adjusting accordingly.
Actually, they can't. They can't do it now, and it's unlikely they will able to do it quickly in the future. To date we don't even have a programming tool for computer programmers that can do automatic code corrections if there is a testing failure; the best that we have is little more than the equivalent of a spell checker.

This is again the "design ability" hurdle that singularists tend to hand wave, but do not realize is an enormous hurdle; probably because the vast majority of singularist proponents aren't actual programmers but sci-fi writers.

=====

A human being isn't born knowing how their brain works. We have to use sophisticated tools to actually look at how the brain works.

A machine would similarly be unable automatically figure out how a microprocessor works. It may be able to know what model of microprocessor it's using by checking the System Registry, but System Registries do not come with complete schematics on how the microprocessor actually looks or functions. It doesn't say what kind of tolerances the hardware is capable of. If it tries to install itself on just some random computer, the likely result is simple: It will not run, period. It's like trying to run a game like Crysis II without knowing the system capabilities.

So, again, the claims of the singularists are very much on the level of "insane paranoid ramblings" as far as the Terminator 3 Skynet scenario is concerned. It was a dumb scenario. You may as well claim that the Large Hadron Collider will kill us all.
User avatar
Irbis
Jedi Council Member
Posts: 2262
Joined: 2011-07-15 05:31pm

Re: Do you believe a technological singularity is near?

Post by Irbis »

Ryan Thunder wrote:...Uh, no. Irbis, without wanting to be derisive--have you ever written so much as a "hello, world" script? :lol:
Yes, I did, o Village Idiot. If fact, I made IT degree a few years back, before I moved into multimedia industry. You?

And how about some arguments other than ad personams?
Zinegata wrote:What's funny is that I'm just pointing out how improbable the whole thing is even if we assume that somebody actually programmed a God-AI software into existence in the first place; which as I've repeatedly pointed out is not actually an easy (or imminent) thing to do.
Yes, it's not easy/improbable. However, I just wanted to point out that ascribing human limitations of today to something made 20 years from now, far more intelligent than any human, is downright silly.
Nor is it even necessary to network such a machine, because only idiots think that the World Wide Web with its enormous mass of contradictory information is an ideal tool for "teaching" an AI. Heck, an AI that bases its knowledge base on the Internet would probably conclude the its purpose in life was to create porn for anything and everything.
And that would be bad thing how exactly? :lol:
Again, viruses and other similar "infection" are by necessity small programs to avoid detection. You can't have a God-AI program the size of a virus, because it literally can't have that kind of functionality with only a couple of bytes of data.
Here's a piece of reality check for you - even if we talked about 100 GB file, modern desktop of today has 500-1000 GB disk. Do you know how much of it is used if machine is used for typical office work and net surfing? 100 GB tops. There are literally millions of such machines today, care to tell me how much free space there will be on typical desktop, or even cell phone in 10 years?

And while main program won't be the size of virus, if an AI wanted to spread itself making a virus (that would have infected as much machines as possible looking for good candidates for AI infection, and activating sleeper cells in case of purge) the size of actual virus would be trivial. Good luck purging that, seeing any active AIs would modify this safety net observing human efforts to combat them and learning from it.
Actually, if you again look at the potential file size of an AI program, it will in fact consume the majority of your bandwidth. We're not talking about a 100kb file here. We're probably going to talk in terms of a 100GB file at the minimum. Even if you have a fibr internet connection, you'll generally be working under a usage subscription plan so you're gonna see your usage used up. And there is no way to prevent the inevitable harddisk slowdown as it tries to copy such a huge file into your PC.
Another reality check for you - over past 2 days, I downloaded 80 GB raw movie from my colleagues. Into my laptop with 4200 RPM HD. It downloaded 1-2 MB/sec, and you know how many slowdowns I noticed? None. I even played a few demanding games in meantime, loading a lot of data from HD, little difference to my usual experience. Care to tell me how much slowdown I'd notice on 7200 desktop HD? Or how much I'll notice in 10 years, when all HDs will be SSD anyway?

And you're aware that there are countries where you get just straight broadband, no stupid download limits? If my country, bordering on second world status, can provide such connection over not fiber, but humble LAN cable, I'm pretty sure actual first world countries have connections capable of silently downloading this AI now, much less in 10 years.
So again, even if we assume that a God AI does try to do a Skynet, only an idiot would believe it can self-replicate. The Terminator 3 scenario was extreme idiocy (and not just because Skynet actually subsequently committed suicide because it nuked all of the computers it was running on)
Yeah, because we totally don't have thousands of examples of programs that would be quite sufficient to maintain infection network even today. Oh, wait, ever heard of botnets? :roll:
Compatibility issues are a huge hurdle in any software development cycle. You cannot make a program run on Linux if it was meant to run on Windows natively, and it gets worse with a more complex program. In fact, you're probably going to look at a total recode from top to bottom just to get it to run again... and with a huge and highly complex program, that ain't gonna happen overnight.
You know how I'd do it? Not thought totally locked in the box, like you two. I'd first infect the PCs with the spreading virus I mentioned above, then I'd ignore the hurdle of windows/linux/android/whatever altogether by putting your actual system into virtual machine at boot, making it essentially program ran by AI, which would then live off excess computing power of your graphic card and processor you don't use at the moment. That way, I'd only need something capable of running on x64, ARM and select few other processors. You're aware that there are proof of concept viruses that do it even today? With barely noticeable slowdowns in most common tasks other than gaming?

Assuming AI coded in next 10 years will have the same limitations, approaches, and hardware resources as human programmer 5 years ago is, as I said, downright stupid. What I described above might be impractical/dead end in 10 years, but saying 'we'll defeat AI because it can't fit into floppy drive, hurr!' is, well... :roll:
Unlike human programmers, AI would be actually capable of thoroughly testing and understanding the processor it runs at, finding out all the errors it possibly can, and adjusting accordingly.
Actually, they can't. They can't do it now, and it's unlikely they will able to do it quickly in the future. To date we don't even have a programming tool for computer programmers that can do automatic code corrections if there is a testing failure; the best that we have is little more than the equivalent of a spell checker.
Note I said "it possibly can". I'm well aware that there are problems that can't be detected from the inside, so to speak. However, complete testing of the processor is possible, as quite a few earlier pentium processors have been so tested (as part of government checks and equipment testing) - and again, assuming AI that can run various tests on processors identical to the one it is using now will only have accuracy of human programmer or automatic tool of today is dangerous.
A human being isn't born knowing how their brain works. We have to use sophisticated tools to actually look at how the brain works.
And that stops something more intelligent than human, something running on perfectly identical, blueprinted units from being capable of discovering it how exactly? Aeronautic engineer might not know how bumblebee wing works, but that doesn't stop him from understanding how turbojet and metal wing work.
A machine would similarly be unable automatically figure out how a microprocessor works. It may be able to know what model of microprocessor it's using by checking the System Registry, but System Registries do not come with complete schematics on how the microprocessor actually looks or functions. It doesn't say what kind of tolerances the hardware is capable of. If it tries to install itself on just some random computer, the likely result is simple: It will not run, period. It's like trying to run a game like Crysis II without knowing the system capabilities.
If a human being can understand it given publicly accessible data, I'm sure actually malicious program both smarter and much faster than human can figure it out. I noted that its initial attempts at unfamiliar processors/infrastructures will fail, but once it figures out the differences it will prepare a copy that will work. Especially seeing it only need to code for processor, not OS.

Let's take the Crysis II example from above - on how many windows machines of today it can run? All of them. Sure, it will fail to provide entertainment on a significant percentage, as these will be too slow for generating useful graphics, but the fact is, you don't need to recompile Crysis to work on any modern PC. If the AI can run on typical cell phone of 2020, as postulated by someone above, or even typical desktop with acceptable slowdowns, the hardware barrier will be meaningless. At worst, it will use weaker computers as sleeper cells.
So, again, the claims of the singularists are very much on the level of "insane paranoid ramblings" as far as the Terminator 3 Skynet scenario is concerned. It was a dumb scenario. You may as well claim that the Large Hadron Collider will kill us all.
*yawn* Malicious AI might be dumb scenario, sure. But, all I wanted to point out, if anyone will ever make malicious AI on purpose, assuming it will be blind, dumb, deaf and the resultant chest beating by "proving" limitations of today will stop it in 10 years is even dumber. Especially seeing appropriate connections and disc spaces exist today, much less in an era where everyone will have multiple megabit/gigabit wireless connections and multiple terabyte SSDs.
User avatar
Ariphaos
Jedi Council Member
Posts: 1739
Joined: 2005-10-21 02:48am
Location: Twin Cities, MN, USA
Contact:

Re: Do you believe a technological singularity is near?

Post by Ariphaos »

Zinegata wrote:While there are some applications that will require crawling the web for information, I'm going to have to point out that creating an AI may not necessarily go that route. The web is huge repository of information, but a lot of it is useless or even outright wrong information, and without consultation with outside sources your AI is just gonna have a lot of garbage information dumped on it. And as they say, garbage in, garbage out.
What mechanism of putting it on the web requires using it for updating its general model from individual human actors?

The most obvious human interaction for a general AI sitting on a server somewhere on the web is actually the reverse. Teach it about some system, and then allow it to be a part of a site's help system, for example, intelligently responding to humans and adapting itself to each unique human to get the most effective result.
Give fire to a man, and he will be warm for a day.
Set him on fire, and he will be warm for life.
User avatar
cosmicalstorm
Jedi Council Member
Posts: 1642
Joined: 2008-02-14 09:35am

Re: Do you believe a technological singularity is near?

Post by cosmicalstorm »

Those where very interesting posts, Starglider.
User avatar
cosmicalstorm
Jedi Council Member
Posts: 1642
Joined: 2008-02-14 09:35am

Re: Do you believe a technological singularity is near?

Post by cosmicalstorm »

I can't believe why people seem to think that the current human brain should be the end point of all that can be. I guess some people just spent so much time dreaming about big tin-cans filled with monkeys spinning around the galaxy on adventures, anything that challenges that dream seems too provocative.
The fact that the brain is highly modular is well established.
Neuronal information is being moved around at half a millionth of light-speed.
Nature places limits at the size of our brain so women can survive giving birth.
Algorithmic researchers have long since proven beyond any doubt that super human abilities exist.

Unless we self destruct I have little doubt that in a century or two there might not be a single DNA strand left outside a virtual memory museum.
User avatar
Akhlut
Sith Devotee
Posts: 2660
Joined: 2005-09-06 02:23pm
Location: The Burger King Bathroom

Re: Do you believe a technological singularity is near?

Post by Akhlut »

cosmicalstorm wrote:Unless we self destruct I have little doubt that in a century or two there might not be a single DNA strand left outside a virtual memory museum.
You really think that, say, the Wahhabist imams of Saudi Arabia are going to be cool with uploading human personalities to computers?

Or that the African quasi-slaves used to mine coltan for use in electronics are going to be part of the techno-singularity because of their virtuous and enlightened masters, because, gosh, I'm sure it's going to be profitable as all hell to do that.

I'm sure the Amish are going to embrace the movement wholeheartedly, too. Same with Mennonites too.

I bet indigenous people around the world are going to want in on it too, because I'm sure they won't be suspicious at all that some group in power isn't going to delete their entire people to make room for a few yottabytes worth of artificial sexy harem girls.

Plus, all the people who willingly do stuff like homestead or live primitive lives will obviously want to integrate themselves into a computer. Makes perfect sense, really.

So, yes, even if the technology becomes available, I'm sure there are going to be more than a few humans who willingly (and not so willingly) remain entirely and thoroughly organic in nature.
SDNet: Unbelievable levels of pedantry that you can't find anywhere else on the Internet!
User avatar
Akhlut
Sith Devotee
Posts: 2660
Joined: 2005-09-06 02:23pm
Location: The Burger King Bathroom

Re: Do you believe a technological singularity is near?

Post by Akhlut »

Destructionator XIII wrote:The biggest problem facing this planet isn't how to make the rich richer, the strong stronger, or the smart smarter, and that's what most transhumanists seem to talk about. I'll pass.
I think it's because they are absolutely and utterly terrified of their own mortality and want to become immortal.

I also really strongly doubt that any of this uplifting technology is going to be readily available for the billions of humans making less than a $1 a day. You're a cowherd in Nigeria? Sucks to be you, I guess.

I also look forward to the first accidental server wipe holding the personalities, experiences, and other relevant data of hundreds of thousands of people. Genocide sure seems a shitload easier when some 4chan troll can do it on a lark if he figures out how to crack the security for the machines that contain uploaded people.
SDNet: Unbelievable levels of pedantry that you can't find anywhere else on the Internet!
User avatar
Imperial528
Jedi Council Member
Posts: 1798
Joined: 2010-05-03 06:19pm
Location: New England

Re: Do you believe a technological singularity is near?

Post by Imperial528 »

I, personally, think that very few will ever actually upload themselves into a computer or put their brain into a large mainframe. Most people I think will likely get some sort of cybernetics, but not really major ones, just general quality of life ones, similar to a pacemaker for people with heart problems. An insulin synthesizer for people with diabetes, maybe, or ocular implants for those with vision problems, etc.

We might see some people doing things like getting a calculator put in their head or some such, but I honestly think most new technologies will follow existing trends of complementing what we have and making up for what we lack, without being invasive in the way a good deal of singularity-related concepts are.
User avatar
Aaron MkII
Jedi Master
Posts: 1358
Joined: 2012-02-11 04:13pm

Re: Do you believe a technological singularity is near?

Post by Aaron MkII »

Akhlut wrote:
cosmicalstorm wrote:Unless we self destruct I have little doubt that in a century or two there might not be a single DNA strand left outside a virtual memory museum.
You really think that, say, the Wahhabist imams of Saudi Arabia are going to be cool with uploading human personalities to computers?

Or that the African quasi-slaves used to mine coltan for use in electronics are going to be part of the techno-singularity because of their virtuous and enlightened masters, because, gosh, I'm sure it's going to be profitable as all hell to do that.

I'm sure the Amish are going to embrace the movement wholeheartedly, too. Same with Mennonites too.

I bet indigenous people around the world are going to want in on it too, because I'm sure they won't be suspicious at all that some group in power isn't going to delete their entire people to make room for a few yottabytes worth of artificial sexy harem girls.

Plus, all the people who willingly do stuff like homestead or live primitive lives will obviously want to integrate themselves into a computer. Makes perfect sense, really.

So, yes, even if the technology becomes available, I'm sure there are going to be more than a few humans who willingly (and not so willingly) remain entirely and thoroughly organic in nature.
Man, the only interest in transhumanism I have is replacing my crippled limbs with robot bits. I enjoy my hobbies outside too much to stick my brain in a box, living forever also sounds terrible. I can watch my friends and family die who don't upload, yeah wonderful.

People want to do it, then fine. I don't want too though.
User avatar
cosmicalstorm
Jedi Council Member
Posts: 1642
Joined: 2008-02-14 09:35am

Re: Do you believe a technological singularity is near?

Post by cosmicalstorm »

I don't WANT most of it. I don't even KNOW what would be an optimal future for humans.

What I'm talking about is the fact that there is vast room for improvements in the structure of minds.
And it seems likely that those minds will come into existence in a not distant future.
What happens to Standard Human Brains contained in big sacks of meat when that happens?

Nature does not care about us any more than it cared for the 99.9% of all lifeforms before us that got extinct because they ran out of luck.

This issue is an extinction risk much more acute than global warming, mile wide impactors, GRB-jets and super-volcanoes. And even on forums filled with people who are otherwise open to ideas that seems completely alien and incomprehensible to vast chunks of the population this issue is almost unheard of or mocked.
Akhlut wrote:
cosmicalstorm wrote:Unless we self destruct I have little doubt that in a century or two there might not be a single DNA strand left outside a virtual memory museum.
You really think that, say, the Wahhabist imams of Saudi Arabia are going to be cool with uploading human personalities to computers?

Or that the African quasi-slaves used to mine coltan for use in electronics are going to be part of the techno-singularity because of their virtuous and enlightened masters, because, gosh, I'm sure it's going to be profitable as all hell to do that.

I'm sure the Amish are going to embrace the movement wholeheartedly, too. Same with Mennonites too.

I bet indigenous people around the world are going to want in on it too, because I'm sure they won't be suspicious at all that some group in power isn't going to delete their entire people to make room for a few yottabytes worth of artificial sexy harem girls.

Plus, all the people who willingly do stuff like homestead or live primitive lives will obviously want to integrate themselves into a computer. Makes perfect sense, really.

So, yes, even if the technology becomes available, I'm sure there are going to be more than a few humans who willingly (and not so willingly) remain entirely and thoroughly organic in nature.
Hey, good job not getting any of what was just said. It wont matter what those people want to adopt or do when a machine mind decides they are not a necessary component of the planetary mass.
User avatar
Guardsman Bass
Cowardly Codfish
Posts: 9281
Joined: 2002-07-07 12:01am
Location: Beneath the Deepest Sea

Re: Do you believe a technological singularity is near?

Post by Guardsman Bass »

I wonder how attractive invasive surgery for the purpose of non-medical cybernetic implants would actually be, particularly if there are non-invasive surgery alternatives that aren't quite as good (but much easier and cheaper to use). Think of a choice between wearing special augmented reality/HUD glasses like the Google Glass project, or having artificial augmented reality lens implanted directly into your eyes. Perhaps the capabilities of the latter are greater than the former, but you still have to go through surgery to get them in (and possibly surgery every time they need to be repaired and/or replaced).
“It is possible to commit no mistakes and still lose. That is not a weakness. That is life.”
-Jean-Luc Picard


"Men are afraid that women will laugh at them. Women are afraid that men will kill them."
-Margaret Atwood
Simon_Jester
Emperor's Hand
Posts: 30165
Joined: 2009-05-23 07:29pm

Re: Do you believe a technological singularity is near?

Post by Simon_Jester »

What's most plausible is that the technology would be developed first for the disabled and spend a long time primarily marketed to them, with at most a narrow sideline in augmentation among rather odd people.

After the surgical techniques are refined by years of practical experience, and after the hardware has gotten really really capable, to the point where, say, a disabled person with cybereyeballs actually enjoys an advantage over an ordinary person... Then you will see it starting to spread a bit more.
This space dedicated to Vasily Arkhipov
User avatar
Aaron MkII
Jedi Master
Posts: 1358
Joined: 2012-02-11 04:13pm

Re: Do you believe a technological singularity is near?

Post by Aaron MkII »

Provided medical professionals are going to do it, there is that whole oath thing, sticking bags of silicone under tits and other cosmetics are one thing.

Scooping out a healthy eye or chopping of an arm is something quite different.
User avatar
Akhlut
Sith Devotee
Posts: 2660
Joined: 2005-09-06 02:23pm
Location: The Burger King Bathroom

Re: Do you believe a technological singularity is near?

Post by Akhlut »

Guardsman Bass wrote:I wonder how attractive invasive surgery for the purpose of non-medical cybernetic implants would actually be, particularly if there are non-invasive surgery alternatives that aren't quite as good (but much easier and cheaper to use). Think of a choice between wearing special augmented reality/HUD glasses like the Google Glass project, or having artificial augmented reality lens implanted directly into your eyes. Perhaps the capabilities of the latter are greater than the former, but you still have to go through surgery to get them in (and possibly surgery every time they need to be repaired and/or replaced).
"I'm sorry, you cannot view this content, as your hardware version is too old! Please contact your local cybernetics surgeon and upgrade your ocular implants in order to see this content!"

Plus, once enough people have computer implants in their brains, and before the relevant regulations come into effect, how many people get to have their dreams interfered with by megacorporations turning them into inescapable advertisements? You're having a sex dream, and your partner interrupts it for a few minutes, blabbering about some product. "Oh, yeah, that's nice, but we should use some Tide brand laundry detergent to wash up the sheets afterward!"

And if you say that's unlikely, I'd reply that it will only be for as long as the technology prevents it from being possible.
SDNet: Unbelievable levels of pedantry that you can't find anywhere else on the Internet!
User avatar
FireNexus
Cookie
Posts: 2130
Joined: 2002-07-04 05:10am

Re: Do you believe a technological singularity is near?

Post by FireNexus »

Aaron MkII wrote:Scooping out a healthy eye or chopping of an arm is something quite different.
If they're replacing it with a better functioning prosthetic, though, would it really count as harm?
I had a Bill Maher quote here. But fuck him for his white privelegy "joke".

All the rest? Too long.
User avatar
Aaron MkII
Jedi Master
Posts: 1358
Joined: 2012-02-11 04:13pm

Re: Do you believe a technological singularity is near?

Post by Aaron MkII »

You know, I'm not sure.

Ethics isn't exactly my thing.
User avatar
PeZook
Emperor's Hand
Posts: 13237
Joined: 2002-07-18 06:08pm
Location: Poland

Re: Do you believe a technological singularity is near?

Post by PeZook »

Not all doctors are paragons of virtue, either. It's a sufficiently gray area that unless ethics boards find it a problem, I'm sure you'll be able to find plenty of doctors willing to do it for a good price.

And even if ethics boards in some countries DO find it a problem, surgical tourism isn't exactly unheard of...
Image
JULY 20TH 1969 - The day the entire world was looking up

It suddenly struck me that that tiny pea, pretty and blue, was the Earth. I put up my thumb and shut one eye, and my thumb blotted out the planet Earth. I didn't feel like a giant. I felt very, very small.
- NEIL ARMSTRONG, MISSION COMMANDER, APOLLO 11

Signature dedicated to the greatest achievement of mankind.

MILDLY DERANGED PHYSICIST does not mind BREAKING the SOUND BARRIER, because it is INSURED. - Simon_Jester considering the problems of hypersonic flight for Team L.A.M.E.
Zinegata
Jedi Council Member
Posts: 2482
Joined: 2010-06-21 09:04am

Re: Do you believe a technological singularity is near?

Post by Zinegata »

Irbis wrote:Yes, it's not easy/improbable. However, I just wanted to point out that ascribing human limitations of today to something made 20 years from now, far more intelligent than any human, is downright silly.
Except of course a very large number of computer scientists - particularly those who have actually studied other fields (i.e. neurobiology) have come to realize that the complexity level of creating a self-thinking machine is so great that even a presumably "smarter" machine will not really be able to improve upon the design process. Again, even the guy who invented Moore's Law thinks the Singularity is charlatan science.

Because again, "design ability" is actually something that is taught to humans, taking many years and a lot of experiences. The people who insist machines will be smarter (most of whom are actually sci-fi writers) simply handwave this learning curve and think that dumping information into a computer brain will somehow make it uber smart. It won't. The machine will just look at the 100 GB of data you dumped on it and go "What the fuck man?"

Learning is not about data-dumping or mindless memorization. This is why the core of current AI research is into creating proper data structures to store information and their associations; not building a bigger hard disk.
And that would be bad thing how exactly? :lol:
That you fail to realize that "creating porn for everything and anything" does not equal "revolutionizing science as we know it" simply demonstrates your tacit admission of glossing over the very real technical limitations of giving AIs "design ability".
Here's a piece of reality check for you - even if we talked about 100 GB file, modern desktop of today has 500-1000 GB disk. Do you know how much of it is used if machine is used for typical office work and net surfing? 100 GB tops. There are literally millions of such machines today, care to tell me how much free space there will be on typical desktop, or even cell phone in 10 years?
Here's a reality check for you: No matter the size capacity of a hard disk, the speed of writing/reading the said information has generally remained the same for the past several years. So again, even if you have a 1,000 GB harddisk, you WILL slowdown the whole system by copying a 100 GB file. And again, that's gonna use up your bandwidth allocation, and again this glosses over compatibility issues.

Ever tried copying a big file from a CD, from a USB disk, and an Internet site from the same time? (Playing games don't count - games generally utilize CPU power and RAM, rather than actively read / write data from the harddisk) Ever seen it cause Windows Explorers to hang so that you can't even click on the C: icon anymore? That's the kind of slowdown I'm talking about, not this complete irrelevancy about hard disk space.
And while main program won't be the size of virus, if an AI wanted to spread itself making a virus (that would have infected as much machines as possible looking for good candidates for AI infection, and activating sleeper cells in case of purge) the size of actual virus would be trivial. Good luck purging that, seeing any active AIs would modify this safety net observing human efforts to combat them and learning from it.
Ah, yes. "Let's have an AI spread a virus on the net like a giant octopus spreading its tentacles!" idiocy.

Cut the AI's LAN line and tear out the WiFi. The viruses suddenly have nothing to send information back to. End of story.

There are no "Internet bypasses". There is no "Wireless Internet". It is THAT fucking simple. Cut the LAN line, tear out the WiFi.

Again, only fucking idiots who write sci-fi stories believe this to be a realistic scenario.
And you're aware that there are countries where you get just straight broadband, no stupid download limits?
Actually, if you were in the telco industry (which I am and you clearly are not), you'd very quickly realize that any service provider which promises "unlimited broadband" is actually just promising a maximum transfer rate. When Verizon says you have a 100 Mbp connection, they aren't promising 100 MB all the time. In reality, this rate is "overbooked" by a considerable margin (often by a ratio of at least 8:1) when compared to the actual overall resources available to the network.

This is a completely sound setup with human Internet users (unless you're a dirty pirate who is downloading movies off the Internet; shitheads like these are one of the biggest reasons for Internet slowdowns), because Internet users generally don't download stuff constantly 24/7. Standard Internet behavior is to download a webpage, spend a couple of minutes reading it, then moving on to the next page, minimizing the simultaneous usage and making the overbooking issue largely moot.

However, if you're gonna try to spread 100GB file simultaneously to tens of thousands of users, the backbone of many ISPs will simply not be able to support it; and EVERYONE will crawl down to the overbooking levels. So your 1 Mb transfer rate is gonna collapse down to something like dial up speed for EVERYONE.

Even worse (because you are again a moron who isn't working for an actual telco), you completely fail to realize that each individual country has their own internal network. Japan and Korea for instance both have an absolutely fantastic internal network that's almost 100% Fiber. People there can get a 50Mbps connection easily.

However, before getting content from another country (i.e. the US) they have to go through several "gateways", which have very limited bandwith.

Which is why when you're in Japan, you can have a 50 Mbps connection and get content from Japanese websites very quickly, but it may take a long time to download the Yahoo US home page. Of course, big sites have done the sensible thing and locally cached their content (so that you don't have to go through the international gateway before accessing the Yahoo US page - you just go to the locally cached version instead), but again that's gonna put any mass upload Skynet scenario to a screeching halt - because your AI won't have a locally cached server.

The Internet pipeline is simply finite (the bottleneck being the gateways, which are reliant on a handful of submarine cables), and people WILL notice large scale uploads because it will cause a webwide disruption.

But of course, that's assuming your fictional Skynet can actually understand Japanese and be able to install itself in Japanese PCs. :lol:
Zinegata
Jedi Council Member
Posts: 2482
Joined: 2010-06-21 09:04am

Re: Do you believe a technological singularity is near?

Post by Zinegata »

Xeriar wrote:What mechanism of putting it on the web requires using it for updating its general model from individual human actors?

The most obvious human interaction for a general AI sitting on a server somewhere on the web is actually the reverse. Teach it about some system, and then allow it to be a part of a site's help system, for example, intelligently responding to humans and adapting itself to each unique human to get the most effective result.
Web crawling is actually already done today (on a somewhat primitive level from an intelligence perspective) by search engines. You can ask Google questions as part of your search query, and it will return related websites.

There's some work being done to further refine this and give the search engines more intelligence. Basically, you create a program, ask it to read through the whole world wide web, and then have it capable of answering questions asked by humans. The challenge is mainly making an AI that is capable of seperating good data from bad data without needing human input.

Also, I will note that none of these applications involve hacking into government websites to take over control of nuclear missile facilities; which again precludes it from the Stupid Skynet Scenario.

The vast majority of current AI designs are as you noted though: Teach it about a system while offline (feeding it a very limited and focused subset of necessary information), and then using it as a tool to help humans. Because again, the web is fully of bad and contradictory data and the whole point of making an AI web crawler is to seperate good data from the bad.
Zinegata
Jedi Council Member
Posts: 2482
Joined: 2010-06-21 09:04am

Re: Do you believe a technological singularity is near?

Post by Zinegata »

Destructionator XIII wrote:Zinegata, what do you think about using something like throttled bit torrent to do the transfer? The demon ai could wait several months to finish the transfer, doing piece by piece as it has a connection available, and do peer to peer shit as its local caches.

(of course this still begs the question of what copying around is really good for anyway.)
It'd be harder to detect and wouldn't cause obvious worldwide Internet slowdowns.

However, I'd also have to point out that much of Internet security nowadays revolves around detecting small unauthorized data streams - precisely the sort of thing that a throttled torrent connection would look like.

People may laugh at the general incompetence of anti-virus / spyware companies, but most viruses are in fact caught within a few weeks (often even days) of release, especially if it's aiming for a very widespread distribution. Same for spyware.

So while it may "infect" some computers that are without any reasonable security software, it will hardly be able to take control of every computer in the world - much less of vital systems which are generally kept offline anyway. Plus, if the connection to the original AI is cut before it completes one full upload, all you'll have is an incomplete and non-functiontining program.

But as you noted, there's really no good reason why it should copy itself around to every home computer anyway, and that's assuming it isn't completely cut off from the world wide web.
MrDakka
Padawan Learner
Posts: 271
Joined: 2011-07-20 07:56am
Location: Tatooine

Re: Do you believe a technological singularity is near?

Post by MrDakka »

FireNexus wrote:
Aaron MkII wrote:Scooping out a healthy eye or chopping of an arm is something quite different.
If they're replacing it with a better functioning prosthetic, though, would it really count as harm?
^That's the entire plot of Deus Ex: Human Revolution.

Personally I'd wait until the integration between man and machine is more .... seamless. Flesh and metal interfaces currently are inelegant and I think this is probably the main reason (if there weren't any technical problems/side effects) that would prevent people from cyborging up.

Of course, if we were WH40K orks, just bolt that stuff to any convenient anchoring point and voila! Instant cyborg. :D
Needs moar dakka
Post Reply