Harry Potter and the Methods of Rationality
Moderator: Steve
-
- Jedi Master
- Posts: 1127
- Joined: 2010-06-28 10:19pm
Re: Harry Potter and the Methods of Rationality
That's a relief. He won't accept a slashing with Malfoy. Too much human contact. The world heaves a sigh of relief.
Re: Harry Potter and the Methods of Rationality
I think this conversation, from Chapter 61, sums up why I love this story nicely:
Spoiler
Spoiler
"Do I really look like a guy with a plan? Y'know what I am? I'm a dog chasing cars. I wouldn't know what to do with one if I caught it! Y'know, I just do things..." --The Joker
- Battlehymn Republic
- Jedi Council Member
- Posts: 1824
- Joined: 2004-10-27 01:34pm
Re: Harry Potter and the Methods of Rationality
It's remarkable how often those who most champion and fetishize rationality, logic, or objectivism become the biggest enemies of those causes.Lurks-no-More wrote:Yudkowsky is a nut, if an intelligent and occasionally interesting nut, and this particular fic is horribly overrated.
- Ahriman238
- Sith Marauder
- Posts: 4854
- Joined: 2011-04-22 11:04pm
- Location: Ocularis Terribus.
Re: Harry Potter and the Methods of Rationality
Having never met the man, what about him disturbs/annoys all of you so much?Battlehymn Republic wrote:It's remarkable how often those who most champion and fetishize rationality, logic, or objectivism become the biggest enemies of those causes.Lurks-no-More wrote:Yudkowsky is a nut, if an intelligent and occasionally interesting nut, and this particular fic is horribly overrated.
"Any plan which requires the direct intervention of any deity to work can be assumed to be a very poor one."- Newbiespud
- Battlehymn Republic
- Jedi Council Member
- Posts: 1824
- Joined: 2004-10-27 01:34pm
Re: Harry Potter and the Methods of Rationality
I don't think it's Yudkowsky per se, but LessWrong in general is very well-meaning in that they're a community of rationalists trying to fight against irrationalism but then they fall into their own cargo cults. RationalWiki covers it quite well. The section about the Ugly is most hilarious because the Singularitarians of the community suddenly fall into a hysterical witchhunt.
One source of considerable HCM was the "Forbidden Post", or "basilisk". Yudkowsky is interested in what could be described as causality that goes backwards in time: future events "causing" past events[22] by the mechanism of having something in the present simulate what someone will do in the future and using the results in the present, e.g. not giving a gun to someone you predict will shoot you. This gets odd when you imagine super-human intelligences because their predictions may be near perfect.[23] Roko (a top contributor at the time) wondered if a future Friendly AI would punish people who didn't do everything in their power to further the AI research from which this AI originated, by at the very least donating all they have to it. He reasoned that every day without AI, bad things happen (150,000+ people die every day, war is fought, millions go hungry) and a future Friendly AI would want to prevent this, so it might punish those who understood the importance of donating but didn't donate all they could. He then wondered if future AIs would be more likely to punish those who had wondered if future AIs would punish them. That final thought proved too much for some LessWrong readers, who then had nightmares about being tortured for not donating enough to SIAI.[24] Eliezer Yudkowsly replied to Roko's post calling him names and claiming that posting such things on an Internet forum could have caused incalculable harm to the people who read it. Four hours later, Eliezer Yudkowsly deleted Roko's post[25] including all comments. Roko left LessWrong, deleting his thousands of posts and comments[26]. (He later briefly returned [27], and posted among other things that "I agree that the post in question should not appear in public"[28] and "I wish I had never learned about any of these ideas"[29].)
- Lurks-no-More
- Redshirt
- Posts: 40
- Joined: 2010-07-18 05:14am
Re: Harry Potter and the Methods of Rationality
And that sort of stupid bullshit is why, despite having plenty of interest in transhumanist ideas, I can't take the Singularitarians and other such groups seriously.