It might be more fun to say, but "I think like a supervillian" seems like a particularly dangerous cached thought to have, in that it will inspire overconfidence if you /ever actually encounter a supervillian/. A very smart agent whose goals are in direct conflict with yours AND who has no scruples AND hasn't forsworn the dark arts is /not someone you think like/.
Edit: Interestingly, the woman who insisted I take her number was positively disintersted when I did call.
I've always considered "be yourself" to mean "don't pretend to be someone you're not", which is wonderful advice because unless you're /very/ good at it, most people will see through your disguise.
There is something to be said about being confident enough that you don't follow the social script. Like seemingly most things in dating, the strategy doesn't matter very much - it's all about the way you portray yourself.
After a friend recommended giving women my number, I have completely stopped asking for theirs. With n=~10, only one has declined saying that it was my responsiblity to take hers. The others all seemed delighted that I was different, and willing to give them more of the direct power whether or not they'd like to see me again.
My general advice in this department would to be to completely forget that there is a script and simply experiment to see what works for you.
My name is Sandy and despite being a long time lurker, meetup organizer and CFAR minicamp alumnus, I've got a giant ugh field around getting involved in the online community. Frankly it's pretty intimidating and seems like a big barrier to entry - but this welcome thread is definitely a good start :)
IIRC, I was linked to Overcoming Bias through a programming pattern blog in the few months before LW came into existence, and subsequently spent the next three months of my life doing little else than reading the sequences. While it was highly fascinating and seemed good for my cognitive health, I never thought about applying it to /real life/.
Somehow I ended up at CFAR's January minicamp, and my life literally changed. After so many years, CFAR helped me finally internalize the idea that /rationalists should win/. I fully expect the workshop to be the most pivotal event in my entire life, and would wholeheartedly recommend it to absolutely anyone and everyone.
So here's to a new chapter. I'm going to get involved in this community or die trying.
PS: If anyone is in the Kitchener/Waterloo area, they should definitely come out to UW's SLC tonight at 8pm for our LW meetup. I can guarantee you won't be disappointed!
I'm bad at scheduling things and forgot to make a reservation. Tonight we'll be meeting at 9pm instead of 8. I really apologize for any inconveniences this might cause.
tl;dr: The time is changed to 9pm.
I just made the reservation - it's under Sandy, but I'll have a sign too, so don't worry too much about finding us. We're expecting a turn out of about 10 people :)
Honestly, my maginal returns of spending time on LW dropped drastically since I finished reading the sequences. Attending local meetups was kinda fun to meet some like-minded people, but they inevitably were far behind in the sequences and for the most part always struck me as trying to identify as a rationalist rather than trying to become more rationalist. This strikes me as the crux of the issue: LW has become (slash might have always been) an attractor of nerd social status, which is fine if that's its stated goal, though this doesn't seem to be the issue.
Additionally, in the 5 years I've been attending meetups (at least six different ones in three different countries), I've noticed a drastic increase in the levels of weirdness happening, to the extent that I find myself discouraged from attending and having to deal with these people. This is the point I think Witness was trying to express below, perhaps not in so many words, but I find myself explicitly not liking a lot of the people/memes now associated with LW. This is not good.
I do, however, think a place for cultivating rationality is important to have, and to that end I would suggest using Github as the platform. Having some sort of rationality repository (preferably without the LW label), where people can open pull requests for things they're thinking about/working on solving. As an added bonus, you get the ability to track how ideas change over time, can easily fork differing opinions, and get all of the cool things a commit history would do for you. I think having some sort of rationality platform is important, but personally I would do away with the LW culture, keep our identities small, and individually go on our way.
LessWrong has had its time and its place, but its lingering death is probably something we should pay a lot of attention to. As a community experiment, I think the results speak for themselves.