I want to be stronger.

I want to outsource my weaknesses to those who are strong at them, and intake the workload that my strengths can take.

Like Tim Ferriss and A.J. Jacobs, I'd like to have someone working in synergy with me, to increase our productivity a lot.

....and I funded and direct an institute in Brazil (www.ierfh.org google chrome for translation) which aims in the general direction of Less Wrong folk orientation. Institute Ethics Rationality and Future of Humanity.

We work mostly in two broad areas: Transhumanism and Effective Altruism. We are currently outreaching and filtering people who are motivated smart and altruistic (it is ok to be altruistic only towards one future self). Some are researching and lecturing.  Since February last, we grew to 31 members, more than I'd expect be interested on topics in the whole of Brazil, so that was great!

Here is the catch, because I live in Brazil, the assistant does not need to be in India! Nor cost as much as in the US!

But then which are the best places to find someone with this set of characteristics that Luke Muelhauser was looking for in his assistants?  (link to my portuguese facebook share http://www.facebook.com/gdiego.vichutilitarian/posts/349182458498891)

Ultrasummary of abilities: Very good English command, goals that either pro-technology or pro-effective giving, minimally rational, somewhat rich (there is a niche of people who work to feel fulfilled more than for money, in Brazil, this correlates strongly with good english skills and all the abilities social class can buy),  able to think by themselves when given tasks as abstract as "Find contact info of 5 researchers on happiness who published in 2010-2012, ask them whether they would like to jointly publish on topic x, tell me their thoughts."

How would you go about finding yourself/your institute an assistant? How to optimize for above-mentioned criteria? 

Suppose my current strategy is posting job positions in undergrad walls. Can you improve it?

PS: technically this is not exactly an offer, because I know all Brazilians in Less Wrong, but if you feel like moving to São Paulo and would like to take the job, please do tell me! I recently met a Less Wrong couple going to Australia to optimize for beautiful landscapes. Coming here means optimizing for social relaxing environment, cuddly happy huggy people, and delicious yet quite expensive food.

 

 

New to LessWrong?

New Comment
12 comments, sorted by Click to highlight new comments since: Today at 10:30 AM

Downvoted for roundaboutness

Pros for Up: Spreading rationality, positive news on progress Pros for Down: conflictory use of strenght buildup and bypass, low focus on the title issue, post category confusion (is this an advert, news report, job offering or methodology query?), low topic cohesion: transhumanisms is a tack on

Is it me or has transhumanism become a taboo word associated to low status crackpots around here?

Transhumanism is a cause of some folk visiting here despite being less interested in epistemic rationality as such, so I would expect an enrichment of self-identified transhumanists anong the authors of badly received posts. It's also a political tribe for a number of people, so there is room for mind-killing.

However, I'd be curious to see your links to examples.

Quite a few associations rooted in transhumanism have attempted (whether they did so successfully is questionable) to distance themselves from the crazy-sounding (to a mainstream audience) plain description of their original goals and beliefs, in an effort to attract more and better quality funding and following (such as academia).

Compare :

Longecity, formerly The Immortality Institute

Humanity+, formerly The World Transhumanist Association

The Singularity Institute remains named so, but seems willing to follow suit.

I think I'm observing an emerging pattern, where several loaded topics such as transhumanism and cryonics have become much more controversial and unfashionable in places which previously championed them, and Lesswrong is no exception, as there's been concern that such topics may not have their place on a forum devoted to rationality.

You appear to express this connection (transhumanism being unfashionable) yourself in this sentence : Denotationally crazy political (namely, transhumanist) rhetoric.

Longecity, formerly The Immortality Institute

Huh. Both of these names seem pretty terrible to me. Longecity just sounds peculiar and unmemorable, while "Immortality Institute" seems ridiculously overblown as "a plain description of their original goals and beliefs." And there is a "Life Extension Foundation."

Humanity+, formerly The World Transhumanist Association

I think that they were trying to avoid sounding "anti-human," which "Transhumanist" has some connotations of. Those connotations are significantly true if we're talking about allowing serious genetic enhancement, or the creation of brain emulations, although I'm pretty sure that transhumanist overwhelmingly want the welfare and survival of existing humans (such as themselves!) protected.

Also, historically, I think that this was a ploy at increasing growth and becoming more fashionable, rather than a reaction to increasing unfashionability.

The Singularity Institute remains named so, but seems willing to follow suit.

The SI is small relative to other forces shaping the popular meaning of the term Singularity, e.g. Kurzweil's books, or people using the term "Singularity" to talk about advances in video game technology (which has actually happened at a major conference). So it's hard to avoid confusion with other meanings of the word, or with different views and organizations (e.g. people confuse SI for Singularity University fairly often).

You appear to express this connection (transhumanism being unfashionable) yourself in this sentence : Denotationally crazy political (namely, transhumanist) rhetoric.

I would make similar comments about denotationally crazy rhetoric on behalf of other political ideologies like liberalism, conservatism, nationalism, and so on. I was saying first that the claims were factually false, and second that they were being made in an ideologically charged way that it's helpful to avoid. I certainly don't think that transhumanism implies denotational craziness much more than other ideologies. If anything, I'd say the opposite, because of the high average levels of education, intelligence, secularism, and so forth among transhumanists and the concordance of their ethical views with elites along those dimensions who do not self-identify as transhumanist.

Ultrasummary of abilities: Very good English command, goals that either pro-technology or pro-effective giving, minimally rational, somewhat rich (there is a niche of people who work to feel fulfilled more than for money, in Brazil, this correlates strongly with good english skills and all the abilities social class can buy)

(emphasis added)

Is this acceptable now? I suspected some would practice such discrimination privately, but to proclaim it publicly and to expect it to be seen as a fair requirement surprises me.

Wealth is moderately correlated with intelligence/instrumental rationality (especially for those over 30 years of age), so it might work as a decent filter (in conjunction with other metrics) for their purposes.

I'm sure it is correlated. One might find even correlations with other things such as race and gender... I questioned the fairness in using it as a way to recruit people.

Very good English command

I don't know what counts as very good English in Brazil, but your post wouldn't pass as written by a native speaker of the language.

I assume that's why he wants someone who can communicate in English well...so that he and his rationality organization doesn't get disqualified from opportunities simply because of a language barrier.

You're right-- I misread the post.