Silk Road drugs market shut down; alleged operator busted.
Bitcoin drops from $125 to $90 in heavy trading.
Edited to add: Well, that was quick. Doesn't look like the bottom fell out.
Edited again: Here's the criminal complaint against the alleged operator. The details at least make sense as a story: in the early days of Silk Road, the alleged operator had really lousy opsec, linking his name to the Silk Road project. Then later, he seems to have got scammed by a guy who first threatened to extort him, then pretended to be a hit-man who would kill the extortionist.
If anyone wants to read all the primary source documents, see http://www.reddit.com/r/SilkRoad/comments/1nmiyb/compiling_all_dprrelevant_pages_suggestions_needed/
I need some advice. I recently moved to a city and I don't know how to stop myself from giving money to strangers! I consider this charity to be questionable and, at the very least, inefficient. But when someone gets my attention and asks me specifically for a certain amount of money and tells me about themselves, I won't refuse. I don't even feel annoyed that it happened, but I do want to have it not happen again. What can I do?
The obvious precommitment to make is to never carry cash. I am strongly considering this and could probably do so, but it is nice to be able to have at least enough for a bus trip, a quick lunch or for some emergency. I have tried to give myself a running tally of number of people refused and when that gets to, say, 20, I would donate something to a known legitimate charity. While doing so makes me feel better about passing beggars by, it doesn't help once someone gets me one-on-one. So I've never gotten to that tally without resetting it first by succumbing to someone. Is there some way to not look like an easy mark? Are there any good standard pieces of advice and resources for this?
However, I always find these exchanges to be really fascinating from the ...
The basic answer is not to talk to these people.
Do not answer questions about what time it is, do not enter any conversations at all. At most say "sorry" and walk on.
Just. Do. Not. Talk. To. Them.
assume that they're scamming. It will often be true and even when honest giving money to panhandlers is an inefficient use of charity. Remind yourself that you already have a budget for charity and that you're sending it to givewell or MIRI or whatever.
An idea: Next time try to estimate how much money such person makes. As a rough estimate, divide the money you gave them by the length of your interaction. (To get a more precise estimate, you would have to follow them and observe how much other people give them, but that could be pretty dangerous for you.)
Years ago I made a similar estimate for a beggar on a street (people dropped money to his cap, so it was easy to stand nearby, watch for a few minutes and calculate), and the conclusion was that his income was above average for my country.
By the way, these people destroy a lot of social capital by their actions. They make life more difficult for people who genuinely want to ask for the time, or how to get somewhere, or similar things. They condition people against having small talk with people they don't know. -- So if you value people being generally kind to strangers, remember that these scammers make their money by destroying that value.
Interesting statements I ran into with regards to kabuki theater aspects of the so called United States federal government shutdown of 2013. This resulted in among other things closing down websites.
A website shouldn't just go down when the people managing it stop working, it's not like they're pedaling away inside the servers. Block the federal highways with army tanks, sorry the government is closed.
There is a nontrivial set of the voting public who legitimately believe money equals tech working via magical alchemy.
I was interested to know this kind of thing has a name: Washington Monument Syndrome.
The name derives from the National Park Service's alleged habit of saying that any cuts would lead to an immediate closure of the wildly popular Washington Monument.
As a sysadmin, if I were to be furloughed indefinitely I would probably spin down any nontrivial servers. A server that goes wrong and can't be accessed is a really, really, really, really terrible-horrible-no-good-very-bad thing. And things go wrong on a regular basis in normal times; when the government is shut down and a million things that get done everyday suddenly stop being done, something somewhere is going to break. Some 12-year-old legacy cron job sitting in an obscure corner of an obscure server written by a long-departed contractor is going to notice that the foobar queue is empty , which turns out to be an undefined behavior because the foobar queue has always had stuff going through it before, so it executes an else branch it's never had occasion to execute, which sends raw debugging information to a production server because the contractor was bad at things, and also included passwords in their debugging because they were really bad at things...
This is actually a terrible example of Washington Monument Syndrome.
" Hi, Server admin here... We cost money as does our infrastructure, I imagine a site that large costs a very good deal, we aren't talking five bucks on bluehost here.
I am private sector, but if I were to be furloughed for an indeterminate amount of time you really have two options. Leave things on autopilot until the servers inevitably break or the site crashes at which point parts or all of it will be left broken without notice or explanation. Or put up a splash page and spin down 99% of my infrastructure (That splash page can run on a five dollar bluehost account) and then leave. I won't be able to come in while furloughed to put it up after it crashes.
If you really think web apps keep themselves running 24/7 without intervention we really have been doing a great job with that illusion and I guess the sleepless nights have been worth it to be successfully taken for-granted."
I've heard several stories in the last few months of former theists becoming atheists after reading The God Delusion or similar Four-Horsemen tract. This conflicts with my prior model of those books being mostly paper applause lights that couldn't possibly change anyone's mind.
Insofar as atheism seems like super-low-hanging fruit on the tree of increased sanity, having an accurate model for what gets people to take a bite might be useful.
Has anyone done any research on what makes former believers drop religion? More generally, any common triggers that lead people to try to get more sane?
Edit: Found a book: Deconversion: Qualitative and Quantitative Results from Cross-Cultural Research in Germany and the United States of America. It's recent (2011) and seems to be the best research on the subject available right now. Does anyone have access to a copy?
I can tell you what triggered me becoming an atheist.
I was reading a lot of Isaac Asimov books, including the non-fiction ones. I gained respect for him. After learning he was an atheist, it started being a possibility I considered. From there, I was able to figure out which possibility was right on my own.
This seems to be a trend. I never seriously worried about animals until joining felicifia.org where a lot of people do. I never seriously considered that wild animals' lives aren't worth living until I found out some of the people on there do. I think it's a lot harder to seriously consider an idea if nobody you respect holds it. Just knowing that a good portion of the population is atheist isn't enough. Once you know one person, it doesn't matter how many people hold the opposite opinion. You are now capable of considering it.
I didn't think unfriendly AI was a serious risk until I came here, but that might have been more about the arguments. I figured that an AI could just be programmed to do what you tell it to and nothing more (and from there can be given Asimov-style laws). It wasn't until I learned more about the nature of intelligence that I realized that that is not likely going to be easy. Intelligence is inherently goal-based, and it will maximize whatever utility function you give it.
Theism isn't about god. It has also social and therefore strong emotional consequences. If I stop being a theist, does it mean I will lose my friends, my family will become more cold to me, and I will lose an access to world's most wide social networks?
In such case the new required information isn't a disproved miracle or an essay on Occam's razor. That has zero impact on the social consequences. It's more important to get an evidence that there is a lot of atheists, they can be happy, and some of them are considered very cool even outside of atheist circles. (And after having this evidence, somehow, the essays about Occam's razor become more convincing.)
Or let's look at it from the opposite side: Even the most stupid demostrations of faith send the message that it is socially accepted to be religious; that after joining a religion you will never be alone. Religion is so widespread not because the priests are extra cool or extra intelligent. It's because they are extra visible and extra audacious: they have no problem declaring that everyone who disagrees with them is stupid and evil and will go to hell (or some more polite version of this, which still gets the message across) -- a...
I'm in the process of translating some of the Sequences in French. I have a quick question.
From The Simple Truth:
Mark sighs sadly. “Never mind… it’s obvious you don’t know. Maybe all pebbles are magical to start with, even before they enter the bucket. We could call that position panpebblism.”
This is clearly a joke at the expense of some existing philosophical position called pan[something] but I can't find the full name, which may be necessary to make the joke understandable in French. Can anyone help?
In the past few hours, my total karma score has dropped by fifteen points. It looks like someone is going back through my old comments and downvoting them. A quick sample suggests that they've hit everything I've posted since some time in August, regardless of topic.
Is this happening to anyone else?
Anyone with appropriate access care to investigate?
To whoever's doing this — Here's the signal that your action sends to me: "Someone, about whom all you know is that they have an LW account that they use to abuse the voting system, doesn't like you." This is probably not what you mean to convey, but it's what comes across.
I got an offer of an in-person interview from a tech company on the left coast. They want to know my current salary and expected salary. Position is as a software engineer. Any ideas on the reasonable range? I checked Glassdoor and the numbers for the company in question seem to be 100k and a bit up. I suppose, actually, that this tells me what I need to know, but honestly it feels awfully audacious to ask for twice what I'm making at the moment. On the other hand I don't want to anchor a discussion that may seriously affect my life for the next few years at too small a number. So, I'm seeking validation more than information. Always audacity?
Always ask as much as you can. Otherwise you are just donating the money to your boss. If you hate having too much money, consider donating to MIRI or CFAR or GiveWell instead. Or just send it to me. (Possible exception is if you work for a charity, in which case asking less than you could is a kind of donation.)
The five minutes of negotiating you salary are likely to have more impact on your future income than the following years of hard work. Imagine yourself a few years later, trying to get a 10% increase and hearing a lot of bullshit about how the economical situation is difficult (hint: it is always difficult), so you should all just work harder and maybe later, but no promises.
it feels awfully audacious to ask for twice what I'm making at the moment
I know. Been there, twice. (Felt like an idiot after realising that I worked for a quarter of my market price at the first company. Okay, that's exaggerated, because my market price increased with the work experience. But it was probably half of the market price.)
The first time, I was completely inexperienced about negotiating. It went like: "So tell me how much you want." "Uhm, you tell me how much you give peop...
Don't deliberately screw yourself over. Don't accept less than the average for your position and either point blank refuse to give them negotiating leverage by telling them your current salary or lie.
For better, longer advice see [Salary Negotiation for Software Engineers](http://www.kalzumeus.com/2012/01/23/salary-negotiation)
I'm afraid I couldn't quite bring myself to follow all the advice in your link, but at any rate I increased my number to 125k. So, it helped a bit. :)
Look up what Ramit Sethi has to say about salary negotiation. He really outlines the how things look from the other side and how asking for your 100k is not nearly as audacious as it seems.
I would like to eventually create a homeschooling repository. Probably with research that might help people in deciding whether or not to homeschool their children, as well as resources and ideas for teaching rationality (and everything else) to children.
I have noticed that there have been several question in the past open threads about homeschooling and unschooling. One of the first things I plan to do is read through all past lesswrong discussions on the topic. I haven't really started researching yet, but I wanted to start by asking if anyone had anything that they think would belong in such a repository.
I would also be interested in hearing any personal opinions on the matter.
Homeschooling is like growing your own food (or doing any other activity where you don't take advantage of division of labor): if you enjoy it, have time for it and are good at it, it's worth trying. Otherwise it's useless frustration.
I couldn't agree more about division of labor in general, but with the current state of the public school system, I do not trust them to do a good job of teaching anything.
I do not have the time or patience for it, and probably am not good at it, but fortunately my partner would be the one teaching.
Mindkilling for utilitarians: Discussion of whether it would have made sense to shut down the government to try to prevent the war in Iraq
More generally, every form of utilitarianism I've seen assumes that you should value people equally, regardless of how close they are to you in your social network. How much damage are you obligated to do to your own society for people who are relatively distant from it?
How can I acquire melatonin without a prescription in the UK? The sites selling it all look very shady to me.
It's melatonin; melatonin is so cheap that you actually wouldn't save much, if any, money by sending your customers fakes. And the effect is clear enough that they'd quickly call you on fakes.
And they may look shady simply because they're not competently run. To give an example, I've been running an ad from a modafinil seller, and as part of the process, I've gotten some data from them - and they're easily costing themselves half their sales due to basic glaring UI issues in their checkout process. It's not that they're scammers: I know they're selling real modafinil from India and are trying to improve. They just suck at it.
If I make a target, but instead of making it a circle, I make it an immeasurable set, and you throw a dart at it, what's the probability of hitting the target?
If you construct a set in real life, then you have to have some way of judging whether the dart is "in" or "out". I reckon that any method you can think of will in fact give a measurable set.
Alternatively, there are several ways of making all sets measurable. One is to reject the Axiom of Choice. The AoC is what's used to construct immeasurable sets. It's consistent in ZF without AoC that all sets are Lebesgue measurable.
If you like the Axiom of Choice, then another alternative is to only demand that your probability measure be finitely additive. Then you can give a "measure" (such finitely additive measures are actually called "charges") such that all sets are measurable. What's more you can make your probability charge agree with Lebesgue measure on the Lebesgue measurable sets. (I think you need AoC for this though.)
In L.J. Savage's "The Foundations of Statistics" the axioms of probability are justified from decision theory. He only ever manages to prove that probability should be finitely additive; so maybe it doesn't have to be countably additive. One bonus of finite additivity for Bayesians is that lots of improper priors become proper. For example, there's a uniform probability charge on the naturals.
Topic: Investing
There seems to be a consensus among people who know what they're talking about that the fees you pay on actively managed funds are a waste of money. But I saw some friends arguing about investing on Facebook, with one guy claiming that index funds are not actually the best way to go for diversified investing that does not waste any money on fees. Does anyone know if there is anything too this? More specifically, are Vanguard's funds really as cheap as advertised, or is there some catch to them?
To find previous Open Threads, click on the "open_thread" link in the list of tags below the article. It will show you this page:
http://lesswrong.com/r/discussion/tag/open_thread/
For some reasons that I don't understand, the Special threads wiki page has a link to this:
http://lesswrong.com/tag/open_thread/
...but that page doesn't work well.
Am I mistaken, or do the Article Navigation buttons only ever take to posts in Main, even if I start out from a post in Discussion? Is this deliberate? Why?
Another PT:LoS question. In Chapter 8 ("Sufficiency, Ancillarity and all that"), there's a section Fisher information. I'm very interested in understanding it, because the concept has come up in improtant places in my statistics classes, without any conceptual discussion of it - it's in the Cramer-Rao bound and the Jeffreys prior, but it looks so arbitrary to me.
Jaynes's explanation of it as a difference in the information different parameter values give you about large samples is really interesting, but there's one step of the math that I just c...
There is too much unwarranted emphasis on ketosis when it comes to Keto diets, rather than hunger satiation. That might sound like a weird claim since the diet is named after ketosis, but when it comes to the efficacy of the Keto diet for weight loss with no regard to potential health or cognitive effects, ketosis has little to do with weight loss. Most attempts to explain the Keto diet almost always starts with an explanation on what ketosis is with an emphasis on attaining ketosis rather than hunger satiation and caloric deficit. Here is intro excerpt...
Yet another newbie question. What's the rational way to behave in a prediction market where you suspect that other participants might be more informed than you?
Here's a toy model to explain my question. Let's say Alice has flipped a fair coin and will reveal the outcome tomorrow. You participate in a prediction market over the outcome of the coin. The only participant besides you is Bob. Also you know that Alice has flipped another fair coin to decide whether to tell Bob the outcome of the first coin in advance. What trades should you offer to Bob, and wha...
If it's worth saying, but not worth its own post (even in Discussion), then it goes here.