Directly visiting http://foretold.io gives an ERR_NAME_NOT_RESOLVED. Can you make it so that foretold.io redirects to www.foretold.io?
That's a normal part of life :). Any things that I decide to do in a future day, I'll copy/paste to over there, but I usually won't delete the items from the checklist for the day where I didn't complete them (thereby creating a record of things I expected or hoped to do, but didn't).
For instance, at https://github.com/vipulnaik/daily-updates/issues/54 I have two undone items.
There is some related stuff by Carl Shulman here: https://www.greaterwrong.com/posts/QSHwKqyY4GAXKi9tX/a-personal-history-of-involvement-with-effective-altruism#comment-h9YpvcjaLxpr4hd22 that largely agrees with what I said.
My understanding is that Against Malaria Foundation is a relatively small player in the space of ending malaria, and it's not clear the funders who wish to make a significant dent in malaria would choose to donate to AMF.
One of the reasons GiveWell chose AMF is that there's a clear marginal value of small donation amounts in AMF's operational model -- with a few extra million dollars they can finance bednet distribution in another region. It's not necessarily that AMF itself is the most effective charity to donate to to end malaria -- it's just the one with the best proven cost-effectiveness for donors at the scale of a few million dollars. But it isn't necessarily the best opportunity for somebody with much larger amounts of money who wants to end malaria.
The main difference I can make out between the EA/GiveWell-sphere and the general global health community is that malaria interventions (specifically ITNs) get much more importance in the EA/GiveWell-sphere, whereas in the general global health spending space, AIDS gets more importance. I've written about this before: http://effective-altruism.com/ea/1f9/the_aidsmalaria_puzzle_bleg/
I tried looking in the IRS Form 990 dataset on Amazon S3, specifically searching the text files for forms published in 2017 and 2016.
I found no match for (case-insensitive) openai (other than one organization that was clearly different, its name had openair in it). Searching (case-insensitive) "open ai" gave matches that all had "open air" or "open aid" in them. So, it seems like either they have a really weird legal name or their Form 990 has not yet been released. Googling didn't reveal any articles of incorporation or legal name.
In my experience, writing full-fledged, thoroughly researched material is pretty time-consuming, and if you push that out to the audience immediately, (1) you've sunk a lot of time and effort that the audience may not appreciate or care about, and (2) you might have too large an inferential gap with the audience for them to meaningfully engage.
The alternative I've been toying with is something like this: when I'm roughly halfway through an investigation, I publish a short post that describes my tentative conclusions, without fully rigorous backing, but with (a) clearly stated conclusions, and (b) enough citations and other signals that there's decent research backing my process. Then I ask people what they think of the thesis, which parts they are interested in, and what they are skeptical of. Then after I finish the rest of the investigation I push a polished writeup only for those parts (for the rest, it's just informal notes + general pointers).
For examples, see https://www.lesserwrong.com/posts/ghBZDavgywxXeqWSe/wikipedia-pageviews-still-in-decline and http://effective-altruism.com/ea/1f9/the_aidsmalaria_puzzle_bleg/ (both are just the first respective steps for their projects).
I feel like this both makes comments more valuable to me and gives more incentive to commenters to share their thoughts, but the jury is still out.
FWIW, my impression is that data on Wikipedia has gotten somewhat more accurate over time, due to the push for more citations, though I think much of this effect occurred before the decline started. I think the push for accuracy has traded off a lot against growth of content (both growth in number of pages and growth in amount of data on each page). These are crude impressions (I've read some relevant research but don't have strong reason to believe that should be decisive in this evaluation) but I'm curious to hear what specific impressions you have that are contrary to this.
If you have more fine-grained data at your disposal on different topics and how much each has grown or shrunk in terms of number of pages, data available on each page, and accuracy, please share :).
In the case of Wikipedia, I think the aspects of quality that correlate most with explaining pageviews are readily proxied by quantity. Specifically, the main quality factors in people reading a Wikipedia page are (a) the existence of the page (!), (b) whether the page has the stuff they were looking for. I proxied the first by number of pages, and the second by length of the pages that already existed. Admittedly, there are a lot more subtleties to quality measurement (which I can go into in depth at some other point) some of which can have indirect, long-term effects on pageviews, but on most of these dimensions Wikipedia hasn't declined in the last few years (though I think it has grown more slowly than it would with a less dysfunctional mod culture, and arguably too slowly to keep pace with the competition).
Great point. As somebody who has been in the crosshairs of Wikipedia mods (see ANI) my bias would push me to agree :). However, despite what I see as problems with Wikipedia mod culture, it remains true that Wikipedia has grown quite a bit, both in number of articles and length of already existing articles, over the time period when pageviews declined. I suspect the culture is probably a factor in that it represents an opportunity cost: a better culture might have led to an (even) better Wikipedia that would not have declined in pageviews so much, but I don't think the mod culture led to a quality decline per se. In other words, I don't think the mechanism:counterproductive mod culture -> quality decline -> pageview declineis feasible.