Hello, I am asking for some insights for a research I am doing. Can you cite examples of technologies that have been forgotten? What I mean by "forgotten" is not things we don't know how to do but used to (I suspect there aren't that many), nor things that are no longer in use but used to (mechanical television), but things that were decently developed (either in theory or in practice) but never "saw the light of day" anyway.

It's my first time posting, so I won't do much policing on the answers, thanks in advance.

New Answer
New Comment

8 Answers sorted by

quanticle

140

I would argue that spaced repetition is one such technology. We've known about forgetting curves and spaced repetition as a way of efficiently memorizing data since at least the '60s, if not before. Yet, even today, it's hardly used and if you talk to the average person about spaced repetition, they won't have a clue as to what you're referring to.

Here we have a really cool technology, which could significantly improve how we learn new information, and it's being used maybe 5% as often as it should be.

It really needs a personal computer to schedule the repetitions, and we're only now getting to the point where every schoolchild having their own handheld computer is a somewhat practical proposition.

4gilch
The Pimsleur series of language courses are just audio, and they use spaced repetition (among other research-backed techniques) without a computer. They've got an app now, but the original tapes would work on a Walkman. You're supposed to do one lesson per day. They've scheduled the material to bring vocabulary words up when you're about to forget them.
1rsaarelm
Still worse than a computer, since they can't take feedback on words that you've learned better. It only works if your learning rates for different words are what the tape maker expected. Also this won't work for the end run of spaced repetition where a well-practiced card might pop up a year after it was last reviewed. The long-lived cards are going to be a very eclectic mix. Then again, school courses usually don't expect you to retain the stuff from each course past the duration of the course, so this isn't that much of a shortcoming for education.

Ishaan

60

There's a large class of viable pharmaceuticals which don't see the light of day because their unpatentability causes companies not to fund the clinical trials which would be necessary to clear regulatory approval.

Could you cite any? Or at least point me at some research/source on the subject?

gilch

30

The Smalltalk programming language and environment was revolutionary at the time and still highly influential to this day. Lots of later languages have copied some of its features, and none of them really got it right.

The grammar is extremely simple and easy to pick up compared to most industry languages. A famous small program demonstrating all of the language (but not the library) fits on a postcard.

Using the debugger, you can catch an exception, walk up the stack, and correct, recompile and swap in individual methods while the program is still running. You can save the entire state of the program at any time and resume it at a later time, even on another machine. You need an entire OS in a VM to do this in almost any other language.

The tight feedback loops you get from its interactive programming stye is arguably superior to almost anything else we have today, although e.g. Python or ClojureScript can approach this level of interactivity, it isn't their default.

Smalltalk's first stable release was in 1980 and we still haven't caught up to its level in industry. It's hard to understand exactly how this happened historically, but it seems to be path dependence based on some combination of (relatively) poor marketing, early mistakes in design, and the limitations of older hardware that could barely handle those features when the industry was first taking off.

But there are open-source Smalltalks now, most deriving from Squeak. Pharo, Dolphin, and Cuis are notable. There is even a VM written in JavaScript so you can try it in your web browser.

gilch

20

Aerospike rockets are supposed to be much more fuel-efficient in atmosphere than are the conventional bell nozzles.

There are good reasons "rocket science" has become a synonym for "difficult". Nobody wants to take a chance on unproven technology when designing rockets is already hard enough. Not even Elon Musk, at least so far.

Polytopos

20

Digital knowledge management tools envisioned in the 1950s and 60s such as Douglas Engelbart's hyperdocument system has not been fully implemented (to my knowledge) and certainly not widely embraced. The World Wide Web failed to implement key features from Engelbart's proposal such as the ability to directly address arbitrary sub-documents, or the ability to live embed a sub-document inside another document. 

Similarly both Engelbart and Ted Nelson emphasized the importance of hyperlinks being two-directional so that the link is browsable from both the source and the target document. In other words, you could look at any webpage and immediately see all the pages that link to that page.  However, Tim Berners-Lee chose to make web hyperlinks one directional from source to target, and we are still stuck with that limitation today.  Google's PageRank algorithm gets around this by doing massive crawling the web and then tracing the back-links through the network, but back-links could have been built into the web as a basic feature available to everybody. 

https://www.dougengelbart.org/content/view/156/88/

Ericf

10

Depending on your sensitivity filter, there are over 300,000 US patents, of which perhaps 10% have been incorporated into a commercially successful product. https://www.uspto.gov/web/offices/ac/ido/oeip/taf/h_counts.htm

A Ray

10

I think commercial applications of nuclear fission sources are another good example.

Through the 1940s, there were lots of industrial processes, and commercial products which used nuclear fission or nuclear materials in some way.  Beta sources are good supplies of high-energy electrons (used in a bunch of polymer processes, among other things), alpha sources are good supplies of positively charged nuclei (used in electrostatic discharge, and some sensing applications).

I think one of the big turning points was the Atomic Energy Act, in the US, though international agreements might also be important factors here.

The world seems to have collectively agreed that nuclear risks are high, and we seem to have chosen to restrict proliferation (by regulating production and sale of nuclear materials) -- and as a side effect have "forgotten" the consumer nuclear technology industry.

I am interested in this because its also an example where we seem to have collectively chose to stifle/prevent innovation in an area of technology to reduce downside risk (dirty bombs and other nuclear attacks).

I think the EBR-II reactor was a notable example. The government cut funding three years before the completion of the program. Its design is what we now call an "integral fast reactor". Its passive safety features demonstrated that it literally cannot melt down. An IFR design would also produce much less waste than a conventional light-water reactor.

A Ray

10

I think Google Wave/Apache Wave is a good candidate here, at least for the crowd familiar with it.

Designed to be a new modality of digital communication, it combined features of email, messengers/chat, collaborative document editing, etc.

It got a ton of excitement from a niche crowd while it was in a closed beta.

It never got off the ground, though, and less than a year after finishing the beta, it was slowly turned down and eventually handed over to Apache.

9 comments, sorted by Click to highlight new comments since:

Oh, here's another one: Lisp Machines. These were computers with alternative chip designs focused on executing Lisp (or really any functional programming language) rather than on executing procedural code. Had the direction been pursued further, they might have resulted in dramatically different computer architectures than what we use today. Some were built and used, but only in very limited contexts, so I'd say this meets the criteria of "never saw the light of day" in that less than 10k Lisp machines were ever built.

One that comes to my mind is OpenDoc, a cool and exciting proposal for a way to make editable generic computer documents that were not application constrained. The idea was to make documents a cross-platform, operating system level responsibility and what we today think of as applications would instead be embedded viewers/editors that could be used when putting different types of "objects" in documents.

We did eventually get something like it: Google Docs, Word, and even web pages generally have the ability to embed all kinds of different other documents, and sometimes there is viewing/editing support within the document (you can see images, embed editable spreadsheets, embed editable diagrams, etc.), but with more vendor lock-in and missing the spirit of vendor openness OpenDoc intended.

In the same vein as OpenDoc, XMPP and RSS both come to mind. While they "saw the light of day", they never seemed to reach the threshold of popularity necessary for long-term survival, and they're not well supported any more. I would argue that they're both good examples of "left-behind" tech.

Plan 9 from Bell Labs comes to my mind (papers & manpages): By the creators of unix, tight integration of networks (better than other systems I have seen so far), UTF-8 all the way down, interesting concept with process-wide inherited namespaces.

It used up way too many weirdness points, though, and was fighting the old Worse is Better fight. It lost, and we are left with ugly and crufty unices today.

Another one that comes to mind is Project Xanadu. It was quite similar to the modern web, but a lot more polished and clean in design and concept. It probably failed because a really late delivery and by being too slow for the hardware at the time.

I guess that's mostly the problem: ambitious projects use up a lot of weirdness points, and then fail to gain enough traction.

A project that will probably fall into the same category is Urbit. If you know a bit of computer science, the whitepaper is just pure delight. After page 20 I completely lost track. It's fallen victim to a weirdness hyperinflation. It looks clean and sane, but I assign ~98% probability that its network is never going to have more than 50.000 users over the span of one month.

+1 Plan 9.

I think it (weirdly) especially hits a strange place with the "forgotten" mark, in that pieces of it keep getting rediscovered (sometimes multiple times).

I got to work w/ some of the Plan 9 folks, and they would point out (with citations) when highly regarded papers in OSDI had been built (and published) in Plan 9, sometimes 10-20 years prior.

One form of this "forgotten" tech is tech that we keep forgetting and rediscovering, but:

  1. maybe this isn't the type of forget the original question is about, and
  2. possibly academia itself is incentivizing this (since instead of only getting one paper out of a good idea, if it can get re-used, then that's good for grad students / labs that need publications)
[-]Elo30

Bone conduction headphones but they are still alive and coming back into production. (and I would recommend them)

E cigarettes nearly died because the person who first patented them could not monetise them (I believe), then the patent ran out and people started manufacturing them.

There are lots of devices on advertising TV like the slap-chop and the steam mops that seem novel and useful but don't seem to be mainstream.

https://en.wikipedia.org/wiki/Damascus_steel

Could you clarify how Damascus Steel qualifies? As I understand it, the question is asking about technologies which demonstrated promise, but never reached widespread use, and thus languished in obscurity. Damascus Steel was famous and highly prized in medieval Europe. While it was rare and expensive, I'm not sure that it manages to meet the obscurity criterion.