Would it be useful to have a term, analogous to 'hardware', 'software', 'wetware', 'vaporware' etc.[1], which could be used to distinguish learned/discovered components of software, like gradient-trained DNNs, prompt-hacked LLMs, etc?

EDIT 2024-01-04: my current favourites are 'ML-ware' (HT Shankar), 'fuzzware' (me), and 'hunchware' (Claude), in that order; LW votes concur with 'ML-ware'.

In a lot of conversations with nonexperts, I find that the general notion of AI as being 'programmed' apparently still has a surprisingly strong grip, even after the rise of ML and DL made it even clearer that this is an unhelpful anchor to have. Thane recently expressed similar, quite strongly.

David Mannheim has a short take AI is not software which I think nicely encapsulates some parts of the important distinctions.

The important thing, for me, is that, in contrast to traditional software, nobody wrote it, the specification is informal at best, and we can't (currently) explain why or how it works. Traditionally, software is 'data you can run', but traditionally this class of data were exclusively crafted (substantially) by human design.

A valid answer to this question is, 'no, we do not need such a term, just say, "learned components of software" or similar'.

In practice, we probably wouldn't apply this term to, say, a logistic regression, but maybe?

Some ideas, none of which I like enough yet

  • netware (seems too NN-specific; also evokes networking which is the wrong concept)
  • dryware (like wetware but... dry)
  • neuroware (too NN-specific; also evokes bio neuro - maybe that's fine)
  • infoware (sounds like just any data though)
  • learnedware/learnware
  • trainware
  • emergeware
  • adaptware
  • implicitware
  • paraware
  • evoware
  • foundware
  • guessware
  • fuzzware
  • noware
  • everyware
  • anyware
  • selfaware
  • please-beware

After a bit of back-and-forth, Claude managed to produce a few which I think are OK but I'm not very sold on these either

  • fogware
  • cloudware
  • enigware
  • blurware
  • darkware
  • specware
  • inferware
  • luckware
  • chanceware
  • hunchware

  1. For some illuminating compendia of -ware terms, see wiktionary, computerhope ware jargon, Everyware from rdrop, or gears' shortlist of suggestions. Notably, almost all of these are really semantically <thing>-[soft]ware with the 'soft' elided e.g. spyware really means spy-software. ↩︎

New Answer
New Comment

9 Answers sorted by

Shankar Sivarajan

Jan 03, 2024

1510

Why not just "ML-ware"?

It's not specific to neural networks, corresponds closely to what most people would refer to as "AI" today, but explicitly excludes handcrafted algorithms. The resemblance to "malware" is serendipitous.

This is simple but surprisingly good, for the reasons you said. It's also easy to say and write. Along with fuzz-, and hunch-, this is my favourite candidate so far.

Odd anon

Jan 04, 2024

92

Brainware.

Brains seem like the closest metaphor one could have for these. Lizards, insects, goldfish, and humans all have brains. We don't know how they work. They can be intelligent, but are not necessarily so. They have opaque convoluted processes inside which are not random, but often have unexpected results. They are not built, they are grown.

They're often quite effective at accomplishing something that would be difficult to do any other way. Their structure is based around neurons of some sort. Input, mystery processes, output. They're "mushy" and don't have clear lines, so much of their insides blur together.

AI companies are growing brainware in larger and larger scales, raising more powerful brainware. Want to understand why the chatbot did something? Try some new techniques for probing its brainware.

This term might make the topic feel more mysterious/magical to some than it otherwise would, which is usually something to avoid when developing terminology, but in this case, people have been treating something mysterious as not mysterious.

I wasn't eager on this, but your justification updated me a bit. I think the most important distinction is indeed the 'grown/evolved/trained/found, not crafted', and 'brainware' didn't immediately evoke that for me. But you're right, brains are inherently grown, they're very diverse, we can probe them but don't always/ever grok them (yet), structure is somewhat visible, somewhat opaque, they fit into a larger computational chassis but adapt to their harness somewhat, properties and abilities can be elicited by unexpected inputs, they exhibit various kinds of learning on various timescales, ...

1Oliver Sourbut4mo
Incidentally I noticed Yudkowsky uses 'brainware' in a few places (e.g. in conversation with Paul Christiano). But it looks like that's referring to something more analogous to 'architecture and learning algorithms', which I'd put more in the 'software' camp when in comes to the taxonomy I'm pointing at (the 'outer designer' is writing it deliberately).

Shiroe

Jan 03, 2024

44

"tensorware" sprang to mind

This one independently sprang to mind for me too.

5Oliver Sourbut4mo
This is nice in its way, and has something going for it, but to me it's far too specific, while also missing the 'how we got this thing' aspect which (I think) is the main reason to emphasise the difference through terminology.

because the goal here is to have a word that people skeptical of the "lifeyness" or "brainyness" of ai will accept to understand that it's not normal software, I really like "moldware" and will be using it until something sticks better. it nicely describes the general nature of function approximators without getting into the weeds of why or how, or claiming function approximators have inherent lifeyness. it also feels like the right amount of decrease in "firmness" after software.

more candidates to reject from, a few favorite picks from asking an llm to dump many suggestions: fit-; contour-; match-; mirror-; conform-; mimic-; map-; cast-; imprint-;

Mold like fungus or mold like sculpt? I like this a bit, and I can imagine it might... grow on me. (yeuch)

Mold-as-in-sculpt has the benefit that it encompasses weirder stuff like prompt-wrangled and scaffolded stuff, and also kinda large-scale GOFAI-like things alla 'MCTS' and whatnot.

Mikhail Samin

Jan 04, 2024

30

Groware/grownware? (Because it’s “grown”, as it’s now popular to describe)

drossbucket

Jan 03, 2024

3-1

Oozeware?

faul_sname

Jan 03, 2024

20
  • Gradientware? Seems verbose and isn't robust to other ML approaches to fit data.
  • Datagenicware? Captures the core of what makes them like that, but it's a mouthful.
  • Modelware? I don't love it
  • Puttyware? Aims to capture the "takes the shape of its surroundings" aspect, might be too abstract though. Also implies that it will take the shape of its current surroundings, rather than the ones it was built with
  • Resinware - maybe more evocative of the "was fit very closely to its particular surroundings", but still doesn't seem to capture quite what I want

I don't really like any of those ideas. I think it's really interesting that aware is so related though. I think the best bet would be based on software. So something like deepsoftware, nextsoftware, nextgenerationsoftware, enhancedsoftware, etc.

Nathaniel Monson

Jan 03, 2024

1-8

I like "evolveware" myself.

it's distinctly not evolved. gradients vs selection-crossover-mutate are very different algos.

3Nathaniel Monson4mo
I agree in the narrow sense of different from bio-evolution, but I think it captures something tonally correct anyway.
2the gears to ascension4mo
this has been an ongoing point of debate recently, and I think we can do much better than incorrect analogy to evolution.
1Oliver Sourbut4mo
I hate to wheel this out again but evolution-broadly-construed is actually a very close fit for gradient methods. Agreed there's a whole lot of specifics in biological natural selection, and a whole lot of specifics in gradient-methods-as-practiced, but they are quite akin really.
3the gears to ascension4mo
please wheel such things out every time they seem relevant until such time as someone finds a strong argument not to, people underrecommend sturdy work imo. in this case, I think the top comment on that post raises some issues with it that I'd like to see resolved before I'd feel like I could rely on it to be a sturdy generalization. but I appreciate the attempt.
1Oliver Sourbut4mo
Separately, I'm not a fan of 'evolveware' or 'evoware' in particular, though I can't put my finger on exactly why. Possibly it's because of a connotation of ongoing evolution, which is sorta true in some cases but could be misleading as a signifier. Though the same criticism could be levelled against 'ML-ware', which I like more.
6 comments, sorted by Click to highlight new comments since: Today at 10:56 AM
[-]Ann4mo20

Nebulaware ...

Hardware / software is a contrast between 'the physical object computer' and 'not the physical object computer' ... I do think that models are certainly 'not the physical object computer', and what we are actually distinguishing them from are 'programs'.

'Pro-graphein' etymology is 'before-write'. If we look for greek or latin roots that are instead something like 'after-write', in a similar contrast (we wrote the program to do the planned thing, we do the <x> to write the unplanned thing) we get options like 'metagram', 'postgram' ... unfortunately clashing with the instagram wordspace ... or 'postgraph'.

(Existing actual words with similar etymology to what we're looking for with this approach: Epigram, epigraph, metagraph - which arguably is weirdly close in meaning to what we want but would be confusing to override.)

Looking instead to 'code', going back to codex, caudex (tree trunk/stem)... this kind of still works, but let's go for a similar word - folium, folio ...

Alternately 'ramus'/'rami', branch, leading to 'ramification', seems a promising direction in a semantic sense. It has a lot of association with not explicitly planned developments and results. ('Ramagram' is kind of a silly possible word in English though. Then again, a lot of the AI development space has silly words.).

... More a starting point of ideas here than actually having dug up too many good-sounding words.

[-]Ann4mo10

Going a step forward into the etymology of 'program', it comes to mean 'write publicly' or 'written notice', which we could also contrast with roots meaning something else like 'idi-' from 'idios' for 'private, personal, one's own', or in fact 'privus' itself.  (Again need to keep clear of actual existing words like 'idiogram').

Nice! 'Idioware'? Risks sounding like 'idiotware'...

[-]Ann4mo2-1

'Idiomware'? Since idioms are expressions with a meaning that can't be deciphered from the individual words used, and AI models are data with a function that can't be easily deciphered from the specific code used?

@the gears to ascension , could you elaborate on what the ~25% react on 'hardware' in

Would it be useful to have a term, analogous to 'hardware', ...

means? Is it responding to the whole sentence, 'Would it be useful to have...?' or some other proposition?

that was due to a bug in how lesswrong figures out what text a recorded react applies to. I'm not sure which react that was supposed to be, but my reacts weren't valuable enough, so I simply removed them.