Ontology

Zach Stein-Perlman (+45)
Zach Stein-Perlman (-5)
Zach Stein-Perlman
Zach Stein-Perlman (+64/-64)
Zach Stein-Perlman (+45/-12)
Zach Stein-Perlman (+64)
Zach Stein-Perlman (+66/-8)
Zach Stein-Perlman (+117/-23)
Zach Stein-Perlman (+58/-21)
Zach Stein-Perlman (+41)

Why care? Most humans seem to have similar ontologies, but AI systems might have very different ontologies, which could cause surprising behavior. E.g. the panda-gibbon thing. Roughly, if the shared-human-ontology isn't natural (i.e. learned by default) and moreover is hard to teach an AI, then that AI won't think in terms of the same concepts as humans, which might be bad. See Ontology identification problem. Also misgeneralization of concepts or goals.

Why care? Most humans seem to have similar ontologies, but AI systems might have very different ontologies, which could cause surprising behavior. E.g. the panda-gibbon thing. Roughly, if the shared-human-ontology isn't natural (i.e. learned by default) and moreover is hard to teach an AI, then that AI won't think in terms of the same concepts as humans, which might be bad. See also Ontology identification problem.

  1. ^

    "Objects" means possible objects, not objects that really exist.

    An ontology can also include an account of other kinds-of-stuff, like properties, relations, events, substances, and states-of-affairs.

    "Objects" means possible objects, not objects that really exist.

An ontology is an answer to that question. An ontology is a collection of sets of objects[1] (or: a collection of sets of points in thingspace).[1] An agent's ontology determines the abstractions it makes.

  1. ^

    And maybeAn ontology can also include an account of other kinds-of-stuff, like properties, relations, events, substances, and states-of-affairs.

    "Objects" means possible objects, not objects that really exist.

  1. ^

    And maybe other kinds-of-stuff, like properties, relations, events, substances, and states-of-affairs.

    "Objects" means possible objects, not objects that really exist.

For example, "chairs"consider Alice and Bob, two normal English-fluent humans. "Chairs"_Alice is in Alice's ontology; it is (or points to) a set of (possible-)objects (namely what she considers chairs) that she bundles together. "Chairs"_Bob is in Bob's ontology, and it is a very similar set of objects (what he considers chairs). This overlap makes it easy for Alice to communicate with Bob and predict how he will make sense of the world.

Ontology is a branch of philosophy. It is concerned with questions including how arecan objects be grouped into categories?

An ontology is an answer to that question. An ontology is a collection of sets of objects and properties (or:[1] (or: a collection of sets of points in thingspace). An agent's ontology determines the abstractions it makes.

  1. ^

    And maybe other kinds-of-stuff, like properties, relations, events, substances, and states-of-affairs.

An ontology is an answer to that question. An ontology is a collection of sets of objects and properties (or maybe:(or: a collection of sets of points in thingspace)thingspace). An agent's ontology determines the abstractions it makes.

Load More (10/12)