Sorted by New

Wiki Contributions


I said on Twitter a while back that much of the discussion about "alignment" seems vacuous. After all, alignment to what? 

  • The designer's intent? Often that is precisely the problem with software. It does exactly and literally what the designer programmed, however shortsighted. Plus, some designers may themselves be malevolent. 
  • Aligned with human values? One of the most universal human values is tribalism, including willingness to oppress or kill the outgroup. 
  • Aligned with "doing good things"? Whose definition of "good"? 

My reaction on Twitter:

This post seems even more relevant and true now, in 2023. 

Here's an interesting example of a fairly unique governance arrangement:

Ostrom is great, obviously. In fact, I forgot how thoroughly I summarized a good bit of that literature in a 2001 piece:

I recently stumbled across a sociology classic of sorts: Paul J. DiMaggio and Walter W. Powell, "The Iron Cage Revisited: Institutional Isomorphism and Collective Rationality in Organizational Fields" (American Sociological Review, 1983).

"We ask...why there is such startling homogeneity of organizational forms and practices." 

The main answer: "We identify three mechanisms through which institutional isomorphic change occurs...: 1) coercive isomorphism that stems from political influence and the problem of legitimacy; 2) mimetic isomorphism resulting from standard responses to uncertainty; and 3) normative isomorphism, associated with professionalization."