Stuart Buck

Posts

Sorted by New

Wiki Contributions

Comments

Sorted by

I said on Twitter a while back that much of the discussion about "alignment" seems vacuous. After all, alignment to what? 

  • The designer's intent? Often that is precisely the problem with software. It does exactly and literally what the designer programmed, however shortsighted. Plus, some designers may themselves be malevolent. 
  • Aligned with human values? One of the most universal human values is tribalism, including willingness to oppress or kill the outgroup. 
  • Aligned with "doing good things"? Whose definition of "good"? 

My reaction on Twitter: https://twitter.com/stuartbuck1/status/1719152472558555481

This post seems even more relevant and true now, in 2023. 

Here's an interesting example of a fairly unique governance arrangement: https://issues.org/revisiting-nsf-nsb-science-advice-olds-rosenberg-robichaud/

Ostrom is great, obviously. In fact, I forgot how thoroughly I summarized a good bit of that literature in a 2001 piece: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=268744

I recently stumbled across a sociology classic of sorts: Paul J. DiMaggio and Walter W. Powell, "The Iron Cage Revisited: Institutional Isomorphism and Collective Rationality in Organizational Fields" (American Sociological Review, 1983). https://www.uio.no/studier/emner/matnat/ifi/INF9200/v10/readings/papers/DeMaggio.pdf

"We ask...why there is such startling homogeneity of organizational forms and practices." 

The main answer: "We identify three mechanisms through which institutional isomorphic change occurs...: 1) coercive isomorphism that stems from political influence and the problem of legitimacy; 2) mimetic isomorphism resulting from standard responses to uncertainty; and 3) normative isomorphism, associated with professionalization."