From the LA Review of Books, excellent read.
A few excerpts:
He pointed to the evolution of the very word “technology” from the 19th to the mid-20th century as particularly revealing in this regard; he argued that the meaning of the word had morphed from “something relatively precise, limited and unimportant to something vague, expansive and highly significant,” laden with both utopic and dystopic import. The word had become “amorphous in the extreme,” a site of semantic confusion — surely a sign, he concluded, that the languages of ordinary life as well as those of the social sciences had “[failed] to keep pace with the reality that needs to be discussed.”
YES! this:
He’s onto something fundamental that’s worth exploring further: scientific knowledge and machines are never just neutral instruments. They embody, express, and naturalize specific cultures — and shape how we live according to the assumptions and priorities of those cultures.
There is another important reason why the algorithm-as-doer is misleading: it conceals the design process of the algorithm, and therefore the human intentions and material conditions that shaped it.
The algorithms considered in these discussions usually use datasets to produce classifications. Burrell’s “opacity” refers to the fact that an output of this sort rarely includes a concrete sense of the original dataset, or of how a given classification was crafted. Opacity can be the outcome of a deliberate choice to hide information.
Link: