Thursday, September 11, 2014

The post-modern politicization of computer science

In an earlier blog post, I wrote about how the recent report from the Executive Office of the President entitled Big Data: Seizing Opportunities, Preserving Values promotes the ideology of disparate impact. The report also provides a very peculiar politicized definition of an algorithm:

    In simple terms, an algorithm is defined by a sequence of steps and instructions that can be applied to data. Algorithms generate categories for filtering information, operate on data, look for patterns and relationships, or generally assist in the analysis of information. The steps taken by an algorithm are informed by the author's knowledge, motives, biases, and desired outcomes. The algorithm may not reveal any of those elements, nor may it reveal the probability of a mistaken outcome, arbitrary choice, or the degree of uncertainty in the judgment it produces. So-called "learning algorithms," which underpin everything from recommendation engines to content filters evolve with the data sets that run through them, assigning different weights to each variable. The final computer-generated product or decision -- used for everything from predicting behavior to denying opportunity -- can mask prejudices while maintaining a patina of scientific objectivity.

The report goes on to say:

    Powerful algorithms can unlock value in the vast troves of information available to businesses, and can help empower consumers, but also raise the potential for encoding discrimination in automated decisions. ... For these reasons, the civil rights community is concerned that such algorithmic decisions raise the spectre of "redlining" in the digital economy, the potential to discriminate against the most vulnerable classes in our society under the guise of neutral algorithms.

Thus does the post-modernist poison seep into computer science. Algorithms are "informed by the biases and desired outcomes of the author." Under the "guise of neutral algorithms," software "encodes discrimination" and "masks prejudices while maintaining a patina of scientific objectivity." Compare this politically tinged blather with the precise definition of algorithm provided by Wikipedia:

    In mathematics and computer science, an algorithm is a step-by-step procedure for calculations. Algorithms are used for calculation, data processing, and automated reasoning. An algorithm is an effective method expressed as a finite list of well-defined instructions for calculating a function. Starting from an initial state and initial input (perhaps empty), the instructions describe a computation that, when executed, proceeds through a finite number of well-defined successive states, eventually producing "output" and terminating at a final ending state. The transition from one state to the next is not necessarily deterministic; some algorithms, known as randomized algorithms, incorporate random input.

According to the post-modernist ideology: it is impossible for any individual to be impartial; there is no objective truth to be discovered; we can never escape our various "identities" (white, black, Latino, Asian); every action we take or judgment we make is inexorably determined by the prejudices that our identities impose on us. And so, when post-modernism looks at computer science, it arrives at a vision where algorithms can never be objective and impartial, and cannot discover any truths about the world, but simply encode the prejudices of the author (probably a white or Asian male).

It is incumbent on the engineers of Silicon Valley to resist such portrayals of their activities. No software engineer worth his salt will refuse to fix bugs in his software if these bugs are pointed out to him. But, the presumption that software engineers are incapable of rising above their prejudices and only write software in order to arrive at "desired outcomes" is an insult to the intellectual integrity of the Valley.

No comments:

Post a Comment