Here on Privacy News Online we’ve written a number of stories about the privacy implications of DNA. There’s an important case going through the Californian courts at the moment that involves DNA and privacy, but whose ramifications go far beyond those issues:
“In this case, a defendant was linked to a series of rapes by a DNA matching software program called TrueAllele. The defendant wants to examine how TrueAllele takes in a DNA sample and analyzes potential matches, as part of his challenge to the prosecution’s evidence. However, prosecutors and the manufacturers of TrueAllele’s software argue that the source code is a trade secret, and therefore should not be disclosed to anyone.”
The Electronic Frontier Foundation (EFF) points out that there are two big problems here. One is the basic right of somebody accused of a crime to be able to examine and challenge the evidence that is being used against them. In this case, that’s not possible, because the manufacturer of the TrueAllele software is unwilling to allow the source code that determines whether or not there is a DNA match to be released. Particularly egregious is the fact that the company is claiming that its right to maintain a supposed trade secret outweighs the accused’s right to a fair trial.
But beyond that issue, there is another that is certain to have a big impact on the world of privacy. It involves the increasing use of algorithms to make judgements about us. An algorithm is just a fancy way of saying a set of rules, usually implemented as software encoding mathematical equations. The refusal by TrueAllele’s manufacturer is therefore a refusal to permit the accused in the Californian case to examine and challenge the algorithmic rules that are being applied.
If this position is allowed to