I found this interesting syndicated article "Algorithms may echo human bias, study finds", on Today 14-July-2015, page 36.
Basically it says that eventually, algorithms are created by humans, together with the human influences and biases. In other words, data analytics algorithms are merely human attempts to model a scenario mathematically with the help of very large amounts of data.
For instance, by applying graph theory on a social network platform, we can assign weightings on links to friends that have common interests with us and find who our closest friends are, who our best friends are or even who our spouse is. In plain language, we are looking for 'birds of a feather that flock together'.
There is also an algorithm that detects expense claim fraud, that analyses the first digit of each expense claim item. So if only a few digit values are used and very repeatedy so, the expense claimant is flagged for further investigation. This probably based on the tendency that human beings will not think of broad ranges of numbers when cheating.
I trust that algorithms for data analytics have a symbiotic relationship with human psychology. So, it pays to observe patterns of human thinking through the data they manifest. May be some old proverbs may offer inspiration.
--------------------------------------------------------------------
Algorithms may
echo human
bias, study finds
NEW YORK — There is a widespread
belief that software and algorithms
that rely on data are objective. But
software is not free of human influence.
Algorithms are written and
maintained by people, and machinelearning
algorithms adjust what they
do based on people’s behaviour. As a
result, algorithms can reinforce human
prejudices, researchers say.
A new study by Carnegie Mellon
University researchers revealed that
Google’s online advertising system
showed an ad for high-income jobs
to men much more often than women.
Research from the University of
Washington also found that a Google
Images search for “CEO” produced
11 per cent women, even though 27 per
cent of chief executives in the United
States are women.
Algorithms, which are instructions
written by programmers, are often
described as a black box; it is hard to
know why websites produce certain
results. Often, algorithms and online
results reflect people’s attitudes and
behaviour. The autocomplete feature
on Google is an example — a recent
search for “Are transgender” suggested,
“Are transgenders going to hell”.
“Even if they are not designed
with the intent of discriminating
against those groups, if they reproduce
social preferences even in a completely
rational way, they also reproduce
those forms of discrimination,”
said Mr David Oppenheimer, who
teaches discrimination law at the University
of California, Berkeley.
The Carnegie Mellon researchers
built a tool to simulate Google users
who started with no search history,
and then visited employment
websites. Later, on a third-party
news site, Google showed an ad for
a career-coaching service advertising
“US$200k+” executive positions
1,852 times to men and 318 times to
women. The reason for the difference
is unclear. It could have been that the
advertiser requested that the ads be
targeted towards men, or that the algorithm
determined that men were
more likely to click on the ads.
Google declined to say how the ad
showed up, but said: “Advertisers can
choose to target the audience they
want to reach, and we have policies that
guide the type of interest-based ads
that are allowed.” The New York Times