It is people who algorithms have prepared this fate. Algorithms are capable of discriminatory actions

We associate mathematical equations with a bit abstract coolness, but also with a portion of reliable, hard knowledge. However, it turns out that a negative potential may be hidden behind the sequence of numbers and symbols.

What causes some applications for a good cause to reveal their dark side?

There can be several answers. One of the first that comes to our minds concerns human nature. It is a known mechanism according to which people use a lot of stereotypes and prejudices in their lives. They may concern other individuals, social groups or the world of values. Cognitive patterns produced by people are perfectly combined with a lack of imagination and a reluctance to reliable analysis. So an explosive mixture is created that generates negative situations.

Someone who blindly trusts data from the computer, does not notice the complexity of the situation, eagerly eliminates the subjective assessment of the event. And then the "action" begins. Action, i.e. a big problem for parties involved in the event.

Algorithms on duty in the police

The police are ideal for testing smart technologies. Industry representatives know very well that a useful algorithm can sometimes create problems. But let’s be fair. Thanks to intelligent data processing, police computers are able to effectively combine crime, historical data and circumstances of events into categories and collections. The usability of applications that can discover relationships between places, people, psychological profiles, the time of the crime and the tools of the crime are indisputable.

Criminologists and scientists involved in data processing from the University of Memphis have reached for software produced by IBM. The software was intended for predictive analyzes. The team involved in the project has created an analytical mechanism that takes into account such variables as: air temperature, maps of the environment, clusters of people, distribution of shops, restaurants, preferences of residents and current criminal activities. The algorithms that were involved were to use these variables to identify potential inflammation in the city. And it actually worked.

During tests of this system, it turned out that in fact, with some probability, one can predict the future (in which: no details are given here), but high enough that police patrols were redirected to these potentially "threatened" places. In another comment we read that thanks to this, the police response time counted from reporting the event to the first response was shortened 3 times.

I imagine that the mere presence of police in these places reduces the occurrence of criminal incidents. This example can be difficult to understand for a layman. He proves that modern technology is becoming a real innovative "firecracker" that can bring excellent results.

Sometimes, however, something is not right here

The Hunchlab program implemented by the Azavea startup implemented in the United States analyzes huge amounts and types of data (including moon phases) to facilitate police analyzes of criminal incidents. As in the previous case, it is about creating a map of places where the probability of committing a crime increases.

The program emphasizes, among others, the location of places such as bars, schools and stops in the city. The effects of the program turned out to be positive. Sometimes these analyzes were obvious, but often they were surprising. The lower intensity of criminal activities on cold days can be simply explained. But explaining why there had been more theft of cars parked near schools in Philadelphia could have been more difficult.

Could a policeman not equipped with such software ever come up with an idea to look for meaning in the school-car theft relationship? So far, we’ve looked at several positive scenarios. However, it is difficult to go to the agenda over the fact that intelligent machines are not enough that they are wrong, they make misinterpretations. They often have serious problems analyzing situational contexts. Like people.

Doubtful software certainty

In 2016, the independent organization of investigative journalists ProPublica described in its text "Machine Bias" , the tendency of American courts to use specialized Northpointe software responsible for analyzes of crime in the United States. The authors were interested, inter alia, in the matter of chances of committing subsequent crimes by convicted persons before.

The text describes that the software used willingly and massively by American judges, generated analyzes according to which 45 percent existed. chances that previously convicted black citizens will return to criminal activity. In the case of people with white skin color, the probability of returning to criminal activity was estimated at only 24 percent.

To these interesting conclusions came the algorithmically produced thesis that areas inhabited by black people are more exposed to criminogenic behavior, compared to the districts associated with white residents. The truths served by the software were questioned, and the algorithms ended Northpointe’s analytical career. All by basing inferences only on historical data and a lack of awareness or rather designing the algorithms to take into account recent demographic changes.

Algorithms and white faces

Interesting seems to be the thesis of Cathy O’Neil, who in her book "Weapon of Math Destruction" published in 2016, devoted the influence of algorithms on various areas of our lives. The author claims that people tend to trust mathematical formulas too much. And prejudices – he claims – can form in different ways and on many levels.

She also pointed out that negative processes may arise early, before data collection, which will later be used for analysis algorithms. Amazon managers have experienced this mechanism. They noticed that their recruitment support programs regularly discriminated against women. In results including groups of promising employees, there were always far fewer women. Why? Again, through historical data, where more men applied for specific positions. This disturbed the employment parity for women vs men and somewhat preferred men, and finally mistakenly created employment policy.

Algorithms are not familiar with cultural changes

The programs were based on the work of algorithms that were designed at a time when there was a large imbalance in employing both sexes. Overrepresentation of men was characteristic of a specific moment in history. Algorithms trained on historical data used the "belief" that the world was not changing.

So they acted on the basis of wrong assumptions and simplifications (black skin color is more likely to be a crime, more professionalism is the domain of men).

Worrying questions

It will not be a mistake to feel that mechanisms analogous to those described above can work in many areas of professional and private life. In how many cases, which are unknown to us, data organization can be based on incorrect assumptions? In how many situations do the algorithms not take into account economic and cultural changes?

Black Box is a term that is used to express human helplessness in the face of what may be happening in the "brains" of artificial intelligence. Our ignorance and the increasing independence of algorithms, which, as it turns out, are not alpha and omega, generate a disturbing mix. Algorithmic prejudices will not disappear at the touch of a magic wand.

The key question is: is there a chance for specialists who design algorithms, who often design or train subsequent algorithms themselves, to become more aware of their own responsibility and the fact that algorithm bias are a simple extension of human attitudes and practices?

Norbert Biedrzycki

Head of Services CEE, Microsoft. Manages Microsoft services in 36 countries, their scope includes business consulting and technological consulting, in particular in such areas as big data and artificial intelligence, business applications, cybersecurity, premium services and cloud. Formerly Vice President Digital McKinsey responsible for the CEE region and services combining strategic consulting and implementation of advanced IT solutions. From comprehensive digital transformation through rapid implementation of business applications, big data solutions and analysis, business applications of artificial intelligence to blockchain and IoT solutions. Earlier Norbert was the President of the Management Board and CEO of Atos Polska, he was also the head of ABC Data SA and the President of the Management Board and CEO of Sygnity SA. Previously he also worked at McKinsey as a partner, he was the director of consulting services and Oracle’s business development department.

Norbert’s passion is the latest robotics technologies, applications of artificial intelligence, blockchain, VR and AR, Internet of Things, and their impact on the economy and society. You can read more about this on Norbert’s blog .

It is people who algorithms have prepared this fate. Algorithms are capable of discriminatory actions

https://ift.tt/3eUmb6w