fbpx
The Stack Archive News Article

New dependent routing technique helps eliminate algorithmic bias

Mon 24 Apr 2017

Researchers from the University of Maryland have developed a novel approach to counter algorithmic bias, with the intent of proving that algorithms can guarantee fairness in resource allocation. Their study includes assuring fairness even in the flexible resource allocation that arises in cloud-based services.

These results are achieved by applying a two-stage dependent rounding technique, which takes the form of ‘slowing down’ and ‘early stopping’ of algorithmic calculations.

While on the surface it seems that algorithms should be free of bias, as mathematical constructs, several studies have shown that human bias can be reinforced in the application of algorithmic clustering and machine learning. For example, a study by ProPublica found that a software program created to predict potential recidivism in criminals was unfairly biased against African-Americans, assigning a much higher risk of committing future crimes to black defendants than to Caucasians.

The overall problem in algorithmic sorting of variables is that outliers tend to get left out of cluster sorting. The further from the center, the less likely that a specific data point will be included in the cluster that is analyzed, reinforcing a bias against statistical variants.

The research team addressed the issue of algorithmic fairness, attempting to eliminate bias with the creation of a new Symmetric Randomized Dependent Rounding (SRDR) technique. The SRDR modifies existing algorithms to update variables symmetrically, with additional randomization; then stops the process early when there are some fractional correlative values left.

The process of adding symmetrical updates during the sorting process is called ‘slowing down’. It does, on an incremental basis, slow down the lightning fast algorithmic calculations required to sort data. However, the purpose of this ‘slowing down’ is to throw in an additional randomization level. Slowing the algorithmic sorting process to tie together two thus-far uncorrelated results, chosen at random, and then re-running the algorithm serves to create additional clusters that include statistical outliers.

This ensures that those markers far from the center are included in the ongoing algorithmic correlation.

Stopping the process early is included in the SRDR framework as well. A normal algorithm will break a correlation all the way down, mathematically. But this research confirms that there is an earlier stopping point where the results are valid, but the outliers are not completely eliminated. That fractional amount acts as a cushion for the results, including a more diverse series of results without affecting the outcome or effectiveness of the algorithm used.

This research can be applied, for example, by a cloud service such as streaming or on-demand video. These services require a flexible facility location, as videos are reshuffled to different locations to improve end user experience. Applying the SRDR framework techniques of ‘slowing down’ and ‘early stopping’ ensures with high probability, each customer can get better service than guaranteed by previous approximation ratios.

Tags:

analytics Cloud data data centre research U.S.
Send us a correction about this article Send us a news tip