Building Transparent Ranking Algorithms (Part 3)
Collaboration Suggestions for Software Engineers
Collaborating on Transparency Initiatives
In the last article, we wrote about ways to design a product to be more transparent. However, a transparent product by itself does not guarantee the customer’s trust. Users gain their trust from different information sources, such as government auditors and cross-industry standards, which can confirm that a particular algorithm is indeed “transparent” in a specific way. The following are a few concrete suggestions for engineers willing to organize outside of their immediate engineering teams.
Help create cross-industry standards for best practices in building transparent ranking algorithms. Existing organizations such as the IEEE, or new organizations have the potential to create transparency standards.
Examples in industry: The Web Accessibility Initiative (WAI), since 1995, has published a series of Web Content Accessibility Guidelines (WCAG), which is the industry standard for user interface designers creating for people with disabilities. Maybe a similar set of transparency standards exists for ranking algorithms that I’m unaware of.
Require engineers to catalog the parameters that go into ranking algorithms, and make these parameters public. The government can hire auditors to confirm the correctness of this accounting. Just as the FDA requires food manufacturers to provide nutrition labels, software companies can publish their user data collection schema process in a common, standard format.
Examples in industry: In 2018, the EU enacted the world’s most stringent privacy legislation to date, the General Data Protection Regulation (GDPR). In 2021, Amazon was fined $887 million for allegedly violating the user’s right to opt out of targeted advertising. Theoretically, the user also has a right to data portability, which allows them to request their data in a “commonly used and machine-readable format”. However, of the user’s five GDPR privacy rights, data portability is the least well enforced. No company thus far has been fined for violating it. Certain states in the US, such as California, legally protect a subset of these privacy rights, data portability included. These protections are drafted and updated by lawyers and software engineers meeting in the same room.
Collaborate with and fund academics to research and evaluate the social impacts of specific product features. This research should help software corporations impersonally assess their products for customer sentiment, health, and safety.
Examples in industry: Facebook’s attempts to conduct social science research on its products, while good intentioned (and expensive), ultimately back-fired when those internal documents were leaked, causing a massive fallout in user trust. However, examples of university-business collaborations abound, so social science research occurs both privately within industry and openly in academia. It’s unclear how software companies decide when to outsource their ethical and social science research.
Read more
Introduction (Part 1)
Product Design (Part 2)
Collaboration Initiatives (Part 3)
To be continued…