Labour has attacted ridicule from online commentators for arguing the algorithms used to tailor everything from search engine results to workplace practices should be made open to regulation by government.
Opening private sector use of algorithms and data on individuals are to be part of Labour’s consultation on industrial strategy, set to be published after the Christmas break, according to Chi Onwurah, the shadow minister who acts as Labour’s industrial spokeswoman.
“Algorithms aren’t above the law,” she told The Guardian. “Algorithms are part of our world, so they are subject to regulation, but because they are not transparent, it’s difficult to regulate them effectively.”
In an accompanying letter she argued that Internet companies should be made to face up to the effects caused by their data tools.
“Google and others argue their results are a mirror to society, not their responsibility,” she wrote. “Companies such as Google, Facebook and Uber need to take responsibility for the unintended consequences of the algorithms and machine learning that drive their profits.”
She called for “tech-savvy government” that would include “opening up algorithms to regulation as well as legislating for greater consumer ownership of data and control of the advertising revenue it generates”.
Her remarks come amidst growing concern over the proliferation of extremist or deliberately falsified information online, including the spread of “fake news” and anti-semitic content appearing in automatically generated Google search suggestions.
Facebook, Google and others have said they plans to institute more human intervention to help curb the spread of deliberate misinformation, hate speech and extremist content and the search company has taken action to remove offensive content from search suggestions.
Many readers commenting on Labour’s remarks online said the plan showed a misunderstanding of the nature of the technology involved.
“If people understood better how search engines (not just Google) actually work, then this non-issue would not arise,” one wrote. “It’s not the algorithm, it’s a matter of educating the users of these things.”
“Applying legislation to malleable software artefacts would introduce all sorts of idiocy to the work environment,” another remarked.
Labour is not the only organisation examining whether emerging areas of technology should face more accountability, with EU financial regulators this week launching a consultation on possible changes to the oversight of large-scale data analytics.
Big data practices involve building detailed profiles based on information collected from Internet and mobile devices, and present issues including the exclusion of certain individuals as well as the security risks inherent in storing large amounts of personal data, regulators said.
What do you know about Uber, Airbnb and the startup scene? Try our quiz!
Welcome to Silicon UK: AI for Your Business Podcast. Today, we explore how AI can…
Japanese tech investment firm SoftBank promises to invest $100bn during Trump's second term to create…
Synopsys to work with start-up SiMa.ai on joint offering to help accelerate development of AI…
Start-up Basis raises $34m in Series A funding round for AI-powered accountancy agent to make…
Data analytics and AI start-up Databricks completes huge $10bn round from major venture capitalists as…
Congo files legal complaints against Apple in France, Belgium alleging company 'complicit' in laundering conflict…