Edtech firms failing to protect children’s data, say campaigners
Image credit: Dreamstime
An investigation by children’s digital rights charity 5Rights has accused education technology companies of making children’s data vulnerable to commercial exploitation.
Privacy campaigners have accused major edtech players of breaking UK data laws, the Financial Times has reported.
The 5Rights Foundation has conducted research showing how Google as well as other third parties have tracked children's clicks on external links, while they were using Google Classroom and ClassDojo. This data can be used to determine preferences and display personalised advertising.
The charity presented the report to the Information Commissioner’s Office and the Department for Education on Wednesday, claiming the company's opaque privacy terms go against UK data protection law and can confuse teachers.
“The pandemic has both shown the utility of technology, but also revealed the lack of oversight,” Lady Biban Kidran, chair of 5Rights, told the Financial Times.
Edtech, or educational technology, has been one of the most useful tools schools have leveraged during the Covid-19 pandemic, in order to minimise the disruption to students' learning during the lockdowns.
However, campaigners have argued that, due to the rush with which some of these programmes were implemented, the consequences of using free software have not been properly considered and regulated. To address this problem, 5Rights has highlighted the importance of adhering to the UK’s Age Appropriate Design Code, known as the Children’s Code.
Introduced in September 2020, the Children's Code is a set of 15 provisions to prevent the use of children’s data for marketing and advertising messages. As the code is based on GDPR, companies risk fines of up to £17.5m or 4 per cent of their annual global turnover (whichever is higher) for serious failures.
“Children’s rights must be respected and we expect organisations to prove that children’s best interests are a primary concern," Stephen Bonner, the ICO’s executive director of regulatory futures and innovation, said at the time. "The code gives clarity on how organisations can use children’s data in line with the law, and we want to see organisations committed to protecting children through the development of designs and services in accordance with the code.”
The terms of service of the majority of edtech platforms make schools responsible for protecting the privacy of their students. However, 5Rights campaigners have stated that schools lack the understanding or resources to fully manage, monitor or fulfil these responsibilities once this software is in place, due to the opaque wording of the terms.
In its report, the charity has proposed a solution: requiring edtech providers to "clearly, publicly and transparently" state the nature of the data collected and used about children, with regular independent audits.
“It’s great to have platforms for distance education, but that doesn’t mean they’re a free-for-all. A child’s search terms or browsing history can be used to target commercial products, and that’s not good,” Kidran said. “Instead of being unwelcome in our schools, [tech companies] must be visitors with rules."
The ICO said it was interested in any evidence of edtech not complying with the law.
“The relationship that exists between edtech providers and schools is complex and we have provided tools and resources to support schools and education organisations across the UK,” it added.
The need to protect children's privacy in the digital era has been a hot topic of discussion over the last few years. Last week, a group of campaigners from the NSPCC (National Society for the Prevention of Cruelty to Children) wrote an open letter to the government urging the next UK Prime Minister to prioritise the passage of the Online Safety Bill into law.
The bill, one of the landmark pieces of legislation of Boris Johnson’s government, was scheduled to be passed last July, but it was postponed as a result of Johnson's resignation.
Hailed as groundbreaking regulation of the tech sector, the Online Safety Bill would force social media and other user-generated content-based sites to remove illegal material from their platforms, with a particular emphasis on protecting children from harmful content.
However, IT specialists have criticised the piece of legislation, warning it could weaken free speech. Earlier this month, BCS, The Chartered Institute for IT, published the results of a poll, which showed how some 46 per cent said the bill was not workable, with only 14 per cent of tech professionals believing the legislation was "fit for purpose".
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.