Personal chips get under your skin
Image credit: Dreamstime
We’ve been microchipping animals for decades now, but the idea of routinely implanting subdermal radio-frequency identification circuitry into humans in the workplace raises ethical and trust concerns over just what the tracking data will be used for.
In most respects it’s an everyday picture. A waiter stands by a young couple in a coffee bar with a handheld terminal, ready to take a routine digital payment. The man raises his wrist to the machine. But he’s not wearing a smartwatch or offering a contactless card. He’s completing the transaction using a microchip that sits permanently beneath the skin on his hand.
“I believe that one day implants will be as popular as payment cards,” says Wojciech Paprota, founder of London-based tech start-up Walletmor, who claims to have created the world’s first microchip implant for contactless payments. For the man buying the coffee, the benefits of such technology are seemingly limitless. “Unlike a standard payment card,” says Paprota, “it cannot end up in the wrong hands. It will not fall out of your wallet, and no one will take it from there. The implant cannot be scanned, photographed or hacked.” The passive device can be fitted under local anaesthetic and costs a shade under €200 (£170).
Paprota’s enthusiasm for subdermal near-field communication (NFC) technology is by no means universal, especially when seen more from the waiter’s point of view. The recent Citrix Work 2035 survey firmly suggests that our attitude to microchipping humans in the workplace is divided on the fault-line that while C-level executives see the technology as enhancing productivity and quality of the work experience, the workers themselves feel that having ‘cyborg chips’ embedded in their flesh will strip them of their privacy and pave the way for robots to take their jobs.
While three-quarters of business leaders believe that such subdermal sensors will boost worker performance and productivity by 2035, less than half of workers share that positive view. While only a third of managers were prepared to back up their convictions by having implants, more than half of the workers would embrace the technology if only to protect their jobs. One of the top-line findings of the survey explains the nexus as follows: “Leaders are consistently more positive about the benefits that technology will bring, while workers are more sceptical and concerned about their own role in the changing world of work.” What both parties seem to instinctively agree on is that chip implants cross some sort of boundary, while taking wearable electronics to a new level.
We’ve been microchipping animals in the UK for more than three decades. It’s a useful technology application: insert a subdermal radio-frequency identification (RFID) chip somewhere the animal can’t get to it – such as the nape of its neck – and a whole world of digital data opens up. Researchers can monitor behavioural and biological traits, while zoo scientists, conservationists and pet owners can confirm identities and locations, crucial in monitoring the illegal trade in rare wild animals and for keeping tabs on your domestic cat. In the UK it’s a legal requirement to have any dog over the age of eight weeks microchipped (and to keep the stored data up to date), while the government is currently considering the same requirement for felines. Owners are increasingly taking it on themselves to put smartphone-enabled ‘air-tags’ on pets’ collars (as well as their children).
During the Covid-19 pandemic, pet abduction rose to such a scale that the UK’s Department for Environment, Food and Rural Affairs (Defra) pet theft taskforce published a report recommending that the crime of ‘pet abduction’ be elevated from that of property theft to “reflect the true severity of the crime”. When such changes to the law come into place, there will be “new requirements to register additional details, and a single point of access to microchipping databases will support tracking lost and stolen dogs”. RSPCA chief executive Chris Sherwood was “thrilled that the government wants to simplify the microchipping database system”.
As thrilling as the government response to what home secretary Priti Patel describes as “an awful crime” might be, tagging animals to safeguard their emotional, financial, or environmental value is a world away from chipping humans. Throughout the 21st century, the UK’s criminal justice system has used electronic monitoring in curfew and home detention applications as an alternative to custodial sentences, with use of external devices (usually worn on the ankle) rising sharply since their introduction. In the National Audit Office report ‘The Electronic Monitoring of Adult Offenders’, former comptroller and auditor general Sir John Bourn says: “Electronic monitoring represents value for money, providing a cost-effective alternative to custody for offenders who do not pose a risk to the public.”
‘This new development is set to give a quite different meaning to ‘hacking the body’ or biohacking.’
While implanting subdermal RFID devices or NFC chips in criminals is likely to remain the stuff of science fiction for the foreseeable future, in the world of work that future has arrived. A US software company has already hit the headlines – ‘Three Square Market Implants RFID Chips for Employees’ – for going down the route of putting subdermal chips in the hands of 80 of its staff. Jane Crosby of law firm Hart Brown explains that while these volunteers are now able to “pay for food in the company canteen, open doors, log into their computers and even use office equipment such as photocopiers with the wave of a hand”, the impact of microchipping humans opens a “huge can of ethical worms” while raising questions about personal privacy, the consequences of refusing to be microchipped and what happens when you leave your job.
That’s without the problematic question of tracking peoples’ movements without their consent, exactly what type of information is being stored on those chips and what kind of safeguards are put in place. The difference between implanted chips and curfew-monitoring technology, smartwatches and wearable bio-medical sensors is, says Crosby, that “once it’s in, it’s in”. We might be (often reluctantly) getting used to electronic tracking through CCTV, vehicle monitoring and so on, but with microchipping “we’re crossing a whole different Rubicon”.
Ahmed Banafa, author of ‘Secure and Smart Internet of Things and Blockchain Technology and Applications’, says that for the human chip implant culture “to be accepted and become mainstream, it needs to overcome three challenges: technology, business and society”. Banafa adds that from the technology perspective, implants put us squarely in the orbit of transhumanism (the enhancement of the human condition via emerging technologies). “As a sensor, the chip touches upon your hand, your heart, your brain and the rest of your body. Literally. This new development is set to give a quite different meaning to ‘hacking the body’ or biohacking.” While cyber experts continue to worry about protecting critical infrastructure and mitigating security risks that could harm the economy, says Banafa, implanted chips add new dimensions to the risks and threats of sensor hacking.
The business challenge is that of how to commoditise a technology that could effectively replace traditional forms of identification in shops, offices, airports and hospitals. Added to that, says Banafa, implanted chips “will provide key physical data and further processing of that data in the cloud to deliver business insights, new treatments and better services”, presenting “a huge opportunity in all types of businesses and industries in private and public sectors”.
But before any of this can happen, society needs to come to terms with the new order of things. Banafa predicts that as humans we will grapple with the privacy and security implications that come with a “set of technologies that will become much more personal than your smartphone or cloud storage history”. The tiny chip that sits under your skin, he says, will pose new risks and threats that can only be addressed by the two-pronged mitigation strategy of legislation and consumer trust, “which is built on security, safety and privacy”.
Attitudes to human microchipping vary with geography, and it seems that the nearer you get to Scandinavia, the more likely you are to be comfortable with your subdermal sensor. In Sweden’s Stockholm, the much-reported capital of human embedded electronics, there are ‘chipping events’ attended by students such as 19-year-old Olof, who underwent the procedure (that he describes as feeling like being stabbed with a fork), “because it’s cool and it’s something us techies are into because we like being at the forefront of technological developments”. Sandra Haglof, who works for the Stockholm-based event company Eventomatic, says she chose to get the chip because she wanted to be “part of the future”.
Olof plans to copy his public transport pass, keys and credit cards to his new chip. When asked if he’s concerned about having the chip hacked or his movements monitored, he is phlegmatic, stating that all his personal data can be found on the internet, and if anybody wants to track his location, they can already do that via his smartphone.
This laid-back attitude may be why Sweden is becoming the focus of workforce chipping. One of the most publicised case studies is tech start-up Epicenter that has implanted the technology into 150 of its workforce, who can now open doors, access photocopiers and pay for coffee in the company’s café without scrambling for plastic cards. In stark contrast to the headlines describing NCF chipping as the jackboot of Orwellian surveillance, CEO of Epicenter Patrick Mesterton sees the process as making life easier for everyone. “It basically simplifies your life. You can do airline fares with it; you can also go to your local gym. It replaces a lot of things you have other communication devices for, whether it be credit cards, or keys, or things like that.”
As for deciding whether he wanted electronic gadgetry in his body, he overcame his doubts by taking the view that there is a long history of people “implanting things in their bodies like pacemakers to control their heart. That’s a way, way more serious thing than having a small chip than can communicate with devices.”
Author of ‘The Swedish Microchipping Phenomenon’, Moa Petersen thinks there might be 6,000 people in Sweden so far to have had a chip inserted in their hands. While this number is hardly earth-shattering, following the Swedish government’s announcement that the coronavirus vaccine passport requirement is to be put into effect, that number is increasing daily. The reason Sweden has embraced the technology so readily is that it “is a high-trust society. The most important factor for implant tech taking off here is that Sweden is a powerfully tech-literate society. If you understand how a technology works, you also understand that you need not be afraid of it as long as you use it in the right way.”
The problem is that while subdermal chipping would appear to be plain sailing for self-proclaimed ’trust societies’, it could present challenges in urban contexts where there are social issues of violent theft.
Equally, Biohax CEO Eric Larsen thinks that the Covid-19 pandemic has introduced a higher level of technophobia, with public concern over contact-tracing applications spilling into the microchipping space his company is involved in.
Despite sensing increased resistance to digital surveillance (especially movement tracking) brought about by mistrust of Covid-19 contract-tracing security (and suspicion over data misuse), Larsen is upbeat about microchipping: “It is a step towards the future, although it is already happening. This technology was born to help us, to give us small superpowers. We are not tracking movements, we don’t have a GPS inside, but I think that a lot of people are not aware of that.”
Back at the coffee bar and Wojciech Paprota is optimistic about his company’s contactless payment implant. He says Walletmor’s research shows that in territories where there is already a market acceptance of conventional wearable technology, such as Scandinavia, the UK and Switzerland, “despite conspiracy theories about coronavirus, there are few qualms here whether the device spies on or monitors its user”.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.