Matt Gracey-McMinn

‘You’ve got criminals thinking this is an easy way to launder money’

Image credit: Nick Smith

Netacea’s head of Threat Research, Matt Gracey-McMinn explains how automated internet bots are being hijacked by the money- laundering trade. He says elimination of malicious bots is the first step in the defence against organised crime.

“There were a lot of very disappointed online retail clients over Christmas,” says Matt Gracey-McMinn. “They were essentially victims of bot attacks.” Shoppers hoping to buy electronic goods over the internet experienced premium stock scarcity on legitimate vendor sites and were forced to redirect their trade to re-seller sites. This is where they were exposed to extortionate price uplifts that exploited the pre-Christmas demand increase.

None of this is necessarily illegal, says the head of Threat Research at Manchester-based Netacea, but the power of scalper bots (that digitally jump the customer queue to snap up bulk stock of in-demand products) is now becoming a tool for the murky world of organised crime. “It’s got to the point where the US government is exploring a bill to legislate against the use of bots in purchasing electronics.”

Thirty-year-old Gracey-McMinn says the fact that in most cases such scalper bots are legal is “part of the problem. They are publicly available automated tools to buy things online that can operate far faster than any human could hope to match.” But this type of bot, while irritating, is relatively harmless when compared with other bots that lurk on the internet poised to drain credit cards and break into bank accounts. And that’s just the tip of the iceberg, says Gracey-McMinn, who recites an entire cast-list of questionable bot-related activities, from account takeover to credential stuffing, web scraping to card cracking, loyalty card fraud to skewed market analytics. All of these (and more) constitute the murky and rising bot threat landscape. Mitigation of the problem – a cyber-security approach informally known as ‘bot zapping’ – “has never been needed more to protect customers in today’s digital economy. That’s basically what we do here at Netacea.”

Gracey-McMinn says that while any consumer conducting retail transactions over the internet will be familiar with the sinking feeling that goes with being beaten by a bot in an eBay auction or a virtual festival ticket queue, this isn’t what he’s talking about. Besides, in most cases there’s nothing particularly nefarious going on (although he is keen to stress the UK’s Digital Economy Act 2017 bans bulk buying of tickets using bot technology). As if to underline the activity’s innocuity, eBay provides bidders with its familiar in-house automated bidding feature, which, as regular e-Bay users can attest to, can lose out to snipers chipping in at the last minute (providing they outbid your maximum).

Apart from events ticket reselling (or ‘scalping’), which can be traced back to 19th-century railroad scams in the USA, “for the individual, it’s more a matter of ethics than law”, says Gracey-McMinn. The big problem, he says, is not so much what happens at the consumer end. It’s that organised crime is exploiting online retail, breaking into streaming services and hitting e-commerce sites with lists of a million stolen credit cards, all as a means of cleaning up dirty cash in money-laundering operations. Gracey-McMinn says this problem is often confused with the advantage created in eBay auctions by using sniper tools. For him, the real issue is “industrial-scale” market manipulation, while money laundering is illegal – “always has been, always will be”.

Reflecting on the launch of PlayStation 5, Gracey-McMinn says: “They’ve hardly been available since they were first released over a year ago now. What’s happened is that the scalper bots have dived in, grabbed them and immediately relisted them elsewhere, such as on eBay, Facebook Marketplace, StockX or other retail sites, sometimes for as much as 10 times the recommended retail price.” What this means is that the original purchasing demographic now “either can’t afford to buy the item or will choose not to”.

While this wreaks reputational havoc upon the brands being resold – loyal customers that can’t get hold of their favourite technology upgrades don’t stay loyal for long – on the other end of the spectrum, “you’ve got criminals thinking this is an easy way to launder money. They are virtually guaranteed to get back at least the face value of the dirty cash.” Gracey-McMinn says Netacea has been talking in the UK to MP Douglas Chapman (the SNP’s Small Business, Enterprise and Innovation spokesperson), “about legislating against the problem, and his biggest concern is that of money laundering”. Part of the conversation centred on monopolisation of the heated toilet seat market, “where consumers are being price gouged. I might be the only person ever to talk to an MP about toilet seats.”

‘It’s not the technology that is the issue. It’s the scale of operations’

Matt Gracey-McMinn

The problem is clouded by popular products being targeted by legitimate resellers “large enough to create supply-and-demand issues”. Gracey-McMinn explains that this was blatant during the Covid-19 lockdowns of 2019-20 in which it became apparent that “well-funded organisations would happily put millions of pounds into buying up large amounts of particular products”. While some groups concentrated on cornering the PS5 market or notoriously the home-gym equipment market, “we saw others snapping up PPE equipment stock and hand sanitiser”. The resulting scarcity had the potential to contribute to the public health issue. “People wanted to buy hand sanitiser because there was coronavirus around. But they couldn’t afford to because it was only available online for ten quid a bottle, as opposed to two.” Another Covid-19 example is how scraper (malicious data collecting) bots were being deployed to isolate and report back on vaccination slot availability, with the intention of then ‘selling’ these slots on to the public. “Some of these are aggressive – checking with a frequency of say every 0.1 seconds – while others might check only every hour or so.”

The cyber-security wing of the Intechnica software development and consulting group, Netacea is a tech start-up with its origins in events ticket sales, specifically the creation of the system that sold Glastonbury 2005 tickets. “That’s where we started, but we’ve moved on to repurposing those tools for the sneaker [sports training shoe] industry.” You’ll have heard reports, says Gracey-McMinn, of “14-year-olds buying rare sneakers from Nike and Adidas and making lots of money by reselling them at a mark-up. These ‘sneaker bots’ are now being used against any e-commerce organisation that has an online catalogue. Since lockdown, these attacks are increasing. As more and more people make more and more money out of this, we’re starting to see the bots get more sophisticated, more people being drawn into the market, and that brings with it organised crime.”

We’re not just talking about cottage-industry scale either: “Some of the more informal community groups will have up to 30,000 people using bots to hammer away at websites to try to get hold of products such as PS5. Some of these organisations are registered with Companies House. But it’s got to the point where suppliers such as Sony are trying to get their retailers to do something about this, while retailers are calling out to governments to help them, saying: ‘you made this illegal in the ticket industry, so why the hell isn’t it illegal in the electronics industry too?’”

The UK government is listening, says Gracey-McMinn, “because we’ve also reached the point where basically no-one can buy anything. During lockdown it span out of the electronics sector and is now everywhere.”

Stockpiling and hoarding have been standard retail business tactics since commerce began. Ask any rare books dealer how many signed J.K. Rowling copies they have in their vast warehouses along the M4 corridor, and they’ll go very quiet, and quieter still when asked how long they’re going to keep them under wraps to artificially stimulate market demand. There’s nothing wrong with any of this, says Gracey-McMinn, who perhaps surprisingly doesn’t have a problem with such tactics becoming digitally automated. “It’s not the technology itself that is the issue. I think the scale of operations is where the line has been crossed. People have always tried to buy stuff and resell it at a higher value: that’s basically how the economy works. But what we are seeing now, because of these organised groups bashing away at websites all day long, is that traditional sellers are no longer able to sell to their traditional customer bases. They are now selling just to robots, which means that this slight technological advantage is allowing complete monopolisation of certain markets.”

Cyber issues are the modern equivalent of cops and robbers, with the latter simply defined by Gracey-McMinn as “an automated process that runs over the internet. It’s a very simple script that’s set up to do a very simple thing. Using the Star Wars analogy, there are Light Side and Dark Side bots. So when we sit in front of our customers, one of the first things we do in bot mitigation is to try to distinguish between the good and the bad guys.” Gracey-McMinn says that if, in the process, they block Google’s good bots “that go around indexing the whole of the internet, then the client isn’t going to be happy because they’d no longer appear in Google searches. On the other hand, you have the bad actors trying to work out how they can use these bots for personal gain.”

A bad bot example is the account checker that exploits data dumps received from cyber breaches, “where usernames and passwords are linked. There are literally billions of them are out there now. The way they work is to find a log-in page (say Netflix) and throw every username and password combination at that in the hope that someone has used the same combination before in a previously breached site. When this technique works on a bank account... that’s when it gets more serious. Some of these attacks are quite clever. Once an Amazon vendor account is breached, bots can strip out names, addresses, payment details, phone numbers and so on, as well as reporting back to the attacker that the account has now been taken over and can be used to make purchases.”

Some of these ‘horrible’ bots will then automatically buy items that have been identified as desirable by the attacker. By the time you get an email alert advising you that someone has logged into your account, and you should reset your password now, says Gracey-McMinn, “it’s already too late”. Even if you react to that email immediately, the chances are you’re shutting the stable door after the horse has bolted, which is why you need to have “good awareness of cyber security at all times. Bot management is a niche area of cyber security.”

All websites, mobile apps and APIs are now a target for malicious attacks by automated bots, says Gracey-McMinn. “This has the effect of putting profits, customers, data and reputation at risk. Without specialist bot protection in place, attacks such as credential stuffing, carding, fake account creation, scraping and scalping will succeed or go undetected.” He explains that on average, somewhere between 10 and 40 per cent of traffic on a typical web-facing system is made up of malicious bots that can appear human and bypass defences that have been put in place to identify them. Netacea’s bot-management protocols “take a new approach to bot detection, spotting known and evolving attacks to ensure that the maximum number of bots are detected with a minimum number of false positives”.

Gracey-McMinn describes this process as analogous with putting up a castle wall in front of the client’s internet infrastructure. As with traditional castles, the wall is strong and thick, but has a gate through which all traffic must pass. “Some of the traffic will be in the form of good bots, which we will allow through, and some of it might even be human traffic.” This is important because real people behave differently to bots, so learning to differentiate between the two is critical in bot mitigation. “We know what good bots look like and what humans look like. We’re familiar with that, and so it becomes easy to identify bad bots that are just using brute force to gain access to a system.”

The scenario becomes more complex when bad bots attempt to disguise themselves by mimicking human behaviour. “What we’re seeing now is that attackers are getting more advanced and are using machine learning and other artificial intelligence techniques to emulate human behaviour over any given website. Attackers literally hire humans and say: ‘go on this website while my bot watches you and learns how you behave.’ Then they train the bot based on these observations.”

Gracey-McMinn says bot zapping is now a technology ‘arms race’, where cyber-security is trying to stay ahead of organised crime by adding new elements of defence to counter the growing sophistication of the bad guys. “To get on top you need to improve detection and break the kill-chain to stop the attack before it is successful.”

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Recent articles