There are already plenty of Youtube video posts about this topic, which is an easier way to digest summary information, but I wanted to cover a few interesting points here anyway.
The general theme is an online scam, relating to someone using social media contact to get someone to trust them enough to make crypto-currency trading transactions as an investment, which turn out to be fake. Sounds pretty improbable, doesn't it? It is, but the build-up is the crux of it, how they approach it. I received this Line message not so long ago; this works as a good example:
A common intro is a wrong-number sort of message, like this. You wouldn't necessarily talk to someone sending a message to the wrong Line account, but you might message them to tell them they've got the wrong number. Often the picture will be that of an attractive woman, which may increase the chances that someone might want to be helpful, to explain the "mistake," or to talk further beyond that. That one was a bit moderate (in emphasizing attractiveness); this might be a better example:
Then whatever the starting point is doesn't matter, because the scam would relate to making small-talk next. This particular intro opens up asking questions about travel themes, and sharing travel preferences, a good lead-in for social discussion, even with a "wrong number." They would probably tentatively offer to pay for guide services; there would different ways to bait the hook.
Later it would switch to advice about crypto trading, which becomes kind of a stretch. They use a script to help people running the scam bridge the gap in between topics, cutting and pasting a number of discussion messages that lead to that. It's not supposed to make complete sense. This is an interesting Youtube channel summary of how it all works; I'll draw more details here from this, about who is doing this, and from where.
I've talked to one person online before attempting this scam, curious about why an online stranger was talking to me (maybe from a social media contact; I don't remember that starting point). The transition was pretty clumsy in that example; I would imagine that approach and the message steps get a bit better dialed in over time. Another source covers a reason why you shouldn't talk to these people, to intentionally waste their time out of curiosity or malice, more or less what I was doing.
Per that Youtube reference, and at least one other I've watched on this theme, these scams are typically conducted by Chinese criminals, set up and ran by enlisting workers in isolated residence spaces, some of whom are pretty much captives there, victims of human trafficking. In that sense two crimes are really occurring, the scam, and a forced-labor theme supporting it, hosted out of places like Myanmar, or Poi Pet, Cambodia. The scam employees / captives might be expected to generate a certain amount of return, and might be punished for low conversion rates, for example beaten, so it might be a mistake to try to tie up their time in order to cause others to not get scammed.
Jumping the track a bit for a tangent, I've been to one of those places, to Poi Pet, two years ago.
Poi Pet looks a bit like Thailand, but rougher |
development isn't consistent, but they had built out casinos and housing |
Poi Pet felt a little off; there were literally shut-down casinos there, part of boom and bust phase of them attempting to become the next Macau, and that not working out. Surely the reasons for that were complex. Development seemed inconsistent; there were plenty of large-scale apartment and office buildings here and there, but then also those closed casinos, and roads in between developed areas that weren't finished, at some stage in between dirt and pavement. Apparently that business failure and connection to Chinese interests, and limitations of local law enforcement / openness to corruption, all combined to make this a viable new criminal industry there.
One video reference on this theme, not that one I've cited, here instead about a Dubai-based operation, showed how they can work around using an attractive woman's photo, when it's really more often some Chinese guy being held captive. Instead of pulling down online photos, which would make it possible to reverse-search the images, a more sophisticated operation can employ someone to play that role, to be the photo model, and even to be able to video chat with scam victims.
That's not a job that anyone would seek out, but they could enlist such help the same way they could turn up the other workers: make false promises about a much more valid work position, and then set it all up as a difficult situation to get back out of once someone gets started.
It's still unbelievable that this could work at this time, isn't it, decades into people running all sorts of online scams? Even if you somehow thought that you were talking to an attractive woman, who had become something of an online friend, would you really get started on investing in crypto-currency, which typically is a scam no matter the starting point? Hopefully not.
But they build up to that sort of thing, using complicated deception. The videos describe how they have people start by investing very little money, $100, and then witness how easy it is to do the trading to apparently earn returns. It just turns out that those are fake, based on using an app created for this deception, that mirrors the look and feel of real trading apps. If you would see the person you are talking to on a video call, supposedly, and then see how you could rapidly generate profits on a small investment (seemingly), then it might make more sense to go further. Not for someone sensible, but one part of this is the victim believing what they already want to believe, partly tied to the attractive image part.
Back to the message starting points, the messages that I never responded to, those were identified ordinary, attractive women, just planning a trip, per the shared context. Supposedly organic discussion would lead to the topic of crypto trading, following a script, with plenty of allowance for variation for responses, all making it seem more believable. The context built in some cover for rough English use and slight inconsistency; from the looks of it those women would be supposedly visiting from China.
This kind of thing might work out well using a dating site or online penpals themed site to initiate it, right? I explored an example of the latter at one point, "Interpals," but lost interest relatively quickly, in part because of drawing more contact from scammers than genuine users. I did make one online friend there, an older Chinese guy in Malaysia, and we talked about local culture issues and changes in modern society over a number of years. Some other contacts were at least real people, but it all went nowhere.
Someone commented on a discussion once that they could tell if a woman was a scammer online, because any female talking to them would have to be that. Unfortunately that's kind of how you could tell who was who on that penpals sort of site. If a random woman starts a conversation, that looks like a model in her profile photo, with very little personal background in it, that's a scam.
I'm an admin for a large Facebook tea group and Facebook is being populated by these sorts of profiles now; they're joining groups to look more legitimate. They're easy to spot for a similar reason; the photos are almost all attractive women, and the limited details don't add up, even within the two or three background items shown for group approval. Most go to a university like Harvard, often work for Facebook, and come from places like California City, which is a real place, but drawing on minimal knowledge of the US that would be LA instead. Somehow more sophisticated fake profiles draw on use of obsolete or inactive and older real profiles. That way a lot of it can actually be real, and consistent, it just wouldn't show years of recent activity, which is also the case for many real Facebook users.
There's not much conclusion here. It goes without saying that you shouldn't send money or get started on investments based on advice from a random online contact. These scams will keep changing form too; they'll figure out a next way to extract money from a stranger based on limited conversation under a set of false premises. Later on chat-bots will be doing this, not human-trafficking captives. Eventually they'll even be able to video chat.
No comments:
Post a Comment