TikTok’s Invisible Challenge hack caused thousands of users to unknowingly download malware. Platforms like TikTok, where media is distributed in the evenings, are ideal for hackers looking for instant gratification from social apps that help spread malware quickly without much effort. But these hackers created their own moral dilemma.
Disconnect from the Internet!
#invisiblefilter is a new trend using invisible body effects on TikTok, currently has 27.3 million views. The filter replaces the user’s body with a watermark, and creators post nude videos through the app, hiding their real bodies.
The comment section of the video is often filled with user questions about how to remove the filter – questions that attackers are very quick to answer.
The deleted account has the invisible Body filter removed and can be accessed through a Discord server called Space Unfilter. Once connected to the server, viewers will be shown a video of the muted speech supposedly obtained with the help of the program.
This means that when a user receives a private message containing a link to a GitHub repository, its legitimacy is rarely questioned. However, the malware repository contains a Python package installation file named “WASP Stealer (Discord Token Grabber)“.
The malware steals device passwords, Discord account credentials, bank card details, and crypto wallets. The threat is still active because the malware author moves the server and program right after it’s removed.
World’s smallest violin?
Malware is a huge threat, as evidenced by the number of users connecting to the server – but how much sympathy is there for the victims? Finally, he enters a special server to circumvent the challengers’ non-disclosure agreement.
TikTok’s algorithm prioritizes viral content, but it also applies to users. Videos labeled Invisible Body are shown to at-risk media-exposed viewers, a group of users read in comments for tips on how to see bodies behind the filter.
Promote videos that promise to use “filtering” software on the same TikTok account – which is exactly what the cybercriminals behind the attack are doing.
The “victims” of the TikTok Invisible Challenge hack want to see what’s behind the filter and remove creators they don’t agree with (although these posts pique their interest).
It also looks darker than the hormonal need to see naughty girls. There’s no need to comment on how easy it is to access NSFW content online – girls and women’s nudity are popular items. Malware victims must be motivated, at least in part, by their ability to fool their creators with malicious filters. The right to withdraw the content creator’s consent.
Also, a quick Google search shows that it’s not possible for creators to remove filters from videos once they’ve been released, let alone third parties. As a result, scam victims are not as tech-savvy as they think they are. But a very clever person has created a tool to remove the filters that can see the mind of a naked woman who normally does not allow her to see her connection to the server and become victim of malware.
While the TikTok Hack Invisible Challenge shows how TikTok can be used in malicious cyber attacks, it is an example of human exploitation. Sure, we would feel better if victims downloaded software that helped save abandoned puppies, but malware and its victims are a prime example: humans are always the link. weakest in the network security scheme.