![]() ![]() The technology was turned into its first app, dubbed DeepNude, in 2019 although its creator took the app down, its code still circulates. Since the first AI-generated fake porn was created by a Redditor at the end of 2017, these systems have become more sophisticated. ![]() The website’s startup-like growth tactics signal a maturity in abusive “nudifying” deepfake technologies, which overwhelmingly target and harm women. The original website has been previously reported on, but the extent of its partner programs has not. The website has made its algorithms available to “partners” through access to its APIs and two spin-off websites have been created by other people. In recent months the website has expanded its services, earning its creator potentially thousands of dollars. Previously, similar technologies have only worked with partially clothed photographs. Researchers say its output is “hyper-realistic,” and unlike similar abusive platforms, it can generate pornographic images even when the person in the original photo is fully clothed. It digitally “removes” clothing from non-nude photos to create nonconsensual pornographic deepfakes. The website, which WIRED is not naming to limit its amplification, has existed since last year. The expansion efforts have allowed the service to proliferate despite bans placed on its payment infrastructure. With generative AI, these images can look a lot more genuine and can cause a lot of anguish to the victims.A deepfake website that generates “nude” images of women using artificial intelligence is spreading its murky tentacles across the web-spawning look-alike services through partner agreements and recruiting new users through a referral system. ![]() A lot of small apps that give loans to people have access to their Aadhaar card or Pan card and use that image and superimpose it with a nude image to blackmail and recover the money. "It is creativity at its worst as it can be grossly misused. "I don't see any specific reason or legal use of such an app," Gaurav Sahay, partner, SNG & Partners. Politicians for instance are particularly susceptible as troll armies are known to dig up dirt and try and tarnish the image of certain people. People in the public eye particularly are at risk, experts said. However, it is important to note that none of this can be verified and neither are there any implications or checks and balances to ensure that someone's privacy is not being violated or if their 'consent' has been sought before using these tools. Please respect others and seek appropriate permissions when using this tool." Users are solely responsible for their actions and consequences. Misuse, such as creating explicit content without consent or violating someone's privacy, is strictly prohibited. Upon starting a conversation with a chatbot, it warns - "The Clothes Remover AI should only be used responsibly within legal and ethical boundaries. The government should look to have this clause in the Digital Personal Data Protection Bill." Similarly, the government needs to mandate that something that is AI-generated should be clearly labelled as such. "We need to have classifiers to identify what is real and what is not. "The solution has to be two-pronged - technology and regulation," he said. Jaspreet Bindra, founder of Tech Whisperer, said technology would have to evolve to the point where just as users download an anti-virus, they also have a 'classifier' technology that distinguishes between what is genuine or fake. These tools have been extensively used in recent times by fraudulent loan apps that gain access to a person's gallery and then use morphed nude images of users to extort money from them.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |