Chinese Students Invent AI ‘Invisibility Cloak’

Chinese graduate students have created a plain-looking garment that prevents artificial intelligence systems monitoring cameras from recognising the wearer as human.

The InvisDefense coat relies on printed patterns for daytime use and includes a series of heat-generating elements to throw off infrared cameras at night.

Professor Wang Zheng of the school of computer science at Wuhan University oversaw the project, which won first prize in a contest sponsored by Huawei Technologies on 27 November as part of the China Postgraduate Innovation and Practice Competitions.

The researchers’ paper is to be presented at the AAAI 2023 AI conference to be held in Washington, DC in February.

AI cameras fail to identify people wearing the InvisDefense coat as humans. Image credit: Wei Hui

AI confusion

Wang said the research could be used to create stealth military uniforms or by AI researchers to improve recognition models.

“Our InvisDefense allows the camera to capture you, but it cannot tell if you are human,” he said.

PhD student Wei Hui, who created the core anti-AI algorithm, said a major difficulty was creating a pattern that would confuse AI surveillance cameras without appearing conspicuous to humans.

“Traditionally, researchers used bright images to interfere with machine vision and it did work,” he said. “But it stands out to human eyes, making the user even more conspicuous.

“We use algorithms to design the least conspicuous patterns that can disable computer vision.”

Printed patterns confuse AI cameras. Image credit: Wei Hui

Low-cost solution

The team tested nearly 700 patterns within three months to find the right balance.

The researchers also looked to find an economical solution, with Wang saying the finished result costs less than 500 yuan (£58).

He said the cameras’ accuracy was reduced by some 57 percent, and could be higher with further experimentation.

“Our results prove that there are still loopholes in current artificial intelligence technology and computer recognition technology, researchers could use our algorithms to improve current models,” Wang said.

“InvisDefense might also be used in anti-drone combat or human-machine confrontation on the battlefield.”

Image credit: Weibo
Matthew Broersma

Matt Broersma is a long standing tech freelance, who has worked for Ziff-Davis, ZDnet and other leading publications

Recent Posts

Blue Origin Aborts Test Flight Minutes Before Launch

Jeff Bezos' Blue Origin cancels New Glenn certification flight at last minute due to unspecified…

5 hours ago

Government Aims To Make UK AI ‘Superpower’

Government to loosen AI regulation, exploit public-sector data, build data centres in growth zones as…

10 hours ago

Brazil Demands Clarity After Meta Ends Fact-Checking

Brazil demands specifics on how new Meta stance on misinformation will apply to country amidst…

18 hours ago

US Executive Order Aims To Shore Up Cyber-Defences

Order from outgoing Joe Biden administration aims to respond to multiple hacks by China targeting…

19 hours ago

Amazon, Meta End Diversity Initiatives

Amazon, Meta end diversity and inclusion initiatives as tech firms re-align policies with those of…

19 hours ago

TSMC Cuts Off Singapore Company Amidst Huawei Fallout

TSMC cuts off Singapore-based PowerAIR as it investigates chip it produced appearing in AI accelerator…

20 hours ago