Alarm bells are sounding around Artificial Intelligence (AI) and the people behind the technology once again. Images of a young woman sitting on a toilet seat, taken by a cleaning robot, and leaked to closed social media groups on Facebook and Discord, are the epicentre of the newest controversy against a technology that sci-fi films have long poked around. The image leak, first reported by the MIT Tech Review, was taken as a test version of the Roomba. The woman pictured was not a customer, but either a volunteer or an employee of iRobot, the manufacturer of the Roomba robot vacuum cleaner.The roots of the incident go back to 2020 when iRobot asked employees and paid volunteers to help them gather data to help improve a new model of the machine. All they had to do was use the seemingly harmless piece of tech in their homes. iRobot claims that participants were made aware of how the data would be used. It says that the models even came with “recording in process” tabs. So, what exactly was iRobot doing recording its unreleased product’s samplers? It was sending the data to another party- San Francisco-based Scale AI. From there, the data was going to Scale’s contracted data workers, who were sitting in Venezuela. Scale AI’s contractors, or data labellers, were working on a project for iRobot to tag photos. All the photos the Roomba took, would label to teach the machine to recognize objects in their surroundings better.Scale AI says the contractors posted the pictures in violation of their non-disclosure agreement. iRobot has since cut ties with their outside partner. The company is investigating how private photos ended up on social media. iRobot, which Amazon is set to acquire in a $1.7 million deal, was quick to clarify that the machines are not the same as production models.However, the leaked images reveal a trend bigger than any individual company. They are evidence of the widespread practice of sharing potentially sensitive data to train algorithms. But even beyond that, it shows how much a single piece of information (in this case, an image) is exposed to. The volunteers’ pictures for instance, came from homes in North America, Europe, and Asia. They reached the servers of Massachusetts-based iRobot, and then to Scale AI in San Francisco. From there, they moved all the way to gig workers in Venezuela in South America. Some of those pictures travelled across three continents to train a robot to move around easily while cleaning a house.There is a whole data supply chain. And with it, there are now multiple points where personal information could leak out. This fits in neatly with the growing demand for stricter laws regarding AI. Or not. Time will tell how the world’s policymakers react to this.
top videos
Read all the Latest Buzz News here
About the Author
A team of writers at ietif.com bring you stories on what’s creating the buzz on the Internet while exploring science, cricket, tech, gender, Bollywoo...Read More
first published:January 19, 2023, 17:38 IST
last updated:January 19, 2023, 18:13 IST