You’ve probably already seen the video on social media. It’s an accomplished “parody” of clips published by engineering company Boston Dynamics, showing a CGI replica of the firm’s Atlas robot getting kicked, hit, and shot at, before turning the tables on its captors.
Maybe you saw the video and initially thought it was real. Maybe you even felt bad for the robot and angry at its tormentors. “Why are they hurting that poor machine?” asked many. “Sure, it can’t feel anything, but that doesn’t mean they can treat it like that.”
It’s a totally understandable reaction! But it’s also one that shows how much trouble we’re going to be in when robots like Atlas become a common sight on our streets.
Are machines really deserving of empathy? Do we need to worry about people fighting for robot rights? These are big questions that are only going to become more relevant.
First, though, a little side-bar on why so many people were taken in by this clip. Praise here goes to the creators, an LA production company named Corridor Digital, who did a slick job. The CGI is solid, the set dressing is on-point, and the target is well chosen. Boston Dynamics really does stress-test its robots by kicking and poking at them with sticks, and this has long made for slightly uncomfortable viewing. Helping the footage go viral is the fact that many accounts shared low-res versions of the video (which disguised the CGI) or trimmed the fantastical ending, where the robot is ordering humans about at gun-point.
In short: if you thought the video was real, don’t kick yourself. Because that would be actual cruelty, as opposed to the fake, robot-kind.
But that brings us to the important question here: is it okay to hurt robots? The obvious answer is: yes, of course. Robots aren’t conscious and can’t feel pain, so you’re never hurting them; you’re just breaking them. You may as well feel sorry for the next plate you drop on the floor, or advocate for the rights of cars being torn apart for scrap.
But despite this obvious reading, humans do feel sorry for robots — all the time. Numerous studies show that it’s laughably easy to make humans treat robots like humans. We feel bad turning them off if they ask us not to; we obey their orders if they’re presented to us as authority figures; and we get uncomfortable touching their ‘private parts.’
This isn’t really a surprise. Humans will feel empathy for just about anything if you put a face on it. As MIT researcher and robot ethicist Kate Darling puts it: “We’re biologically hardwired to project intent and life onto any movement in our physical space that seems autonomous to us. So people will treat all sorts of robots like they’re alive.”
The tricky thing is, how do we use this power? There are going be benefits for sure. Think of robots like Paro the baby harp seal that can help theelderly stop feeling lonely. But what about corporations that take advantage of our empathy; designing cheery AI assistants that win thehearts of childrenwhile teasing out some valuable marketing data, for example. And that’s before you start thinking about the mobile robots that are being deployed in supermarkets, on our streets, and that may soon becoming to our houses.
In other words: the future of robot empathy is going to be a mess. Be glad we’re just dealing with the CGI parodies for now.
Source: James Vincent (The Verge). ~Best Feeds ™...
Giertz got tired of waiting for Elon Musk to release Tesla’s first pickup truck, so she made one herself. Simone Giertz was tired of waiting for Elon Musk to unveil his new Tesla pickup truck, so she decided to make one herself. The popular YouTuber and self-described “queen of shitty robots” transformed a Model 3 into an honest-to-god pickup truck, which she dubs “Truckla” — and naturally you can watch all the cutting and welding (and cursing) on her YouTube channel. There’s even a fake truck commercial to go along with it. Giertz spent over a year planning and designing before launching into the arduous task of turning her Model 3 into a pickup truck. And she recruited a ragtag team of mechanics and DIY car modifiers to tackle the project: Marcos Ramirez, a Bay Area maker, mechanic and artist; Boston-based Richard Benoit, whose YouTube channel Rich Rebuilds is largely dedicated to the modification of pre-owned Tesla models; and German des...
It is at least the fourth fatal crash involving Autopilot Illustration by Alex Castro / The Verge Tesla’s advanced driver assist system, Autopilot, was active when a Model 3 driven by a 50-year-old Florida man crashed into the side of a tractor-trailer truck on March 1st, the National Transportation Safety Board (NTSB) states in a report released on Thursday . Investigators reviewed video and preliminary data from the vehicle and found that neither the driver nor Autopilot “executed evasive maneuvers” before striking the truck. NTSB_Newsroom ✔ @NTSB_Newsroom NTSB issued preliminary report Thursday for its ongoing investigation of the fatal, March 1, 2019, highway crash near Delray Beach, Florida. The preliminary report is available at; https:// go.usa.gov/xmpBm 67 3:10 PM - May 16, 2019 Twitter Ads info and privacy 62 people are talking about this The driver, Jeremy Beren Banner, was killed in th...
The resulting fakes could be used to shame, harass, and intimidate their targets. The DeepNude app creates AI fakes at the click of a button. A new AI-powered software tool makes it easy for anyone to generate realistic nude images of women simply by feeding the program a picture of the intended target wearing clothes. The app is called DeepNude and it’s the latest example of AI-generated deepfakes being used to create compromising images of unsuspecting women. The software was first spotted by Motherboard’s Samantha Cole, and is available to download free for Windows, with a premium version that offers better resolution output images available for $99. THE FAKE NUDES AREN’T PERFECT BUT COULD EASILY BE MISTAKEN FOR THE REAL THING Both the free and premium versions of the app add watermarks to the AI-generated nudes that clearly identify them as “fake.” But in the images created by Motherboard , this watermark is easy to remove. (We were unable to test t...
Comments
Post a Comment