Used Nest cameras had bug that let previous owners peer into homes
Get link
Facebook
X
Pinterest
Email
Other Apps
-
Google says a fix has been found, and the update is being applied automatically.
Google says it has fixed an issue that allowed old owners of Nest security cameras to continue to view a feed from the device, even after de-registering it from their account. The issue could have potentially allowed an old owner of one of the cameras to continue to look through it, even after selling it to someone else. The new owner would have had no indication that a stranger could be able to look inside their home.
“We were recently made aware of an issue affecting some Nest cameras connected to third-party partner services via Works with Nest,” a Google spokesperson said to The Verge, “We’ve since rolled out a fix for this issue that will update automatically, so if you own a Nest camera, there’s no need to take any action.”
The issue was related to Nest’s integration with Wink, a third-party home hub which its cameras connect to using the Works with Nest program. Even though de-registering a Nest cam from your account stops you from being able to view it using Nest’s own app, a user on a Wink Facebook group discovered that they could still view a feed through Wink’s third-party app.
Wirecutter was later able to verify the existence of the problem, which allowed them to view still images from the camera. Since the camera was de-registered from its old Nest account, a new owner would be able to sign up for a new Nest account without any indication that the device was still associated with its old owner in some way. Wirecutter Verified that the problem affected the Nest Cam Indoor, but it’s unclear whether the company’s other connected cameras were also impacted.
It’s telling that the bug appeared through Google’s Works with Nest program, which the company announced it was discontinuing last month. At the time it said it was discontinuing the program in the name of privacy, to stop third party devices from having as much access to data captured by Nest products. Now that we’ve seen the extent of this data sharing, it’s hard to blame them. Works with Nest was originally due to shut down on August 31st, but Google later clarified that customers will be able to continue to use any services and connections until they’re replicated in Google’s new Works with Google Assistant program.
This is the second major privacy scandal suffered by Google’s Nest division this year. Back in February it emerged that the Nest Secure home security system included an on-device microphone, which the company had failed to disclose when it was originally released.
Although Google claims the issue has now been resolved, the process of buying a pre-owned Nest camera can still be complicated. If a previous owner hasn’t de-registered the camera from their account, then the only advice Google’s support page has is to email the previous owner directly to ask them to remove the device. Still, at least you’d be aware that there’s a problem in that case, unlike this more recent oversight.
Source: Jon Porter|@JonPorty (The Verge).
~Best Feeds ™...
Giertz got tired of waiting for Elon Musk to release Tesla’s first pickup truck, so she made one herself. Simone Giertz was tired of waiting for Elon Musk to unveil his new Tesla pickup truck, so she decided to make one herself. The popular YouTuber and self-described “queen of shitty robots” transformed a Model 3 into an honest-to-god pickup truck, which she dubs “Truckla” — and naturally you can watch all the cutting and welding (and cursing) on her YouTube channel. There’s even a fake truck commercial to go along with it. Giertz spent over a year planning and designing before launching into the arduous task of turning her Model 3 into a pickup truck. And she recruited a ragtag team of mechanics and DIY car modifiers to tackle the project: Marcos Ramirez, a Bay Area maker, mechanic and artist; Boston-based Richard Benoit, whose YouTube channel Rich Rebuilds is largely dedicated to the modification of pre-owned Tesla models; and German des...
It is at least the fourth fatal crash involving Autopilot Illustration by Alex Castro / The Verge Tesla’s advanced driver assist system, Autopilot, was active when a Model 3 driven by a 50-year-old Florida man crashed into the side of a tractor-trailer truck on March 1st, the National Transportation Safety Board (NTSB) states in a report released on Thursday . Investigators reviewed video and preliminary data from the vehicle and found that neither the driver nor Autopilot “executed evasive maneuvers” before striking the truck. NTSB_Newsroom ✔ @NTSB_Newsroom NTSB issued preliminary report Thursday for its ongoing investigation of the fatal, March 1, 2019, highway crash near Delray Beach, Florida. The preliminary report is available at; https:// go.usa.gov/xmpBm 67 3:10 PM - May 16, 2019 Twitter Ads info and privacy 62 people are talking about this The driver, Jeremy Beren Banner, was killed in th...
The resulting fakes could be used to shame, harass, and intimidate their targets. The DeepNude app creates AI fakes at the click of a button. A new AI-powered software tool makes it easy for anyone to generate realistic nude images of women simply by feeding the program a picture of the intended target wearing clothes. The app is called DeepNude and it’s the latest example of AI-generated deepfakes being used to create compromising images of unsuspecting women. The software was first spotted by Motherboard’s Samantha Cole, and is available to download free for Windows, with a premium version that offers better resolution output images available for $99. THE FAKE NUDES AREN’T PERFECT BUT COULD EASILY BE MISTAKEN FOR THE REAL THING Both the free and premium versions of the app add watermarks to the AI-generated nudes that clearly identify them as “fake.” But in the images created by Motherboard , this watermark is easy to remove. (We were unable to test t...
Comments
Post a Comment