Reminds me of the time I bought lunch at work, and a colleague told me exactly what I bought and how much I paid for it. I called him out and said it was a lucky guess, and then he proceeded to tell me my entire payment history for the past 2 days.
Turns out when I was buying lunch, he was on the phone with a friend who worked at Paytm and that guy gave away my transaction history for shits and giggles.
My trust in private companies has been at it's lowest since then and I absolutely do not trust startups to keep my data safe.
What you must understand is that this is human nature not "private companies."
When I was in high school I had a friend who worked at one of those 1-hr photo processing places. People would bring their film in to have prints made. And there were no small numbers of "intimate" photos on those rolls of film. Yes even in the days of film cameras, people took photos of themselves in sexual situations.
Of course my friend thought it was hilarious and the shop would make extra prints of these photos to pass around among the staff. They had separate categories similar to what you'd see on any porn site. Of course it was in violation of policy but people do this stuff. If you're building something that handles photo/video images you must expect it and build in privacy from the ground up. You cannot rely on your staff to always be on their best behavior.
When I was working for [blank] cellphone carrier we had a competing carrier fire their entire phone repair/support staff in the area because they were keeping a USB hard drive stash of nudes. Tech support staff would pass it from store to store and dump whatever nudes they'd collected from customer phone repairs that week. I don't remember how they got caught.
I had to come down on multiple tech staff at our own store for digging around in photos anytime a hot woman came in with a phone.
Rare occasions we had this one older women that would ask us to transfer photos every year to her new phone and to "verify personally that every photo had been moved." Of course the majority of the photos would be her naked selfies or what seemed to be swinger parties. I've got a 65yo woman in a cowboy hat only seared into my brain because I was the first tech to deal with her kink of having people look through the photos.
Right, so the problem is still private companies not putting in the effort to build in privacy to prevent this.
For physical film it’s hard, but for software you should at the very least record access to personal data and audit it to make sure people actually need it and aren’t abusing whatever permissions they are using to get the data.
The employees in this case aren’t going looking where they weren’t supposed to. This isn’t an auditing problem, like the Google/Facebook/Amazon employee snooping where they aren’t supposed to.
In this case employees are tasked with watching the video clips that car owner intentionally shared with them in order to label it (draw boxes around specific objects).
After spending 8 hours a day drawing boxes around fire hydrants, they would joke about the funny things they saw with their coworkers.
This is literally how all human labeled AI training works. You can’t teach an AI to identify crucial features on the road without building the labeled data set to train it with.
Amazon does the same thing with Alexa. Apple does the same thing with Siri. Security camera companies all do the same thing with their motion and object detection software.
They were sharing recordings and screenshots on internal message boards, even making memes of them. It wasn't just funny stuff either, graphic stuff too. That's far from joking around with your coworkers, even that would be totally inappropriate and worthy of termination imo.
I assume they don’t mean that they are baffled by human nature. The thing that’s sort of unexpected is that, knowing human nature, companies don’t build in safeguards for this kind of sh.
Exactly so. It is repugnant that level of access was even available to a employee based on nothing but curiosity. Unfortunately another part of human nature is that most people profiting off you don't give a shit about your privacy. This is why legislation such as HIPAA has to exist.
> Do you really thing the people in government care about your privacy?
They definitely don't, but at least there is electoral pressure applied to them from constituents who they represent. In the case of private orgs it is gobble up anything you can and there's no way for regular users to hold them accountable.
If the government gave a shit about privacy in general, Equifax would have been fined into oblivion and its owners jailed for a bit. The example is not evidence that they give a shit, it is presented as something without which our healthcare data would be traded as another commodity.
> If you're building something that handles photo/video images you must expect it and build in privacy from the ground up.
I agree with this as part of a bigger solution. There should also be privacy regulations with serious consequences for negligence or abuse. If private customer data were a liability due to the risk of huge fines from misbehaving employees, companies would collect a heck of a lot less of it.
Right now data collection is almost all upside for the company; there are many ways to use or sell it to make more money. But users bear the costs, many of whom don't realize just how much they are being spied on.
To go further, it should also go down on someone's permanent record so that they aren't allowed to work in the same field for someone new. This shouldn't be like being a priest where you just get rotated to a new area but get to continue doing what you got in trouble for in the first place.
> What you must understand is that this is human nature
Seems no one is speaking up in the defense of humanity at large. What you describe is possibly even "common" but it is not ingrained in all of "human nature". There are many people who are simply incapable of certain transgressions - for some even the thought doesn't occur. These are possibly rare but they do exist. What you are describing is the fundamental problem of humanity: we are not a smooth and uniform distribution and practically every political thought ultimately boils down to this foundational problem of our collective but morally and ethically disjoint coexistence.
Human nature? Sure, but very much lack of regulation and of good corporate privacy policies. First, customer data should be in a special, highly logged environment. Anyone logging in for whatever reason needs a justification. Next audit or periodically or if something goes wrong, those logs are checked and people confronted with why they're accessing random person X's data for no work related reason.
Won't make those incidents disappear completely, but it will sure kill off fetching data from a friend of a friend for shits n giggles.
I was at a hackathon once, and they had a startup fair, where various companies had their booths.
One startup had created a student-management system for schools. And the rep was demoing the system. Except with live data. Showing pages of real students with their pictures, home addresses, etc!
So, principles aside, the actions of companies are such that there can be no trust.
I once had a classified ad software hustle, and I needed customer data to debug things. But one of my customers was Canada's largest gay newspaper, and ts classifieds were highly sensitive.
So I wrote a routine that obfuscated the database, changing all phone numbers to 555-xxxx, changing all names to random names of fruit (So a customer might become Banana Grapes, 416-555-1234), and a few other changes to hide other possibly identifiable information.
I had a menu item to do that to the database, it was under a "developer" menu that only appeared when I was personally signed in as the only superuser. I am embarrassed to say the menu item was called "mixed fruit."
One day, I was signed in at the client's office, and the manager came by and wanted to do something or other. I gave him the mouse and keyboard without logging out and asking him to log back in. He did his job, then noticed the developer menu. "What's this," he murmured, and selected "mixed fruit."
No confirmation dialog, no warning, it begin munging the live, production database as I watched in horror. I managed to get everything sorted and the production data restored, but I learned a few lessons that day about building super-features for myself that were extremely sharp and difficult to undo.
Personal ads in those days did not usually have real names or addresses in them, regardless of sexual orientation. Some had personal phone numbers, but in those days you could also pay a few bucks more and rent a voicemail number just to put in the ad, or a "mailbox" for written correspondence. Classified ad software in those days had all kinds of features for supporting pseudo-anonymity and handling the pricing correctly.
So yes of course the ads were public, you could pick up a free copy of the newspaper at any gay bar on Toronto's Church Street. But the details about the customers placing ads were extremely sensitive.
———
Oh man, the stories I heard... Many extremely sad to me, when I think about why people felt the need to be closeted and so on.
Someone might give out their phone number and a fake name, but the back end the software might need to know their real name and address for billing purposes.
Why on earth you create that in the first place, and put it in the menu?
How often do you need to trash production (which should be never -- copying and scrabling is plenty bad enough) that you need a menu shortcut for it?!?
These were the days before you could just ssh and/or vpn into a client's production system, so I would copy their db onto my Mac, run the mongler on it, and then I would take the database home with me, without worrying whether the theft of my Mac could lead to lives being ruined.
If you want to point out that I could have been even smarter thirty-plus years ago, the line forms to the left... Stretches down the block... Goes around the corner... &c.
But while my memory is imperfect, I believe that I hardcoded that feature to only work when running on my personal Mac SE/30, and also changed the feature/name to have a tossed salad metaphor.
Let's just say that while my library of best practices has grown over the ensuing decades, my collection of anecdotes about worst practices has grown even faster :-D
Sorry, that code is strictly worse. Because while the website on the laptop looks fine, there is a database on the laptop which has all the sensitive information. Anyone with access to the laptop just makes a backup of that database and the damage is done.
It's hard for me to say whether you are correct or not; I was under the impression that the data was updated on the DB instance, and in that case I stand by what I said (the prod data would end up on the laptop, but then as soon as the command was run, it would get permanently obfuscated). If the data was just obfuscated on screen, then yes, perhaps you are correct.
Something I have seen done before that I thought was good was that every so often a production database was synced, but during the sync process, things that could identify a customer were redacted or obfuscated. Then almost anyone who had need to work with production data would use the munged version.
Not that it matters greatly, but FYI this app ran on a product called 4th Dimension, a 4GL app-building environment running a proprietary database on Macintosh desktop systems. They also had a client-server version that ran on local networks. There was a runtime engine, or for big money you could buy a compiler that would build standalone executables.
They eventually pivoted to supporting the web (and PHP!), but this story predates all of that.
4th Dimension was orginally called "Silver Surfer," and it is the centerpiece of a story Guy Kawasaki used to tell about how big companies work. Apple was trying to get everyone to write software for the new Macintosh, they gave free hardware and engineering support to Ashton-Tate to port its popular PC database to Mac, &c.
Meanwhile, indie companies like Aldus (who would go on to literally rescue Apple when they released PageMaker) and ACIUS (A company Guy formed in partnership with the original developer to distribute 4th Dimension) shipped software that people actually bought and actually used.
Apple would have done better if they'd told people like Ashton-Tate to "Ship or GTFO," but the history of technology companies is one of people skating to where the puck has been...
> ...I would copy their db onto my Mac, run the mongler on it, and then I would take the database home with me, without worrying whether the theft of my Mac could lead to lives being ruined.
From that, it was definitely run against a copy of the database.
The code that you showed looks like PHP display code. So it will change what is shown to the user, but unless showMixedFruit is horribly misnamed, will not change what is in the database. And therefore does not address the problem that the mongler was trying to solve.
That was implicitly intended to be part of menu template code, so it would show the “Mixed Fruit” option only if it wasn’t prod, and as I tried to suggest in my comment, such code would not execute on prod (the endpoint/backend code would abort if it found itself running in prod). (And the mixed fruit code would, in fact, affect data in the DB). Tbh I think this discussion is less than pointless.
I had a background check company that a previous employer used email me to say:
"I just wanted to verify that this is the correct email address for you, and that the following info is correct:" and proceeded to list my full name including middle initial, name, phone number, date of birth and full SSN."
Mind blown on a couple of levels. Was tempted to reply back "I have no idea who this is - this isn't the correct email".
They also found my girlfriend on social media, got her phone number and called her to "verify my identity".
To be clear, this was a run of the mill SRE position, not any type of background check for a clearance.
Instead, I told my new employer, and they were as livid as I was, profusely apologetic to me, and fired that background check company that day.
If they are Dutch that would be illegal under Dutch law. We have very strict laws around private investigation and anybody doing that without the proper license would be in a lot of trouble.
The United States has similar laws, where licensed Private Investigators have to at least supervise investigations (it's a bit like having a licensed pharmacist on duty while less-trained people actually hand out the meds, or having a medical doctor supervise the work of a group of physician's assistants).
There are some exceptions, but most of the time you need a license to go snooping around. These PI licenses are also a major perk of law enforcement careers, as it's sometimes difficult to get the required work experience without time in law enforcement (unless you don't mind working for nearly-nothing under a PI taking photos of insurance & workers compensation cheats, unfaithful spouses, etc. for years).
That background check company was almost certainly operating within the bounds of the law. They were just doing it Really Poorly.
I've also seen the pendulum swing in the opposite direction. There was a demo for hospital software that was obviously using fake data. It was something like "Patient Name: Mickey Mouse SSN:123456789"
An overzealous IT security manager literally covered the demo with his body until it was taken down. All under the pretense that "You never know, that could be someone's real information."
Someone should have walked up with their phone camera and shouted "ooh hey, nice addresses!" And just started snapping away (or at least pretending to)
Yeah. I would seriously have considered asking "did these students consent to the public dissemination of their personal data?" This kind of behavior will only go away when it becomes a shameful act in the eyes of the public.
I would not be surprised if their resolution to someone taking pictures in order to show them the error in their ways would be for them to call to police and have you arrested.
My awakening was the multiple times I've been talking with a startup and said, "I was surprised there's no self-guided demo or video tour available on your website; can you show me how the product works?" and had them reply something like, "Oh, sure. The reason we don't have a demo yet is that we haven't gotten around to making fake data, but let me pull up one of our customers' accounts and give you a tour using their data. Try not to read anything."
If you build robust privacy guidance mechanisms into the fabric of your startup from the beginning, your ability to handle risk management around these types of cases can be resourced to scale with the customer expectations of the system you build.
Unfortunately, if you do that, you are going to be outcompeted by the teams that are working to get their first 10,000 paying customers by any means necessary, because privacy planning is less capital efficient.
The companies that do get big enough to overcome their immediate survival constraints often have a harder problem identifying and providing resource needs for privacy assurance because it's less on the minds of the people in charge of making resource decisions because you have other operational scaling issues at the front of mind.
Your engineers and support staff doing dumb things with your data is a risk you can have resources allocated to. But it's not on the critical path to market dominance so it shouldn't be expected to be a priority.
Sounds like it should be illegal to share data by default, and that individuals shouldn't be able to sign away their expectation of privacy as part of a EULA.
I'm so glad I've never heard of that service and have no idea what it is.
Thanks very much for calling them out by name, BTW. Presumably someone from that company is reading this as we speak - and soon enough, will be reporting back to us that that employee has been identified, and of course, duly fired.
Indian peer to peer payments app. Generally considered one of the higher-quality made in India applications and very very widely used. Nobody will be IDed or fired until the same thing happens to a celeb or gvmt official.
It's because you don't live in India. Not sure if this is still the case, but Paytm was the Venmo of India, and actually had more penetration than credit cards when I visited ~5-6 years ago.
I don't think it's just private companies. Its really any thing that has data I think.
I used to work at the largest telco in the country on a software project (as a consultant) that involved some integration with existing services. With some playing around it soon became clear that all services were wide open as long as you were on the internal network, you just had to know what they were and how to call them. No authentication required, no audit logs as far as I can determine.
I didn't poke at it too much, but I was able to at least read an arbitrary cell phone's text messages and call logs.
> My trust in private companies has been at it's lowest since then and I absolutely do not trust startups to keep my data safe.
Hate to be the bearer of bad news, but governments aren’t any better. Local government in particular is usually an IT security nightmare.
There was a local government in a state I used to reside in that required folks to have an “alarm license” for their home and fined people for false alarm police callouts. The form to apply required you to give an alarm code for the police, and of course your name, address, and phone number.
Predictably, the database of information was ineffectively secured and basically public on the Internet for years before it was fixed. I don’t recall any burglaries or home invasions happening due to it, but still rather asinine.
At this point in my life I have basically no faith in any institution in society and treat all information I give out as effectively compromised immediately.
You are right, I’ve worked at a bunch of companies with millions of users and access to data was almost out of control within the company. Management doesn’t really care, including the data protection officers and the likes. I only hope Google has better policies, with all the data we store in emails and in docs.
Frankly, everyone is a risk to some extent, which is why it’s handy to just not give someone things that can be a problem if exposed. What they don’t know/don’t have, can’t be used against you.
But someone with minimal direct criminal/financial risks of exposing something is definitely higher risk than others, and that is most startups.
That said Amazon reps have clearly been bought out before, and individuals within most large corps have always been viable targets of blackmail, bribery, coercion, etc.
It’s why some societies are so resistant to phasing out Cash. Anything else gives leverage to folks that historically it’s been a bad idea to give leverage to.
At a fertility clinic they had paperwork that asked my employer, social security number etc. I asked if this was necessary, as I was paying out of pocket. They firmly said yes and asked me to fill it out.
Guess who works for $State Fertility Group, with a social 111-11-1111 and makes 100M/year.
I believe that your assessment of startups is valid. I also think your views are true of big, multinationals as well. It seems that by the time a company gets large enough, consumers start receiving some protections, but workers, not so much.
> Tesla is neither a private company, nor a startup.
The parent implies 'private' in the sense of non-governmental entity (the Indian terminology), and by that metric both Tesla and Paytm are 'private' (and publicly traded)
That's horrific. I work for a fairly large payment processor and it takes multiple levels of approval and oversight from several levels of management to get time limited access to a production database or client interface - all access being logged and ticketed along the way. The idea that someone could just start looking up transactions for shits and giggles would be unheard of.
> The idea that someone could just start looking up transactions for shits and giggles would be unheard of.
What about the DBA maintaining the database? Do they not have query access to the data? How about the devs who are responsible for reporting; do they develop reports using generated test data? It’s naive to believe that data is entirely secure and private. There’s always a level of trust required from employees to not share private data that they may see on the job.
I fully expect this in India. There is no privacy in India and no education or awareness of it either. There are no laws and so no expectations either. I don’t if it’s still prevalent but I know friends who had folks from some company knock on their doors saying they will take a sample and do free blood tests and add it to their website to track it as a service. This is pre covid
This is really disappointing. HDFC wouldn't ever do this to you if you used your debit card for transactions (I realize this isn't feasible for every vendor, I just mean conceptually). Now I have to wonder why Amex was banned for so long for not localizing data. The other payment apps with localized data aren't really doing that much to protect it!
Well, that's India, though. Whether it's a private or a public company there's no real notion of privacy of data. If you want it and you have a friend with access, he'll give it to you. Your job is to then not get caught.
when it comes to security you should always assume the worst intentions, if you can think of it as possible somebody is probably doing it. This is why nobody trusted the NSA and they were proven right with the Snowden stuff
The fact that PayTM doesn't guard sensitive personal data even when it's a local tech behemoth makes me not want to use digital payments and switch to cash.
Why are you advocating sensitivity to the giant corporation here? If there's a defense its "we don't allow random employees to access these records, and anytime an access is performed we audit the access including interviewing the responsible employee and reviewing call recordings. If the access was for anything other than an approved reason we fire the employee."
I don’t recall the source, it may have just been anecdotes online that aren’t easily verifiable, but even after Facebook went legit as a company I’m pretty sure access to data (who’s looking at whose profile, stuff like that) was marketed as an employee perk.
Not a perk, if you look at personal info of anyone in your social circle, you are fired, no matter who you are, for what ever reason. If you need to fix a bug, and you need to look at personal info, it is logged and reviewed, go snooping, you are fired. You use your friend as the example for that bug, you are fired. In fact, when I worked there, I was paranoid I'd accidentally look at personal info and get fired, and rigorously used test data.
For additional sense of timescale - I’ve been there for coming up to a decade, and day 1 of employee onboarding was “you will be fired immediately if you even try to do <list of activities>” (even looking up your own data at the database level is forbidden, because eg foreign keys could imply a list of who has blocked you, which is not visible in the web frontend)
I’ve never dared test the auditing of the user-data-logs, but I have tested the auditing of the network-logs — when I tried ssh’ing from my work laptop to my personal web server (so that I could run an IRC client there), it took seconds for the security team to react ^^;
I think Zuckerberg is a sociopath but this chat log often gets dredged up and it’s never been verified. It could be real but it could be an urban legend or an internet echo or the reincarnation of tubgirl.
Short of Zuck confessing to the authenticity of those messages, what sort of verification would you even expect? If Business Insider was libeling him, why didn't he sue them for it?
The quote is consistent with his behavior and attitudes since then, and with other accounts of his behavior and attitudes at the time. I don't see any reason to believe it's a fake quote. People on HN get upset when the quote is posted because it's posted so frequently, and the reason it's posted so frequently is because it continues to be relevant to and consistent with Zuckerberg's behavior and attitudes to this day.
> Short of Zuck confessing to the authenticity of those messages, what sort of verification would you even expect? If Business Insider was libeling him, why didn't he sue them for it?
Or because it wasn't libel. There is no evidence that it was libel, the man quoted has never said it was libel, so why are you laboring to defend this asshole billionaire who won't even defend himself? Cease your insipid simping.
I don't like Zuckerberg and I think the quote is probably real.
Dismissing someone as a simp is worse than ad hominem because, while an ad hominem criticizes a trait that someone actually has, anyone who disagrees with you on this issue can be labeled a simp. Get over yourself; people can disagree with you for reasons other than being blinded by irrationality.
I think it’s more disturbing that Zuckerberg named his company Meta in reference to a fictional dystopia where an antagonistic billionaire monopolist who runs a V.R. universe tries to literally control the minds of the world population both on and offline to become more powerful than world governments. Hubris is arrogance before the gods but what do you call the same before the devils?
TL;DR: there is factual stuff Zuck does right in the open and is worse than a sketchy internet quote.
> named his company Meta in reference to a fictional dystopia
Meta is a normal developer word - like Uber was. I personally never even knew there was a movie called Meta, although it is entirely obvious there should be at least one. Not saying Zuck wasn’t referencing the movie, but I am saying it is reasonably likely that wasn’t the reason.
Uber was a normal developer word?? I don't remember hearing it used in developer circles.
As to meta, yes, it's a common dev word. But Facebook is using it as short for "metaverse", which Zuck has admitted to lifting from Snow Crash's dystopian world.
Uber is a standard German word, which was often used in English as internet/gaming slang where you’d today use something like “hyper”, eg “uber awesome” or sometimes in video games “I am uber” (= “I am the best”).
> TL;DR: there is factual stuff Zuck does right in the open and is worse than a sketchy internet quote.
All that shit lends credence to the veracity of the quote (which comes from Business Insider, not the amorphous "internet"). A quote which Zuckerberg has never even bothered to deny.
> it could be an urban legend or an internet echo or the reincarnation of tubgirl.
I feel conflicted whenever I see a comment like this.
On the one hand, let's assume it's true: a Paytm employee acted negligently.
But on the other hand, what if it's not true? What if you happen to have a friend or family member who works for a Paytm competitor, or you have some grudge against Paytm for whatever reason, and are instead spreading low-key FUD about the company to make it seem like they have lax data controls and staff disregard for sensitive data?
The issue is that there doesn't really seem to be a way to substantiate your anecdote.
Let's assume it's true: a Paytm employee acted negligently.
Not negligently - maliciously.
The employee knew exactly what they're doing, that it was "wrong" in any conventional sense -- and most likely a huge liability to their career and reputation if it got found out.
most people can't handle it as a career and it has low barriers to entry, so many people do it as an early job. I have met several 10x call center people, and it can be an incredibly lucrative career. It's effectively low level social engineering. It requires extraordinary levels of grit.
From personal experience, people will do anything they are physically capable of doing and think they can get away with. Almost nobody I know has the slightest amount of respect for any private data to which they have access. This extends from people in healthcare breaking HIPAA to tell me about how Jane Doe is an idiot who got a mayo jar stuck in her vagina to IT workers showing me John Doe's cringey nude selfies. Trust absolutely no one. If it's possible, it's happening. The goal should be able to make it not possible to the best extent and when it is, create accountability.
> Almost nobody I know has the slightest amount of respect for any private data to which they have access.
Really? You need to run with a better set of people. It's true that there are plenty of corrupt, terrible people out there -- but it's also true that there are plenty who aren't.
This is what makes the lie more potent. It’s based on a kernel of truth, and because it reinforces beliefs, you can easily believe a Paytm employee acted negligently, with no more evidence than an anecdote.
Oh please. The comment was less about PayTM and more about tech companies being blasé about data privacy in general.
If I had a friend or family member who was an employee of such a publicly facing tech company, I’d be grilling them about their data security and privacy practices. I’ve been burned enough times by Indian companies so ridiculously free with their data sharing that I’ve stopped giving out my contact info to everything but the most essential of services.
Most Indians will lean towards believing the GP because they know how aggressively their personal data is being abused, unless Paytm comes out with concrete details of how they protect privacy inside and outside the firm.
I didn’t even realize Paytm was a real company when reading OP. It sounded like a generic name made up for purpose, like “Jane Doe” for payment companies.
This is why rule of law is important. India has weak rule of law... there's no confidence from anyone that wrongdoing will be punished and there's no confidence that making up stuff to hurt a competitor will be punished.
Given what I've seen I have absolutely no problem believing this. If you don't then that's fine but that simply means you've been living a sheltered life. Have a look at the GDPR enforcement tables for some choice violations.
So you trust a for-profit more then an aneedote by a customer of them? I am sure you'd also forcefully vaccinate your loved ones if $authority told you to do so, right?
In my experience, everything bad you can imagine, a for-profit has already done.
Wanted to get Model Y in the past. Nice comfy car, huge backseat and… a camera filming inside the car. Thanks no. Apparently for drowsiness detection, but I am not buying it. Time of flight camera staring at the driver’s face is kinda normal in car industry. Not a normal color camera filming everything and streaming somewhere like Tesla does.
> Time of flight camera staring at the driver’s face is kinda normal in car industry.
Is this really becoming standard? If so, that really bums me out.
I considered buying a Subaru for my latest car but one of the big negatives was an internal camera pointed at the driver that couldn't be disabled. It was, at least in part, for driver attention monitoring but performed horribly. It would ding contantly, even when I was staring straight ahead.
I have no faith that these video feeds will be kept private and really, really don't want to have to worry that any awkward or embarassing thing I've ever done in a car could be released to the world.
The most recent FSD beta update from a few days ago finally started using the interior camera to track driver attentiveness, according to the release notes.
Likely attempting to refer to a camera that can see depth like an Xbox Kinect sensor, but no car I know of has such tech. Some manufacturers like GM use IR cameras looking at the driver because they can see through sunglasses
From 7 July 2024 these rules will apply in the EU:
For all road vehicles (i.e. cars, vans, trucks and buses): intelligent speed assistance, reversing detection with camera or sensors, attention warning in case of driver drowsiness or distraction, event data recorders as well as an emergency stop signal;
Apparently drowsy drivers account for about 100 000 crashes, 71 000 injuries and 1 550 fatalities each year in the US alone, and around 11 600 people died due to drunk drivers in the US in 2020. So to say that nobody was asking for it isn't really true, I think a substantial part of the families to those killed and injured people would prefer if cars contained some kind of obstacle for drunk or tired drivers.
Cameras can detect both, being drunk or drowsy or distracted share a lot of symptoms regarding eye movement etc. The important thing isn't why a driver is impaired, but detecting when they are.
Is there any concern of additional eye fatigue w/ time-of-flight sensors? It's purely anecdotal but I was on a project that involved ToF sensors and and a couple on my desk pointed at me. After some time, I noticed my eyes hurt by the time I got home. The eye pain went away after I stopped having the ToF sensors pointed at my face all day.
News to me. I thought that ToF was well outside of human spectrum, like 850-950nm or so and really light power ratings. I’ve never heard of anyone fatiguing from it.
> I thought that ToF was well outside of human spectrum
While I cannot give an exhaustive complete answer I think it is safe for me to at least mention that you don't need to be able to perceive it for it to have an impact. You don't perceive even a fraction of what impacts your body, from radiation to chemicals. The molecules still react to lots of other things that are not part of what the eye is supposed to sense. It's not like everything outside light of the human visual spectrum magically has zero impact and you, or the eye's components, are immune.
Example: UV light.
There also seems to be evidence of flickering lights, causing fatigue, even if you cannot see the flickering.
> I’ve never heard of anyone fatiguing from it.
What did you do to find out? If you didn't research it, there would only be random chance for you to find out, accidentally reading a report or being told, plus paying attention and not filtering it out like what we do with the vast majority of what reaches us.
Why not buy laptops that don't have cameras? They exist, and doing that would at least help to add economic incentive for the continued production of them.
That's almost certainly true. I wasn't asserting that purchasing laptops without cameras would make Apple (or anyone else) suddenly get off the camera train. I was asserting that if you (for instance) buy a Clevo because it lacks a built-in camera, that will encourage Clevo to keep making them.
I don't know the world of Apple machines, but at least outside of Appleworld there are several cameraless options out there from major manufacturers. I assume they're sold mostly to businesses, because I've worked at a few companies who only allow machines that don't have cameras and microphones built in.
From purely academic perspective, self-driving cars could probably benefit from eye-tracking data as humans drive cars. Although I agree with you on the privacy aspect. I would need proof that the data is only being used for certain things to feel comfortable with it.
The camera feeds are processed entirely locally in the car and not streamed anywhere. You can optionally opt in to share certain data with Tesla if you want to, but it's not required.
Get a German car. Privacy is taken extremely seriously here in Germany. In all walks of life. Companies, schools, doctor‘s offices, etc.
Edit:
I wouldn’t want to touch a Tesla even if it was gifted to me. Seeing the founder behave the way he does (latest incident: Changing Twitter’s logo to a shitcoins image) makes him extremely unsympathetic to me.
Probably nothing? Because they're pretty good at cheating the rules so they don't get caught. Occasionally they get caught, but you know how it is: if you find one cockroach in your food, that probably wasn't the only cockroach in the restaurant.
As far as I can tell, German companies are very good at keeping their shenanigans under wraps because they know that being caught has terrible consequences. Occasionally they slip up and then you see what a complex conspiracy they've been operating.
German regulators are notoriously understaffed, see 'Wirecard'. I would not be surprised if there were no consequences at all until it hit the press or someone in a position of power was embarrassed.
Missing Wirecard seems to be a product of arrogance and incompetence. Under-staffing could have caused the regulator to overlook the fraud, but the continued obstinacy of the regulators in the face of overwhelming evidence was incompetence. Attempting to prosecute the journalists revealing the fraud was incompetent arrogance.
> I would not be surprised if there were no consequences at all until it hit the press or someone in a position of power was embarrassed.
At which point the Data Protection Authorities (GDPR enforcement) would get involved. They would almost certainly levy a hefty fine against the auto manufacturer.
Given similar circumstances to what? Having a camera that is off by default and where it is up to the user whether to turn it on or not? What exactly is the problem?
Mercedes was my car of choice for decades. Then a late model one decided to phantom brake twice and put me in danger. Never again. That's the kind of thing I would expect from brands with a different reputation. That doesn't say they are playing fast and loose with privacy, and if they do not then that's a point in their favor. But their software absolutely sucks and if that passed QA then who knows what else is going on.
Also: working in R&D doesn't necessarily give you insight into what operations people are up to and what kind of access they have to sensitive data. I've looked at plenty of companies in Germany and not all of them are lily white in this respect. MB may well be one of the better ones, I haven't looked at them in particular. But they use an awful lot of components from companies that have a less good reputation.
Eberhard sued Musk challenging his right to call himself a founder and the case resolved such that Musk gets to keep calling himself a founder. Musk was a major source of funding early on, putting in about 6 million out of a 7 million dollar raise, served as chairman of the board, headed product design for the Roadster, and took over as CEO before they launched the Roadster. Tesla publicly acknowledges him as a founder.
The founder title is one that a company gives to individuals crucial in the early formation of the company. Musk clearly qualifies.
I'm saying all this because I think it's very lame that Musk critics try to say he's not a founder of Tesla. Hate Musk all you want, just use real criticism and not this sophomoric "he's not a Tesla founder" stuff.
You're allowing your personal feelings for Musk to interfere with facts. Musk provided money, design, and leadership at critical early stages of the company. He's obviously a founder and you are simply wrong for disputing that.
It sounds like he was an influential early investor, but the situation you've described isn't what I would typically think of as "founding" a company. I'm surely not alone in this, thus the present debate. This isn't to diminish Musk's huge influence either; but folks tend not to love when the meanings of words are changed for PR.
Attacking other users like this on HN is completely unacceptable. If you do it again we will ban you.
I'm not going to ban you right now because when I looked through your recent history I didn't see a lot of other comments breaking the site guidelines. That is the difference between your account and the other account, which I did ban.*
Please avoid flamewar comments in the future generally, as well. It's not what this site is for, and destroys what it is for. We've had to warn you about this at least once before.
* Lest anyone is worried, that difference has nothing to do with agreeing more with one user than the other—we couldn't care less about this flamewar topic. It has only to do with the difference in comment histories.
Musk was not a founder. He might have been an early investor but he didn't found the company. Stop lying with your fucking propaganda. I hope Musk bought you a pony for your bullshit. You are a musk shill. Stop trying to rewrite history you shill.
You can't attack fellow users like this on HN. I've banned the account.
No one is saying you owe $billionaire better but you certainly owe this community much better if you're participating here.
If you don't want to be banned, you're welcome to email hn@ycombinator.com and give us reason to believe that you'll follow the rules in the future. They're here: https://news.ycombinator.com/newsguidelines.html.
Musk is a founder in literally every sense of the word. He joined the company early, was a key investor, designed early products, and had a major leadership role (chairman of the board and then CEO) - from the start. In addition, the matter has been adjudicated in a court of law - Musk is a founder. The company Tesla recognizes him as a founder.
Against the overwhelming and undisputed evidence and common sense you have only childish insults. You don't even try to make your case. Sad, but at least you know you're wrong.
Is that true for cars made for markets that are not Germany? It's not that German car makers are inherently more ethical so I'd assume they do whatever they can get away with in whatever jurisdictions they sell their cars.
The issue with Tesla is _presumably_ not that they decided to make this available to employees out of sheer evil, but that they made it unavailable through negligence or incompetence. If they went into it thinking “this has to comply with (the rather strict German interpretation of) the GDPR”, this would be less likely to happen.
The coin was created as a joke. However, Dogecoin doesn't have any characteristics of a fake cryptocurrency. The technology is just as sound as Bitcoin's, as it's basically a fork of a fork of a fork with no crazy major changes. Logically, from a technical perspective, you can't think Dogecoin is a joke without also thinking Bitcoin is a joke, unless you disagree with the way Dogecoin changed the block parameters. Bashing it as a "meme" is orthogonal to any technical argument you could make about it.
The value of crypto coins doesn't come from technology. It comes from faith in the technology. Inherently value is a social thing, and hence some of what underlies it is social too.
Big part of that is the stated intent of a cryptocurrency. People can't rally around a joke about dogs as the best way to do money. People might rally around the first, most serious, and rather conservatively developed cryptocurrency though. Or perhaps the one that best allows computation on the blockchain, or the one that has the best track record of preserving privacy.
Or buy an old car. Of course there are tradeoffs with efficiency and safety, but I like having a car that's easy to work on and features no telemetry or smart features.
> Yes, to the point nothing can get done anymore and e. g. having an outdated, non-digitalized healthcare system.
The system may be slow and not very digital but it is very much not outdated. As someone who has seen a friend go through a terminal illness and has researched quite a lot about cancer treatments, the care you can get in Germany is top notch. And all that for free (except for hospital bed that you have to pay sometimes if you stay longer than a set amount. Which is about 11EUR if I remember correctly).
Yes things can be improved and privacy can be a hurdle but it is not quite a bleak as HN commenters make it out to be. I would take it over having everything beging top notch digital without any regards for privacy like it is in the US. (and often very poor or non-existent non-digital solutions)
There was only one "Dieselgate company", Volkswagen Group. BMW and Mercedes-Benz had nothing to do with it. And yes, I would rather buy a car that cheats on emissions tests than a car which spies on me. That's an easy choice (or would be, if I had any real confidence that VW cars truly don't spy on their users too. Which I don't.)
Can you elaborate why German products would be more privacy focused?
As a counterpoint, BMW started putting feature behind premium subscriptions couple years ago such as steering wheel. Such micro transactions are really hard to combine with privacy.
Except you had to register in an app every place you went to, during covid. Nice social profiling that data could provide… say, if another type of party ended up as the ruling one.
The correlation is integrity and honesty. Do you really think these German car companies that lied, cheated and defrauded their customers care about the privacy of your data? That's like asking a criminal if they care about your health.
Please don't take HN threads further into flamewar. This thread was bad enough already but a comment like this is an obvious degradation. We want the opposite here.
The things being discussed in these threads are valid criticism. The cars have deep flaws and the corporation has an expectation that exuberant owners will continue to do free marketing and PR for them.
Ban away if you need to but I'll continue to level valid critiques at a tech company selling an $80,000 luxury sedan without a spare tire.
"Snark or flamebait" as determined by tech bros that own the car?
On any given day there are 30 fresh topics getting lambasted by the terminally online nerds that populate this site but the second someone brings up Tesla-- Mods show up to enforce "civility" and pass out bans to anybody with 'wrong' opinions.
Hacker News being pointlessly negative is such a meme that people blackhole links coming from HN just to avoid the "free condescending advice", seems fitting these same reply guys can't handle the lightest of criticisms against their god-king or his ewaste cars.
We get called "the orange site" because name dropping Hacker News on social media causes a swarm of scavengers to attack you like a dead fish hitting the ocean floor.
I understand you have a difficult job making sense of the above + trying to prevent it from getting worse but genuinely this interaction leaves me feeling like getting my main account banned from HN would be a badge of honor.
If you want to make this kind of argument, you need a stronger basis than "So much worse than that, because they go online and simp for the company after doing it". I don't think it's a close call to refer to that as snark or flamebait.
> the second someone brings up Tesla-- Mods show up to enforce "civility" and pass out bans to anybody with 'wrong' opinions
This is entirely illusory. It feels that way only because you have strong feelings about $topic and are more likely to notice the moderation cases that feel wrong to you (see https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...). In reality, the distribution is more or less random. We have zero interest in moderating Tesla topics or any $topic more than any other, and we don't care what your or anyone else's $opinion is.
Literally the only thing we care about is people posting in the intended spirit of the site: thoughtful, substantive, curious conversation, as outlined in https://news.ycombinator.com/newsguidelines.html.
If you had expressed your opinion in that way, I'd never have replied; and if you had expressed the opposite opinion in the same way, I'd have replied the same way. As a matter of fact I have no clue what your opinion even is.
As for "civility" - we haven't used that word in many years and it doesn't reflect how we think about moderation.
"Stop making fun of my favorite billion dollar company. I just gave them 80k of my hard earned cash, therefore it must be good, and you should also buy it."
Edit: We've had to ask you this many times before. I don't want to ban you but if you keep breaking HN's rules we are going to have to. We've cut you a ton of slack already; it's not infinite, so please fix this.
I don't own a Tesla, and I'm not an apologist. But there are multiple reasons to own the car. And not all of them have to do with saving the planet, cameras, or the glovebox.
Won't someone think of the children? A whole generation of high schoolers deprived of the ability to steal their daddy's luxury sedan and bang their girlfriend in the front seat without the car narcing on them.
Yikes. So turning private images into memes and posting them internally isn't a fireable offence but union talk is? Such impeccable standards for conduct.
The kind of nonsense described in this story is exactly what I'd expect from an early stage software startup back in the early '00s, when I was first getting started (not in SV, mind). It was common to mock people and create memes for any reason (the Basecamp "foreign sounding name" kerfuffle very much reminds me of this).
This kind of culture was unacceptable back then, of course, but the founders and owners were little more than children themselves. It was at least understandable if not excusable.
What completely boggles my mind is that, some 20 years later, this kind of culture is still happening. And it's not happening at some small tech startup founded and run by young men barely out of school - it's happening at a wildly successful car company which has been around for over a decade run by the richest man in the world.
The culture at Tesla must be rotten to the very core for this to persist so long.
Amazon audits the use of the order database. As do banks for deposit information. As do google for looking up the contents of what's in someone's drive, etc, etc.
We've come to expect privacy as a first class right. That's why turning a customer's potentially embarrassing action into an internally shared meme feels like an extreme violation.
I’ve worked at three large tech companies and before that did medical research for a bit. Every large tech company treated actual content with respect and tried to prevent unnecessary access, not just through policy, but with tooling. The nature of medical research is that you don’t have the means to build tooling for every little thing, so they compensated by constantly reminding us of the constraints and cultivating a culture of respecting the data (which was anonymized) - nobody would even think to joke about this type of thing.
There’s no excuse for a big company like Tesla to not protect data. And there’s a big difference between turning a blind eye to a culture of abusing access/not working to prevent unnecessary access, and what will likely happen here where a handful of people are sacrificed after the fact because it generated bad publicity.
> Tesla managers sometimes "would crack down on inappropriate sharing of images on public Mattermost channels since they claimed the practice violated company policy," Reuters wrote.
somewhat implies that this has been a known issue for a while, already. (This shouldn't be triggered exclusively by a public scandal.)
Indeed; policies without consequences for violating them might as well not be policies.
My sister and I used to complain that our parents were more strict than everyone else's. My mom talked to my friends' moms and said "they all have about the same rules we do." My reply was "but they get yelled at for five minutes tops when they break the rules, and we get grounded!"
It's not like the memes were posted internally today, firing them now is just a reactionary response. The problem is that they let it go on the entire time before now.
I work in the industry. It's not a "vehicle" any more - it's a "platform" to "engage" with customers, i.e. sell your data and lock features behind ridiculous monthly fees. That is now the mantra of the automotive industry. It really kills any enthusiasm I have for new cars.
> It really kills any enthusiasm I have for new cars.
... and televisions, and phone apps, and solar inverters (with wifi!) and air purifiers (with wifi!) and drones (with wifi to activate) and scooters (to activate), and on and on...
Consumer tech takes a lot of cues from domestic abusers. “I want to know everything about you and who you talk to.” “I will do what I want to you with little notice or recourse.” “You can’t leave me that easily.”
It seems to me that we have long past the road building stage in US economy & now mostly in toll-booths stage of things. I wonder if we will ever go back. Maybe AI will change that but not very hopeful.
Wishing I still had my 89 Caprice with physical buttons and switches. Parts were insanely cheap too (except the transmission, which I part of why I got rid of it).
This is why I’m keeping my 09 Toyota. Upgraded the radio to a CarPlay capable head unit. But it was the perfect fit of modern safety features, a couple nice to haves like backup camera, but none of that always connected on star style crap.
Will drive it till it rusts out and in praying the market will have corrected by then
The theory would be to rebuild it with a roll cage, harness, etc for mixed use. In theory, these safety systems are superior to airbags. The only impediments to helmets/Hans with a real harness are people who don't use the existing seat belt (airbag can prevent ejection) and they are too inconvenient for the general public.
"Roll cages are completely unsafe in a street car unless you’re wearing a helmet."
No shit, that's why I mention a helmet.
Roll cages, if properly done, do not affect crumple zones. The passenger compartment should not be crumbling and that's the only part a roll cage should be in. Keep in mind that actual race cars have a rigid occupant compartment, crash at higher speeds, and have great safety records due in part to the protection offered from a harness, helmet, and hans. On the other hand, airbags are more prone to failure and can injure or even kill people.
The only reason I mention a roll cage is because it would be for dual use. You could use a helmet/hans in place of an airbag in a regular street car and would likely be safer. Again, the only reasons we have for not doing that is that some people don't use their seat belts and most people would find it inconvenient.
There are some non-EVs that you can get with limited tech (until the newer laws kick in). You can disable the connection in some vehicles. For example, there are guides on how to disable onstar to various degrees (removing the antenna, removing the bridge to the network chip, etc). EVs tend to be tech heavy because the costs associates with them require targeting the premium market to make a profit.
Because from the companies point of view, what's the real downside? If their goal was to simply make a good product, then none of this stuff would really be used, but they have longterm growth to think about too!
This is the most depressing thing I’ve read all day. I have two trucks. A 2010, and 1998. I was going to sell the 2010 for redundancy, but I don’t think I can. The 2010 is in pristine condition, and should last another decade if I’m careful with it. I’ll have to acquiesce to the modern trend with cars eventually, but I hope to hold out as long as I can.
Work in the industry -- the reason you have to pay is that the companies do not contract with Verizon/ATT/Tmo to have always on connection unless you eat the cost. This is changing going forward due to the value of always on system monitoring (DTCs, oil change interval etc) to say you didn't do your part and the warranty is void, but that's not on stuff from the last 10 years and won't ever be. Also, current things don't benefit past the warranty period, which is why you have to pay beyond that point if you want e.g. remote start. This will change in the next couple of years with OTA updates and whatnot, but that's only now on Mercedes/BMW/etc and only just getting onto Ford/GM/Stellantis/etc.
My understanding of OnStar is that you have to disconnect some things to -truly- disable it. (Probably to let people 're-subscribe' if they try to use the thing when they aren't current customers.)
Other systems... I can say my Maverick lets you select whether to have vehicle location data shared or not, but TBH I don't trust opt-outs. [0] I've also run into the issue on my said car, where my opt-outs got de-opted when the dealer runs updates.
[0] - Also as a side note, I've had at least one loan note in the past (from a larger bank) where they stated a tracking device may be placed in the vehicle. Not sure whether they sell that data.
Current owner of a '14 i3. Sorry, but the i3 has a telematics module and a cell modem. It talks to BMW. The first gen ones only have a 3g modem though, so at least they are muzzled in areas that have dropped that network.
I think we'll start to see ICE-to-EV conversion kits become more popular over the next decade or so. So you may get your wish, just not with a brand-new car.
I think that's a pipe dream. Changing the powerplant on a car is not cheap at all, and thats when you are doing a 1 to 1 swap with something the car is built to handle. The battery array needs to go someplace too and it is quite large and heavy.
Thanks for saying it, I didn’t have the energy to.
ICE to EV has been around for a while before EV’s were mainstream and was much more expensive ultimately than buying a new Tesla, with significantly lower construction quality and performance results.
> Wake me up when there's an EV that is just a car and not an always-connected surveillance device on wheels.
This is an easy way to excuse Tesla's behavior. You're implying everyone does it. <shrug> What can ya do, right?
There's 0 evidence other car companies are allowing PII like this to be spread internally, and employees are so brazen as to use them as the basis for memes.
I'm saying exactly the opposite. I will never consider getting an EV as long as it's full-time tethered to the mothership and my car can be deactivated/bricked/broken remotely either on purpose or by accident. I want my car to be MY property, not a device obeying inputs from someone somewhere else.
And absolutely no surveillance built in. Until then I'll stick to gasoline cars.
All that said though. I totally agree. I will never drive a car like this or one that tells me I need to update before I can drive.
And if I do get a computer car (at least one like a Tesla that runs something resembling a Linux or bsd kernel, then I would expect to be able to access that system just like the engine etc). It’s my property.
If you buy a new model of car, approved for manufacture after 31 March 2018, it must have the 112-based eCall system installed. This rule applies both to cars with no more than 8 seats and light commercial vehicles. If you have a car which is already registered, you are not obliged to retrofit an eCall device but you can have it installed if your car meets the technical requirements.
> Your eCall system is only activated if your vehicle is involved in a serious accident. The rest of the time the system remains inactive. This means that when you are simply driving your vehicle, no tracking (registering your car's position or monitoring your driving) or transmission of data takes place.
I'm not inclined to see this as anything remotely similar to what a Tesla has, and would hesitate to call it a "connected car" off such a feature by itself. The claim of "connected at all times" appears to also be a falsehood.
Is this FUD, or you got any links? AFAIK the EU regulations require only an on-board system that can call emergency services after the car detects a collision, yes it means requiring a modem that can transfer GPS coordinates (to make it easier to locate the accident), it doesn't mean (again, AFAIK) "has to be connected at all times".
Again, I'd google it myself to see if what you're claiming is true, but I have a hunch it'd just be wasting my time because someone makes a wrong claim online. But hey, maybe I should start baking the humble pie while waiting for the citations from trustworthy sources?
Edit: you've replied to other comments saying "lookup eCall". That's what I described in the first paragraph. Not "connected all the time". Throwaway commenter writes bullshit, news at 11!
Wake up. I'm told Tesla has an undocumented factory option to buy a car without radios, for heads of state and their security and suchlike.
I'm not sure how difficult or possible it is to actually get a car so configured, but you can always rip out the GSM radio yourself if you wish (which voids your warranty, natch).
When I was working at Uber, we had a case of a guy stalking his ex-girlfriend's every movement through the backend. They eventually found out, fired him, and stepped up auditing.
Didn't this happen at Google a few years ago? I recall something about a guy that was accessing his ex's Google data, like email, chats, contacts, and so on.
These days pretty sure things are much more regulated and locked down for the giants. It would take a conspiracy with multiple employees in the right groups to access that type of data and there'd be a trail that gets detected fast.
Wow! Never heard about that. The narrative is usually: Some rando, low-level employee went rogue. As soon as BigCorp was made aware of the situation, the rando was terminated and law enforcement was notified. BigCorp conducted an internal audit and found no other data was accessed. Nothing to see here.
That may not be a good example because it is not clear if it involved those eBay execs actually improperly accessing information that eBay had.
It seems to be a campaign by eBay execs who were physically harassing a couple that published a newsletter the execs did not like. That would only require finding out where they live.
I just found that out in a few minutes from information on their website and a guess that they probably live not too far away from where their company is, followed by a public records search based on that guess which verified that the guess was right and gave me their address.
It may seem paranoid, but it is sane to expect that everything that it is closed and consequently not auditable for security and privacy, and goes online on a network we have no control over, even though not a public one, will soon or later be used for spying.
It's not paranoid, it's simple observation. Anything that can be abused by someone, will be abused by someone. The question is whether the benefits overshadow the potential for abuse. In this case, they don't.
The more places where networks capable of surveillance exist, the less privacy you have, and the less autonomy you have. This has been obvious for decades. It is not surprising. No one on HN should be surprised by either the surveillance capabilities of a smart car, or the assdouchery capabilities of people who work for... ahem. I, for one, took both as a given, and I don't even work in tech.
> Tesla staffed its San Mateo office with mostly young workers, in their 20s and early 30s, who brought with them a culture that prized entertaining memes and viral online content. Former staffers described a free-wheeling atmosphere in chat rooms with workers exchanging jokes about images they viewed while labeling.
> According to several ex-employees, some labelers shared screenshots, sometimes marked up using Adobe Photoshop, in private group chats on Mattermost, Tesla’s internal messaging system. There they would attract responses from other workers and managers. Participants would also add their own marked-up images, jokes or emojis to keep the conversation going. Some of the emojis were custom-created to reference office inside jokes, several ex-employees said.
Work environments should ideally be fun and collegial. But I just would've assumed that it's incredibly obvious that managers should not encourage a culture of memeing and shitposting when the job is so centered on private user images.
This is absolutely disgusting to hear. Can you imagine if Apple or Google employees were caught doing something like this with photos/videos taken with their products?
It's not linked in there, but the Wired writeup adds that (0) there were at least two SRE's fired for something like this and (1) Google chose not to report this to law enforcement, seemingly prioritizing its public image over its users' safety.
I mean, the Google one seems much worse because this happened:
> Google engineer who allegedly used his internal clearances to access private Gmail and GTalk accounts so that he could spy on and harass people, including four minors.
In this article, it says some people in Tesla posted a few pics of funny signs and pets on an internal message channel read by some tens of people, and at least one intimate, but not nude pic. Which is not what I thought when I first read the headline here.
I guess there were some more concerning things, like someone approaching the car naked, but it doesn't have any context and that could be something they might be reporting as a crime for all I know. The article doesn't say one way or another and leaves us to assume that all incidents are the same as the people captioning pet pictures, but there's no clear relation.
> Google's was a rogue employee that was caught. Tesla seems to be a cultural problem where employees think it's OK.
While having it be ok as corporate culture is a bit worse, fundamentally the problem is that private personal data is available to these corporations so at that level it is the same problem.
The only systems which are safe are those which either never send private data outside of local (local to the user) devices, or, where data sent out is encrypted client-side with keys which are generated, stored and only ever available to the client.
The average person actively wants big brother to look after their messages and keys because they lose them.
They still expect the same sort of treatment that they get from their traditional health care providers and banks etc, with strong regulations enforcing the correct behavior.
Well, in my case, I fail to see how these stories are incomparable; they're not identical but they seem, at least, comparable. What makes them so fundamentally different that someone is wrong to share that story as an example of "something like this" story?
Tesla's example is most alarming to me since people who seemingly have no business purpose to access highly sensitive data have access to it. Culturally people feel safe sharing this data on internal chats which means they don't think coworkers will report the data access violations and since the content is spreading "like wildfire" there's a significant number of people at the company who are abusing data access as opposed to an individual abusing their elevated access.
Absolutely. Just like Amazon, Facebook, Uber, NSA and other employees have done so as well. The only reason Apple or Google employees might not do that is encryption and decent access controls, not because they are better people. It would not shock me in the least if a Google or Apple employee did something similar.
It might shock me a little just because of the risk/reward if they were just snooping on random people. But not if it was exes/crushes/friends/etc.
Facebook has had access auditing tools for at least 15 years at this point, and it only got stricter with time. It was already a fireable offence back then, too, to improperly access someone's data without a business justification.
From what I heard they did indeed have some LOVEINT problems before that i.e. an employee couple with a nasty breakup - but they were also orders of magnitude smaller.
> It was already a fireable offence back then, too, to improperly access someone's data without a business justification.
Is this supposed to be comforting? "without a business justification" means that if there is business justification (e.g. more revenue) they will violate privacy.
> Have you tried the Mail search? I can’t find things that are definitely there.
I hate the Mail search, because I know it misses all kinds of things. I have to go open up my gmail account in the web browser to find things that I know exist.
That's kinda like saying that Best Buy shared photos and videos from a customer's iPhone. Different from data ingested during normal operation and being processed by the corporation.
Have you heard of Ring? Alexa? There are countless stories of the data from these devices being harvested by Amazon, and even sent to law enforcement to use against their owners.
We've come to expect tech companies to surveil us constantly, so the idea that some Tesla employees could access this footage is unsurprising. I think, however, when a customer buys something this expensive they must have some kind of expectation that the data would be used in a responsible way.
The fact that these videos are posted as memes and jokes about customers' private lives, is... well, terrible. Tech startup culture has certainly been known to foster this kind of immaturity (I have very well seen it first hand...), but coming from a car company - as recently as 2022, if not even later - this is horrifying.
I'd like to think that the major auto makers are not infected with the kind of culture that would condone this sort of thing, but maybe I'm just naive...
This has been the number one thing stopping me from getting a Tesla. I can't believe how little privacy concerns have come up with them. Until I can have fine-grained control of what data gets sent to the mothership, or opt out entirely, I can't see myself buying one.
Yeah, you can definitely trust Tesla to keep their word and safeguard your data. I mean, otherwise there would be stories of employees passing around video footage stolen from customer’s cars, right?
I personally see a big difference between the reliability of the software enforcing the opt-in/opt-out, and the reliability of young humans hired cheaply to label video clips that were opted-in for sharing, who then see something salacious in the clip and can’t resist copying it. I assume this happens universally with all these cloud connected microphones and camera products.
My general approach is I can trust that the data is not being exfiltrated from my device/car/network if I have configured it not to be uploaded to the cloud.
But if agree to share the clips and have them viewed by humans, then I certainly expect those humans to look at them, and yes potentially even laugh at them.
Data Sharing you can turn off, but Data Sharing is different from data sharing as a concept. The telematics appear to get sent regardless of what you do. It seems they intentionally chose this Data Sharing label to deceive people into thinking that all data sharing would be turned off.
Can you clarify on this and give some kind of citation or security audit of opt-out testing? Which third parties were these? Working with Tesla or auditing as a separate entity?
The biggest part of my question and what holds me up the most, if the entity was able to verify the metadata from a certain application/process on Infotainment - what if the manufacturer enforces the opt out at the server level? What if it's a server that Tesla doesn't own and belongs to a company are selling their data to? I could go on and on with these questions. The truth is that nothing should satisfy you if you simply opt out from a button on the Infotainment system.
My last position involved creating and enforcing security specs at one of the biggest auto manufacturers on what was then fairly cutting edge Infotainment systems. I also did MITM stuff to verify encryption of third party applications from suppliers. This was something I did on the side, not instructed, as protecting customer data was very important to me. One of my least favorite pastimes was arguing with managers over customer data and opt out/opt in. You may or may not be surprised about the rhetoric of the auto manufacturers - they don't look at it as the customer's data. It's their data and you are a product they can keep making money from, long after you bought the car. I lost these battles, and was one of the bigger reasons I left the company.
The only solution is to remove the Infotainment system from the car totally. Removing the antenna won't work on it's own. They have auto connecting WiFi's, bluetooth, etc.
> I expect the company will face a mass lawsuit from Tesla EV owners
Doubt it. Tesla still hasn't really faced any consequences for selling FSD capabilities that don't yet exist, and for a lot of money. They haven't faced any real consequences for removing capabilities from their customers' cars with no renumeration. Tesla owners are a loyal bunch and forgive a lot.
Single-purpose accounts are not allowed on HN, so can you please stop using an account to comment only about one thing? We want curious conversation here. Pre-existing agendas are not compatible with that.
Sure, not everyone is impacted. Some people had significant battery capacity removed without any compensation. Others have had hardware disabled (the most obvious recent example being radar).
Edit: oh, you were probably talking about FSD.
It can't drive itself yet, in case you didn't notice. Not across the neighborhood, much less across the country. At the risk of being uncharitable, I'm going to assume you're going to disagree about whether sufficient fine print exists to absolve Tesla of responsibility, legally or morally, for what they've advertised FSD to be over the years. We don't need to rehash that, you won't convince me to change my position either. If I had bought FSD for either of my Model 3s, I'd be quite irritated by now.
Privacy is important but safety is too. It’s a big difference between a Tesla model y and a 95’ Volvo in case of an accident. I don’t mean you should drive a Tesla, but personally I value more my life than my privacy.
The article mentions disrespectful locker room joking behavior around the images.
I didn't see the article touch on that this alleged mockery of "your privacy is important to us" might also include employees/contractors/partners/etc. able to use this private data Tesla collected for targeted purposes.
Such as stalking a particular person (because, e.g., attracted to, obsessed with, hated, in dispute with), or selling/furnishing information about a particular person for similar purposes.
The NSA uses private data to spy on potential dates, of course Tesla uses private photos they collect to mock customers or for prurient purposes.
It makes me laugh when Congress talks about banning TikTok because it can be used by China to spy on US citizens while completely ignoring all the US companies that can and do spy on US citizens. If they really wanted to stop this behavior they'd pass stringent regulations.
My friend works for enforcement at the CRA (Canada's IRS) and he often told me about how employees would look up the tax returns of their ex's, neighbors, etc. We need to change the culture to 0 tolerance and just remove people who don't have that ethical/moral character in them, especially in the Public sector.
> Meanwhile, one "former employee saw nothing wrong with sharing images, but described a function that allowed data labelers to view the location of recordings on Google Maps as a 'massive invasion of privacy.'"
I wouldn't trust Tesla's logging of privacy-sensitive data access by employees, but I suspect that Google could figure out which Google Maps accesses those were.
Maybe enough violated people could be found that way, such that the costs to Tesla will smack some sense of responsibility into them and any other tech companies that needs it.
Sounds like another Reuters beat-up. Start with dislike of Tesla/Musk, then write a "special report" on that basis.
> "Reuters contacted more than 300 former Tesla employees "
300! They tried extra hard to dig up dirt.
> "all speaking on condition of anonymity"... Of course. They don't work there any more, but happy to sling mud anonymously for Reuters clickbait.
> "Reuters wasn’t able to obtain any of the shared videos or images"... Oh no! You'll have to write extra-descriptive words about the "shocking" images.
> "The news agency wasn’t able to determine how widespread it was"
So what exactly was Reuters able to determine? Not much. By the sounds of it, users elected to share video, and some 20-something meme junkies at Tesla shared a few images internally... which is slap-on-write level stuff. Nobody kept any of the images, nothing leaked except anonymous stories from ex-employees!
> "Some former employees said the only sharing they observed was for legitimate work purposes"..
So... contradictory reports about images that don't exist, that might have been shared internally, to a degree we can't determine. Got it!
Seems like a lot of commenters came away from this article thinking that Tesla employees are seeing and sharing images from inside the car?
Because that’s not what was reported, and it’s not what happened. But they certainly didn’t go out of their way to make that clear.
If the car drives past someone who is naked, the cameras record it, and the driver has opted in to data sharing, then someone on Tesla’s labeling team might see it.
I think it’s bad the videos were being pulled out of the controlled labeling system / screen-shotted and shared.
This is video that drivers opt-in to sharing with a very clear description of what is being sent back to Tesla in the UI. It obviously doesn’t say they’ll make memes out of it, but it’s no surprise at all Tesla has snippets of the external camera video feeds that are being labeled by humans.
> thinking that Tesla employees are seeing and sharing images from inside the car?
It's worse! The cars recorded videos (while off) of private residences, garages etc. The lead quote is "we could see them doing laundry and really intimate things. We could see their kids"
If that doesn't cause you concern, I don't know what would.
The car doesn’t have an “on” and “off”. There is no switch. The car specifically has Sentry mode for recording continuously while parked.
There’s a very detailed Data Sharing settings right in the UI where you turn on or off sharing any clips with Tesla, and you have to click Yes for the car to share these clips.
Let's allow for a minute that the existence of the privacy-intruding footage is, as you insinuate, the user's fault. Is it also their fault that it was shared for entertainment and turned into memes?
I’m saying the car owner opted-in to send the footage to Tesla. It’s not a matter of fault, they chose to share their video clips.
The fact that the humans doing the labeling are “entertained” by certain clips… how could they not be? Imagine what that job is like for a minute.
Take this ProPublica report about Facebook hiring thousands of contractors to read through private WhatsApp messages for instance. [1]
Did users in that case opt-in for their messages to be read? No, in fact Zuck went on the record saying they never see them. But turns out when another user on the chat flags messages as spam or abuse, or an algorithm scans it and decides it could be illegal, they do look, and even report it to authorities.
Going back to Tesla, the article says video clips were allegedly shared on an internal chat between the labelers. How is this in any way surprising? The job is to draw bounding boxes around objects in an endless list of videos. Short of implementing some sort of mind control a la the show “Severance”, how could we possibly expect that the employees who do the labeling all day not talk and joke about the strange or scandalous things that they came across?
> If the car drives past someone who is naked, the cameras record it, and the driver has opted in to data sharing, then someone on Tesla’s labeling team might see it.
I'm not sure how the opt-in procedure works for Tesla. Is it truly opt-in? That is, does a Tesla driver positively select to share data, or are they "opted-in" by default? If the latter, are they presented with the option to opt out, or do they have to dig around in the menu to find where to opt out?
Regardless, it's not informed consent if Tesla misrepresents how it uses that footage, as it seems they do.
They've also got a really different definition of anonymous data than I've ever seen...
You’re familiar with the Anker/Eufy cameras storing all video unencrypted and unprotected in the cloud?
Or Amazon sharing all their Ring footage and Alexa recordings with the police?
Almost all these devices share their recordings to the cloud and have employees watch/listen to it. Tesla at least asked first if you want to share the clips.
I know of at least one company that was looking at using Teslas for their executive drivers, but as soon as the security team looked at the always on wireless connections and interior microphone the plan got axed. If that's the conclusion a small team of security guards comes to at a relatively small manufacturing company, I can't imagine the headaches larger teams have at bigger, more sensitive companies.
> The car crashing into a kid on a bike I can understand
I don’t think they were looking because it related to their work. I got the impression that it was quite impressive, and some kind of disaster voyeurism.
Honestly, every time Tesla or their overlord is in the news, I get closer to wanting to sell ours.
The only three things keeping me here are:
1. It’s a waste to sell this and buy a new car
2. Tesla still have the best charging network by far
3. Tesla are one of the only car company outfitting their vehicles with decent processors. Which is extra important now that everyone moves their stuff to touch screens running Linux or android automative.
The second point will be moot soon with regulations requiring sharing network access.
The third is getting less important as more brands support CarPlay , and according to last wwdc, CarPlay is getting support to control more of your car. (Though GM went the other direction)
Which leaves the wasteful part…which maybe it’s worth it to have a car that isn’t spying on me and changing its UI around every few months in new and awful ways. Seriously, the driver can’t see the music info anymore because it’s moved to their blind spot AND it’s further for the passenger to control? Do Tesla designers drive the cars before shipping their UI?
And now this flagrant invasion of privacy, with seemingly no controls for access or opt-in? Ridiculous.
There is a Privacy section with a big button “Data Sharing”
It pops up a large dialog box which says the following;
“We are working hard to improve autonomous safety features and make self-driving a reality for you. You can help Tesla in this effort by sharing diagnostic and usage vehicle data. This data includes short video clips, using the vehicle’s external cameras to learn how to recognize things like lane lines, street signs and traffic light positions. The more fleet learning of road conditions we are able to do, the better your Tesla’s self driving ability will become.
We want to be super clear that the diagnostic and usage data such as short video clips are not linked to your vehicle identification number. In order to protect your privacy, personal information is either not logged at all, is subject to privacy preserving techniques, or is removed from any reports before they’re sent to Tesla. You may enable or disable the collection of this data at any time
Do you agree to allow us to collect this data?
Yes. No. Ask each drive.
Then there are additional sections concerning the interior cabin camera, and concerning navigation and traffic.
I believe all these sections are OPT-IN, I recall getting prompted about sharing interior cabin data after the software update that enabled it, and having the selection be blank to start, but I may be misremembering.
Yet another reason to stay with old autos. If I go looking for a new car, I want one with no internet and bluetooth anything. And if I do look for new, in the sales agreement if any internet connection/bluetooth is found, I get a full refund plus interest.
Yes, I know soon all autos will have internet and I think most already have bluetooth.
>"Tesla says there are "two types of camera recordings that are eligible to be transmitted from your vehicle to Tesla: Safety Event and Fleet Learning camera recordings." The Safety Event recordings last up to 30 seconds and are "captured only if a serious safety event occurs such as a vehicle collision or airbag deployment," Tesla says."
The article states that they could see people doing their laundry. Presumably because the washer/dryer was in the garage or else the car was parked in front of a laundromat. Neither of these sounds like a Safety Event or a Fleet Learning So why would allow a video from a parked car in a garage or in front of a laundromat ever be transmitted?
It sounds like the privacy guarantees of “camera data isn’t linked to your ID” and “the interior camera data stays in the car” aren’t really useful if all the data is geotagged, given that these cars spend 70%+ of their time at the owner’s home.
I’ll save my outrage for more facts, but if employees can look at my camera feeds without a valid business purpose (and post memes made from them without getting fired immediately) that’s a real problem. If the geotag data is freely available with the video data (and not controlled for privacy) that’s also a huge problem. A company the size and age of Tesla has no excuse for lapses like that.
while unfortunate and unethical this is not surprising.
power asymmetry between provider and subscriber of any product or service will always be skewed towards provider. act accordingly to whatever degree is needed to ensure your safety and mental health. obviously there will be significant variance between persons in that regard.
i drive non-internet ice cars with comma openpilot. i’ve used every generation of their device. it also has a driver facing camera with full interior view.
i originally ran openpilot without internet except when updating, and would delete all camera data from the device before updating.
these days i leave toggled off upload driver camera, it defaults to off. i don’t use cellular, but i do let it connect to my home wifi for the brief moment before it gets unplugged and put in the glovebox.
my thinking has relaxed a bit because of the ubiquity of cameras around me. passengers in the car have 4 cameras, some very wide angle, typically being held at face level all the time. so do people in adjacent cars or on the sidewalk.
while you can cover your own cameras with easily removable opaque stickers, you can’t do that to the other cameras which inevitably surround you. audio recording is equally challenging to mitigate.
while this incident is serious and hopefully results in legal action and changed user behavior, this type of thing will only become more common.
the right approach is something like what snowden did in his early videos. be very aware that you are constantly surveiled, and go to extreme lengths to mitigate surveillance when necessary.
these days, when typing in our passwords or sharing intimacy, we all need blankets over our heads.
that said, public spaces are not a new concept. for better or worse the distinction between public and private space is changing. we will adapt and it will be fine.
While some of the behaviour here is less professional than I'd expect for user data, I'm wondering what a good architecture for building a system to leverage visual telemetry without harming user privacy would really look like.
The problem is that:
1) It's very difficult to manually or automatically scrub identifying information from video, and doing so probably greatly reduces the value of the data for AI training purposes
2) If you allow users to delete videos that have been sent, then you no longer have a reproducible corpus of data
3) If you keep the recordings for say, a week, and allow the user to delete them before they're sent, either users don't have the time to review it (reviewing hundreds of Sentry clips comes to mind), or are going to just forget about it and we'll have the same problem.
I suspect they do not want to talk about the data collection too much in general, because just like how Spotify built a major value add (we have lots of playlists) off of tricking users into making user playlists public by default (it's actually very non obvious when playlists are created), having a huge fleet of cars recording video is hugely valuable for development of self driving.
I think where I'm conflicted is that at the same time as I personally consider this kind of thing an invasion of privacy and overall a nuisance, I recognize that there is societal good/value to be gleaned from mass collection of datasets. If we asked users whether they wanted to share their playlists or car video recordings, you would end up with almost no data, and the data you got would invariably be biased (see Apple Music and the paucity of good user playlists).
Clearly there's a missing middle solution, but I'm not sure there is a good one.
> While some of the behaviour here is less professional than I'd expect for user data
It's like showing up to a school shooting and saying while some of the behavior here is less professional than I'd expect for an interaction with children...
If I have already paid 10s of thousands of dollars / pounds for your car why am I also generating data for your AI garbage as well?
Opt in if you like, with a coherent discussion and understanding of the privacy risks. Have a "tesla club" where you opt in and get sent a bottle of wine every month or something, but the attitude that you should get this information by default, for free, is fucked up.
> having a huge fleet of cars recording video is hugely valuable for development of self driving.
Ok. But so what? Doing medical experiments on large, unknowing populations of humans might be hugely valuable for development of medicine, but we don't do it. Claiming there's some potential future benefit to violating someone's autonomy today is weak sauce at best and highly unethical at worst.
> I'm wondering what a good architecture for building a system to leverage visual telemetry without harming user privacy would really look like
For starters, employees should have no way to save images from customers. No way to export them, load them on a USB drive, FTP them, or take a picture with their smartphone. If you need people to do tagging, fine, do it [WITH PERMISSION] on special workstations in restricted areas.
This is the solution and something that most companies refuse to do. At best there's some kind of opt-out, but companies love hoovering up everything they can get their hands on whether you want them to or not. It's bad enough when it happens in a corporate-owned space like a grocery store, office, or mall, but when it's happening with items you ostensibly own on your own property it feels fucking criminal. I don't understand why consumers and legislators continue to tolerate it.
Yeah, clearly we should be sympathetic to the difficulties AI researchers have in response to an article about how regular employees are sharing pictures of people being intimate in cars.
First thing I did after getting my Model Y is adding a nice sticker over in-cabin camera. Intentionally or not those incidents will happen. There is no place for a camera inside personal car, but sticker is an easy fix too
It's not clear from the article that this footage was from the cabin camera. Filming garages should be off limits, but footage from public roads and parking lots doesn't seem as outrageous.
Although this article seems to imply a lot of things, the only real examples it gives are cases of accidents being uploaded to Tesla and then employees sharing these videos. I don't think you have to worry about normal driving, this seems to be more about edge cases which are reviewed by humans for their FSD.
That said I understand the concern about having a camera in your car which will occasionally upload its content to Tesla
Whether I'm right or wrong, my perception is that this is just the kind of nonsense I could imagine Elon Musk having a great laugh about. That must feed through to the employees.
I won't touch a Tesla while that foolish child is in charge. And, even then, the brand is so damaged (in my eyes) that'll it'll take a decade for me to even think twice about them again.
So glad I thought better about buying one a few years ago when I was a huge fan of Tesla. I delayed the purchase for some practical reason and have seen their brand decline ever since.
> About three years ago, some employees stumbled upon and shared a video of a unique submersible vehicle parked inside a garage, according to two people who viewed it. Nicknamed “Wet Nellie,” the white Lotus Esprit sub had been featured in the 1977 James Bond film, “The Spy Who Loved Me.”
> The vehicle’s owner: Tesla Chief Executive Elon Musk, who had bought it for about $968,000 at an auction in 2013. It is not clear whether Musk was aware of the video or that it had been shared.
Does that mean that up until now you thought Tesla is a company full of ethical people? And when I say "ethical people" I mean either the low level individuals wouldn't go violate their customer's privacy, AND the higher level individuals would put in place security measures that would make it impossible for the low-level employees to violate their customer's privacy?
I had considered buying a Tesla, but didn't because the infrastructure isn't good enough yet where I live. But now, no, my next car will be an EV from any other manufacturer. I had no idea that Teslas had the ability to send video back to the company.
When they first started the Model S (I wasn't that thrilled by the Roadster), Tesla was my dream.
However, as time has passed, the shine has worn off, more and more. I'm not thrilled with the antics of the Tesla CEO, but I haven't really let that influence my opinions of the car.
It took some time, but all the other carmakers are starting to come out with some damn nice e-cars. Tesla had a great head start, but the tortoises are catching up...
I’d expect that they don’t let employees have “easy access to the cameras' output” and for them to be able to share them “freely with other employees”.
There's room for both acknowledging that someone is a victim, and that this was fairly predictable. Just as a mugging victim doesn't deserve a mugging, they probably shouldn't walk down dark alleyways at night. And if they didn't know better ... well, that's a problem too.
Which says to me that the situation should have been predicted by the legal and HR teams before this division got fired up, leading to a robust code of conduct and compliance that would prevent this from happening in the first place, with heavy penalties for abuses and breaches.
It's not victim blaming. They opted in to this when they bought a Tesla; the privacy policy clearly allows Tesla staff to access all data the car generates.
It's available to anyone to review before purchase. If you don't want it, don't buy a Tesla.
It is victim blaming. No reasonable person can keep up with all the privacy notices in their life. Did you read your cars privacy notices? How about your email? Your phone? Your kids school website? What about the terms and conditions for all of them?
By the way, the Tesla privacy notice specifically says "no one but you would have knowledge of your activities, location, or history" but this article proves that's a lie.
> Did you read your cars privacy notices? How about your email? Your phone? Your kids school website? What about the terms and conditions for all of them?
Yes. This is why I physically removed the GSM radio from my car, host my own email, use an outbound firewall on my computer, and don't have an OpenAI account.
The terms matter. Clicking past them as if they don't exist isn't a reasonable action.
Yeah, it is victim blaming. The expectation is that they're going to use that data to improve the vehicle, or something within that realm, not turn the videos/images into memes and then pass them around for the yucks.
Edit: I'm open minded; in addition to downvoting, I'm genuinely curious why folk might think that I am incorrect here. Happy to correct my thought process.
Having sex in front of any camera tends to end in exactly one way.
Their product line spells out "S3XY." Public technical failures are glossed over or ignored. Anyone expecting integrity from this company needs a reality check.
In this case, it wasn't ok, insomuch as they experienced consequences that they wouldn't have if they had known or had been more skeptical.
We should hold the companies accountable. But maybe we shouldn't be so quick to reassure people that they couldn't have prevented things like this, so that when the next predatory company puts out a predatory product, there will be a sort of collective knowledge that relying on a corporation's good will is probably not a great idea.
Then we might actually have people reject products that do user hostile nonsense, and the market will have to respond.
In other words, this kind of thing is the teachable moment that can make it so everyone can be aware of the things that we are.
Unnecessary victim blaming. The implications of technology like this are significantly less obvious than you think they are. It's not even very obvious that Tesla's have in cabin cameras. It's not mentioned anywhere during the order process, it's a tiny footnote in the manual and the camera is very very very small. Backup cameras are standard on new cars nowadays so every car has at least one camera on it. Almost every new car has constant remote connection too for features like remote start or road side assistance. How do you know a Toyota prius isn't uploading footage?
This is a horrifying view of the world even on its face, but also: even if we anticipate that people might do the wrong thing, it is still wrong and it is still appropriate to condemn that wrong and call for justice.
Yes, fine, but lacking all naiveté is a horrifying view of the world. We must have mechanisms for achieving justice after wrongdoing, and have faith that we can trust each other to do the right thing.
Cf. Shaw: “The reasonable man adapts himself to the world: the unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man.”
You can of course blame the victim while also blaming the perpetrator. As you point out, they're not mutually exclusive. But if your communication to your friend focused exclusively on her actions—like the comment does—then yes, I think that would be worthy of criticism.
> When a friend's car gets stolen, I reserve the right to condemn the thief and tut-tut at her for leaving the doors unlocked.
That's a jerk move. And a terrible analogy. Using the car as intended and having the manufacturer use the cameras for something other than their intended use is not the same thing at all as failing to use the locks for their intended purpose.
Might as well tut-tut at you for using the Internet the next time one of your devices gets compromised.
Piss off with this low grade comment. Obviously nobody expected that Tesla employees would be sharing photos of them fucking in their work chats for their own amusement.
There will always be such employees, which is why actually privacy-minded companies strictly limits access to personal information, and carefully monitors the access of it.
It would be even better if that data was not collected to begin with. Alternatively, that this data would not leave the local hardware. But no.. everything has to be connected.
I'm legit "paranoid" (if you can even call it that at this point lol), but c'mon. Why blame the people who were promised privacy (like, I don't know, were they? Like isn't it a given that they were at this point? If I'm wrong, please do correct), and that promise was breached by the promisers?
It's one thing to say, "It's expected at this point but they should have upheld their agreement to maintain privacy and respect for their customers," vs. blaming those whose privacies were betrayed.
What they expect and what they are (currently) legally entitled to can be separate things. Same goes for the rights you were taught that you have in school and how they actually work in the legal system.
Turns out when I was buying lunch, he was on the phone with a friend who worked at Paytm and that guy gave away my transaction history for shits and giggles.
My trust in private companies has been at it's lowest since then and I absolutely do not trust startups to keep my data safe.