Hacker News new | past | comments | ask | show | jobs | submit login
Siri records fights, doctor’s appointments, and sex, and contractors hear it (arstechnica.com)
267 points by vatueil on July 27, 2019 | hide | past | favorite | 122 comments



> “These recordings are accompanied by user data showing location, contact details, and app data.”

This should be in the title.

Media was focusing on contractors listening to Google Assistant snippets not long ago — and those were anonymized. Private and identifying information was in the snippets, not metadata (the article mentioned this).


I think the story is possibly confusing this bit. When recorded, and transmitted to Apple, the location, Apple ID, etc. of the recording device is known. The same (similar) would be true for any voice assistant.

When the contractors are reviewing snippets for training purposes, this data is anonymized. In the case of Apple;

> Apple has said that it takes steps to protect users from being connected with the recordings sent to contractors. The audio is not linked to an Apple ID and less than 1% of daily Siri activations are reviewed."


It's a direct quote from the whistleblower:

> The whistleblower said: “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”

> That accompanying information may be used to verify whether a request was successfully dealt with. In its privacy documents, Apple says the Siri data “is not linked to other data that Apple may have from your use of other Apple services”. There is no specific name or identifier attached to a record and no individual recording can be easily linked to other recordings.

> Apple told the Guardian: “A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”

Strangely, the whistleblower's quote directly contradicts Apple's privacy documents. The whistleblower says the recordings are linked to some other information, but the privacy documents say the recordings are not linked to other information.

But the whistleblower's quote does not contradict Apple's quote to the Guardian. The whistleblower never says the recordings are linked to the user's Apple ID.

https://www.theguardian.com/technology/2019/jul/26/apple-con...


1% is an unexpectedly high number.


It's a huge number. I am surprised they have the manpower.


In fairness gp says less than 1%.


I would expect 'less than 1%' to be 0.1% to 0.99%. 0.1% is still a high figure.


I would bet it’s closer to 0.99% otherwise wouldn’t they have phrased it something like “slightly more than 0.1%” to make it sound smaller?


If I was talking to a reporter and the real number was .01% I would still say "less than 1%" to make sure I was properly understood. The whole reason we use percentages and say "one percent" instead of "point zero one" is that some people tend to get confused with decimals but not with integers.

EDIT: And part of this is just how we say things. "Twenty" is just easier to parse in a conversation than "Two Oh" even though the later is perfectly comprehensible when written out.


1% vs .01% would be overstating the truth by 100x. If you were a spokesperson for apple you would want to make that number seem as small as possible.


"One in ten thousand" is clear enough for everyone though.


I wish the article was more clear on this.

It takes a lot of work to properly anonymize the data. The question is: what information does the operator see?


As I understand it, the data was not anonymized. They simply asked the contractors to sign a non-disclosure contract.


> When recorded, and transmitted to Apple, the location, Apple ID, etc. of the recording device is known.

Actually, according to Apple's statements Siri does not link recordings to your Apple ID.

That's a bit of a double-edged sword, though, since it means users cannot easily check what Apple has recorded either. Judging from public reactions, it seems many users think Apple doesn't keep Siri records at all simply because they can't access the audio history.

Reportedly, the only way to avoid letting Apple manually review audio was to not use Siri at all, unlike other voice assistants. If so, I wonder if that was also due to the lack of Apple ID, since if audio isn't linked to a user then it follows that it isn't linked to user preferences either.

I suppose you could still do something like send data from users that allow QA testing to one system and from those that opt-out to another, but apparently that wasn't done.


Note that while the Google recording data was arguably "anonymized", the reviewers of the leaked recordings were able to in many cases find enough personal information to contact the people recorded. Which means they were not, in fact, anonymized.

"We let ordinary Flemish people hear some of their own recordings. ‘This is undeniably my own voice’, says one man, clearly surprised."

"A couple from Waasmunster immediately recognise the voice of their son and their grandchild."

https://www.vrt.be/vrtnws/en/2019/07/10/google-employees-are...


It's largely impossible to anonymize data you don't understand, isn't it? They have somebody transcribe the recording because they don't know what it said, so they can't know whether PII is in there. I suppose they might try to split it into multiple parts and have different people work on each part, but that may lead to problems, as you need the full sentence as context in some cases.


This. There's a nice paradox. Until the AI is good enough to understand the voice data, it can't know that it isn't privacy-violating -- up to and including extremes like somebody telling their most intimate secrets. Exposing this data to any other human, whether a subcontractor or internal employee, is a potential privacy violation, full stop.


I would rather the voice processing happened locally on your device. We must be getting to the stage where that's becoming viable? If not entirely in software then via a custom chip.

Even if that transcript then still needs to be sent to the cloud for processing (eg for translating human sentences into logical actions might still be out of the grasp of home devices), developers can at least then send the data tokenised (which wouldn't be possible if it was a compressed audio stream) and then get a server response referencing those tokenised place holders.

A simplistic example of my point might be

  Message:
  I need to book a doctors appointment for erectile dysfunction.
  
  Sent:
  I need to book a $ARG[0] appointment for $ARG[1].
  
  Received:
  {
    "action": "ring";
    "subject": "$ARG[0]";
    "message": "appointment for $ARG[1]"
  }
I appreciate something like that wouldn't be infallible but it would still go a long way to redacting sensitive or identifiable information and thus a huge step forward for privacy.


Another factor here: there aren't so many Flemish speakers, so the probability of a recording being reviewed by someone who can identify the speaker is much higher.


In this case the user had read their address out, the reporters went there and spoke to them.


> and those were anonymized

Hard to believe for Google context. They always trying to get the max possible data.


When they give the recordings to reviewers, they anonymize it, but they still have all the data associated with it.


It's a very relative concept. You can always omit some part of the record pretending you're "anonymizing" it, while the rest is easily correlated with a particular person.


Many HNers have a weird belief that Apple is somehow a privacy focused company, despite it actually being no more private than Facebook or Google.

The main concern people have (voice assistants beside) is data gathering for ads, which Apple does for their app search, among other places. Also, their data centers are in China, which gives Chinese authorities full access to their iCloud account.

If you want privacy, don't rely on corporations. Host everything yourself, and encrypt your data at rest and in flight.


> Many HNers have a weird belief that Apple is somehow a privacy focused company

Note how the top comments here branch into criticizing Google?

It takes a lot to change people's opinions in such matters. It just easier to repeat the same old same old arguments and continue with our lives...


Google is the worst at privacy. Even when Apple or any other company fucks up it's always worth noting that, maybe it dents their undeserved reputation even more.


Arguing over which tech behemoth is worse at privacy is pointless. The fact is that they are incentivized against privacy. Your data is valuable. The cost of anonymizing your data is expensive, and offers little measurable ROI.

Some companies work on providing the illusion that they care about privacy. But at the end of the day, unless it's convenient for them, and does not interfere with their goal, you should assume the worst.


No, because privacy in today's world is not black and white. Apple is definitely much better than MS and Google, so people that e.g. buy Apple smartphones have better privacy than Android users.

This is not a theoretical concern: I can buy an iPad for my parents, turn of iCloud & Siri, install an Ad-blocker and they will be reasonably safe from cybercriminals and spyware companies like Google.


Their datacenter for Chinese customers are in china.

I expect them to take reasonable precautions, mistakes do get made. I'm not a user of voice assistants, I'd rather use the device directly, its faster anyhow.

To quote from the article "Voice assistants appear to be yet another instance where a technology has been developed and adopted faster than its consequences have been fully thought-out."


There are at least a few instances where using a voice assistant is far faster than using the device directly. An example that I use frequently is something like “hey Siri remind me to change my furnace filter in 3 months”. In many cases using a voice command/shortcut is quicker than navigating to an app and locating the ui to do it.


I'm particularly fond of "hey Siri, when I get home remind me to take out the trash" as a shortcut.


> Their datacenter for Chinese customers are in china.

Why is it okay for an American company to allow foreign, human right violator governments to spy on their citizens?


Okay? It's not really. But it's the price of cheap labor in China and any other incentive they have for doing business there. Everything done in China comes with a no-privacy guarantee you can't take a shit in China without them knowing the color and consistency.


I agree with your general recommendations to not rely on corps and to use crypto, but it's not a weird belief, it's a fact. Google is bar Facebook the worst company in the world at privacy and their whole business model is focused around subverting privacy.

There are many studies done by non-profits and government organizations which prove this. They also repeatedly got fined for their rotten practices.

That doesn't mean Apple are saints. Any US corporation is suspect and a feature like Siri is broken by design and can't be implemented with privacy in mind.


Google makes business off of selling ads. Apple makes their business off selling hardware and software.


Increasingly 'services', for which they want to lock you in for rent.


Both models aren’t that different. One is to show you better ads, and make you buy stuff, and other is to make iPhone better, and make you buy it.

In both cases they need your data to make money.


Knowing your users and how they use your software is too big of an advantage, and it seems users doesnt care, or is there a market for users willing to pay for privacy!?


This is a "blue ocean" situation. Plenty of people would care if they understood the hazards of the status quo and a viable alternative existed. People can already make their own privacy-respecting phone, but most people aren't up for it. Someone will have to get close enough to the UX polish of Apple and Google to pull people away.

https://en.wikipedia.org/wiki/Blue_Ocean_Strategy


I think the most important piece of info isn't even in this article. The Verge article I submitted[0] had this quote:

"Additionally, as The Guardian notes, while Amazon and Google allow customers to opt out of some uses of their recordings, Apple doesn’t offer a similar privacy protecting option, outside of disabling Siri entirely. That’s a particularly bad look, given that Apple has built so much of its reputation on selling itself as the privacy company that defends your data in ways that Google and Amazon don’t."

I was shocked by this because I opt out of sending analysis data to Apple at setup of my devices. I figured that would also prevent sending Siri recordings. Apparently not, though.

[0]: https://www.theverge.com/2019/7/26/8932064/apple-siri-privat...


Does disabling Siri severely affect accessibility and usability for physically impaired people?

Also how is this on other platforms, are disabled people forced to unfairly compromise their privacy to properly use IT today?


The thing is, I don’t even want siri to do anything smart.

It needs to: dial a number, and keep a lot of frequently dialed people so it doesn’t think I meant someone else

Set the alarm / timer

Understand the street I want to navigate to.

The whole smart thing, looking up things on the internet or contextualized answers don’t work, and they’re frustrating as hell.

Also, I want Siri to work offline. So basically I want voice recognition with a short set of instructions I’m happy to memorize.


Sadly, most phones used to have this exact technology before the 'ok, google' stuff became ubiquitous. My older android did exactly these things and it was great. I could even send text messages that weren't terrible.


It would be great if people could review what voice recordings were sent to the companies past week. After that people could make a decision whether they want to continue using the product.


This would be a security issue. What if your SO's private conversations are reported to you?


I agree that would be an awkward situation for myself and SO, but I disagree that it would represent a security issue. If the data is being sent to the vendor, I believe we should be able to audit that data. IMHO, deciding otherwise is giving far too much power to the vendor.


You seem to be implying it is safer to send a recording from my phone to a huge corporation where it will be heard by any number of people I don't know than sending the recording to myself, the owner of that phone.

Surely not. Perhaps that's not what you meant?


I suppose, if given the choice, most people would prefer their most intimate secrets to be leaked to a total stranger than to the people closest to them. Of course, both situations are bad, but the latter is arguably worse.


The original Guardian article makes it clearer that the metadata isnt accompanying the audio, this article does not. That said, this article closes with a poignant and telling quote.

"Voice assistants appear to be yet another instance where a technology has been developed and adopted faster than its consequences have been fully thought-out."


One thing that happens to me is sometimes I’ll accidentally send people random texts from my watch while I sleep. I’ve started putting it in “water mode” and also airplane mode just before I go to bed to help keep that from happening. All I need is to accidentally text my boss an audio from having sex with my wife.


This may be tangential but I’m regularly amazed at how bad Siri is at basic things still.

I’ll try to use it for things in front of guests on the HomePod and Siri is embarrassing half of the time


I've never quite gotten the allure of voice assistants myself, it just seems like a really frustrating way to use a computer.


I use it sometimes. For household use I pretty much agree: my partner has Alexa set up to turn lights on and off, and it only works about 70% of the time. Apple doesn’t, I think, say it’s Siri which is running this keyboard which I am using right now to “write” this content, but I assume the voice recognition engine is the same one, and it’s significantly more useful than Alexa. For example the only edits I have made while dictating this have been where I change my mind about what I wanted to write, not any transcription errors. It is certainly much more useful, when I wish to take notes, than the on-screen keyboard.


I agree, and then that technology is put to use on the pretty well put together accessibility tech showing up in iOS 13:

https://youtu.be/aqoXFCCTfm4


"my partner has Alexa set up to turn lights on and off"

Do you really need the help of a megacorp to turn lights on and off?


Smart light systems are really good. You can be in bed and say “goodnight” and all of your lights turn off. It is convenient. Schedules are also helpful. I use them on grow bulbs for plants.


> You can be in bed and say “goodnight” and all of your lights turn off

Well that is qute, but the flip side is that FANGs and gods know who else may be listening in.

> Schedules are also helpful. I use them on grow bulbs for plants.

Timers for turning appliances on and off have been around for decades.


I'd sooner use a car for grocery shopping (10 minute walk) than retrofit my bedroom so I don't have to take another step.

However! It is exceptionally useful for people who are injured, sick, or disabled in some other way.


No, but it does feel very Star Trek.


Yea, with star fleet analysts keeping tabs on crews.


Lots of episodes would’ve been much shorter and more boring if they had 2014 technology. How many times did someone disappear from the ship without anyone noticing? Or that time one of the Marquis on Voyager literally murdered another crew member?

Don’t get me wrong, I appreciate that fictional scenarios are a terrible justification for anything in real life, but it can still be a lot of fun to use the technology described in those scenarios.


As someone who doesn’t use any digital assistants, voice activated or not, I’m facinated by the marketing and media hype. It’s still not clear to me why anyone would want them.


(Safely) Creating a reminder while driving is very handy.


Voice memos were a thing before we had voice assistants. Hell in the 15-20 years before that we even had plenty of automatic voice-memo systems that could be activated with a simple command.


I don’t think you should be doing that while driving. The one argument that I do see is while cooking and your hands are covered in flour or something. That’s just a really niche usecase.


People shouldn't talk while driving? Yikes. I mean, I know it affects concentration and I suppose that means it probably increases accidents at some level, but speaking a thought out loud when you remember you need to do something seems pretty innocuous to me.


It's great with my sometimes inadequate working memory. "Hey Siri, ..." is faster and more reliable than trying to get into an app and do a thing before I forget what I was trying to do.


I know several people who use voice assistants because of visual impairments. Touching, swiping and tapping is several times more harder with a visual impairment.


In my situation, I noticed having a Bluetooth equipment paired made micro interruptions that made the voice impossible to understand. I noticed it when I called some friends and they told me.

Since unpairing the bluetooth, Siri is performant.


Apple is marginally better than the rest at privacy, but their marketing centers around it. If you're naive enough to fall for their marketing, this should be a wake up call.

This, and the zero resistance Apple offers China, when they demand iCloud data of Chinese citizens.


> If you're naive enough to fall for their marketing, this should be a wake up call.

I admit that I was naive enough..


Apple is playing fairly dangerous game, positioning themselves as some kind of saint, or tron, fighting for the user.

Firing shots at all tech companies for collecting data can backfire at them. Like it does in this case. It shows that they’re likely overselling what they can do without your data, and to deliver they your data anyway - they just try to be quiet about it.

Of course there are different level of data collection, and I’m not making a call if what Apple is doing here is bad. But they have spent last few years building narrative that any data is bad, as it allowed them to differentiate themselves.

They built their marketing around it in last few years. But it’s likely that they won’t be able to deliver on it. With AI being hot and constantly needing more data to improve it, Apple needs to get this data somehow, and their narrative makes it hard, unless they’re willing to mislead their customers.


I have a theory that Apple chose the privacy route because it was their only serious way to compete. Apple simply doesn't have the AI chops to compete with Google.

They were already years behind in ML when they started their privacy marketing and possibility even further behind now. Since they can't compete on ML features their only other option was to compete on privacy.

Apple is taking the calculated risk that people will be willing to sacrifice features in exchange better privacy. If they're right it will pay off massively. If they're wrong it could be very bad for them.


The librem cannot come fast enough.


I'm overdue for a phone and I am at this moment reaching religious faith levels of hope for this librem phone.

please release it!


Why doesn't Apple et al let users know what they're sending and at least have a way to view and delete sensitive data before it's reviewed. Maybe store it online a month before it's reviewed and pop up a notification that you have Siri data queued for review. Most people would listen if it's something simple like directions would leave it but anything private they could remove.

Or have an audio queue when audio is being recorded some heartbeat like click so every few seconds while recording you hear a click and are like oh shit and can tell Siri not to send that recording.


Let's do this again next month for Alexa!


Ordered a nice far field microphone and raspberry 4. Gonna try to build my own voice assistant with Mycroft.


Does Mycroft need a network connection for voice recognition to work? I wasn't able to conclusively tell from a quick scan of their website but their Get Started document[1] says "Mycroft devices need to be paired with your Mycroft Home account to work", which sounds like it does.

I had been intending to do something similar to you, but using Snips[2] since it supposedly has on-device voice recognition (and use of Rust, which I like).

1. https://mycroft.ai/get-started/ 2. https://docs.snips.ai/getting-started


The Mycroft Home stuff covers Google Speech to Text, so I'm betting that overall speech processing is done in the cloud, though I'm sure wake-word detection is done locally.

https://mycroft.ai/documentation/home-mycroft-ai-pairing/


Yeah - but not google - they switched to mozilla deepspeech. You can run deepspeech locally, but needs something reasonably powerful


Yeah it uses a mozilla cloud speech api. I gather one can run it locally though (Deepspeech).

So not too worried about this aspect.


Wait, isn't it possible to research this by setting up a hotspot that shows the traffic on your computer?

Or would it be an issue because it's encrypted? Couldn't you then still at least correlate traffic with sound/voice?


Audio has to be sent to do the recognition. The only question is whether the company then turns around and saves it and sends it for further analysis. You can't track that step.


Every recording has to be sent to the server, there is zero local processing for voice assistant except recognizing "Hey Siri".



Someone needs to invent a soundproof phone case. When it's closed; no light or sound is let it.


Quelle surprise; Tim Cook has acted in a hypercritical and sanctimonious fashion to further the capitalist aims of his company.

I am disappointed in Apple. Very.


> Voice assistants appear to be yet another instance where a technology has been developed and adopted faster than its consequences have been fully thought-out.

What a rubbish. This is totally expected and thought-out. It's even worse, this is probably the #1 reason voice assistants exist at all, to provide these companies with even more valuable data. I cannot imagine the existence of Siri (or any other voice assistent) without a business model behind it.


Um, could the business model be to provide value to users in order to make the systems they are loaded on more attractive products?

Sure, a huge part of the Siri/Alexa/Assistant business model is probably data collection. But does that mean all voice assistants ever inherently have no real end-user value? I don't regularly use them, but I would be very happy to find a reliable way to change music and send messages with my voice while driving.

Before the era of mass data harvesting, arguably useful utilities like the MS Office suite were loaded onto OSs to make users more likely to stay on that platform.


If Siri reviewers encounter evidence of certain crimes such as murder, sex trafficking, pedophilia, or domestic abuse, it is obvious they have a basic moral duty to report the crime and turn over the evidence to investigators.


As an aside, it’s highly ignorant to refer to pedophilia as a “crime”. Please be aware that pedophilia is not in fact a crime at all. It in fact refers to the attraction one had toward pre-pubescent children. Be aware that this is entirely different than being a child rapist. I assume you meant to say “child rapists” in your statement.

In the same way it’s generally accepted that homosexuals and heterosexuals don’t choose their sexual identity, neither do pedophiles. One could say they’re “cursed” to be given that sexual preference they were born with. Since most are not both pedophiles and rapists, most are doomed to live a life of unfulfilled attraction. The majority of pedophiles recognize that rape is wrong and would never consider it. To equate pedophilia and rape is flawed as equating all neck-beards as obviously rapists (since presumably they’re so sexually repulsive they could never get a woman to agree to consensual sex). Obviously the majority of sexually repulsive males don’t just feel entitled to rape at will, neither do pedophiles.


> One could say they’re “cursed” to be given that sexual preference they were born with.

Were they, though? I did some light research for a university course many years ago and from what I understand/remember, many pedophiles were themselves abused in their childhood, which can really mess up a developing brain. Still cursed, just in a slightly different even more tragic way.


> Were they though?

I suspect the propensity to rape is what is learned. I’d question whether the pure attraction is formed by the abuse you describe.

We should recall those a decade ago, suggesting that homosexuality was a “disorder” needing to be cured. A disorder implying that it was caused by some incident. Clearly that line of thinking is outdated and offensive to many.

Either way the source is of the attraction is moot. The fact is we live amongst pedophiles and there’s no reason they should be reviled.


CSA survivor here. Can you fucking not?

Maybe you're even right, and there's no reason they should be reviled. But there's no reason they should be normalized, either. Not to themselves, not to anyone else.

Your comparison with homosexuality is false and invidious, because homosexuality exists among consenting adults and can therefore be expressed without harm. This is not true of pedophilia. Nor will it ever be. So your efforts at normalization cannot lead to anything other than an increased likelihood of harm. Having experienced that harm firsthand on multiple occasions, I have no problem whatsoever in saying that it is heinous to act in ways which make it more likely that others will suffer the same.


> homosexuality exists among consenting adults

No it doesn't. I'm gay, and I could be on a desert island with not a soul in sight, or I could be the last man alive on planet Earth and I will still be gay. I do not need anyone else to have my identity. This is because homosexuality is a trait of one person and his/her attraction. I am attracted to men whether or not there is a consenting man present on the planet.

Just like you may be heterosexual but never have a date in your life. Not being able to attract a partner does not cause you to change who you're attracted to. One does have to stop calling oneself heterosexual just because they're ugly and no one will go out with them. You see? There is no consent. Because there isn't anyone else.

You've missed the entire point of my post. It's this: One's attraction needs to be de-coupled from the sex that a person may or may not partake in. You've conflated attraction and the act of sex.

Pedophilia is about the former not the latter.


Even granting everything you've just said, which I do only for the sake of argument, I see no good end to normalizing the idea that even pedophilia qua attraction, as you have it, is in any way okay.

Sure, I can see where that'd be a cross to bear. Bearing it in public silence, with the support of others who share your affliction and are willing to offer understanding, seems like a strongly preferable option. I don't understand what you hope to gain by this crusade, and I am unconcerned with whatever judgment you may make of me for saying that I doubt the rectitude of your motives in pursuing it. Even the people who molested me never argued to anyone that the urge which helped move them to do so wasn't a problem. They'd be as horrified and disgusted as I am to hear you argue that it's not. They were ashamed not only of what they'd done, but also of why they'd done it. They should have been, too. But even after what they did to me, I have more respect for them than for you.

(I'm gay, too. Don't conflate that with this. We had enough problems being associated with NAMBLA through big-tent naïveté back decades ago, and no one will thank you or think well of you for arguing we should make the same damnfool mistake another time.)


> I see no good end to normalizing the idea that even pedophilia qua attraction, as you have it, is in any way okay.

Just as with drug and alcohol addiction, destigmatizing the condition as a moral failing removes a barrier to acknowledgement and seeking (and continuing) treatment, which reduces the likelihood and severity of the condition producing socially harmful behaviors.

In the case of pedophilia, I think that's a pretty big win, but obviously that's a matter of the relative priority of expressing moral revulsion compared to preventing harmful action.

> They were ashamed not only of what they'd done, but also of why they'd done it.

Clearly that shame associated with the orientation didn't do any good in preventing the harmful action, so maybe it's worth considering whether the stigma around the orientation is useful or counterproductive. “People feel shane about having pedophilic orientation but still go out and molest children” doesn't seem to me to be as compelling an argument as you seem to think it is for the social utility of the stigma it references.


Even I don't believe that having pedophilia makes someone a bad person, irredeemable. It's not a moral failing; it's a dangerous paraphilia. And so I don't think you are wrong in saying that reducing barriers to treatment is worthwhile, and that a carefully titrated degree of destigmatization is likely to serve that end.

The person with whom I've been arguing in this thread has shown no sign yet that his thesis is like your own. His argument hasn't been about treatment. It has instead been that no more doubt, concern, or askance should by default attach to someone with pedophilia than to, for example, someone who is gay. That's not mere destigmatization. That's outright normalization, and I won't apologize for my reaction to it any more than I would apologize for demanding the same justification from someone who came begging similar consideration at par for those "cursed" with the urge to rape grown adults.


I think you raise a relevant point here, one I wish more people would think about, though what I have read indicates many of them acquired it by trauma of their own.

A society should be judged not by how it treats its outstanding citizens, but by how it treats its unwanted, outcasts, and its criminals.


"Child rapist" is pretty specific. Consumption/distribution of CP is also heinous and harmful.


Why would consumption, production or distribution of synthesized CP be heinous? Who exactly is the victim?


Well, I was talking about CP in general, which you omitted entirely in your original post.

Specifically with synthesized CP, though, I could imagine that it's harming the consumer by feeding into their mental illness.


> The majority of pedophiles recognize that rape is wrong and would never consider it.

Thank you for your thoughtful comment, matt-attack. No one mentioned non-consensual sexual contact with a child. Do you believe that consensual sexual contact with a child is acceptable? I ask only because the phrasing of your argument possibly implies this. Thank you very much for your response/clarification.


When you're on the side NOT defending pedophiles, you know you're on the right side of history.

Democrats: We'll do anything it takes to get Trump! No one is above the law!

Also Democrats: Watch as we provide "sanctuary cities" for illegal immigrants to be above the law, and we normalize pedophilia!

https://i.imgur.com/zs2VlOn.jpg


I wish more people had this opinion--"cursed" is a good way to put it.


I disagree. There's a reason that evidence obtained illegally by the police can't be admitted in court, even if it proves some horrible crime. IMO the fair use of the law for the whole population is far more important than stopping one criminal.


Obligatory not a lawyer, but third party doctrine likely applies in the US, meaning that it would not be illegal for police to use such evidence if supplied by Apple. Not sure where the line is drawn if Apple itself violated it's own EULA, or violated local wire tapping laws, or if a service like Siri falls under such laws. I know California and Illinois are 2 party consent states. Not positive if it's only for phone calls or all voice recordings.


"by the police"

In this case it is not the police who obtained it.

I think in some countries, not reporting crimes you know about could count as complicity.


unless you’re a lawyer and have a client-attorney privilege. same thing might apply here.


Thats the burn a forest to kill a few rabid dogs argument.


Apple has extensively annotated iOS with understandable information about how data is used. The user has the ability to review this info when they enable Hey Siri, or any time in the Siri & Search settings. It lists the data that is sent to Apple when you activate Siri.

Personally, I’ve only ever seen Hey Siri falsely activate once, from something a character on TV said. (I could get Ok, Google to trigger on a friend’s phone by talking in a ridiculous falsetto that sounded nothing like the friend’s voice.) Per the law of large numbers I’m sure it does trigger in doctors’ offices and so on but I’m skeptical that it’s a widespread problem.

I do use Siri to dictate text messages all the time while I’m driving, and I’m sure I’d be embarrassed by one or two that human reviewers have heard.


Siri activates almost any time I say “are you serious?!”, which is a phrase I say with some regularity.

I’ve definitely had it activated a number of times by the TV or when watching YouTube or Twitch.


I had it trigger while watching an episode of Archer. I even went back and played it a few more times to double check it wasn’t a fluke.


I no longer see a way to turn Siri "off", only to not respond to "hey Siri" but rather the home button. Does this mean she's not listening unless I've activated, or is she listening anyway? (I can't seem to help anthropomorphizing Siri it seems...).


If you turn off all the options, you'll get a dialog that asks if you want to turn Siri off altogether. (As I just did.)


Excellent! Thank you. (They also promised to remove the info they use to analyze my requests from their servers).


Siri-usly?


I once asked Siri "What city am I in?", to which she responded "Calling Ian." This was around 5am. I promptly disabled Siri and never looked back.


This was my experience, and took the same course of action.


I've seen Siri activate in error on at least two different phones this week. Not uncommon at all.


When Google introduced some new gadget (I think it was pixel 3), poeple on the front row had their Google Assistant activating inns6nc with the presenter. It was hilarious.

I think passive listening is dangerous. If I ever went back to using such tools I would prefer something like the (universally hated) Bixby button.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: