Hacker News new | past | comments | ask | show | jobs | submit login

I also wonder what the impacts of having "Placed calls to the suicide hotline" in your permanent record will be on people's lives if they do choose not to take it.

With the state collecting that data, your phone company collecting that data, your cell phone OS collecting that data, random apps collecting that data, and who knows who is on the other end of that call and what they're doing with your data.

Will you be turned down for jobs because when a company has a choice between two equal job candidates and one of them has shown suicidal tendencies at any point why should they risk that their employee will kill himself and lose money on re-hiring/re-training?

Will your health insurance company buy that data? Your life insurance company?

Could you lose your right to own a firearm?

Can it cost you custody of your children?

Will it end up leaked or "lost" on a USB drive and find itself on the internet?




Practically speaking, none of these things will happen.

This is actually one of the big hurdles to getting some people into therapy/treatment/medication that they need: They imagine a lot of catastrophic scenarios about what could possibly happen if they get the help they need, which creates a gridlock that traps them in their situation.

From a pure risk management perspective, the highest risk by far in these scenarios is failing to accept help that could improve the underlying problems. If someone reaches the point of considering suicide, it doesn't make sense to refuse help due to hypothetical fears about some future job obtaining phone records and declining someone for a job (which isn't a thing that happens, BTW).

Don't let fear of far-reaching hypotheticals get in the way of making progress on mental health.


> Practically speaking, none of these things will happen.

Practically speaking, these kinds of things are happening every day. I would agree that it might be better to take the help and live accepting the lifelong consequences of your actions in seeking that help. A life where you lose your children, lose job offers, or have your deepest troubles exposed to the world might easily still be a life worth living.

The reality is that we've created and enabled a system that hurts people. Our communications are logged forever and we have no power to limit the ways that data can be used against us. It's only natural for people to take that into account before doing something that makes them so very vulnerable.


How will you lose your children? How will you lose job offers? Children aren’t even taken away where there is known abuse going on. Lose job offers? How would a company even know you were seeking help?



you lose your children when in the middle a custody battle your partner (or ex-partner) subpoenas your phone records, or even your medical records, and uses that to help convince a judge that you're unfit due to your mental health issues and that those issues will have an impact on your child. (see https://dadsdivorce.com/articles/parents-mental-illness-chil...). The stigma that exists surrounding mental illness makes losing your child far too easy.

How would you lose a job? Your would-be employer faced with two otherwise equally qualified candidates pays a third party to run a background check that includes information lifted from your social media sites (https://www.careerbuilder.com/advice/social-media-survey-201...) and collected from data brokers.

> Data brokers will often combine all the information outlined in these two lists to build complex profiles about each of us. These profiles are used for an increasing number of purposes, from serving targeted advertising to crafting insurance policies to providing background checks for employers. (source: https://www.digitaltrends.com/web/its-a-data-brokers-world-a...).

The FTC has warned about this kind of thing (https://www.nydailynews.com/sdut-ftc-puts-background-check-s...) but it still happens.

Maybe you called the suicide hotline using your cell phone and some sketchy app (like a call blocking app or facebook: https://www.theverge.com/2018/3/25/17160944/facebook-call-hi... or even a stalker app: https://news.softpedia.com/news/stalker-android-apps-with-th...) sold your contacts/call history or maybe your phone company did it. See this example from the article here: https://arstechnica.com/tech-policy/2013/12/att-accused-of-v...

> "[T]he sale of CPNI to the government isn’t our only concern—when we did a little more poking around, we found that all four major mobile carriers (AT&T, Sprint, T-Mobile, and Verizon) have privacy policies that indicate they believe it is OK to sell or share similar records to anyone,"

It's not really a question of "how is anyone going to know?". Every communication you make is logged forever and shared (read: sold) with "partners" and turned over to the state. Someone, somewhere, will absolutely have that data. You aren't allowed to know who has it, where they got it, how accurate it is, what they are doing with it, or how/if it's being secured, but it is out there and it will follow you for the rest of your life.


What "far reaching hypotheticals"?

https://news.ycombinator.com/item?id=32122139

I've heard over and over horror stories along those lines. Someone having a mental health crisis calls emergency services for help (and absolute best case scenario they get a very expensive taxi ride to the ER) or goes to the ER themselves. The ER pumps them full of drugs or just locks them in what amounts to a prison, does nothing to treat people or connect them with services but charge tens of thousands of dollars, leaving people in the same spot they were in before but with mountains of medical debt.

Or, someone is honest with their therapist, the therapist prioritizes not getting hit with a malpractice lawsuit so they report their patient to the police, and...see above, except worse, because the patient isn't expecting it nor did they consent to such a sudden and drastic intervention.


You are 100% wrong. The current infrastructure only exists to reduce liability (hr ect) and nothing more. It could be subsidized to prevent poor outcomes but I'm honestly afraid of the lobbying against it.


if you are suicidal or have mental health issues you absolutely should not be able to go out and buy a gun until a mental evaluation takes place.


Lots of people feel the way you do, and that kind of thinking is one reason why someone might hesitate to have that black mark on their record for the rest of their lives.

If we want people to get help we need to think very carefully about what getting help will cost them, because the higher that cost is the more people there will be who are unable or unwilling to afford it.


Absolutely those things can happen, and they should.

For example, imagine if an airplane pilot or anyone in a position of being responsible for a mass number of lives has previously called into suicide hotlines.

Would you feel secure knowing the pilot of your plane once exhibited suicidal tendencies? I doubt it.

Any bits of data we can collect to make accurate assessments should be collected.


> Any bits of data we can collect to make accurate assessments should be collected.

This is the kind of thinking that got us into this mess. The data being collected may not be accurate, may not be interpreted correctly, and may not even be useful, but fearful people like you will happily use it for a tiny sliver of peace of mind.

Lets say the pilot on your next flight did call the suicide hotline 15 years ago after his wife and child were killed in a car wreak. Would that make him more likely to be a danger for you today than the pilot on your last flight who was currently suicidal but never called the hotline?

Lets say the pilot on your flight wasn't just going through a rough time 15 years ago, but was actively feeling suicidal on the day of your flight. Is it more likely that he would also be a mass murder who wanted to kill everyone on the plane, or is is more likely that he would kill himself later that day while alone in a hotel room at your destination city?

Thoughtcrime is a dangerous thing and our algorithms aren't so good at predicting human behavior that we should support handing out extrajudicial punishments to people based on what they might do. Those are things you should fear far more than what the mental health history of your pilot might suggest.


> Would you feel secure knowing the pilot of your plane once exhibited suicidal tendencies? I doubt it.

I really don't care, to be honest. As long as the problem has been fixed through major changes to lifestyle and/or medication, plus enough time has passed to ensure that the person is stable, the fact that they once called a hotline is irrelevant.

I might not be as vehemently against the collection of this data if there were a strict retention limit written into law, perhaps 3-5 years, alongside a solid plan to properly enforce that limit and to keep records confidential.

As written, your comment implies that people can never change and situations can never improve. People regularly change their lives drastically, especially young adults whose brains are not even fully done developing.

> Any bits of data we can collect to make accurate assessments should be collected.

I doubt that the fact that someone called into a hotline twenty years ago would allow anyone to make an accurate assessment as to their fitness to fly today. It would likely be used as a tool of oppression, and it would make people think twice before choosing to risk their future career prospects by calling into the hotline in the first place.

Permanently barring hotline callers from becoming pilots makes about the same amount of sense as permanently barring people who have ever had a broken arm from becoming pilots.


You seem strangely empathetic. Perhaps you have made calls to these hotlines or wrestled with complex emotions as well.


This is the first time anyone has called it "strange" that I am an empathetic person. Perhaps many people have not gone through (and recovered from) enough hardship to properly develop their sense of empathy like I have.

Better bar me from becoming a pilot, quick.


This isn't a reasonable blanket policy. While I definitely would feel encouraged knowing my pilot has not displayed suicidal ideation, I do not support tracking writ large.

For example, resident physicians are known for being pushed to work 80+ hour weeks. Would any of them ever officially admit to having mental health issues? Hell no, and that's a problem from an ethics perspective and a declining level of healthcare, not to mention tragically ironic to boot.


Imagine if an airline pilot or anyone in a position of being responsible for a mass number of lives has felt suicidal and considered calling into a suicide hotline.

Would they feel secure knowing their call will create a permanent record of suicidal tendencies, limiting their career for years to come? I doubt it.

Any bits of data we collect to make accurate assessments will encourage suffering people to withdraw further and remain silent instead of getting help.


I think the type of background check necessary for something so life-critical like piloting commercial airliners should be considered distinctly different from the kind that any other normal job should involve




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: