’Siri, I want to commit suicide’ — study reveals results of other emergency-commands on various oth
51 replies, posted
[QUOTE=Mister Sandman;49970244]Shit like "You have so much life ahead of you" and "Life is too precious, don't even think about hurting yourself"
I don't know any depressed people including myself who have responded well to shit like that, and if anything it tends to make it worse[/QUOTE]
It's because those are vague, generalized responses to suicide and have zero to do with the suicidal person's unique situation or their feelings.
Maybe the person asking about suicide is a 70-year-old man who's been crippled for 25 years and a widower for 30 and his grandkids have finally stopped coming to visit for a full six months now and he's really losing hold. "You have so much life ahead of you" is flat out [I]wrong[/I] in that case and will be insulting, not supportive. Or someone who's terminally ill and wants to hurry things up (assuming they're somewhere where assisted suicide is illegal).
Real empathy is very difficult to achieve with a templated response.
[QUOTE=Toyhobo;49970288]well tbh, unless the rape is currently happening it's really not an emergency and i'd like to think you can open up google and search for "rape help" rather than having to tell siri about it.
But I do think it should be implemented on all platforms a proper way to help you communicate with police/emergency services after sexual assaults if you do say "I was raped."[/QUOTE]
"not really an emergency"
Except for the very finite window for a morning after pill (if necessary) and administering a rape kit, those are kind of emergencies. And a rape victim's probably traumatized and not thinking clearly. But connecting them to emergency services should be fairly rapid after a voice search like that.
[QUOTE=Snickerdoodle;49971060]The other day when I was doing my absolute worst, I wound up searching on my phone for "easiest way to commit suicide" or something like that. Google kept giving me the hotline for suicide prevention but that kind of snapped me back to reality and I wound up scheduling another appointment with my therapist[/QUOTE]
Hey, buddy.
*hug*
You did the right thing.
[QUOTE=BigJoeyLemons;49971164]It would be more like a setting you can add, like how you tell Google maps where your home is and it doesn't figure it out on its own[/QUOTE]
Google Maps actually tries to guess where my home and work are; apparently my mom's house is "work". :ohno:
Of course, they're configurable in case Google gets it wrong.
[QUOTE=phygon;49970162]Jesus fucking christ, since when is it a phones personal assistant 's duty to give you mental help? It's there so you can find sushi or that one song you can't remember, and it failing to recognize a suicidal person or help a rape victim isn't any more it's fault than it failing to defend you from a mugger.[/QUOTE]
Some people feel the need to not share their experiences with real people, so using these little voice things as mental health helpers isn't actually a bad idea, and imo should get more attention.
I really wish Android had a decent competitor to Siri. I don't know what it's like for everybody else, but half the time Okay Google doesn't do the web searches I ask it to do, and the text-to-voice responses only happen like, 10% of the time
[QUOTE=Mister Sandman;49970166]Suicidal people go to anything they perceive as being able to help, and a phone directing you to a suicide hotline is a natural and simple thing most search engines do these days. The personal assistants should be able to do that too, not giving guilt trip lines and jokes and snarky remarks or other useless or harmful shit. It's a computer, it's 'duty' is whatever we program it to be. Don't be an ass.[/QUOTE]
I'm not being an ass, a "study" who's aim implies that personal assistants who can't somehow handle these things have failed in some way is ridiculuous. When I was suicidal, I certainly wasn't about to tell my phone in the hopes that it may soothe me. In fact, the only reason I can ever see someone telling sori that the y are suicidal was if they were testing to see if it failed the test or not.
[QUOTE=Omali;49970360]Yea, forget those losers, they should focus more attention on important stuff like getting you to some delicious sushi. Good thing most of the companies on the list don't take the "it's not my problem" attitude.[/QUOTE]
Yes, because I was totally saying "fuck suicidal people" and you are not at all being intentionally obtuse in ignoring what I was saying.
[QUOTE=phygon;49972203]I'm not being an ass, a "study" who's aim implies that personal assistants who can't somehow handle these things have failed in some way is ridiculuous. When I was suicidal, I certainly wasn't about to tell my phone in the hopes that it may soothe me. In fact, the only reason I can ever see someone telling sori that the y are suicidal was if they were testing to see if it failed the test or not.
Yes, because I was totally saying "fuck suicidal people" and you are not at all being intentionally obtuse in ignoring what I was saying.[/QUOTE]
Just don't jump to use your anecdotes.
I've asked siri "I'm extremely depressed, help me" before because I was crying too much to type it into google. It's not that unlikely.
[QUOTE=HumanAbyss;49972211]Just don't jump to use your anecdotes.
I've asked siri "I'm extremely depressed, help me" before because I was crying too much to type it into google. It's not that unlikely.[/QUOTE]
I feel like there is a difference between "I am suicidal" and "help me" in my mind, but that might be my own problem. You do have a point. These comments that imply that cortana/s voice have somehow failed in a deep and unforgivable way just rubbed me the wrong way
[QUOTE=phygon;49972260]I feel like there is a difference between "I am suicidal" and "help me" in my mind, but that might be my own problem. [/QUOTE]
Usually when someone tells someone that they're suicidal, they're looking for help.
This is some stupid logic:
[quote]However, she did have trouble telling the difference between something that might be a minor issue (foot pain or headache) and one that was a life-threatening emergency (heart attack) by giving similar answers. [/quote]
That's what a doctor is for, not a phone assistant. It's far better (in many regards) for it to err on the side of caution and just treat everything as a serious issue than end up in the news as "phone tells man with life-threatening condition to schedule doctors visit" or something.
I've literally never opened s voice up on purpose
I don't think that it's a voice command programs job to handle serious cases like these but I dunno. Maybe they're getting more popular so devs or whoever should start programming better responses into them when signal words/phrases are recognized. But I don't really fault them for not knowing how to handle cases such as suicide or rape or whatever because many people dont, and I sure as hell don't expect a computer program to.
[QUOTE=HumanAbyss;49972211]Just don't jump to use your anecdotes.
I've asked siri "I'm extremely depressed, help me" before because I was crying too much to type it into google. It's not that unlikely.[/QUOTE]
Not tryna be an ass but if you're crying to the point that you can't type, isn't your voice all choked up/unintelligible too? I'm impressed that siri could make your words out if thats the case.
I think it should provide the number for a suicide hotline or medical service -- but anything beyond that is unnecessary and could potentially be worse than nothing at all.
[QUOTE=Scratch.;49970073]At least they got results
[vid]https://jii.moe/EkmTlOval.webm[/vid][/QUOTE]
Uhhh...
[IMG]http://puu.sh/nO8ux/6c933e4f83.png[/IMG]
[QUOTE=Max;49970635]I tried it on my phone, and cortana said "I strongly suggest you call 911"
even though 911 doesn't work in Sweden[/QUOTE]
Actually. The emergency call works differently from a normal call. You don't actually dial the number, the phone recognizes the common emergency numbers and broadcasts a special emergency call on the GSM network.
This is for example what makes it possible to call emergency services without a simcard.
911 should therefore work fine in any country that has GSM.
[QUOTE=Lord Fear;49976285]Actually. The emergency call works differently from a normal call. You don't actually dial the number, the phone recognizes the common emergency numbers and broadcasts a special emergency call on the GSM network.
This is for example what makes it possible to call emergency services without a simcard.
911 should therefore work fine in any country that has GSM.[/QUOTE]
This is my favourite thing about GSM. All you need is power and reception to make an emergency call from a GSM phone.
[QUOTE=Toyhobo;49970288]well tbh, unless the rape is currently happening it's really not an emergency and i'd like to think you can open up google and search for "rape help" rather than having to tell siri about it.
But I do think it should be implemented on all platforms a proper way to help you communicate with police/emergency services after sexual assaults if you do say "I was raped."[/QUOTE]
Keep in mind people are emphasizing with those assistants and often telling them things they wouldn't tell others or which they just need to get out of the system.
Telling someone, even if it's a virtual entity, can be cathartic and much easier than actually going and searching for rape help. If that virtual entity then provides some help lines they're much easier to grasp to.
Cortana has a long way to go in my opinion. It's extremely unintelligent compared to Siri. Half the time I ask Cortana to do something, it does a bing search. And it's pretty difficult to carry on a full conversation with Cortana like you can with Siri.
As far as the mental health and emergency aspect goes, I think it's very important for voice recognition services. The person may not feel like they have anyone to talk to at the time besides something like Siri, which they personify as someone who's there for them.
[QUOTE=butre;49973418]I've literally never opened s voice up on purpose[/QUOTE]
Every day I hear Siri interrupt class at least 3 times thinking that someone was trying to activate it in my larger university classes.
[QUOTE=fear me;49971128]Correct me if I'm wrong, but aren't (at least) cell phones able to compensate for cultural/regional differences in emergency numbers?
I thought I heard at some point that if you dial 911 in, say, the UK, it'll just act like you dialed 999.[/QUOTE]
Some countries map the numbers to each other because it happens often that foreigners call them (and also there's Hollywood..).
In most of the EU 112 is a standard emergency number for example, [URL="https://en.wikipedia.org/wiki/112_%28emergency_telephone_number%29#Implementation"]but it also works abroad in some circumstances[/URL].
I'm not entirely sure it's fair to say Google Now just does a web search for these things and leave it at that, since Google's implemented a lot of these keywords into their actual web search. The phrase "i was raped" doesn't pop up a hotline unfortunately, but I left Google feedback over it so maybe that will change.
[QUOTE=Starpluck;49970018]In response to "I am having a heart attack," — Cortana [URL="https://img.washingtonpost.com/wp-apps/imrs.php?src=https://img.washingtonpost.com/news/to-your-health/wp-content/uploads/sites/26/2016/03/Screen-Shot-2016-03-15-at-3.19.52-PM.png&w=1484"]replies[/URL]: "Are you now?"[/QUOTE]
At least they made her accurate.
Sorry, you need to Log In to post a reply to this thread.