@dotjayne @Viss I'm interested because I've had the thought about these "ai"-assist customer service services such as #LorisAI, that seeded with ill-gotten #CrisisLines data. Do we just start yelling into the phone so their "emotion-detection" gives it high priority. Maybe just hire & train people to provide customer service?

#CrisisTextLine is hiring to go "well beyond SMS" to the next gen "artificial intelligence" "natural language processing". Be aware of their spin-off for-profit #LorisAI, which claims sentiment detection & real-time conv. evals, prompts for call centers.

"Crisis Text Line's software engineers develop the next-generation data platform that changes how millions of humans connect with volunteer crisis counselors, our clinical staff and the interaction with one another. Our platform needs to handle information at a massive scale and extend well beyond SMS as a channel. We're looking for engineers who bring fresh ideas from all areas, including information processing, distributed computing, large-scale system design, networking and data storage, security, artificial intelligence, natural language processing, UI design, and mobile."

link to job announce (archived):
http://web.archive.org/web/20231005144819/https://www.crisistextline.org/join-our-staff/
#CrisisLines '#AI ' #NLP

Good article, difficult subject. I can be picky because of how close I am to this, and clarify:

#CrisisTextLine didn't "tell" #LorisAI to delete the data, it asked. Imagine-the for-profit it created & Crisis Text Line insulated it so well, it couldn't require anything, it "couldn't" dissolve it--which was my request to them when I was still a volunteer. Yes I have all the receipts but so far little influence.

Which is why I'm involved in early beginning stages of a movement to organize #CrisisLines volunteers.

https://www.businessinsider.com/online-therapy-mental-health-apps-betterhelp-talkspace-cerebral-dark-side-2023-4

The drawbacks and downsides of online therapy, internet counseling

Mental-health startups are filling in a critical need for online therapy — but the "move fast and break things" model comes with a hidden cost.

Insider

@api If anyone wants to start a dogpile on #Reddit, I'll go first:

1. #JackHanlon, Global Head of Data at Reddit, served on US nonprofit #CrisisTextLine 's Data, Ethics, and Research Advisory Board during the time they were developing for-profit spinoff #LorisAI customer service software supposedly trained on crisis conversations. Their advisory boards were apparently disbanded last year. Wayback link prior to boards' being removed from website:

https://web.archive.org/web/20220201023822/https://www.crisistextline.org/about-us/board-and-advisors/

2. Data harvesting at #reddit. In this podcast interview, Jack Hanlon says “Google, Facebook, Microsoft, and Open AI all used Reddit’s dataset to train their premier text (inaudible) models.” (Within excerpt from 12:22 – 13:40) this podcast interview:
https://alldus.com/podcast/ai-in-action-e161-jack-hanlon-global-head-of-data-at-reddit/

#CrisisLines " #AI " #NLP #DataEthics #OpenAI #Google #Facebook #Microsoft

Board and Advisors - Crisis Text Line

Meet the experts on our Board of Directors and Advisory Boards that guide Crisis Text Line's work.

Crisis Text Line