#SISDA #SavingLivesOneWordAtATime
Background and Terms of Use. Last Updated (8th October 2021)
Why is SISDA Needed
Safeguarding, Mental Health, Wellbeing leads, Commissioners and Regulators need to collect evidence of safe practice within services, products and innovations targeted at improving mental health and wellbeing outcomes in clinical, social care, office and educational settings.
The need for this has increased. Inability to gain access to effective online support is regularly recorded in Coroner's 'Prevention of Future Deaths' reports. What if one could reduce the risk by simply listening better. What if that could be be achieved, and:
- reduce the need for the health and care system to change what it does, by simply improving the way it listens
- identify early signs of very poor mental health in any setting and enable timely support
- enable the better management of risk in case loads
- signpost effectively to specialist support in the third sector
- reduce the pressure on the mental health system by more effectively signposting to appropriate resources
- enhance equity of access to all mental health support services
- be acceptable to young people, disadvantaged groups and the BAME and low income communities
- be applied to both physical and digital mental health support services,
- reduce the risk associated with anonymous and personalised support, text support, apps, and digitally enabled pathways
- be of value to the public, third sector and private mental health and wellbeing services
- be free at the point of use for the NHS
SISDA enables this.
_______________________
What is SISDA
SISDA is a:
- List of words and phrases associated with the presentation of self-harm/suicidal ideation in language. It is statistically and clinically relevant, and made freely available to online resources used in clinical, social care, office and educational settings.
- Not a medial device as defined by MHRA and FDA. This said, it is being developed using NLP in line with the forthcoming BSI standard for the validation of AI in Healthcare, and Careful AI's own Safety driven product development process PRIDAR.
- Resource maintained via a private, public and third sector partners who seek to support multi cultural & generational communities.
The SISDA Partnership is built on the following Values
- People in crisis should be signposted as quickly as possible to resources that can reduce the symptoms of crisis
- SISDA should help system and service providers identify language and signals in a way that protects the privacy of people
- SISDA should be supported by robust and ethical data science, and be reviewed / validated by the public and care service providers
The SISDA Partnership's mission is to:
- Have people and the care community lead the development and validation of SISDA
- Enable users to manage the clinical and technical risk of using the library in line with the laws of England and Wales, and appropriate standards
- Reduce the risk associated with digital tools that engage people using free text or language
- Maintain a library of crisis language & signals that is fit for purpose for all care providers in all settings
- Make SISDA freely available to the UK health and social care community who specify how they will act on the library in a safe way
Membership of the SISDA Partnership
As a partnership we are keen to work with people who can evidence action and commitment in line with our values and mission.
Partnership Meetings are held online on the first Friday of the month in April, July, October and January.
At Partnership Meetings the past three and future three months activities in support of SISDA are discussed.
The SISDA Development Boards proposals for forward 12 months are openly discussed and published as the SISDA Development Pipeline
To register your interest in the Partnership. Please contact joseph.c(at)carefulai.com.
SISDA's Development Board
- Geraldine: Clinical Lead
- Tom: Clinical Lead
- Sam: English Child Language
- Tony: Adult English Language
- David: Adult Welsh Language
- Joe: NLP and Data Protection
_______________________
The SISDA Model facts
SISDA Fitness for Purpose Statement.
CarefulAI and the SISDA development group will always recommend the timely presence of a person to review free text/voice submissions gathered using online tools. Combined with such a resource, SISDA offers the potential to speed up the triaging of submissions.
SISDA Model Summary
Model Name: SISDA
Version number: 3.0
Published: 1st of October 2021
Outcome: Identification of suicidal ideation and cognitive distortion in online text
Output: % likelihood of person needing to be quickly referred to a clinician for triage (Depending on pattern matching)
Target Population: Systems that seek to triage people seeking mental health and wellbeing support
Time of Prediction: Less than 1 second, (Depending on pattern matching)
Input data source: Online forms, Speech to Text translations and social media
Input data type: Free text containing more that 4 words
Training details: Social media, Suicide Researchers and NLP
Model type: Term Frequency/Inverse Document frequency (e.g. TF.IDF)
Conditions of use: https://www.carefulai.com/sisda.html
Background
In 2017, Burnap et al built on data from Tumbler and Twitter that had been self annotated with the tag of 'Suicide'. They created Ngrams of terms of 1 to 5 words in length. The top 500 terms were collated and ranked based on frequency. These terms were further analysed by two experienced suicide researchers to remove terms not specifically related to suicide, as well as duplicate keywords. This resulted in a final list of 62 keywords and phrases that suggested possible suicide intent. Illustrative examples are 'asleep and never wake', 'don’t want to exist' and 'kill myself'. In 2021, Connor et al, used the 62 keywords and phrases and the same approach to identify the associated cognitive distortion phrases. Combining the suicide and cognitive distortion keywords and phrases, and using ensemble methods, the new model was tested on text gathered by systems that manage interactions between people and care providers. It was found to correctly identify similar language with a precision of 69.31%, recall of 72.9 % and accuracy of 72.8%.
Governance
Each new word or phrase proposed for the SISDA library is judged by a Phrase Team. These teams consist of at leat two clinicians and one member of the community whose language is going to be matched with the SISDA library. This team classifies language against 7 criteria. Only when each member of the team agree a new word or phrase potentially evidences self harm or suicidal intent does it enter the SISDA library.
_______________________
SISDA's Development Pipeline
2021
Oct-Dec:
- Make SISDA available to government digital agencies and front line services supported by the English and Welsh NHS.
- Develop Phrase Team leads in Irish & Scottish NHS.
- Support technology partners to set up their own Phrase Teams in support of young people, disadvantaged groups and the BAME and low income communities.
2022
Jan-March:
- Engage front line systems suppliers to develop API documentation and partnerships. Begin publishing SISDA use statistics
April-June:
- Expand SISDA's language set to reflect needs of people who live and work in Greater London
July- Sept:
- Work with partners to understand, if suicidal ideation and cognitive distortion language, coupled with indices of social deprivation can be used to better predict; support needed for people who are deemed illegible for mental health assistance, focused mental wellbeing support, stratify high risk patients, the need for specialised support
_______________________
SISDA Conditions of Use
CarefulAI shall:
- Enable SISDA to be provided free at the point of use in the UK NHS under a AGPLV3 licence
- Provide registered users with access to an API with the most up to date SISDA Library and Model Summary in a 'ReadMe' file.
- Manage efforts to maintain SISDA's fitness for purpose across all age groups, cultural, and engagement settings
- Indemnify the SISDA Partnership and Development Board for financial loss associated with their support of SISDA's use.
- Chair the SISDA Partnership and Development Group
SISDA Library Users shall:
Propose:
- A Clinical / Safeguarding lead for potential membership of the SISDA partnership group
- Methods to ensure SISDA correctly represents the audiences that they wish to engage
- How they are going to match the SISDA library with data they acquire, and their procedure for acting upon matches e.g. Online submissions that match SISDA phrases being prioritised for direct contact with a person trained in suicidal ideation management
- Separate licence agreements with the SISDA Development Board if it is expected that they or their partners will use the library in the private sector or outside the UK
Accept:
- They will maintain a CarefulAI register detailing the version of SISDA they are using
- They need to mitigate the risk associated with the expectation that in at least 30.71% of cases SISDA cannot be expected to identify suicidal ideation language in online text
- SISDA's usefulness will be dependent on the method of pattern matching SISDA words and phrases to target data
- That they take full responsibility for the pattern matching technique they use with SISDA
- They evidence parts of their risk or DCB0129/160 documentation detailing the risks and mitigations they accept by use of SISDA
- Inform CarefulAI of the number of occasions each calendar month when the SISDA library has been used to trigger support
- Use of CarefulAI to gather and agree evidence needed to respond to Coroners Court enquiries associated with SISDA's use
- SISDA is licensed in the UK NHS under a AGPLV3 licence, as such derivative developments need to be embodied in the SISDA library
- This web page is referenced in any Terms of Use associated with SISDA's application:
- They will stop using SISDA in the UK if the support they reference does not focus primarily on
https://www.nhs.uk/mental-health/feelings-symptoms-behaviours/behaviours/help-for-suicidal-thoughts/
- They will cease using SISDA if they receive an email from CarefulAI evidencing that they are acting contrary to SISDA's values, mission, developments, fitness for purpose or are not complying with the libraries Conditions of Use.
_______________________
SISDA's Story: 'Standing On the Shoulders of Giants'
The Giants:
- Paul: Risk stratification
- Geraldine: Clinical Lead
- Sam: Child Language lead
- Tony: Adult English Language lead
- David: Adult Welsh Language Lead
- Atia, General Practice
- Tom, Child and Young Peoples Mental Health
- Anne, Child and Young Peoples Mental Health
- Ruth, Adult Mental Health
- James: Mental health innovation
- Jimmy: NHS care text service design
- Victoria: Advocate of user centred design
- Ross: Mental health supplier engagement
- Fola: Digital transformation
- Deborah: Commissioning policy
- Andrea: Safeguarding and social care
- Phil, innovation health care
- Tom, innovation in health care
- Marcus: open source publishing
- Dean: academic digital innovation
- Pete: academic NLP research
- John: academic NLP research
- Jon: academic social science research
The Story
In 2015, Joe the Chairman of CarefulAI, met Paul a Psychologist who was looking at high risk language. They jointly shared understandings they had acquired in relation to predicting human behaviour as a consequence of signals generated when humans communicate online. CarefulAI was born out of this meeting and the realisation that technology, ML and AI has a role in augmenting human interaction within the care and risk reduction domain. In 2017, Joe met Geraldine. At that time he had just joined NHSD as an adviser in Machine Learning & AI, she was the NHSE lead on Mental Health and inspired him to focus on mental health. Deborah, the then SRO for 111/999, used his skills to open innovation in the the area of mental health triage. In this work, the end to end pathway associated with mental health support became transparent, along with it the discovery that the proprietary nature of online service providers can be a barrier to effectively assessing risk in language, sharing risk insights and effective triage. Contact with clinicians, digital leads and end users e.g. Ruth Anne, Atia, Victoria, Fola, Ross, Andrea, Tony, Sam & David has highlighted the value of user led design. Contact with Dean, Phil, Tom, Pete, Jon, John, Paul highlighted that developing technology in the open reduces the risk to the public and support services. In 2018, Joe became the ML & AI innovation lead with NHSD's Code for Health. At that time James, Tom and Jimmy highlighted the main barrier to worthwhile developments in the mental health triage domain: this being people presenting with language associated with self harm. Via research, and contact with with Pete and John, Joe was able to surface language that should be of concern to those involved in digital triage. The 'online english' language version this was made available to HLP in 2019 and the suicidal Ideation language list embodied into the DB0129 design of a mental health triage service. In 2020 Anne, during her development of an Anxiety and Commitment Therapy Bot for Minecraft, used SISDA to trigger interventions. The project involved Sam (then aged 11) leading design thinking of a team of two UCL professors, three computer scientists and IBM's Research and Innovation Group. Sam was commended by IBM or his work to normalised SISDA for use with young people, and joined the SISDA development board as young person lead in 2021.
The main challenge for SISDA from 2021 onwards is maintaining its fitness for purpose across all age groups, cultural, and engagement settings.