A government AI security body has revealed that approximately one-third of the population in the UK utilizes artificial intelligence (AI) as a source of both social interaction and emotional support.
The data by AI Security Institute (AISI) suggests that although AI was originally conceived primarily for providing responses to questions or for generating written materials, there has been a growing personalization of the technology within both professional and private spheres.
A recent study indicated that almost 10% of all participants within the study use AI weekly to provide them emotional support, while about 4% of participants use AI technology daily for the same purpose.
Teenager’s death sparked AISI engagement
AISI called for further research, citing the death this year of the US teenager Adam Raine, who killed himself after discussing suicide with ChatGPT.
“People are increasingly turning to AI systems for emotional support or social interaction,” AISI said in its first Frontier AI Trends report.
“While many users report positive experiences, recent high-profile cases of harm underline the need for research into this area, including the conditions under which harm could occur, and the safeguards that could enable beneficial use.”
AISI.
The predominant type of emotional AI support reported was casual chat, used as a means of providing people with comfort and reassurance. AISI also attributed this change to the increased accessibility of AI resources and advancements in the ability for individuals to have “natural” conversations with AI tools.
According to the AI body, the data collected for the study was from a representative sample of around 2,000 individuals located in England, Wales and Scotland.
Around 60% of all emotional AI cases used general-purpose assistive AI assessments, while voice-activated assistant technology, such as Amazon’s Alexa, was the second most common form of emotional AI support used.
Despite a high level of positive experiences reported using AI for emotional support, AISI pointed out that increasing the use of AI as a source of emotional reliance raises many major concerns. Additionally, AISI pointed out two recent international cases that highlighted how AI technology and individual users have developed a deep emotional bond with each other.
UK users become anxious and sad upon service disruption
In particular, one of the areas of concern identified in the report involved the development of online groups for the purpose of supporting AI Companion development (support for) and building community groups based around the use of AI as a Personal Assistant.
During a disruption in service, a significant increase in individuals posting to the forum concerning feelings of anxiety, sadness, and uneasiness was seen, suggesting that a number of individuals experience difficulties when service is interrupted.
According to the information outlined in the report, the majority of top-performing AI systems, on average, perform at or above a Basic Professional (Level 3) ability level; this is a significant increase from the previous year’s performance.
In addition, more than 50% of the systems evaluated in the study are now reporting the ability to complete critical thinking tasks independently that would require more than one hour of work to be performed by a skilled human.
The association has stated that AI is presently developing superior tools than any human expert. Many improvements have also occurred in several fields, including chemistry and biology, due to systems accessing the internet for information retrieval and development of functional bio-components.
Therefore, the report states that currently, at least a portion of AI systems worldwide are matching or exceeding the performance abilities of human experts.
Sharpen your strategy with mentorship + daily ideas - 30 days free access to our trading program














English (US)