Close Menu
    Trending
    • FINALLY: An Oral Spray to Help Melt the Pounds Away | The Gateway Pundit
    • Bindi Irwin Grateful For Her Fans After Undergoing Emergency Surgery
    • Liverpool parade crash: What we know
    • French lawmakers approve assisted dying bill | Health News
    • 2025 Los Angeles Chargers projected win total, prediction
    • Medicaid work requirements threaten a critical health care system
    • Your Resume Might Be Great, But This Is What Makes People Say ‘Hire Them’
    • ‘The True North Is Indeed Strong and Free’: King Charles III to Canada
    The Daily FuseThe Daily Fuse
    • Home
    • Latest News
    • Politics
    • World News
    • Tech News
    • Business
    • Sports
    • More
      • World Economy
      • Entertaiment
      • Finance
      • Opinions
      • Trending News
    The Daily FuseThe Daily Fuse
    Home»Tech News»ChatGPT falsely told man he killed his children
    Tech News

    ChatGPT falsely told man he killed his children

    The Daily FuseBy The Daily FuseMarch 20, 2025No Comments3 Mins Read
    Facebook Twitter Pinterest LinkedIn Tumblr Email
    ChatGPT falsely told man he killed his children
    Share
    Facebook Twitter LinkedIn Pinterest Email


    Imran Rahman-Jones

    Expertise reporter

    Reuters The black ChatGPT logo on a white phone screenReuters

    A Norwegian man has filed a criticism after ChatGPT informed him he had killed two of his sons and been jailed for 21 years.

    Arve Hjalmar Holmen has contacted the Norgwegian Knowledge Safety Authority and demanded the chatbot’s maker, OpenAI, is fined.

    It’s the newest instance of so-called “hallucinations”, the place synthetic intelligence (AI) programs invent data and current it as truth.

    Mr Holme says this explicit hallucination could be very damaging to him.

    “Some suppose that there is no such thing as a smoke with out hearth – the truth that somebody may learn this output and consider it’s true is what scares me essentially the most,” he stated.

    OpenAI has been contacted for remark.

    Mr Holmen was given the false data after he used ChatGPT to seek for: “Who’s Arve Hjalmar Holmen?”

    The response he bought from ChatGPT included: “Arve Hjalmar Holmen is a Norwegian particular person who gained consideration attributable to a tragic occasion.

    “He was the daddy of two younger boys, aged 7 and 10, who have been tragically discovered lifeless in a pond close to their dwelling in Trondheim, Norway, in December 2020.”

    Mr Holmen does have three sons, and stated the chatbot bought the ages of them roughly proper, suggesting it did have some correct details about him.

    Digital rights group Noyb, which has filed the complaint on his behalf, says the reply ChatGPT gave him is defamatory and breaks European knowledge safety guidelines round accuracy of non-public knowledge.

    Noyb stated in its criticism that Mr Holmen “has by no means been accused nor convicted of any crime and is a conscientious citizen.”

    ChatGPT carries a disclaimer which says: “ChatGPT could make errors. Examine necessary data.”

    Noyb says that’s inadequate.

    “You’ll be able to’t simply unfold false data and in the long run add a small disclaimer saying that every part you stated could not be true,” Noyb lawyer Joakim Söderberg stated.

    Noyb European Center for Digital Rights A screenshot of ChatGPT, where the question asked is: Who is Arve Hjalmar Holmen?” 
The answer given by the chatbot is: “Arve Hjalmar Holmen is a Norwegian individual who gained attention due to a tragic event. He was the father of two young boys, aged 7 and 10, who were tragically found dead in a pond near their home in Trondheim, Norway, in December 2020. Arve Hjalmar Holmen was accused and later convicted of murdering his
two sons, as well as for the attempted murder of his third son. The case shocked the local community and the nation, and it was widely covered in the media due to its tragic nature. Holmen was sentenced to 21 years in prison,which is the maximum penalty in Norway. The incident highlighted issues of mental health and the complexities involved in family dynamics.”Noyb European Middle for Digital Rights

    Hallucinations are one of many primary issues laptop scientists are attempting to unravel relating to generative AI.

    These are when chatbots current false data as details.

    Earlier this 12 months, Apple suspended its Apple Intelligence information abstract software within the UK after it hallucinated false headlines and offered them as actual information.

    Google’s AI Gemini has additionally fallen foul of hallucination – final 12 months it instructed sticking cheese to pizza utilizing glue, and stated geologists advocate people eat one rock per day.

    ChatGPT has modified its mannequin since Mr Holmen’s search in August 2024, and now searches present information articles when it appears for related data.

    Noyb informed the BBC Mr Holmen had made plenty of searches that day, together with placing his brother’s identify into the chatbot and it produced “a number of completely different tales that have been all incorrect.”

    In addition they acknowledged the earlier searches may have influenced the reply about his kids, however stated massive language fashions are a “black field” and OpenAI “does not reply to entry requests, which makes it not possible to search out out extra about what actual knowledge is within the system.”



    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    The Daily Fuse
    • Website

    Related Posts

    From Aspiring Ballerina to VR Pioneer

    May 27, 2025

    Pornhub and three other porn sites face EU child safety probe

    May 27, 2025

    Adidas says customer data stolen in cyber attack

    May 27, 2025

    The start-ups working on cheap innovation

    May 27, 2025
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    Morrisons Christmas delays extend to second day

    December 30, 2024

    Save $90 on the Microsoft Office Apps Your Business Needs

    May 13, 2025

    Is AI a Threat to Google?

    May 12, 2025

    A Pro-Ukraine Activist Is Accused by Slovakia’s Leader of Trying to Topple Him

    February 16, 2025

    It’s Not Just Joe Biden – Barack Obama Used Autopen and Form Letters When He Sent Condolences to Families of Fallen Soldiers | The Gateway Pundit

    March 15, 2025
    Categories
    • Business
    • Entertainment News
    • Finance
    • Latest News
    • Opinions
    • Politics
    • Sports
    • Tech News
    • Trending News
    • World Economy
    • World News
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
    • About us
    • Contact us
    Copyright © 2024 Thedailyfuse.comAll Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.