Terry Newell

Terry Newell is currently director of his own firm, Leadership for a Responsible Society.  His work focuses on values-based leadership, ethics, and decision making.  A former Air Force officer, Terry also previously served as Director of the Horace Mann Learning Center, the training arm of the U.S. Department of Education, and as Dean of Faculty at the Federal Executive Institute.  Terry is co-editor and author of The Trusted Leader: Building the Relationships That Make Government Work (CQ Press, 2011).  He also wrote Statesmanship, Character and Leadership in America (Palgrave Macmillan, 2013) and To Serve with Honor: Doing the Right Thing in Government (Loftlands Press 2015).

Think Anew

Recent Blog Posts

Want to Talk with Someone Who Died?  AI Has a Plan

Want to Talk with Someone Who Died? AI Has a Plan

If you lost a brother to road-rage murder and could have him speak to the judge before the shooter was sentenced, would you?  If you grieved over the death of your young child but could speak to her one more time, would you want to?  Two people answered “yes” and AI helped them do so.  AI cannot raise people from beyond the grave but can now make such “conversations” possible.

Stacey Wales lost her brother Christopher, who then gave a victim impact statement to the judge at the sentencing of his road-rage murderer.  "Hello. Just to be clear for everyone seeing this, I'm a version of Chris Pelkey recreated through AI that uses my picture and my voice profile," his AI avatar said.  His message was produced after Stacey interviewed 48 people who knew him about his life and what he would want to say and included her own thoughts. In the video, Christopher addressed the shooter: "It is a shame we encountered each other that day in those circumstances. In another life, we probably could have been friends. I believe in forgiveness and in God who forgives.”

Using virtual reality, South Korean mother Jang Ji-sung met her deceased daughter Na-yeon one last time.  “Mom missed you so much, Na-yeon,” she said, as the digitized version of her daughter ran towards her calling “Mom.”  The child’s avatar was made by 3-D scanning of a child model based on photos and videos of the child and similar voices of child actors.

These are two of a range of AI options increasingly available to enable those who have passed on to stick around.  Using videos and a re-created voice, the BBC offers a creative writing course taught by Agatha Christie.  Some platforms (e.g. Google, Meta) allow a person to give access to their texts, email and social media posts after their death so that AI can create “griefbots” and avatars to enable conversations with those left behind. 

What AI and associated technologies enable does not answer questions about whether what’s possible is desirable beyond those who benefit from its commercial applications. Legal questions come quickly to mind: who owns a person’s digital “life” after death?  Who’s accountable for what is done with the deceased’s texts, emails, videos, posts and artistic works? Do people need to revise their wills? Name a digital executor or ombudsman?

Ethical questions also are being raised.  In the case of AI applications such as the Chris and Jang Ji-sung experiences, does a digital afterlife reflect the values, beliefs and preferences of the deceased or manipulate them?  Does creating “griefbots”” and similar platforms honor or violate religious beliefs/laws?  Do such AI applications respect or demean human dignity?  Should there be any ethical guidelines and, if so, who should develop and monitor them? Does an AI “conversation” with a noted historical figure, say for example Martin Luther King, Jr., distort or accurately reflect the person, history and his impact? Where does moral responsibility lie for what is done, shared and its impact on users?

Some profound questions are psychological. Does an AI-constructed communication with a deceased loved one aid, delay or harm the process of grief and healing?  Does it deepen connection with survivors who may participate or does it increase the isolation of the bereaved who are already experiencing separation and loneliness? Should psychologists or grief counselors play a role?

The technology that has already been used to create deep fakes of current people and events has raised epistemological questions that exist as well with these AI applications. Will such “conversations” implant false memories or distort the memory of a loved one?  Will they blur the distinction between truth and falsehood that already plagues our culture and technologies? 

If our past with digital technologies is any guide it is likely that these AI applications and their impact will spread faster than most of these difficult questions will find answers. There is no clear vehicle – association, government or regulatory agency or legislative body that will quickly come forward or gain support to help.  

Peter Listro, 83, knew he had blood cancer and less than a year to live.  His son, Matt, asked him to work with StoryFile, using AI to create an avatar of his dad that the family could converse with after he passed. Peter was filmed responding to questions the family prepared, encouraging him to talk about and reflect on his life. Viewing the result, Matt commented on the experience. “It won’t change the reality that I’ve lost my father,” he said.  “But it lessens the blow ever so slightly, knowing that when he does die, it won’t be the last time I’ll ever have a conversation with him.” Yet, Matt added, “Seeing him cry on camera was really difficult.” “It was a reminder that this is a human I love that I want to console. But you can’t console a video clip.”

Photo Credit: tenchat.ru

Profiles in Character: FDR and George Marshall Put Country Over Popularity to Prepare America for War

Profiles in Character: FDR and George Marshall Put Country Over Popularity to Prepare America for War