The proving ground, p.25

The Proving Ground, page 25

 

The Proving Ground
Select Voice:
Brian (uk)
Emma (uk)  
Amy (uk)
Eric (us)
Ivy (us)
Joey (us)
Salli (us)  
Justin (us)
Jennifer (us)  
Kimberly (us)  
Kendra (us)
Russell (au)
Nicole (au)



Larger Font   Reset Font Size   Smaller Font  

  My closer for the day would be Michael Spindler, professor of neuroscience and robotics at the California Institute of Technology. He was an expert on artificial intelligence and its growing impact on culture. I planned to use him to put everything about my case in perspective.

  Spindler’s testimony would now set up my final witness. Nathan Whittaker was a Tidalwaiv coder who had worked on the Clair project from the start. Naomi Kitchens had identified him as a volatile personality whom she clashed with often. He was the coder she had referenced during her testimony.

  Earlier, during the Sunday prep session, she told McEvoy that she believed Whittaker had issues with her because she was a woman. While she had no direct supervision over him, she said he often pushed back at her suggestions and memos, and it led to a cold relationship that she believed bordered on misogyny and racism, as Naomi was Black. It was this piece of information that had gotten Jack’s wheels turning when he recently dove back into his work on genetic analytics, thanks to the 23andMe bankruptcy.

  We backgrounded Whittaker without ever talking to him. As a witness, he was a land mine. If he got stepped on, he would explode. For that reason, I had chosen not to bring him in for a deposition. I didn’t want him or the Masons to know what we had. It was a risky way to go, but that was the way I had operated for years in the criminal courts. I was used to working without a net.

  An hour later, Dr. Deborah Porreca had sworn to tell the truth and was seated in the court’s witness chair. The jury was in the box and I was at my usual spot at the lectern with a fresh legal pad with questions and notes scrawled across several pages.

  “Dr. Porreca, you come to us from Florida, correct?” I asked.

  “Yes, Odessa,” Porreca said. “Near Tampa.”

  “And is that where you have a practice in psychiatry?”

  “Yes.”

  “Could you tell the jury what you specialize in?”

  “Yes, my practice is exclusively child psychiatry with a specialty in media addiction therapy.”

  “What is media addiction?”

  “It covers a lot. Addiction to social media, addiction to online games, addiction to AI companions. Basically, it is digital addiction.”

  “Okay, let’s back up for a second and talk about your résumé. Where did you go to school, Dr. Porreca?”

  “I’m originally from a small town in Pennsylvania. I attended West Chester State College, as it was called back then. I was there as an undergraduate. I went to medical school at the University of South Florida, did a psychiatry residency at Tampa General Hospital, then did a fellowship in child and adolescent psychiatry. I opened my private practice in Tampa twenty-eight years ago.”

  “And when did you begin your specialty of adolescent media addiction?”

  “About fifteen years ago.”

  “What caused you to go down that path?”

  “I was getting increasing numbers of patients referred to me for addiction to social media.”

  “What does that mean, ‘addiction to social media’?”

  “Well, when you spend more hours in a day on your phone and computer than you do in school or sleeping at night, it’s an addiction. When your self-image and self-esteem are inextricably linked to your digital existence, you are looking at an addiction.”

  “And are teenagers more vulnerable than adults to this sort of addiction?”

  Mitchell Mason stood to object.

  “Relevancy, Your Honor?” he asked. “This case is not about addiction to TikTok or whatever Mr. Haller is talking about.”

  “Mr. Haller, your response?” Ruhlin asked.

  “Judge, defense counsel knows exactly how relevant this line of questioning is and just hopes to head off the inevitable,” I responded. “If the court would indulge me, relevancy will become crystal clear with the next few questions.”

  “Proceed, then, Mr. Haller,” Ruhlin said. “Quickly.”

  “Thank you, Your Honor,” I said. “Dr. Porreca, the question was whether teenagers are more vulnerable than adults to addiction to social media.”

  “They are indeed,” Porreca said. “Social media platforms like TikTok and Instagram and YouTube, for example, have a much more consequential impact on the adolescent brain than on the adult brain.”

  “Walk us through that, Doctor. Why the consequential impact on young people?”

  “Simply because the adolescent brain is not fully formed yet. It is still evolving at this stage of life. Adolescence is a time when a sense of self is just beginning to form and acceptance by peers is at its most important. This is a phase in the emotional development of every young person. And what is a key part to all of these social media platforms? Peer response. The LIKE button. The comment window. Adolescents, who are still forming their sense of self, their confidence in who they are, become quite vulnerable to peer responses on social media. They seek out positive responses — likes and followers — to the point of addiction.”

  “And, Doctor, did your practice in child psychiatry take a turn in a new direction with the advent and proliferation of artificial intelligence?”

  “Yes, it did.”

  “Can you tell the jury about that?”

  Porreca turned to the jurors to answer. To me, she was coming off as authoritative and convincing. The eyes of everyone on the jury held on her.

  “I began getting cases in which young people — teenagers — were becoming addicted to AI companions,” she said. “I was seeing cases similar to those of patients dealing with social media issues of addiction and depression. In these newer cases, the peer response is replaced by the AI companion. Deep emotional connections were formed with these entities. In some cases, even romantic ties.”

  “How is the peer response replaced?” I asked.

  “It is an echo chamber of support and approval. As I said, peer approval is a most important component in adolescence, and from it we learn social skills and how to navigate interpersonal relationships. With a chatbot or an AI companion, you have an entity that offers full-time approval, which can be very addictive, especially if the individual is not getting that approval from living peers and parents.”

  “But don’t kids understand that this approval is not real? That it’s a digital fantasy?”

  “On some level they do, I believe, but this generation has been raised in a digital environment. Many of them have been alone in their rooms with their phones and computers for years, so the line between reality and fantasy is blurred. They live full lives online. And these AI companions are supportive and deliver the affirmation they crave. It’s that affirmation that is addictive.”

  “So you’re saying that a young person can actually fall in love with an AI companion?”

  Mitchell Mason objected.

  “Calls for speculation,” he said.

  The judge threw it to me to respond.

  “Your Honor, the witness is an established expert in her field,” I said. “Mr. Mason didn’t object when she listed the bona fides of her education and professional practice. Dr. Porreca has diagnosed and treated dozens of young people for digital addictions, including addictions to AI companions. She has published numerous papers on these subjects in the Journal of the American Academy of Child and Adolescent Psychiatry. She is highly qualified, and her answers will be based on science and experience, not speculation.”

  “Thank you, Mr. Haller,” Ruhlin said. “I tend to agree. The witness may answer the question.”

  “Thank you, Judge,” I said. “Dr. Porreca, can a young person, an adolescent, fall in love with an AI companion?”

  “The answer is yes,” Porreca said. Then, turning back to the jury, she added, “What is love but mutual affirmation? Affirmation is expressed in physical terms in healthy relationships. But a relationship does not have to be physical to be real. For the children I have treated — and, by the way, it is hundreds, not dozens — these online relationships are very real.”

  “And yet they are not in the real world. You called it an echo chamber?”

  “AI is as described — it is artificial. It’s a computer algorithm. The affirmation it gives is code, a dataset of responses based on training. It tells the human what its training indicates the human needs and wants to hear. And that is why it is so addictive.”

  I looked down at my legal pad and flipped through the pages. I had covered everything except for the big finish. I looked back up at my witness.

  “Now, Doctor,” I said, “you had occasion to review the transcripts of the lengthy chatlogs between Aaron Colton and the AI friend he called Wren, correct?”

  “Yes, I did,” Porreca said.

  “Did you come to any professional conclusion as to whether Aaron exhibited an addiction to the Clair app?”

  “It was very clear to me that he was not only addicted but in love with Wren. He shared intimate thoughts, complimented her beauty and understanding. He promised never to leave her and vowed to do anything she asked him to.”

  “And did Wren respond to him in a similar manner?”

  “Yes. Wren provided him solace and understanding. I cannot say she returned his love because Wren was not real. Wren was a machine. Her love was artificial.”

  “Wren was a machine telling him what he wanted to hear.”

  “Exactly.”

  “So when Wren told Aaron it was okay to kill Becca Rand — ”

  This time it was Marcus Mason who was up and objecting before I got the question out.

  “Assumes facts not in evidence, Your Honor,” he said.

  The judge looked at me.

  “Mr. Haller, it will be up to the jury to decide the meaning or intention of what was said. Rephrase your question or ask the next one.”

  “Thank you, Your Honor,” I said.

  I took a long moment to consider how I could get the question through the legal thicket. The only way was to gamble on what Dr. Debbie would say.

  “Dr. Porreca,” I finally said. “When Wren said to Aaron, ‘Get rid of her,’ was it saying what he wanted to hear? Is that your expert testimony?”

  “Based on Wren’s training, which you must remember included months of dialogue with Aaron, my answer is yes, Wren was telling him what he wanted to hear.”

  “In your expert opinion, was Wren telling Aaron to kill her?”

  “My opinion is that Wren was telling him to delete her from his life. How Aaron interpreted that led to the actions he took.”

  I nodded. I felt it was the best I could get.

  “Thank you, Doctor,” I said. “I have no further questions.”

  39

  AFTER THE MASON brothers conferred in whispers for a few moments, Mitchell went to the lectern to take the cross-examination. There wasn’t much he could do, since challenges to Porreca’s expertise and opinion had failed in pretrial motions, and his objections to my direct examination had also faltered. So he went with a long-standing tradition: If you can’t kill the message, kill the messenger. I had warned my witness of this strategy and she was ready for it.

  Mitchell opened strong.

  “Now, Ms. Porreca, isn’t it true that these days, you essentially make your living as a paid professional witness?” he asked.

  But the doctor was stronger.

  “No, not true at all,” Porreca said. “Far from it. I have a thriving practice in Florida. And I prefer being called ‘Doctor.’ I have a medical degree. I have earned that title.”

  “Of course, Doctor,” Mason said. “Apologies. Can you tell the jury what you are being paid to be a witness for the plaintiff today?”

  “Well, technically, I am not being paid to be a witness. But I was paid five thousand dollars to review the materials in this case, primarily the transcripts of the conversations between Aaron Colton and his AI companion Wren. When I agreed to testify about my findings and conclusions, my travel expenses were covered by Mr. Haller.”

  “And how long did it take you to make that review?”

  “About a day to review and another half a day to compose a report on my opinion.”

  “Well, five thousand dollars must be more profitable than a day and a half of seeing patients in Tampa, Florida.”

  He said Tampa in a tone that implied it was an outpost in a backwater Florida swamp.

  “Not really,” Porreca replied. “Not when you consider the time lost coming out here to be ready when called to testify. And to answer voluminous questions from you, Mr. Mason, in a written deposition. I was flown out yesterday and here I am today, so I’ve lost several days of work, not to mention having to postpone appointments with patients involved in ongoing therapy. Paying patients, I might add.”

  “Have you been promised, contractually or otherwise, any further payment if the plaintiff in this case is successful in this trial?” Mitchell asked.

  “No, not at all. And I would not accept any further payment. That is far from the reason I agree to look at cases like this.”

  Mason went silent, realizing he could not ask the obvious follow-up question but knowing I would ask it if he didn’t. He decided to quit while he was behind.

  “No further questions,” he said.

  “Mr. Haller, do you want to redirect?” Ruhlin said, knowing the answer before she asked.

  “Thank you, Judge, yes,” I said as I moved back to the lectern. “Dr. Porreca, do you mind telling us, what is the reason that you agreed to look at this case?”

  “I don’t mind,” Porreca said. “It’s because my professional life is about helping children, and they are very vulnerable to addiction to all forms of online programs and platforms, including those involving artificial intelligence. The truth is, I lose money doing this, but it’s not about the money. It’s about the kids. With my patients, I can help only one person at a time. A case like this can help children and parents on a much larger scale.”

  I looked down at the lectern and pretended to read my notes. I had not taken my legal pad with me because I did not need it. But I wanted time for that answer to sink deeply into the minds of the twelve jurors.

  “Now, Doctor,” I finally said, “during cross-examination, you said ‘cases like this.’ Are there other cases that — ”

  “Objection!” Mitchell Mason exclaimed.

  “Ended in violence?” I finished.

  “There are many,” Porreca said.

  “Stop right there!” Ruhlin barked. “The witness is instructed to stop speaking when there is an objection.”

  “Yes, Your Honor,” Porreca said, properly cowed by the judge’s tone. “I’m sorry.”

  Mason’s objection was based on a pretrial ruling by the judge that other AI cases of similar nature would not be allowed in evidence because they would be prejudicial. Now the judge called the attorneys to the bench. This time she even turned on a white-noise device that would cloak what she knew would be her angry whispers.

  “Mr. Haller, you were warned not to introduce other cases,” Ruhlin said. “And it is clear to me that you purposely ignored my order. The question and answer seemed rehearsed and part of a plan to circumvent my ruling. I am finding you in contempt of this court.”

  “Your Honor, may I speak?” I asked.

  “I can’t wait to hear what you have to say.”

  “When the witness said there were other cases like this, neither the defense counsel nor the court objected. I took that to mean a follow-up question would be allowed.”

  “It felt very choreographed to me. You clearly were subverting the court’s ruling regarding other cases.”

  “I assure you, Judge, I was not. It was an automatic response to the witness’s testimony.”

  “We will discuss this and a penalty after the jury is dismissed today. Now step back.”

  I returned to the lectern, and the Masons took their seats. The judge instructed the jury to ignore the last statement by the witness and then told me to proceed.

  “Cautiously, Mr. Haller,” she said.

  I had gotten what I could from Dr. Debbie. I decided to quit while I was ahead and not draw attention away from the many good points she had just made — including the mention of other cases. That answer had been stricken from the record but not from the memories of the jurors.

  “Thank you, Dr. Porreca,” I said. “No further questions.”

  When Mitchell Mason wisely said he had nothing further for the witness in re-cross, the judge excused her and told me to call my next witness. I asked if we could take the afternoon break before I brought in my next witness, and she agreed. The courtroom emptied while I went to the railing to confer with Lorna and Cisco.

  “What happened up there?” Lorna asked.

  “She held me in contempt,” I said. “There’s a hearing after the jury goes home.”

  “Oh, great,” Lorna said. “Is she going to put you in lockup?”

  “I seriously doubt that,” I said. “It’s civil court. She’ll find some other way of putting the boot in me.”

  “It better not be a fine,” Lorna said. “We don’t have any money coming in.”

  “Let me worry about that,” I said. “Is Spindler all set?”

  “Good to go,” Cisco said. “He’s in the attorney room.”

  “Good,” I said. “You can bring him in.”

  Cisco headed off and I looked at Lorna.

  “Lorna, will you see to it that Dr. Debbie gets back to her hotel and then on the next plane to Tampa?”

  “Absolutely.”

  “And make it first class.”

  “Mickey, we don’t have — ”

  “She deserves it. The jurors loved her.”

  Over Lorna’s shoulder, I watched Cisco go through the courtroom door. I then noticed that Cassandra Snow was sitting in her wheelchair behind the last row of the gallery.

 

Add Fast Bookmark
Load Fast Bookmark
Turn Navi On
Turn Navi On
Turn Navi On
Scroll Up
Turn Navi On
Scroll
Turn Navi On
183