Posted on Leave a comment

Therapist found guilty of sexually assaulting Marlington student

Ed Balint The Repository
Source: Canton Rep

John Sohar, 52, of Lexington Township, reacts to a guilty verdict late Friday afternoon in his sexual battery trial in Stark County Common Pleas Court. Jurors convicted Sohar of sexually assaulting a 14-year-old girl in his job as a school-based therapist at Marlington High in 2019.
Source: Canton Rep
Shidonna Raven Garden and Cook

CANTON John Sohar bowed his head in visible anguish when he learned jurors had found him guilty of sexually assaulting a 14-year-old girl during his job as a school-based therapist at Marlington High last year. 

A jury of seven women and five men took roughly 90 minutes to reach the verdict late Friday afternoon on a third-degree felony count of sexual battery.

The charge stemmed from sexual touching and conduct the student said occurred during multiple counseling sessions at the school office Sohar kept while employed as a counselor for an outside agency.

The girl, huddled in the back of the courtroom, and her parents shared hugs and expressed quiet emotion following the verdict in Stark County Common Pleas Court.

Judge Chryssa Hartnett scheduled Sohar’s sentencing for 11 a.m. Tuesday. The 52-year-old Lexington Township man faces up to five years in prison.

Testimony ended earlier Friday with Sohar repeatedly denying he sexually assaulted the girl or convinced her the sexual conduct was part of her therapy.

Defendant’s words

Sohar testified his frequent and sometimes two to three-hour counseling sessions and repeated text messages and phone calls with the girl were an effort to help her cope with depression and mental health issues and not hurt herself.

Sohar’s testimony, coming the day after his accuser took the witness stand, preceded closing arguments.

“My goal was always the same,” Sohar said. “To keep my clients alive.”

And under intense questioning from Stark County Assistant Prosecutor Daniel Petricini, Sohar continued his denials.

Petricini had told jurors Thursday that Sohar manipulated a “lonely teenage girl” who had become infatuated with him. 

The girl had pre-existing mental health issues and a strained relationship with her mother prior to enrolling in therapy, he said during closing arguments Friday.

Petricini asked Sohar if it was proper for a therapist to exchange more than 300 text messages over the course of three days with a student client outside of their regular therapy sessions.

The defendant admitted he communicated with the girl “above and beyond” what he did with other patients.

Closing arguments

Following Sohar’s denials, the prosecution and defense made impassioned arguments to jurors.

Citing the earlier testimony of Carrie Schnirring, a mental health professional with Lighthouse Family Center, Petricini said the girl’s testimony was convincing because of details unique to the sexual abuse from Sohar.

Petricini called Schnirring as a witness in making pyschological asessments of children who make allegations of sex abuse. 

He said the details and sequence of events were consistently told by the girl multiple times and were not “the things you would expect from someone making up a story.”

Schnirring testified Friday that following multiple sessions with the girl, she found her account to be credible.

Petricini said that during her testimony on Thursday the girl sometimes took deep breaths, closed her eyes and paused to recall details of the sex abuse as if she was reliving it in her mind.

Petricini said the girl’s testimony, phone call and text records and Schnirring’s testimony combined to give jurors ample evidence to convict.

Defense attorney George Urban, however, told jurors Sohar was a professional, dedicated and caring therapist who didn’t stop trying to help the girl when regular therapy sessions were over.

Urban emphasized the student had twice become upset when Sohar stopped being her therapist, referring to it as “detachment.”

And although the prosecution cited records of more than 300 text messages between the student and Sohar over the course of a few days, Urban said that only about 10 or 12 texts were produced at trial through cellphone screen photos the mother had taken.

“He’s no groomer,” Urban said of Sohar. “He’s trying to help this young girl. As repayment for that — here we are.”

Prosecution questioning

Sohar was not an employee of the Marlington district; at the time of the allegations in the fall of 2019, he was an employee of Child & Adolescent Behavioral Health, also referred to during testimony as Child & Adolescent Services.

Asked by Urban about the amount of time he spent texting and talking on the phone with the student outside of scheduled counseling sessions, Sohar responded: “It’s difficult to put a timeline on trying to save someone’s life.”

During testimony, he usually spoke in a firm, direct voice but displayed visible emotion when telling his attorney that three of his clients over the years had committed suicide.

In October 2019, the girl wrote a 12-page letter in which she described Sohar’s sexual misconduct, prompting an investigation by the Stark County Sheriff’s Office.

The girl had given the letter to Sohar at school in front of another counselor, according to testimony. Petricini told the defendant that Sohar had turned over the letter only because the school employee inquired about it.

Sohar denied that was the case.

Urban said in the letter the girl sought revenge because she didn’t want Sohar to stop being her counselor permanently. “She wanted to zing Mr. Sohar,” he said in closing arguments. “This was her way.”

Petricini said that Schnirring found the letter not to be written by someone seeking revenge.

“She blames herself,” the assistant prosecutor said, referring to the writings in the pages of her school notebook as a love letter from a girl infatuated with the adult counselor. “This is a cry for help,” he said.

Petricini told jurors Sohar clearly groomed the girl for his own sexual gratification, playing on her vulnerabilities, gaining her trust and pitting the teenager and mother against one another.

He cited the girl’s testimony of how Sohar began by rubbing her shoulders during a therapy session before fondling and sexually assaulting her at later appointments.

More testimony

Under direct questioning, Sohar described the girl’s letter as “the ramblings of someone with some serious mental health issues.”

He also said he still wanted the girl to receive the mental health help she needed.

The student also testified on Thursday that she had seen two tattoos on Sohar’s body during therapy, a cross on his chest and song lyrics on his stomach area.

The defendant said on Friday that his tattoos would have been known to some of his clients, including high school students.

Petricini countered that the defendant’s explanation was not believable, calling it “totally inappropriate to talk about your body tattoos to relate to your teenage clients.”

Also during his testimony Friday, when explaining his educational background and employment history in counseling, Sohar noted that in addition to having a degree in pastoral counseling, he’s a former councilman and mayor of the village of Marshallville in Wayne County.

Reach Ed at 330-580-8315 and ebalint@gannett.com

On Twitter @ebaintREP

Would you recognize healthcare abuse and fraud when you see it? What does it look like? What should you do?

Share your comments with the community by posting them below. Share the wealth of health with your friends and family by sharing this article with 3 people today. As always you are the best part of what we do. Keep sharing!

If these articles have been helpful to you and yours, give a donation to Shidonna Raven Garden and Cook Ezine today.

Posted on Leave a comment

HIPAA, Google, and Article III Standing, With a Nod to Kim Kardashian

Google Shidonna Raven Garden and Cook

Saad GulMichael Slipsky, Poyner Spruill LLP+ Follow 
Source: JDSPURA

In a ruling that could have broad ramifications for health data sharing, a federal judge has ruled that a patient complaining about a hospital sharing his health data without permission lacked standing because he suffered no loss.

The case arose out of University of Chicago Medical Center patient Matt Dinerstein’s concerns about the hospital’s arrangement with Google. The hospital and Google partnered to share thousands of de-identified patient records. At the heart of the initiative was a machine learning project using Google’s electronic medical records data. The objective was to improve healthcare outcomes, for instance reducing care complications.

In a suit filed last June, Dinerstein argued the arrangement violated HIPAA. The partners had not obtained consent to share data. Nor had they informed patients that they would be sharing their data.

A federal judge dismissed the suit last week. The court rejected Dinersteins arguments that his medical records had commercial value, and their appropriation was theft. Both the University of Chicago and Google argued that their data sharing practices were HIPAA compliant. And they contended that Dinerstein’s allegations of fraud and deceptive business practices were meritless since he had voluntarily shared his medical data.

The gist of the defendants’ argument was that Dinerstein offered no contractual or Common Law authority to support his contention that he had a legal interest in his personal health information (PHI). But even if he had, he could not show that their actions had diminished the value of any property interest. And finally, he had shown no pecuniary damages stemming from the alleged contractual breach.

Critics complained that the partnership enabled Google to access mammoth amounts of PHI without patient consent. The partners argued that the material was deidentified data. Critics countered that the ostensibly deidentified data contained physician notes and dates, thereby nullifying any deidentification. The issue implicated partnerships other than the one with University of Chicago. Google has similar arrangements with other partners.

It has consistently maintained that its partnerships adhere to HIPAA mandates. The sole objective was to improve healthcare. Even so, unease with the practice has prompted Congress to query if it is time to update HIPAA in an age of Big Data and corona.

The court ultimately determined that the defendants had the better argument on procedural grounds. Without monetary harm, breach of contract would not confer standing.

“The alleged invasion of Plaintiff’s privacy is an injury in fact that can support his claim of intrusion upon seclusion,” the court suggested. “Dinerstein seems to suggest that the statutes at issue here—HIPAA and the MPRA—also create a legal interest in his health information… [but] has cited no authority supporting the proposition that HIPAA or the MPRA creates a property interest in health data.”

The court stressed that Congress had not created a private right of action for HIPAA. Dinerstein could not sidestep this by pursuing it as a breach of contract claim.

The decision raises three interesting implications for the future

First, it ignores that personal data is bought and sold. A marketplace reflects value. And that is regular citizen PHI. Celebrities from Kim Kardashian to Prince have long dealt with insiders selling their PHI. UCLA paid $856,000 to resolve allegations that personnel sold Kardashian data. Other high profile individuals such as Britney Spears, George Clooney, Farrah Fawcett, Drew Barrymore, Arnold Schwarzenegger, Tom Hanks, and Leonardo DiCaprio have also had their PHI sold.

Second, the court’s reasoning that PHI’s lack of economic value translates to the absence of Article III standing means that HIPAA violators are accountable only to regulators.

Third, the decision went against a state court trend we have previously analyzed: the principle that HIPAA sets the standard of care for privacy. Like any other tort claim, deviation from this standard of care that results in a loss of privacy is a cognizable injury that gives rise to a claim.

Only time will tell if the decision is an outlier or a harbinger of future HIPAA or privacy holdings. If federal and state courts adhere to their current courses, the outcomes of privacy lawsuits will hinge on the forum rather than the facts or legal theories presented.

How are your medical records being shared? Do you know? How would you prefer your medical records to be shared? Do you doctors know? How can this impact the care they give you? How can a breach or sharing of your medical records impact your health outcomes?

Share your comments with the community by posting them below. Share the wealth of health with your friends and family by sharing this article with 3 people today. As always you are the best part of what we do. Keep sharing!

If these articles have been helpful to you and yours, give a donation to Shidonna Raven Garden and Cook Ezine today.

Posted on Leave a comment

HHS: More than 2M patients affected by breaches reported in October

The data comes amidst new reports that cybercriminals are using industry-standard encryption methods to enact attacks that bypass detection.
By Kat Jercich
November 16, 2020 09:58 AM
Source: Healthcare IT News

A person stands with a laptop in a room with technology
Source: Healthcare IT News
Shidonna Raven Garden and Cook

The U.S. Department of Health and Human Services released a snapshot this past week detailing breaches reported to the Office of Civil Rights in October.

In total, more than 2 million individuals had their records exposed by the 58 reported breaches, though it is possible that the same patients were affected by multiple incidents.  

It’s worth noting that the Secretary must, by law, post breaches of unsecured protected health information affecting 500 or more individuals – meaning breaches affecting fewer than that were not listed.   

WHY IT MATTERS  

According to HHS, slightly more than a third of the breaches reported in October took place over email, and about 40% took place over a network server.  

Three of the breaches occurred within an electronic medical record – including an incident at the Mayo Clinic involving a now-fired employee inappropriately accessing reportedly sensitive photographs.   

Although the breaches were all reported in October, they did not all take place last month. The largest breach – affecting more than 800,000 patients of Luxottica of America Inc., which operates vision care facilities – appears to have occurred in August, according to suits filed against the company.

More details about each breach were not included in the HHS list. However, a 2020 State of Encrypted Attacks report published by the Zscaler ThreatLabZ research team this past week found that cybercriminals are using industry-standard encryption methods, paired with malware, to enact attacks that bypass detection.

“Cybercriminals have created sophisticated attack chains that start with an innocent-looking phishing email containing an exploit or hidden malware. If an unsuspecting user clicks, then the attack moves into the malware installation phase, and ultimately to the exfiltration of valuable corporate data,” wrote report authors.  

The team found a whopping 260% increase in SSL-based threats in the last nine months, with 1.6 billion identified and blocked threats specifically targeting the healthcare industry.

More than 30% of SSL-based attacks hide in collaboration services such as Google Drive of Dropbox. And ransomware is on the rise: the Zscaler team reports a 500% increase in ransomware attacks over encrypted channels since March 2020.  

“A notable change in many of these ransomware family variants during the past year has been the addition of a data exfiltration feature. This new feature allows ransomware gangs to exfiltrate sensitive data from victims before encrypting the data. This exfiltrated data is like an insurance policy for attackers: even if the victim organization has good backups, they’ll pay the ransom to avoid having their data exposed,” wrote the report authors.

THE LARGER TREND  

Cybercrime has taken on a renewed danger in the COVID-19 era, with already-strained hospital employees vulnerable to making mistakes such as clicking on phishing links in emails. 

Meanwhile, HHS, along with the Federal Bureau of Investigations and the Cybersecurity Infrastructure and Security Agency, issued a bulletin late last month warning of “increased and imminent” cyber threats to hospitals.

“Ransomware attacks on our healthcare system may be the most dangerous cybersecurity threat we’ve ever seen in the United States,” said Charles Carmakal, chief technology officer of cybersecurity firm Mandiant, in a press statement.

ON THE RECORD  

“The consequences [of a cyberattack] can be grave. If an attack happens in the middle of a surgery, whatever machines are being used could go down, forcing medical staff to fall back on manual methods,” said Juta Gurinaviciute, chief technology officer at NordVPN Teams, in a statement.

“MRI machines, ventilators, and some types of microscopes are computers too. Just like our laptops, those computers come with software that the developers have to support,” said Gurinaviciute. “When the machines become old and outdated, the people who made them might stop supporting them. That means that old software can become vulnerable to attacks.”

Kat Jercich is senior editor of Healthcare IT News.
Twitter: @kjercich
Email: kjercich@himss.org
Healthcare IT News is a HIMSS Media publication.

What are your thoughts on this article? How has it changed the way you manage your healthcare records? Why? Why not?

Share your comments with the community by posting them below. Share the wealth of health with your friends and family by sharing this article with 3 people today. As always you are the best part of what we do. Keep sharing!

If these articles have been helpful to you and yours, give a donation to Shidonna Raven Garden and Cook Ezine today.

Posted on Leave a comment

The Pentagon’s Push to Program Soldiers’ Brains

The military wants future super-soldiers to control robots with their thoughts.

Eddie Guy
Source: The Atlantic
Shidonna Raven Garden and Cook

I. Who Could Object?

“Tonight I would like to share with you an idea that I am extremely passionate about,” the young man said. His long black hair was swept back like a rock star’s, or a gangster’s. “Think about this,” he continued. “Throughout all human history, the way that we have expressed our intent, the way we have expressed our goals, the way we have expressed our desires, has been limited by our bodies.” When he inhaled, his rib cage expanded and filled out the fabric of his shirt. Gesturing toward his body, he said, “We are born into this world with this. Whatever nature or luck has given us.”

His speech then took a turn: “Now, we’ve had a lot of interesting tools over the years, but fundamentally the way that we work with those tools is through our bodies.” Then a further turn: “Here’s a situation that I know all of you know very well—your frustration with your smartphones, right? This is another tool, right? And we are still communicating with these tools through our bodies.”

And then it made a leap: “I would claim to you that these tools are not so smart. And maybe one of the reasons why they’re not so smart is because they’re not connected to our brains. Maybe if we could hook those devices into our brains, they could have some idea of what our goals are, what our intent is, and what our frustration is.”

So began “Beyond Bionics,” a talk by Justin C. Sanchez, then an associate professor of biomedical engineering and neuroscience at the University of Miami, and a faculty member of the Miami Project to Cure Paralysis. He was speaking at a tedx conference in Florida in 2012. What lies beyond bionics? Sanchez described his work as trying to “understand the neural code,” which would involve putting “very fine microwire electrodes”—the diameter of a human hair—“into the brain.” When we do that, he said, we would be able to “listen in to the music of the brain” and “listen in to what somebody’s motor intent might be” and get a glimpse of “your goals and your rewards” and then “start to understand how the brain encodes behavior.”

He explained, “With all of this knowledge, what we’re trying to do is build new medical devices, new implantable chips for the body that can be encoded or programmed with all of these different aspects. Now, you may be wondering, what are we going to do with those chips? Well, the first recipients of these kinds of technologies will be the paralyzed. It would make me so happy by the end of my career if I could help get somebody out of their wheelchair.”

Sanchez went on, “The people that we are trying to help should never be imprisoned by their bodies. And today we can design technologies that can help liberate them from that. I’m truly inspired by that. It drives me every day when I wake up and get out of bed. Thank you so much.” He blew a kiss to the audience.The mission is to make human beings something other than what we are, with powers beyond the ones we’re born with.

A year later, Justin Sanchez went to work for the Defense Advanced Research Projects Agency, the Pentagon’s R&D department. At darpa, he now oversees all research on the healing and enhancement of the human mind and body. And his ambition involves more than helping get disabled people out of their wheelchair—much more.

darpa has dreamed for decades of merging human beings and machines. Some years ago, when the prospect of mind-controlled weapons became a public-relations liability for the agency, officials resorted to characteristic ingenuity. They recast the stated purpose of their neurotechnology research to focus ostensibly on the narrow goal of healing injury and curing illness. The work wasn’t about weaponry or warfare, agency officials claimed. It was about therapy and health care. Who could object? But even if this claim were true, such changes would have extensive ethical, social, and metaphysical implications. Within decades, neurotechnology could cause social disruption on a scale that would make smartphones and the internet look like gentle ripples on the pond of history.

Most unsettling, neurotechnology confounds age-old answers to this question: What is a human being?

II. High Risk, High Reward

In his 1958 State of the Union address, President Dwight Eisenhower declared that the United States of America “must be forward-looking in our research and development to anticipate the unimagined weapons of the future.” A few weeks later, his administration created the Advanced Research Projects Agency, a bureaucratically independent body that reported to the secretary of defense. This move had been prompted by the Soviet launch of the Sputnik satellite. The agency’s original remit was to hasten America’s entry into space.

During the next few years, arpa’s mission grew to encompass research into “man-computer symbiosis” and a classified program of experiments in mind control that was code-named Project Pandora. There were bizarre efforts that involved trying to move objects at a distance by means of thought alone. In 1972, with an increment of candor, the word Defense was added to the name, and the agency became darpa. Pursuing its mission, darpa funded researchers who helped invent technologies that changed the nature of battle (stealth aircraft, drones) and shaped daily life for billions (voice-recognition technology, GPS devices). Its best-known creation is the internet.

The agency’s penchant for what it calls “high-risk, high-reward” research ensured that it would also fund a cavalcade of folly. Project Seesaw, a quintessential Cold War boondoggle, envisioned a “particle-beam weapon” that could be deployed in the event of a Soviet attack. The idea was to set off a series of nuclear explosions beneath the Great Lakes, creating a giant underground chamber. Then the lakes would be drained, in a period of 15 minutes, to generate the electricity needed to set off a particle beam. The beam would accelerate through tunnels hundreds of miles long (also carved out by underground nuclear explosions) in order to muster enough force to shoot up into the atmosphere and knock incoming Soviet missiles out of the sky. During the Vietnam War, darpa tried to build a Cybernetic Anthropomorphous Machine, a jungle vehicle that officials called a “mechanical elephant.”One aspiration: the ability, via computer, to transfer knowledge and thoughts from one person’s mind to another’s.

The diverse and sometimes even opposing goals of darpa scientists and their Defense Department overlords merged into a murky, symbiotic research culture—“unencumbered by the typical bureaucratic oversight and uninhibited by the restraints of scientific peer review,” Sharon Weinberger wrote in a recent book, The Imagineers of War. In Weinberger’s account, darpa’s institutional history involves many episodes of introducing a new technology in the context of one appealing application, while hiding other genuine but more troubling motives. At darpa, the left hand knows, and doesn’t know, what the right hand is doing.

The agency is deceptively compact. A mere 220 employees, supported by about 1,000 contractors, report for work each day at darpa’s headquarters, a nondescript glass-and-steel building in Arlington, Virginia, across the street from the practice rink for the Washington Capitals. About 100 of these employees are program managers—scientists and engineers, part of whose job is to oversee about 2,000 outsourcing arrangements with corporations, universities, and government labs. The effective workforce of darpa actually runs into the range of tens of thousands. The budget is officially said to be about $3 billion, and has stood at roughly that level for an implausibly long time—the past 14 years.

The Biological Technologies Office, created in 2014, is the newest of darpa’s six main divisions. This is the office headed by Justin Sanchez. One purpose of the office is to “restore and maintain warfighter abilities” by various means, including many that emphasize neurotechnology—applying engineering principles to the biology of the nervous system. For instance, the Restoring Active Memory program develops neuroprosthetics—tiny electronic components implanted in brain tissue—that aim to alter memory formation so as to counteract traumatic brain injury. Does darpa also run secret biological programs? In the past, the Department of Defense has done such things. It has conducted tests on human subjects that were questionable, unethical, or, many have argued, illegal. The Big Boy protocol, for example, compared radiation exposure of sailors who worked above and below deck on a battleship, never informing the sailors that they were part of an experiment.

Last year I asked Sanchez directly whether any of darpa’s neurotechnology work, specifically, was classified. He broke eye contact and said, “I can’t—We’ll have to get off that topic, because I can’t answer one way or another.” When I framed the question personally—“Are you involved with any classified neuroscience project?”—he looked me in the eye and said, “I’m not doing any classified work on the neurotechnology end.”

If his speech is careful, it is not spare. Sanchez has appeared at public events with some frequency (videos are posted on darpa’s YouTube channel), to articulate joyful streams of good news about darpa’s proven applications—for instance, brain-controlled prosthetic arms for soldiers who have lost limbs. Occasionally he also mentions some of his more distant aspirations. One of them is the ability, via computer, to transfer knowledge and thoughts from one person’s mind to another’s.

III. “We Try to Find Ways to Say Yes”

Medicine and biology were of minor interest to darpa until the 1990s, when biological weapons became a threat to U.S. national security. The agency made a significant investment in biology in 1997, when darpa created the Controlled Biological Systems program. The zoologist Alan S. Rudolph managed this sprawling effort to integrate the built world with the natural world. As he explained it to me, the aim was “to increase, if you will, the baud rate, or the cross-communication, between living and nonliving systems.” He spent his days working through questions such as “Could we unlock the signals in the brain associated with movement in order to allow you to control something outside your body, like a prosthetic leg or an arm, a robot, a smart home—or to send the signal to somebody else and have them receive it?”

Human enhancement became an agency priority. “Soldiers having no physical, physiological, or cognitive limitation will be key to survival and operational dominance in the future,” predicted Michael Goldblatt, who had been the science and technology officer at McDonald’s before joining darpa in 1999. To enlarge humanity’s capacity to “control evolution,” he assembled a portfolio of programs with names that sounded like they’d been taken from video games or sci-fi movies: Metabolic Dominance, Persistence in Combat, Continuous Assisted Performance, Augmented Cognition, Peak Soldier Performance, Brain-Machine Interface.

The programs of this era, as described by Annie Jacobsen in her 2015 book, The Pentagon’s Brain, often shaded into mad-scientist territory. The Continuous Assisted Performance project attempted to create a “24/7 soldier” who could go without sleep for up to a week. (“My measure of success,” one darpa official said of these programs, “is that the International Olympic Committee bans everything we do.”)

Dick Cheney relished this kind of research. In the summer of 2001, an array of “super-soldier” programs was presented to the vice president. His enthusiasm contributed to the latitude that President George W. Bush’s administration gave darpa—at a time when the agency’s foundation was shifting. Academic science gave way to tech-industry “innovation.” Tony Tether, who had spent his career working alternately for Big Tech, defense contractors, and the Pentagon, became darpa’s director. After the 9/11 attacks, the agency announced plans for a surveillance program called Total Information Awareness, whose logo included an all-seeing eye emitting rays of light that scanned the globe. The pushback was intense, and Congress took darpa to task for Orwellian overreach. The head of the program—Admiral John Poindexter, who had been tainted by scandal back in the Reagan years—later resigned, in 2003. The controversy also drew unwanted attention to darpa’s research on super-soldiers and the melding of mind and machine. That research made people nervous, and Alan Rudolph, too, found himself on the way out.

In this time of crisis, darpa invited Geoff Ling, a neurology‑ICU physician and, at the time, an active-duty Army officer, to join the Defense Sciences Office. (Ling went on to work in the Biological Technologies Office when it spun out from Defense Sciences, in 2014.) When Ling was interviewed for his first job at darpa, in 2002, he was preparing for deployment to Afghanistan and thinking about very specific combat needs. One was a “pharmacy on demand” that would eliminate the bulk of powdery fillers from drugs in pill or capsule form and instead would formulate active ingredients for ingestion via a lighter, more compact, dissolving substance—like Listerine breath strips. This eventually became a darpa program. The agency’s brazen sense of possibility buoyed Ling, who recalls with pleasure how colleagues told him, “We try to find ways to say yes, not ways to say no.” With Rudolph gone, Ling picked up the torch.

Ling talks fast. He has a tough-guy voice. The faster he talks, the tougher he sounds, and when I met him, his voice hit top speed as he described a first principle of Defense Sciences. He said he had learned this “particularly” from Alan Rudolph: “Your brain tells your hands what to do. Your hands basically are its tools, okay? And that was a revelation to me.” He continued, “We are tool users—that’s what humans are. A human wants to fly, he builds an airplane and flies. A human wants to have recorded history, and he creates a pen. Everything we do is because we use tools, right? And the ultimate tools are our hands and feet. Our hands allow us to work with the environment to do stuff, and our feet take us where our brain wants to go. The brain is the most important thing.”

Ling connected this idea of the brain’s primacy with his own clinical experience of the battlefield. He asked himself, “How can I liberate mankind from the limitations of the body?” The program for which Ling became best known is called Revolutionizing Prosthetics. Since the Civil War, as Ling has said, the prosthetic arm given to most amputees has been barely more sophisticated than “a hook,” and not without risks: “Try taking care of your morning ablutions with that bad boy, and you’re going to need a proctologist every goddamn day.” With help from darpa colleagues and academic and corporate researchers, Ling and his team built something that was once all but unimaginable: a brain-controlled prosthetic arm.

No invention since the internet has been such a reliable source of good publicity for darpa. Milestones in its development were hailed with wonder. In 2012, 60 Minutes showed a paralyzed woman named Jan Scheuermann feeding herself a bar of chocolate using a robotic arm that she manipulated by means of a brain implant.

Yet darpa’s work to repair damaged bodies was merely a marker on a road to somewhere else. The agency has always had a larger mission, and in a 2015 presentation, one program manager—a Silicon Valley recruit—described that mission: to “free the mind from the limitations of even healthy bodies.” What the agency learns from healing makes way for enhancement. The mission is to make human beings something other than what we are, with powers beyond the ones we’re born with and beyond the ones we can organically attain.

The internal workings of darpa are complicated. The goals and values of its research shift and evolve in the manner of a strange, half-conscious shell game. The line between healing and enhancement blurs. And no one should lose sight of the fact that D is the first letter in darpa’s name. A year and a half after the video of Jan Scheuermann feeding herself chocolate was shown on television, darpa made another video of her, in which her brain-computer interface was connected to an F-35 flight simulator, and she was flying the airplane. darpa later disclosed this at a conference called Future of War.

Geoff Ling’s efforts have been carried on by Justin Sanchez. In 2016, Sanchez appeared at darpa’s “Demo Day” with a man named Johnny Matheny, whom agency officials describe as the first “osseointegrated” upper-limb amputee—the first man with a prosthetic arm attached directly to bone. Matheny demonstrated what was, at the time, darpa’s most advanced prosthetic arm. He told the attendees, “I can sit here and curl a 45-pound dumbbell all day long, till the battery runs dead.” The next day, Gizmodo ran this headline above its report from the event: “darpa’s Mind-Controlled Arm Will Make You Wish You Were a Cyborg.”

Since then, darpa’s work in neurotechnology has avowedly widened in scope, to embrace “the broader aspects of life,” Sanchez told me, “beyond the person in the hospital who is using it to heal.” The logical progression of all this research is the creation of human beings who are ever more perfect, by certain technological standards. New and improved soldiers are necessary and desirable for darpa, but they are just the window-display version of the life that lies ahead.

IV. “Over the Horizon”

Consider memory, Sanchez told me: “Everybody thinks about what it would be like to give memory a boost by 20, 30, 40 percent—pick your favorite number—and how that would be transformative.” He spoke of memory enhancement through neural interface as an alternative form of education. “School in its most fundamental form is a technology that we have developed as a society to help our brains to do more,” he said. “In a different way, neurotechnology uses other tools and techniques to help our brains be the best that they can be.” One technique was described in a 2013 paper, a study involving researchers at Wake Forest University, the University of Southern California, and the University of Kentucky. Researchers performed surgery on 11 rats. Into each rat’s brain, an electronic array—featuring 16 stainless-steel wires—was implanted. After the rats recovered from surgery, they were separated into two groups, and they spent a period of weeks getting educated, though one group was educated more than the other.

The less educated group learned a simple task, involving how to procure a droplet of water. The more educated group learned a complex version of that same task—to procure the water, these rats had to persistently poke levers with their nose despite confounding delays in the delivery of the water droplet. When the more educated group of rats attained mastery of this task, the researchers exported the neural-firing patterns recorded in the rats’ brains—the memory of how to perform the complex task—to a computer.

“What we did then was we took those signals and we gave it to an animal that was stupid,” Geoff Ling said at a darpa event in 2015—meaning that researchers took the neural-firing patterns encoding the memory of how to perform the more complex task, recorded from the brains of the more educated rats, and transferred those patterns into the brains of the less educated rats—“and that stupid animal got it. They were able to execute that full thing.” Ling summarized: “For this rat, we reduced the learning period from eight weeks down to seconds.”

“They could inject memory using the precise neural codes for certain skills,” Sanchez told me. He believes that the Wake Forest experiment amounts to a foundational step toward “memory prosthesis.” This is the stuff of The Matrix. Though many researchers question the findings—cautioning that, really, it can’t be this simple—Sanchez is confident: “If I know the neural codes in one individual, could I give that neural code to another person? I think you could.” Under Sanchez, darpa has funded human experiments at Wake Forest, the University of Southern California, and the University of Pennsylvania, using similar mechanisms in analogous parts of the brain. These experiments did not transfer memory from one person to another, but instead gave individuals a memory “boost.” Implanted electrodes recorded neuronal activity associated with recognizing patterns (at Wake Forest and USC) and memorizing word lists (at Penn) in certain brain circuits. Then electrodes fed back those recordings of neuronal activity into the same circuits as a form of reinforcement. The result, in both cases, was significantly improved memory recall.

Doug Weber, a neural engineer at the University of Pittsburgh who recently finished a four-year term as a darpa program manager, working with Sanchez, is a memory-transfer skeptic. Born in Wisconsin, he has the demeanor of a sitcom dad: not too polished, not too rumpled. “I don’t believe in the infinite limits of technology evolution,” he told me. “I do believe there are going to be some technical challenges which are impossible to achieve.” For instance, when scientists put electrodes in the brain, those devices eventually fail—after a few months or a few years. The most intractable problem is blood leakage. When foreign material is put into the brain, Weber said, “you undergo this process of wounding, bleeding, healing, wounding, bleeding, healing, and whenever blood leaks into the brain compartment, the activity in the cells goes way down, so they become sick, essentially.” More effectively than any fortress, the brain rejects invasion.

Even if the interface problems that limit us now didn’t exist, Weber went on to say, he still would not believe that neuroscientists could enable the memory-prosthesis scenario. Some people like to think about the brain as if it were a computer, Weber explained, “where information goes from A to B to C, like everything is very modular. And certainly there is clear modular organization in the brain. But it’s not nearly as sharp as it is in a computer. All information is everywhere all the time, right? It’s so widely distributed that achieving that level of integration with the brain is far out of reach right now.”

Peripheral nerves, by contrast, conduct signals in a more modular fashion. The biggest, longest peripheral nerve is the vagus. It connects the brain with the heart, the lungs, the digestive tract, and more. Neuroscientists understand the brain’s relationship with the vagus nerve more clearly than they understand the intricacies of memory formation and recall among neurons within the brain. Weber believes that it may be possible to stimulate the vagus nerve in ways that enhance the process of learning—not by transferring experiential memories, but by sharpening the facility for certain skills.Will an enhanced human being—a human being possessing a neural interface with a computer—still be a human being?

To test this hypothesis, Weber directed the creation of a new program in the Biological Technologies Office, called Targeted Neuroplasticity Training (TNT). Teams of researchers at seven universities are investigating whether vagal-nerve stimulation can enhance learning in three areas: marksmanship, surveillance and reconnaissance, and language. The team at Arizona State has an ethicist on staff whose job, according to Weber, “is to be looking over the horizon to anticipate potential challenges and conflicts that may arise” regarding the ethical dimensions of the program’s technology, “before we let the genie out of the bottle.” At a TNT kickoff meeting, the research teams spent 90 minutes discussing the ethical questions involved in their work—the start of a fraught conversation that will broaden to include many others, and last for a very long time.

darpa officials refer to the potential consequences of neurotechnology by invoking the acronym elsi, a term of art devised for the Human Genome Project. It stands for “ethical, legal, social implications.” The man who led the discussion on ethics among the research teams was Steven Hyman, a neuroscientist and neuroethicist at MIT and Harvard’s Broad Institute. Hyman is also a former head of the National Institute of Mental Health. When I spoke with him about his work on darpa programs, he noted that one issue needing attention is “cross talk.” A man-machine interface that does not just “read” someone’s brain but also “writes into” someone’s brain would almost certainly create “cross talk between those circuits which we are targeting and the circuits which are engaged in what we might call social and moral emotions,” he said. It is impossible to predict the effects of such cross talk on “the conduct of war” (the example he gave), much less, of course, on ordinary life.

Weber and a darpa spokesperson related some of the questions the researchers asked in their ethics discussion: Who will decide how this technology gets used? Would a superior be able to force subordinates to use it? Will genetic tests be able to determine how responsive someone would be to targeted neuroplasticity training? Would such tests be voluntary or mandatory? Could the results of such tests lead to discrimination in school admissions or employment? What if the technology affects moral or emotional cognition—our ability to tell right from wrong or to control our own behavior?

Recalling the ethics discussion, Weber told me, “The main thing I remember is that we ran out of time.”

V. “You Can Weaponize Anything”

In The Pentagon’s Brain, Annie Jacobsen suggested that darpa’s neurotechnology research, including upper-limb prosthetics and the brain-machine interface, is not what it seems: “It is likely that darpa’s primary goal in advancing prosthetics is to give robots, not men, better arms and hands.” Geoff Ling rejected the gist of her conclusion when I summarized it for him (he hadn’t read the book). He told me, “When we talk about stuff like this, and people are looking for nefarious things, I always say to them, ‘Do you honestly believe that the military that your grandfather served in, your uncle served in, has changed into being Nazis or the Russian army?’ Everything we did in the Revolutionizing Prosthetics program—everything we did—is published. If we were really building an autonomous-weapons system, why would we publish it in the open literature for our adversaries to read? We hid nothing. We hid not a thing. And you know what? That meant that we didn’t just do it for America. We did it for the world.”

I started to say that publishing this research would not prevent its being misused. But the terms use and misuse overlook a bigger issue at the core of any meaningful neurotechnology-ethics discussion. Will an enhanced human being—a human being possessing a neural interface with a computer—still be human, as people have experienced humanity through all of time? Or will such a person be a different sort of creature?

The U.S. government has put limits on darpa’s power to experiment with enhancing human capabilities. Ling says colleagues told him of a “directive”: “Congress was very specific,” he said. “They don’t want us to build a superperson.” This can’t be the announced goal, Congress seems to be saying, but if we get there by accident—well, that’s another story. Ling’s imagination remains at large. He told me, “If I gave you a third eye, and the eye can see in the ultraviolet, that would be incorporated into everything that you do. If I gave you a third ear that could hear at a very high frequency, like a bat or like a snake, then you would incorporate all those senses into your experience and you would use that to your advantage. If you can see at night, you’re better than the person who can’t see at night.”

Enhancing the senses to gain superior advantage—this language suggests weaponry. Such capacities could certainly have military applications, Ling acknowledged—“You can weaponize anything, right?”—before he dismissed the idea and returned to the party line: “No, actually, this has to do with increasing a human’s capability” in a way that he compared to military training and civilian education, and justified in economic terms.

“Let’s say I gave you a third arm,” and then a fourth arm—so, two additional hands, he said. “You would be more capable; you would do more things, right?” And if you could control four hands as seamlessly as you’re controlling your current two hands, he continued, “you would actually be doing double the amount of work that you would normally do. It’s as simple as that. You’re increasing your productivity to do whatever you want to do.” I started to picture his vision—working with four arms, four hands—and asked, “Where does it end?”

“It won’t ever end,” Ling said. “I mean, it will constantly get better and better—” His cellphone rang. He took the call, then resumed where he had left off: “What darpa does is we provide a fundamental tool so that other people can take those tools and do great things with them that we’re not even thinking about.”

Judging by what he said next, however, the number of things that darpa is thinking about far exceeds what it typically talks about in public. “If a brain can control a robot that looks like a hand,” Ling said, “why can’t it control a robot that looks like a snake? Why can’t that brain control a robot that looks like a big mass of Jell-O, able to get around corners and up and down and through things? I mean, somebody will find an application for that. They couldn’t do it now, because they can’t become that glob, right? But in my world, with their brain now having a direct interface with that glob, that glob is the embodiment of them. So now they’re basically the glob, and they can go do everything a glob can do.”

VI. Gold Rush

darpa’s developing capabilities still hover at or near a proof-of-concept stage. But that’s close enough to have drawn investment from some of the world’s richest corporations. In 1990, during the administration of President George H. W. Bush, darpa Director Craig I. Fields lost his job because, according to contemporary news accounts, he intentionally fostered business development with some Silicon Valley companies, and White House officials deemed that inappropriate. Since the administration of the second President Bush, however, such sensitivities have faded.

Over time, darpa has become something of a farm team for Silicon Valley. Regina Dugan, who was appointed darpa director by President Barack Obama, went on to head Google’s Advanced Technology and Projects group, and other former darpa officials went to work for her there. She then led R&D for the analogous group at Facebook, called Building 8. (She has since left Facebook.)

darpa’s neurotechnology research has been affected in recent years by corporate poaching. Doug Weber told me that some darpa researchers have been “scooped up” by companies including Verily, the life-sciences division of Alphabet (the parent company of Google), which, in partnership with the British pharmaceutical conglomerate GlaxoSmithKline, created a company called Galvani Bioelectronics, to bring neuro-modulation devices to market. Galvani calls its business “bioelectric medicine,” which conveys an aura of warmth and trustworthiness. Ted Berger, a University of Southern California biomedical engineer who collaborated with the Wake Forest researchers on their studies of memory transfer in rats, worked as the chief science officer at the neurotechnology company Kernel, which plans to build “advanced neural interfaces to treat disease and dysfunction, illuminate the mechanisms of intelligence, and extend cognition.” Elon Musk has courted darpa researchers to join his company Neuralink, which is said to be developing an interface known as “neural lace.” Facebook’s Building 8 is working on a neural interface too. In 2017, Regina Dugan said that 60 engineers were at work on a system with the goal of allowing users to type 100 words a minute “directly from your brain.” Geoff Ling is on Building 8’s advisory board.

Talking with Justin Sanchez, I speculated that if he realizes his ambitions, he could change daily life in even more fundamental and lasting ways than Facebook’s Mark Zuckerberg and Twitter’s Jack Dorsey have. Sanchez blushes easily, and he breaks eye contact when he is uncomfortable, but he did not look away when he heard his name mentioned in such company. Remembering a remark that he had once made about his hope for neurotechnology’s wide adoption, but with “appropriate checks to make sure that it’s done in the right way,” I asked him to talk about what the right way might look like. Did any member of Congress strike him as having good ideas about legal or regulatory structures that might shape an emerging neural-interface industry? He demurred (“darpa’s mission isn’t to define or even direct those things”) and suggested that, in reality, market forces would do more to shape the evolution of neurotechnology than laws or regulations or deliberate policy choices. What will happen, he said, is that scientists at universities will sell their discoveries or create start-ups. The marketplace will take it from there: “As they develop their companies, and as they develop their products, they’re going to be subject to convincing people that whatever they’re developing makes sense, that it helps people to be a better version of themselves. And that process—that day-to-day development—will ultimately guide where these technologies go. I mean, I think that’s the frank reality of how it ultimately will unfold.”

He seemed entirely untroubled by what may be the most troubling aspect of darpa’s work: not that it discovers what it discovers, but that the world has, so far, always been ready to buy it.


This article appears in the November 2018 print edition with the headline “The Pentagon Wants to Weaponize the Brain. What Could Go Wrong?”MICHAEL JOSEPH GROSS, a contributing editor at Vanity Fair, is writing a book about strength.

As we have seen in the past, inventions are often sold to the public at large as very altruistic. But to echo the statements in this article, these biotechnology and neuroscience inventions look altruistic on the surface but are on their way to somewhere else they do not care to disclose to the public. Where do you think this technology is going? Would you want your son or daughter to enlist in the service? Why or why not?

Share your comments with the community by posting them below. Share the wealth of health with your friends and family by sharing this article with 3 people today. As always you are the best part of what we do. Keep sharing!

If these articles have been helpful to you and yours, give a donation to Shidonna Raven Garden and Cook Ezine today.