Quick note here: The article from Ars Technica Referenced below originally appeared on Wired.com. As Wired has a paywall, and Ars does not, I’m linking and referring to the Ars Technica publication.
Recently, I read about a surveillance app called Covenant Eyes in an article on Ars Technica. The article describes a situation in Monroe County in Indiana where a man was charged with possession of child sexual abuse materials. Both the man, and his wife, maintain his innocence. As such, the man was released on bond under an unusual requirement - every electronic device in his home - computers, his wife’s phone, his kid’s phones, his mother-in-law’s phone - everything had to have the app Covenant Eyes installed on it.
The man was re-imprisoned because his wife’s phone pinged Pornhub, which was a violation of the man’s bond. But, his wife says he never had her phone, that it never left her possession. She also says she never navigated to Pornhub, knowing the terms of bond.
This story piqued my interest for a lot of reasons, like is it ethical to use this technology for law enforcement purposes? Without knowing anything else about the app, I was able to come up with a list of issues, challenges, and harms using this technology was likely to incur. While the Ars piece touched in a few of these, I thought they warranted more airtime.
Today, I want to take a look at the ethical problems inherent in a tool like Covenant eyes, the use of artificial intelligence in the app, and the law enforcement use of the tool.
The Surveillance App
Covenant Eyes is an application targeted primarily at churches and parents to help others break porn addiction and it bills itself as an “accountability” app. The app functions like this: the “sensor app” is installed on all the devices of the surveilled person and monitors everything that happens on the device. It will take at least one screenshot per minute, where it claims to blur sensitive information, monitor all network traffic and whole slew of other things. It then uses ‘artificial intelligence’ to identify ‘Flagged’ or ‘Explicit’ content. It will automatically block all ‘known bad’ websites from a list that they maintain of “adult, pornographic and explicit” content. Finally, it sends all of this information to an ‘Ally’, an accountability partner to help them through the partner ‘Victory App’. In addition to receiving all of the data on the person, the Ally can also block specific websites, and the sensor app will automatically block sites on what amounts to a “known bad” list.
In this particular case, all the information collected off of the family’s devices all went to the accused man’s parole officers.
Perhaps more concerning, reading through Covenant Eye’s End-user License Agreement, despite the huge amount of personal information that Covenant Eyes collects, stores, processes - they accept not accountability for any breaches of privacy resulting from the use of their services. In section 9, they make a statement:
You, as the User, agree that all users of your computing device and User ID wil be informed that monitoring/recording is being made, and that in so being informed, other users will not have their privacy invaded" (emphasis added)
This is saying is that because you, as the User, have informed others that you are being monitored, they cannot have their privacy invaded by accidental monitoring. This functions to protect the company, but doesn’t do much to address that actual violation of privacy that might occur if a user misunderstands the scope of the surveillance.
Covenant Eyes is in an impossible position here because the chances of them surveilling a non-consenting individual is all but assured, and this paragraph exempts them from any wrong-doing.
Technical Limitations
The app functions by monitoring what applications you open, what network connections you make, and taking pictures of everything on your screen to send to your Ally. This approach is really comprehensive at showing what’s happening on the phone, but it’s not great at letting you tie that usage back to a particular person.
When looking at telemetry like this, you are limited to knowing what the system knows. The system might know the user who is logged in at the time that some activity took place, but it can’t say it was actually that user who took the action. It’s common to hand your phone to a friend to do a search, or queue up the next Spotify song. In that case, according to what your phone ‘knows’, it was still you doing the search or playing the song. The same could be said for misclicks. Who hasn’t accidentally tapped a link on a mobile device while scrolling? It happens literally all the time. But, a quick conversation with your Ally clears everything up. No harm, no foul.
Covenant Eyes doesn’t actually take into account automation or background functioning of mobile devices at all, going so far to publish a help article about the situation. Your phone is constantly pinging sites, generating network traffic, spawning and spinning down processes in its normal day-to-day function—and you aren’t actually doing anything. The app doesn’t know the difference between a background process that’s automatically updating and the manual, intentional actions of a user.
Again, when used for its original purpose, these limitations aren’t harmful, they are just considerations that both the surveilled person and the ally need to be aware of.
But when used by law enforcement to ensure compliance with the terms of a bond, the repercussions are massive. There is doubt in who was using the device, or if the the action taken was actually a person or an automated process. The lack of certainty here causes harm when used in a law enforcement setting because it ignores the limitations of the technology when using it for monitoring like this.
The harm caused by using the app for its original intent can be mitigated through conversation and awareness. However, when the app is used in a law enforcement setting , it opens the door for misunderstandings and challanges that our law enforcement personnel aren’t equipped to understand, account for, or deal with - like the technical nuances I cover above. It is unreasonable to expect law enforcement to be able to understand and account the limitations of the software.
Artificial Intelligence
As is the way these days, Covenant Eyes claims to use artificial intelligence to help their app function. Their website claims they use “artificial intelligence” to identify concerning behavior and content, as well as to help programmatically blur text out of images taken on the device.
But Covenant Eyes isn’t very forthcoming on their website about how their AI actually functions. We don’t have any insight into what the AI is doing, how it’s doing it. Are they using Machine Learning? or GenAI? How is the model actually identifying concerning behavior and content? Is it in images, the content of websites, or does it just rely on parsing through webpage metadata to establish concerning and explicit content?
AI is such a huge term and is used so cavalier that without additional context it’s impossible for us to know what’s actually happening. As a technologist and an ethicist, even I can’t glean from the support documentation what their AI is actually doing.
Moreover, what actually constitutes explicit content? The website calls out, “all adult, pornographic, and explicit websites.” Based on the call out here, we can assume that explicit content is something ontologically different than either adult or pornographic material, but its not actually defined anywhere.
We already know that the AI can’t distinguish between actions taken in the background by an application without user intervention, and an action taken explicitly by a user. We have to wonder what other limitations there are to the AI. For example, is a picture of a shirtless man considered adult or pornographic? How about a picture of a shirtless woman? Moreover, can their AI actually make the distinction between a shirtless man and a shirtless woman.
Lets expand - assuming the AI processes images - how does the AI do with people of color? Does it have issues with identifying explicit or non-explicit content with them? What about transgender folks? Is a man wearing a dress considered explicit or adult content according to the Covenant Eyes app?
These are common, well documented pitfalls in AI detections, and even tech giants like Google, Amazon, and Microsoft struggle with false positives and false negatives. They have some of the best minds in the world and functionally limitless resources. It seems unlikely that a small company could have solved wide-ranging issues like this.
Covenant Eyes isn’t volunteering the information about what their AI does, or how it does it. Gaps and harms like those I’ve asked about above can be easily mitigated in the original use case through conversation. But in the law enforcement setting, the answers to these questions are critical and would literally change the course of someone’s life.
Technical Misuse
One of the other features of the Covenant Eyes app is the ability to block any site of the Ally’s choosing. In this case from the Ars article, the app blocked access to the website The Appeal, a non-profit that focuses on injustice in the criminal-justice system. Its unclear whether The Appeal was on one of Covenant Eyes’ known bad lists, or whether it was explicitly blocked by parole officers. Judging by the content of website, it seems unlikely to have ended up on “known-bad” list from Covenant Eyes.
The potential for misuse here is concerning. It allows law enforcement to unilaterally censor what content people have access to, without any kind of redress. It’s chilling to hear that app happened to block a site about injustice in the criminal system through use of functionality that could easily lend itself to injustice.
The accused man’s wife reported being afraid to communicate with her lawyer, because she was afraid the parole officers would abuse their access to break attorney-client confidentiality. She was afraid to talk to her tele-therapist for the same reason - that the parole officers would violate the patient-doctor confidentiality.
The Monroe County law enforcement officials are working extra-judicially in a legal gray space that hasn’t been explicitly litigated. This isn’t a situation that law has had to account for before. As such, they have no repercussions for any of their actions against the family. Because the family consented to the surveillance, they’ve also given consent to sharing their privileged information if they access that information on the surveilled devices.
The Provision Against Use by Law Enforcement
I want to highlight one more paragraph in the EULA. As part of Section 13:
Our Reports are intended for use in personal recovery from pornography struggles. As such, Covenant Eyes does not endorse or support the use of Reports in a premeditated fashion for legal purposes, e.g. to build a lawsuit or as specific evidence of misuse of the internet by parolees.
Despite this provision in the EULA, Monroe County Indiana parole officers still used Covenant Eyes for a purpose explicitly contrary to the terms of use. Indiana isn’t the only one either - according to the source article, 5 other states and at least 60 other contracts have also used the Covenant Eyes in similar fashion. It is disingenuous for Covenant Eyes to say that their software shouldn’t be used in this way, and then pursue engagements where they know the exact way they say they don’t support or condone. Moreover, it shows that they are aware of the limitations of the technology, but still choose to do business with law enforcement.
The terminology in the provision is specific. They don’t explicitly prohibit law enforcement from using their software, instead they say they ‘don’t support or condone’, which is much more ambiguous terminology. Regardless of whether they don’t support or condone, they are still entering into business agreements with law enforcement agencies all over the country to use their surveillance software.
Closing Thoughts
My article focuses primarily on the technology as a vector for harm, but I intentionally stayed away the actual harm caused to the family. But it’s important to acknowledge the human cost of using surveillance apps like Covenant Eyes - whether in a private or a law enforcement setting. It is very likely that in using the tool the way they did, the Monroe County Parole Officers violated the First Amendment rights of the family, and the Fourth Amendment protections against unreasonable search and seizure. The wife was afraid to talk to her therapist and her lawyer for fear of police overreach. Her children were terrified to use their phones because of the constant surveillance. And they had done absolutely nothing wrong. Consent under duress, which is the only way this makes sense, isn’t real consent. It’s a form of legal bribery and extortion.
The gaps in how this software works and the types of information it can provide as it surveills are substantial. While we can make reasonable assumptions about the person using Marko’s phone was Marko, we cannot make a reasonable assumption that the person using Marko’s phone was actually Clara without supporting evidence (Marko and Clara aren’t real people - just random names I picked to highlight the example).
The ability for law enforcement to prevent access to any website, without any oversight, is also concerning. Whether Monroe County’s parole officers actually prevented access to a publication about injustice in the criminal-justice system is impossible to know, but it’s an uncomfortable thought. In the same vein, it’s concerning to think law enforcement could spy into attorney-client confidentiality or doctor-patient confidentiality without oversight.
It’s unclear how exactly the courts would treat any evidence gathered that way - on one hand, they could say folks granted consent. On the other, how real is consent when the alternative is breaking up your family?
Personally, I feel the usage of surveillance applications like Covenant Eyes to be … distasteful even when used between consenting adults … or families. Mostly, that’s because of the underlying, never-quite-stated implication of sex shaming and immorality that inevitably bundled in here. There are some folks with diagnosed pornography addictions, and in the case where a mental health professional recommends the use of the application, I see value. But that’s not who this is targeted at. It’s targeted at churches, pastors, spouses, and friends.
Usage by law enforcement on folks who aren’t accused any crime is on a whole new level. How exactly the law enforcement doesn’t see this as over-reach, or a First Amendment lawsuit waiting to happen, or a Fourth Amendment lawsuit waiting to happen is beyond me. I understand the desire to protection children from sexual exploitation, but this seems like taking a sledgehammer to a chicken egg.
It’s hard to see how the pros here outweigh all the harm caused. It’s hard to say that the harm the surveillance caused was worth what the parole officers got out of it … and I’m not even sure what that was.
But at the end of the day, the folks in Monroe County Indiana who signed the contract knew it was unethical for them to use it. The folks in Covenant Eyes knew it was unethical for them to provide the software they knew to have gaps that would allow law enforcement to harm innocents. And they did it anyway.
Like what you’ve read? Subscribe for more!
Also: Check out my piece on the impact of AI on the Gaming Industry over at MassivelyOP.com