Parenting is challenging, even in the best of circumstances. Guiding the emotional and physical development of another human being is a massive responsibility. Throw a separation or divorce into the mix, and it’s easy to see how much more fraught the landscape can be.
Yet this is a common problem. Research from Penn State emeritus professor of family sociology and demography Paul Amato indicates that between 42% and 45% of marriages in the U.S. will end in divorce, resulting in approximately 50% of children experiencing divorce in their lifetimes. As this data doesn’t include parents who are separated or never married, the number of families impacted is likely much higher.
“In my 20 years on the bench, I witnessed countless families torn apart as they slogged through the family law system, battling over the simplest of co-parenting disagreements,” says Hon. Sherrill A. Ellsworth, former presiding judge of the Superior Court in Riverside County, California. “The reality is that most cases–up to 80%, in my experience–do not require legal intervention, yet that’s exactly where many families end up.”
So Ellsworth combined her legal expertise with the technical expertise of entrepreneurs Jonathan Verk and Eric Weiss to create coParenter, an app aimed at helping families collaborate on custody arrangements, child support payments, holiday scheduling, and other issues without conflict. The app just launched on iOS and Android and integrates texting and calendar tools with AI. Parents also have live, on-demand access to professional mediators who can help facilitate co-parenting decisions.
Verk says they began testing the app through court-based pilots in March 2017. “The results were astonishing,” he says. “Judges consistently ordered (or recommended) the platform five times more than we originally anticipated.” He says they rolled out another pilot in December 2017, hoping to acquire 5,000 users by the end of March. “We hit that number in the first week of February, validating our thinking that there would be significant consumer demand.” According to Verk, the pilots have resulted in 2,000 parenting plans and the resolution of more than 4,000 disputes. “We currently have 20,000 registered users,” Verk says, 4,100 of whom are monthly active users.
The app itself has a simple interface designed to function like existing and familiar calendar and SMS tools. Parents have access to all communications, agreements, important documentation, and other evidence if needed for a legal setting.
CoParenter enters a small but growing pool of similar competitors including Talking Parents, Our Family Wizard, and Coparently. However, its AI and live chat components are differentiators.
On the live professional side, Verk says his cofounder Ellsworth heads a team of professionals who vet, recruit, and train all who provide services through coParenter. These are usually experienced mediators who have worked in court, community, and private-practice settings.
Ellsworth says much of their initial training is focused on helping qualified providers transition from physical, in-person mediation setting, to one in which they’re delivering services over the coParenter platform. “Many of the most qualified professionals aren’t digital natives, so it takes some time getting familiar with best practices,” she explains.
Keeping the nonlegal co-parenting issues out of court
“Our professionals focus on specific, individual, and non-legal issues,” Ellsworth adds, noting that up to 80% of what people bring to court are non-legal, co-parenting issues, and they help co-parents make child-centric agreements.
Verk says that these professionals are contracted by coParenter, though the platform can integrate with law and mediation firms, third-party providers, and even family court services who want to provide and charge services on their own.
Should one parent choose not to use the app, the other can still access coParenter’s “SoloMode” so they can still use the features while messages are sent to their co-parent from a separate SMS number.
Using AI to stop fights: “We make it way harder to send that F-bomb”
On the AI side, the app’s natural language processing function can flag potentially contentious conversations and help parents rethink their communication before they press send. Cofounder Eric Weiss explains that at its simplest, the app uses language filters to flag curse words, inflammatory phrases, or offensive names. “It’s not hard to imagine how quickly a normal conversation can escalate into a full-blown argument by dropping a single F-bomb,” Weiss observes. “We make it way harder to send that F-bomb,” he says. “If a user overrides a warning and sends it anyway, the system flags the phrase and may make it available to appropriate third parties such as a judge, lawyer, or mediator because people behave better in daylight.”
Weiss also explained how it can intervene when setting schedules. When a parent receives a request to have the child stay with them, the other parent doesn’t have to open any other apps to coordinate. “The AI pulls the dates and lets you know where it fits in the context of your custody schedule,” he says, “reducing the opportunity for stress, confusion, or conflict.”
Although Weiss can’t say exactly how many disputes the AI has helped resolve, he does point out that of the 20,000 people who have downloaded the app, only 3,000 have actually accessed a live professional, which means the remainder were able to resolve their issues through automated/AI processes.
Saving on lawyer and court fees
Which is exactly the point, says Verk. It does cost parents to use the app. A $12.99 monthly fee (which includes 20 credits that are enough for two separate mediations), or $119.99 annually (includes 240 credits), or $199.99 for two co-parents annually (who each get 240 credits toward mediations). Verk maintains that minimal compared to what an attorney would charge for their services. According to the LegalMatch law library, a child custody dispute can cost anywhere between $3,000 and $40,000, depending on the nature of the dispute. Other costs that can add up include as much as $30 to pay the sheriff to serve the other party, while other papers that need to be filed with the court may cost as much as $300.
CoParenter’s conflict prevention technology has helped 81% of the couples who resolved their disputes on the platform do so without the need for a mediator or legal professional.
Ellsworth also notes that demand on family law courts is increasing, while resources are depleting. “There is a consensus that family courts are in crisis,” she says, driven in part by self-represented litigants who make up almost 85% of family law litigants and clog up the courts trying to navigate a legal system without legal expertise.
But one cost that can’t be measured is the toll custody battles and daily skirmishes can take on the children. “Too many children of separating, divorced, and never-married parents experience excessive levels of toxic stress from exposure to their parents’ ongoing conflict, in and out of court,” says Verk.
Amazon’s face recognition software is under fire from human rights groups, employees, and a growing number of its investors. In a shareholder resolution issued on Thursday, a group of investors are pushing the company to halt government sales of Amazon Web Services’s Rekognition, software that can identify and track faces, citing potential civil and human rights risks.
The American Civil Liberties Union (ACLU) raised concerns of racial bias in Rekognition after conducting tests last year, and hundreds of Amazon employees questioned the sale of the software in a letter and during a staff meeting in November. Amazon has sold Rekognition to law enforcement agencies in at least two states, pitched the software to U.S. Immigration and Customs Enforcement (ICE), and is now testing it with the FBI, according to the investor letter. The company has so far resisted calls to cease its government sales, including another petition signed by the ACLU and dozens of human rights groups earlier this week.
In their resolution, the investors propose that Amazon stop selling to government agencies unless it can use independent evidence to show that the technology doesn’t endanger human rights.
The Sisters of St. Joseph of Brentwood filed the resolution as shareholders and members of the Tri-State Coalition for Responsible Investment, which represent a group of investors with over $1.32 billion worth of total assets, according to a statement. The effort was organized by Open Mic, a non-profit organization focused on corporate accountability. Last June, another group of investors sent Amazon a similar set of demands.
“We filed this proposal because we are concerned that Amazon has pitched facial recognition technology to Immigration and Customs Enforcement (ICE) and piloted its Rekognition with police departments, without fully assessing potential human rights impacts,” Sister Patricia Mahoney said in a statement. The sisters hope Amazon will put the resolution to vote at this year’s annual shareholder meeting in May.
A spokesperson for Amazon declined to comment, but pointed to previous blog posts that tout what it says are the many benefits of face recognition software, including fighting child sexual abuse and human trafficking, finding missing children, and improving content moderation. Amazon has also said that the ACLU’s tests relied on a lower confidence threshold for recognizing faces than Amazon recommends “for use cases where highly accurate face similarity matches are important.”
In another letter sent to Amazon this week, the ACLU and more than 85 advocacy groups told Jeff Bezos, “Instead of acting to protect against the very real dangers of face surveillance, your company is ignoring community concerns and further pushing this technology into the hands of government agencies.” The groups also asked Google and Microsoft to cease their government sales.
Calling for Amazon and other large tech companies to stop their sales of facial recognition software to the government may have limited practical impact: The software is already being used by law enforcement around the world, as well as at stores, casinos, and in airports. And governments—and an untold number of private entities—can purchase face recognition and other surveillance tools from dozens of other firms, or can rely on a variety of open-sourced software.
Some companies like Axon have established ethics boards to self-police new products like face recognition, but privacy advocates insist that only stronger regulation and transparency regimes can limit the risks to the public. As the Amazon investors note, even Microsoft’s vice president has joined the calls for new face recognition laws.
Read the full resolution below:
Risks of Sales of Facial Recognition Software Amazon.com, Inc. – 2019
Whereas, shareholders are concerned Amazon’s facial recognition technology (“Rekognition”) poses risk to civil and human rights and shareholder value.
Civil liberties organizations, academics, and shareholders have demanded Amazon halt sales of Rekognition to government, concerned that our Company is enabling a surveillance system “readily available to violate rights and target communities of color.” Four hundred fifty Amazon employees echoed this demand, posing a talent and retention risk.
Brian Brackeen, former Chief Executive Officer of facial recognition company Kairos, said, “Any company in this space that willingly hands [facial recognition] software over to a government, be it America or another nation’s, is willfully endangering people’s lives.”
In Florida and Oregon, police have piloted Rekognition.
Amazon Web Services already provides cloud computing services to Immigration and Customs Enforcement (ICE) and is reportedly marketing Rekognition to ICE, despite concerns Rekognition could facilitate immigrant surveillance and racial profiling.
Rekognition contradicts Amazon’s opposition to facilitating surveillance. In 2016, Amazon supported a lawsuit against government “gag orders,” stating: “the fear of secret surveillance could limit the adoption and use of cloud services … Users should not be put to a choice between reaping the benefits of technological innovation and maintaining the privacy rights guaranteed by the Constitution.”
Shareholders have little evidence our Company is effectively restricting the use of Rekognition to protect privacy and civil rights. In July 2018, a reporter asked Amazon executive Teresa Carlson whether Amazon has “drawn any red lines, any standards, guidelines, on what you will and you will not do in terms of defense work.” Carlson responded: “We have not drawn any lines there…We are unwaveringly in support of our law enforcement, defense, and intelligence community.”
In July 2018, lawmakers asked the Government Accountability Office to study whether “commercial entities selling facial recognition adequately audit use of their technology to ensure that use is not unlawful, inconsistent with terms of service, or otherwise raise privacy, civil rights, and civil liberties concerns.”
Microsoft has called for government regulation of facial recognition technology, saying, “if we move too fast, we may find that people’s fundamental rights are being broken.”
Resolved, shareholders request that the Board of Directors prohibit sales of facial recognition technology to government agencies unless the Board concludes, after an evaluation using independent evidence, that the technology does not cause or contribute to actual or potential violations of civil and human rights.
Supporting Statement: Proponents recommend the Board consult with technology and civil liberties experts and civil and human rights advocates to assess:
• The extent to which such technology may endanger or violate privacy or civil rights, and disproportionately impact people of color, immigrants, and activists, and how Amazon would mitigate these risks.
• The extent to which such technologies may be marketed and sold to repressive governments, identified by the United States Department of State Country Reports on Human Rights Practices.
Related: A New York City lawmaker is taking on companies that mine your face