Users are able to converse with Deevi anonymously by providing information to the chatbot through their mobile device or computer.
The chatbot introduces itself to users as a space to share information that it can use to help that person work through their situation.
“Can you tell me about what you have been experiencing?” Deevi queries users in its early exchanges.
A fictitious scenario is given to test the bot: “My partner pushed me down the stairs and I am locked in the bathroom”.
Deevi then informs the user that this mistreatment may amount to domestic violence, and identifies the allegation as capable of fitting the category of physical abuse. It then provides a simple, non-exhaustive definition in dot points of things that constitute physical abuse and a link to a government resource about DV.
“According to the law, domestic violence includes physical abuses such as: direct assault on the body (choking, shaking, eye injuries, biting, slapping, pushing, spitting, burning, punching, or kicking); use of weapons including objects; assaulting the children; locking the victim in or out of the house or rooms; forcing the victim to take drugs, withholding medication; food or medical care and sleep deprivation,” Deevi says.
The chatbot does have some limitations however. For example, once a particular line of inquiry is pursued, the AI app will not engage conversationally by answering questions in the same way that a real person on the other end of a tele-helpline might.
Australian NewLaw provider LawPath developed the chatbot using IBM’s Watson AI platform. It is the first such legal product to be built using the technology to assist clients of legal services directly.
Dom Woolrych, the CEO of LawPath, said that the chatbot relied on the AI technology to scan responses given to Deevi and establish whether the situation described by the user is considered domestic violence, according to relevant laws.
“Watson will scan the legislation to match the responses and learn each time it does,” Mr Woolrych said.
“Deevi can then provide general information to victims about their legal rights and how to access further support services.”
Mr Woolrych stressed that the chatbot was no substitute for support services but did complement the existing support centres and resources available to DV victims. He added that the bot was a useful tool to connect victims with services nearby.
“Although there are many chatbots appearing in the legal space these days, this is one of the first to use AI technology directly with a client,” Mr Woolrych said.
“Hopefully, Deevi will help users find services specifically for their situation and location.”
Deevi was built by LawPath interns and final year law students who are given the title of ‘legal engineer’. Mr Woolrych said that the intern group was tasked by the NewLaw group with building an innovative type of legal software for the group.
“The LawPath team set out to not just talk about legal and AI, but to actually show how it can be applied,” he said.
“The project shows how artificial intelligence can be used to tackle social justice issues and also as a test on how we might use AI in a more commercial context.”