Innovation to Research – Why Court Teams Shouldn’t Be Afraid to Try New Solutions

Canva – Good Idea Text

‘Hey Matt, what does the research say about [X]?’

I relish this question, because I love helping practitioners turn research data into something they can apply to their courts. It’s my favorite part of my job.

I am a researcher by training, a training and technical assistance (TTA) provider by trade, and I love when these two fields merge seamlessly. But sometimes they don’t merge seamlessly; for example, a judge recently asked me whether it is better to do closed or open hearings because they’ve heard different perspectives from different organizations. The truth is, there isn’t a clear “evidence base” to answer this question, you can make a pretty convincing case either way, and it might be up to the individual court to see what works best for them.

I want to preface this by saying unequivocally that certain evidence-based best practices are absolutely essential, and when deviated from will cause harm to young people. For example, we know that detention doesn’t reduce criminogenic behaviors. We also know that juvenile drug treatment courts (JDTCs) are far better at serving high risk youth, and because of the nature of the intervention, low risk youth can be harmed by JDTCs. But, despite the Office of Juvenile Justice and Delinquency Prevention’s publication of all the research-informed best practices for JDTCs in 2016 Guidelines, there are still areas of policy and practice, that, much like the open court hearing question, either haven’t been studied, or don’t have a clear answer based on the evidence we have.

In a recent conversation about making JDTC court hearings more trauma-informed, a therapist said, “We wondered about having the court hearings around a conference table instead of the judge on the bench. What is the research around this?” To be clear, I am not certain whether there is research on this, but I am pretty sure that its impact hasn’t been examined in JDTC’s trying to reduce trauma.

My response? “I believe there are ten or so research groups working full time on studying JDTCs and five of them are at this conference, so you can ask them to look into it, but it will take them a while!”

“I think there’s a better way. You do the research. It doesn’t need a bullet proof methodology, or complicated statistical analysis. You can test your idea; you can develop your own local evidence base. Start by conducting the court hearing around a conference table for say, two months. Before the change ask parents and participants how they feel about the court hearings, (level of anxiety about court, comfort speaking up etc.) Then ask the same questions after. In addition, look at short term outcomes: are people attending court more, speaking up more, responding better? Then decide whether it works or not.”

Poor guy. I’m sure he was hoping I’d just answer his question. But I stand by what I said, and here’s why.

Firstly, drug courts, when they started in 1988, were innovative. We knew that the court and prison cycle wasn’t working for those with a substance use disorder (SUD). We knew SUD treatment worked, but no one had put them together like this and the concept of therapeutic jurisprudence was an idea we were still developing.  Over the years we added formal screening and assessment instruments to the eligibility determination, developed integrated case management plans, and incorporated motivational interviewing into the courtroom.  We also realized we needed specialized courts for other struggling populations, such as veterans, juveniles, and alleged perpetrators of domestic violence – the list continues.

Research can struggle to keep up. I believe that JDTCs are full of smart, compassionate people, who much like the early drug court innovators, can spot issues and develop thoughtful responses without waiting for research to validate the idea. If your court team identifies a need and thinks they have a possible solution, try it!

Secondly, the foundation of the TTA we provide at Justice Programs Office (JPO) is that every court is different, every population unique, and every jurisdiction nuanced. What works well in one place may not in another. For instance, one court may find that holding separate hearings for boys and girls reduces the trauma and promotes comradery amongst participants, while another may find that mixing the genders reduces the stressors for LGBTQ participants who don’t feel at home in a single-sex courtroom. As I mentioned earlier, we have excellent evidence supporting practices, which we shouldn’t deviate from, but some policies and practices may be more situational.

Thirdly courts may find that they need to change their practices over time as participants, parents, or court team demographics change. If all the participants at one point in time attend one or two of the local schools, holding court or probation officer check-ins in the school may make sense. However, if the population shifts such that twenty schools are now represented in the JDTC, that may no longer be practical, even if it is deemed successful.

Finally, it is important to define success. While this isn’t a rigorous evaluation process, it is important to state the goal and objectives. This is your data. You should determine what data points will be collected and how you will measure success. Courts can use surveys gather insights on short- and long-term change. It can be tough to determine the exact impact of a change in policy or practice, and I’ve seen too many courts make hard to prove claims about a practice change. So, to avoid this, it’s important to be specific and careful in determining what caused the success. For example, a court told me their new judge drastically improved their graduation rate, but the new judge was accompanied by a new PO, and new sources of funding. Be specific about what the judge did differently and what other circumstances changed.

This is a call to empower you. If there’s something your team thinks may work for your population, first check that it’s not contrary to current best practices, and then try it. After a short while, ask questions of your program, and the data you’ve collected. Did those changes achieve your goals? Are your participants still on track to be crime and drug free? If the answer to these questions is yes, then keep doing it, and contact a friendly researcher to help you formally evaluate the practice, so others can learn from it too.

Ultimately, JDTC teams are far closer to the issues and solutions than researchers, so innovation in JDTCs should drive research – not the other way around!

Matt Collinson is a Senior Programs Associate at the Justice Programs Office working on the Juvenile Drug Treatment Court Initiative.