Amy Solomon is the Senior Advisor to the Assistant Attorney General at the Office of Justice Programs at the Department of Justice. Prior to that, she was a Senior Research Associate at the Urban Institute’s Justice Policy Center.
When you started your research, did you expect to get the findings that you did?
Not at all. With co-authors Avi Bhati and Vera Kachnowski, I looked at who was coming out of prison with or without supervision in various states and what was happening to them. I thought we would find that high-risk offenders would be more likely to serve out their entire sentence and get released without supervision. And I thought this group would be re-arrested more often than those released to parole. In other words, I thought we’d find that the people who most needed supervision would be least likely to get it – and as a result they would re-offend more. I expected the punch line to be that states need to ensure there’s a period of post-release supervision for the high-risk group.
What did you find instead?
It turns out that the demographic profile and criminal risk factors of unsupervised offenders turned out to be very similar to that of mandatory parolees, who were released by state mandate rather than by parole boards, who have the discretion to let people out early. The fact that mandatory parolees and unsupervised offenders are so similar makes for a great natural experiment. What difference does parole supervision make? Our findings showed that there was virtually no difference in re-arrest between these two groups. On the other hand, the parolees released by the parole board did slightly better than the other two groups, but a lot of that may have been because they were a more motivated group for successful reentry – which is what made them attractive to the parole board in the first place.
How did the writing process go? Were you concerned about how to handle these findings?
Our first draft looked very different than the final version. We initially focused on the fact that parolees released by the decision of the parole board did slightly better than mandatory parolees and unsupervised offenders. We submitted a 20 page draft to reviewers, and ended up with 11 pages of single space comments from one reviewer suggesting that we were framing our results the wrong way. What was interesting to our reviewers was that parole supervision didn’t seem to make much difference in our study. We ended up rewriting the report – not the findings, but the interpretation.
How was the report received?
It was a bitter pill. I was warned by a few people that there was going to be a reaction, but I wasn’t ready for how strong – and personal -- it would be. Many people in the parole field were upset and offended. I got yelled at in meetings and even got some hate mail. Some of the criticisms were legitimate. For example, we were characterizing parole in a general way but we all know there are 50 very different state systems in place – some might be operating effectively, others not. We were reporting on the net findings from 15 large states that made up about two-thirds of all prison releases in 1994. We said all this in a discussion section, but the critics are absolutely right that people care more about the headlines than what is in the discussion section.
Did people also react to the title of the report?
Yes. We called it “Does Parole Work?” There were some people who didn’t like the title because it reminded them of a famous article written 30 years ago by a social scientist named Robert Martinson that was widely – and incorrectly – characterized as saying that “nothing works” in corrections programming. To me, the title seemed like the right question to ask, but in retrospect I understand how it hit a sore spot. In reality, our conclusion was not that parole can’t work, but that parole as it exists now in many states is not very effective at reducing crime among the parole caseload. But it’s easy for that message to get lost.
What’s the counter-argument to this critique?
We could have easily buried the findings by giving the report a more academic title, which would have been a shame. We all have evaluations and research with rich findings that fly completely under the radar. I’ve done a lot of things, but this is what got attention. The report got about 20,000 hits on our website in the first month, which is a lot. I think the title made a big difference in terms of getting public attention to an important topic. What it did was pose a clear cut question and answer, which appeals to people.
What’s the aftermath of the report been?
I tried to not talk about parole for awhile! But before long, I was able to start speaking to people in the field to find out, where do we go from here? I think one positive is that many parole leaders are now saying, more than they would have a few years ago, that supervision in many states doesn’t look anything like what we would consider “best practice.” We have to make the practice look more like the ideal.
Three years later, are you optimistic about parole’s ability to impact public safety positively?
Absolutely. I think we still have a long way to go, but I’m optimistic. There is an emerging consensus about what good supervision looks like. We have worked with a number of partners in the field to identify strategies that will enhance parole practice – and I’m optimistic that if these new strategies are implemented, we will see more successful outcomes. We will publish this paper in collaboration with other organizations in the fall. We are also about to survey more than 1,600 parole field offices to find out the extent to which they are using evidence-based practices and innovative strategies in their work in order to better understand what drives innovation in the parole arena. We plan to hold a reentry roundtable with national experts and practitioners to discuss the findings and then create some kind of national policy academy to test these ideas out in the field.