Announcement

Collapse
No announcement yet.

Court: Judges Can Consider Predictive Algorithms in Sentencing

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Court: Judges Can Consider Predictive Algorithms in Sentencing

    This seems like it could be very scary, and in my opinion, rife for abuse.

    It seems very "minority report"-ish.

    The Wisconsin Supreme Court has ruled that judges can use "predictive" algorithms to see if someone will offend again, and sentence accordingly...

    http://www.msn.com/en-us/news/crime/...BuiWWf#image=1

  • #2
    I'd be fine with guidelines and some kind of recommendations as to how to sentence based on some well-studied risk assessment. But what I don't like about this is there is no transparency. You don't know what they're assessing or what biases they're including. I'm all for helping judges do their job to have an objective viewpoint, but they shouldn't trust a report that came from somewhere that won't even tell them how it came to its conclusion.

    Comment


    • #3
      This stuff gives me the hebbie-jebbies, sentencing someone for what they might do in the future is wrong. It won't be long before this is up before the SCOTUS and I predict it will be shot down.
      I can understand that a lot of criminals will return to a life of crime again but there are those that don't, I think someone in that whole mess is watching to much TV. This further surprises me being in WI, I've always thought of it as being a moderate to left leaning state.
      Cry Havoc and let slip the marsupials of war!!!

      Comment


      • #4
        While the algorithms in question are frequently flawed, and there's a lot more than most any system seems to be able to take into account, I'm wary of presenting this as sentencing someone for a crime they're yet to commit.

        Judges taking the question of 'How likely is someone to commit another crime' into account is something I have no issue with. After all, the purpose of prisons is to help criminals rehabilitate, and to prevent them from hurting someone else during the process. If you're more likely to commit a crime after getting out, then it'd be better if you stayed in.

        I do still disagree with the ruling, especially on the grounds that the COMPAS does not make clear how the questions are weighted. I think it's great to have a tool other than a judge's personal judgement, which, as in the case of Brock Turner, can frequently hinge on personal views. Sentencing letters can be helpful, getting testimony from someone who the person knows to their character is good. But SOMEONE is going to say a nice thing about anyone. Ideally, we should have our sentences guided by some principal that can be judged, not just personal intuition.

        That said - The fact that the COMPAS test, which so many sentences are based on, is opaque about how it works, is a serious issue. This should be something criminologists and psychologists can provide feedback on, and that can be challenged. If a judge is going to give a light or heavy sentence based on the test, then we ought to be able to learn if the test WORKS.
        "Nam castum esse decet pium poetam
        ipsum, versiculos nihil necessest"

        Comment


        • #5
          I'd think the parole system is also beneficial to reduce the burden on the judge to sentence based on something very murky and unknown like the chance of committing again.

          Give them a high sentence with a lower parole sentence. Have qualified experts and professionals monitor their rehabilitation, and if they believe they're eligible for parole, let them out with a probationary period.

          I don't have numbers about how well this works, but I can't imagine it's a terrible system, unless the method for how they determine one to be eligible is flawed.

          Comment


          • #6
            The sort of sentencing details this test affects surely themselves affect how likely someone is to reoffend. So if this is both widespread and occasionally recalibrated, it could cause a self-fulfilling feedback loop.
            "My in-laws are country people and at night you can hear their distinctive howl."

            Comment


            • #7
              This stuff gives me the hebbie-jebbies, sentencing someone for what they might do in the future is wrong.
              Here's the thing though: the public idealizes the way they imagine it works now vs the way it actually does.

              When sentencing, prosecutors play on jurists biases to obtain the maximum sentances possible and vice versa. If you're asked to consider the death penalty, in Texas they'll tell you its because of the safety of "other inmates." I've been on jury selection for one of those cases where that was actually a thing that was said. You are asked to predict the future to some degree. So rather than statistical facts, you rely on a very sophisticated bias and manipulation system now. Thats how those decisions are being made.

              And from that we know that: if you are ugly, you are sentenced more harshly. If you are male you are sentenced more harshly. If you are a person of color you are sentenced more harshly. If you are poor, you are sentenced more harshly. So that's your "fair" baseline. When you're dealing with judges, its the same thing in terms of decisions.

              Data-driven judgement is occuring largely because we know and can quantify the gaps in the current system. And the current system actually makes these same types of snap judgement in everything from bail setting to parole granting. So if you're worried about self-reinforcing, don't worry we're already doing that. With numbers, we can easily adjust weighting based on performance. With people, all we can do is suggest that maybe they have some bias. I'm not being pro or con here, I'm just saying that's really the crux of the argument for its consideration.
              Last edited by D_Yeti_Esquire; 07-16-2016, 06:08 PM.

              Comment

              Working...
              X