Many of us went to bed thinking about the Dodgers’ trade for Manny Machado. Many of us then woke up and turned our attention to the Indians’ sudden trade for Brad Hand and Adam Cimber. Travis Sawchik just wrote about the trade at length. Read that, if you’re looking for specifics. Read that, if you’re looking for an explanation of why the Indians gave up a consensus highly-rated prospect. I don’t know what’s actually going to be left for the trade deadline itself, but this has all made for a delightful All-Star week.
From the Indians’ side, this isn’t just about 2018. It’s about 2018 and beyond, because, this coming fall, Andrew Miller and Cody Allen will become free agents. Hand is under contract through 2020, and there’s a club option for 2021. Cimber only just made his debut on March 29. The Indians are thinking both short- and longer-term, and they believe they now have a couple bullpen stalwarts. This is a huge boost for this coming October, but this also reduces the team’s urgency to build out the pen over the winter. The most important pieces might already be in place.
Thinking about the Indians’ side has made me wonder something. Is there actually such a thing as a long-term good reliever? My instinct for a while has been that teams out of the race should try to cash in their good relievers, because the position is just so volatile. I’ve been thinking about nearly every reliever as a short-term value. I wanted to see what the numbers actually say. So here are the results of a quick little study. It didn’t go exactly how I thought.
I isolated the previous decade, looking at the window from 2007 – 2017. Since numbers for relievers alone are no good without context, I looked at position players, starting pitchers, and relievers. I decided to see how often players at different positions would repeat their performances, by which I mean, how often they’d exceed certain performance thresholds. Over the window, for player-seasons with at least 250 plate appearances, about 24% of position players reached 3 WAR. For player-seasons with at least 50 innings in the rotation, about 23% of starting pitchers reached 3 WAR. For player-seasons with at least 30 innings in the bullpen, about 21% of relief pitchers reached 1 WAR. My WAR thresholds, then, were 3, 3, and 1. I think they work well enough, and, everyone appreciates integers.
I looked one year out, and two years out. So, for position players, I looked at everyone who reached at least 3 WAR in a year between 2007 – 2016. Out of those players, 48% of them reached at least 3 WAR the following year. Meanwhile, out of everyone who reached at least 3 WAR in a year between 2007 – 2015, 40% of them reached at least 3 WAR two years later. I repeated this kind of analysis for pitchers.
For starting pitchers, using WAR, I found a 52% repeat rate for the following year, and a 42% repeat rate for two years later. I also looked at the RA9 version of WAR — the version that’s based on actual runs allowed. Using RA9-WAR, I found a 44% repeat rate for the following year, and a 37% repeat rate for two years later.
For relief pitchers, using WAR, I found a 45% repeat rate for the following year, and a 32% repeat rate for two years later. Using RA9-WAR, I found a 43% repeat rate for the following year, and a 33% repeat rate for two years later.
It might be annoying to read all that text, so here’s the same information in easily consumable table form:
Performance Repeats, By Position
|Position||Threshold||Year 2||Year 3|
|Position Player||3+ WAR||48%||40%|
|Starting Pitcher||3+ WAR||52%||42%|
|Starting Pitcher||3+ RA9-WAR||44%||37%|
|Relief Pitcher||1+ WAR||45%||32%|
|Relief Pitcher||1+ RA9-WAR||43%||33%|
The numbers for relievers are better than I thought they would be. My instinct has been that relievers just break down or get worse, and the sustainable ones are the exceptions. Relievers do look the least reliable in the table, but just not by very much. Relievers get worse, but so do starters and hitters. Two years out, only a third of relievers have stayed above the WAR threshold, but the difference between a third and two-fifths isn’t as dramatic as I figured the research would turn up.
It should be clear this isn’t conclusive. You could put together a much better study, if you had more time to work with, and maybe relievers really break down three years out. Also, this kind of analysis treats every performance above the given threshold the same, even if someone’s WAR were to go down by, say, 50%. I don’t think that has a major effect on the outcome, but I could be wrong. Additionally, it’s been suggested to me that there might just be a core group of elite and reliable relievers. Outside of them, perhaps the rest of the player pool is incredibly volatile. It’s possible! But one might suggest the same thing about elite starters and elite position players. Every position has a highest tier of players. Those are the players we should believe in the most, with the other players seeming more interchangeable.
To get back to the main question: Good relievers don’t seem to hold up as well as other players, but the differences in the rates are pretty small. Maybe the real takeaway is that most good players, regardless of position, look worse a couple years later, but as far as the Indians are concerned, Hand and Cimber are real long-term players, nearly similar to how Francisco Lindor and Jose Ramirez are real long-term players. Of course, Hand and Cimber are worse than Lindor and Ramirez, but the fact that they’re relievers isn’t as damning as I had suspected. Hand might get injured throwing any given pitch. Cimber’s funky delivery might get figured out throwing any given inning. There is legitimate risk, if only because there’s risk with every pitcher. Talent, though, has staying power. The Indians aren’t wrong to look beyond 2018, because Hand and Cimber are good relievers *now.*