League presidents were first required by rule to appoint official scorers in 1957. Now the job is in the hands of the commissioner’s office since there are no more league presidents, even though the rule book continues to make frequent reference to them. Not until 1950 was it even stated in the rule book that an official scorer was an accredited representative of the league, although by then he had been long recognized as such. The pay for MLB official scorers in 1950 is unknown, but nowadays the position pays $180 per game. Even though the job on a daily basis can be so time consuming that it sometimes pays less than the minimum wage per hour spent, some official scorers love their work so much they would do it for free.
For many years, teams customarily awarded official scorers’ jobs to favored sportswriters who could use the extra money the assignment paid. In the nineteenth century, a club was not required to divulge the identity of its official scorer. It was felt that shrouding the position in secrecy would protect the scorer from players and fans who might otherwise attack or subvert his decisions. From 1882 to 1891, the Chicago White Stockings kept the name of their official scorer a complete mystery. Later it emerged that Elisa Green Williams (the mother of the club’s future treasurer C. G. Green), who signed her scoresheets as E. G. Williams, had been awarding hits and errors in all the team’s home games. Williams would sit primly between two players’ wives, seemingly no more than a spectator, although secretly she was keeping careful score of every play. After each game her son would mail her scoresheets to the National League office, unaware of what the envelope contained.
Given the fact that their job was a sinecure and subject to the whims of the club that paid them, official scores—particularly in the nineteenth century—were frequently accused of favoring home team players in their rulings. Here is what one reporter had to say about Baltimore Orioles outfielder Willie Keeler’s march to the National League batting crown in 1897:
John Heydler, who is one of the best known baseball scribes in the business, says exception should be taken to this over generous scoring and that Keeler’s figure of .432 will not agree with any private accounts. [Second baseman] Frank Houseman of St. Louis also has objections to Baltimore scoring methods. He says: “Down in Baltimore, one day, Keeler sent two flies to Lally [left fielder Dan Lally of St. Louis], who muffed both of them. Then he hit to Hartman [St. Louis third baseman Fred Hartman] and the latter fumbled and then threw wild. Then Keeler made a good single. The next morning four hits appeared to Keeler’s credit in the Baltimore papers. Talk about Cleveland stuffing Burkett’s average, why, they are not in it with the oyster scribes of Baltimore.”
Jimmy Ryan, considered by this author to be the true 1888 National League batting leader. Had he received his due award he would have been its first winner to bat right and throw left.
Keeler’s batting average was later reduced to .424 when discrepancies were discovered in his hit and at bat totals. Yet to be revised are the suspect final averages of several other purported batting title winners of Keeler’s time. A particularly murky season is 1888, when Chicago first baseman Cap Anson was crowned the National League’s leading hitter with a .344 average, even though this author’s calculations indicate that he hit only .317 and the bat title properly belonged to Anson’s teammate, outfielder Jimmy Ryan, with a .328 mark. Ryan appears never to have called for an investigation, but a good deal of enmity built up between him and his player-manager in the years following 1888, until Anson finally left Chicago after the 1897 season.
The most famous person born in Marshalltown, Iowa, Cap Anson, was still playing major league baseball at age 45. Actress Jean Seberg, arguably the second most famous person born in Marshalltown, was dead at age 40.
Protested games have been part of major-league baseball since its inception, but the plethora of suspended games are a relatively recent development and require an official scorer to practice undue diligence since the rosters for many teams change almost daily. It is not uncommon for a player on one team in a suspended game to be in the uniform of the opposition when the game is resumed, and in some instances the official scorer may not be the same. This makes it extremely important that the notes of the original official scorer be thoroughly digested before a suspended game is resumed. Otherwise it is entirely possible that a player on Team A that left the game prior to its suspension might illegally participate in its resumption for Team B. But if that player on Team A is still in the game at the time it is suspended, he is available to play for Team B if he has joined it in the interim.
There are 18 different batters’ and runners’ statistics that need to be in the report, ranging from (1) number of times batted, except for the four instances when no time at bat is charged—sacrifice hits, walks, hit by pitches, awarded first base via interference or obstruction—to (18) number of times caught stealing. In addition, at various junctures in history, the official scorer has also been saddled with the following duties:
During the 1980s, an official scorer was also required to furnish his league’s office with the name of the player who collected the game-winning RBI in each contest. The experimental category lasted just nine seasons before it was given a quiet burial in 1989. At that, it endured much longer and was much better received by the baseball public than several other experimental categories over the years. The most interesting one may have been an official scorer’s nightmare that was labeled “Total Bases Run.” This invention survived all of one season: 1880. That year, National League official scorers were ordered to input the number of bases each player touched safely in their game reports. The totals were then computed to determine the Total Bases run champion for the season. Since the category had only a one-year life span, there was only one champion. The winner with 501 bases safely touched was Abner Dalrymple of the Chicago White Stockings, the owner of many other obscure records.
During the National League’s inaugural season of 1876, a batter was socked with a turn at bat every time he walked. Punished most by the rule was free pass leader and batting champion Ross Barnes, who collected 20 walks and thus 20 extra times at bat. The rule was rescinded in 1877, but then resurrected ten years later. The upside in 1887, however, was that each walk was also scored as a hit. This bonus swelled batting averages to astronomical proportions. Tip O’Neill topped the American Association with a .492 mark and Cap Anson’s .421 figure paced the National League.
In 1968, the Special Baseball Records Committee voted to treat a base on balls as neither a hit nor a time at bat, and made their judgment retroactive to 1876. This ruling meant statisticians had to recalculate batting averages for both the 1876 and 1887 seasons after bases on balls were deducted from the at-bat totals for each player. Averages jumped in the former year, though not a lot because there were very few walks issued in the 1870s; Barnes, who stood to gain the most, went from .404 to .429. But in the 1887 season, averages were shaved in some cases as much as 60 or 70 points. Tip O’Neill’s .492 mark (actually .485 when correctly calculated), for one, dropped to .435.
Many baseball historians are still upset by the committee’s 1968 ruling, believing that it affects the historical integrity of the game to act as if today’s rules are better than those of the past. Dennis Bingham convincingly argues that what we want from the past, above all, is an accurate account of what happened, and that by allowing a “special committee” to change a scoring rule of the past, we have in a very real sense changed what occurred. A reasonable counter to Bingham’s argument is that walks should not be accepted as hits (as per 1887) and as times at bat without being hits (as per 1876) because the rules in each case were in existence for only a single aberrant season and consequently were not a significant part of the evolutionary process of the game. A second counter is that the purist argument also can mislead as to what actually occurred because if statistical achievements were awarded in many cases where they would now not be, using pre-1898 stats vis-à-vis astronomical stolen base totals, for example, induces modern-day fans to believe something extraordinary happened that really did not. In 1887, for one, six players stole more than 100 bases—an extraordinary number. But if today’s stolen base rule had governed that year, it is doubtful that most of them would have swiped even half as many.
A further point of interest: Had the 1887 walks rule been in effect in 2004, Barry Bonds would have batted a record .607.
There are 15 pitcher records, ranging from number of innings pitched to number of balks committed. In addition, the official scorer is responsible for entering the names of the winning and losing pitchers, the names of the pitchers on both teams starting and finishing the game, the name of the pitcher to be credited with a save (if any), the number of passed balls for all catchers in the game, and the names of players participating in each double and triple play. Time was—and not that long ago—when MLB came up with the splendid idea of rounding off innings pitched so that 200 innings became 200 and 200⅔ became 201. It soon went the way of GWH (game-winning hits), but at least GWH, though eventually regarded as a meaningless stat because the hit too often was something like a sacrifice fly in the first inning of a 12–0 game, did not demand retroactive record alterations since it was only in existence a short time. Rounding off innings pitched required changing the career stats for many pitchers in the game during the time it was in existence and, worse yet, created ersatz ERA leaders. A case in point is the 1981 strike season. To qualify for the ERA crown in 1981, a pitcher had to pitch one inning for each of his team’s games played. Normally this would mean 162 innings, but the strike in 1981 had reduced each team’s game totals to just over 100. The winner today would have been Sammy Stewart of Baltimore, with an ERA of 2.324 in 112⅓ innings pitched. Finishing second would have been Steve McCatty of the Oakland A’s, whose ERA was 2.327 in 185⅔ innings pitched. But the rounding of innings pitched rule then in effect declared McCatty the official leader. Stewart’s 112⅓ innings total was rounded down to 112, while McCatty’s 185⅔ total rounded up to 186. McCatty got credit for an extra third of an inning without allowing a run, and Stewart lost a third of a scoreless inning he had actually pitched. That made McCatty’s final ERA 2.32 and Stewart’s 2.33. The findings were appealed, but the rules committee of the day upheld the result because it conformed to the established practice. The rule was changed the next year, probably principally due to this injustice, and fractions of innings were no longer rounded up or down. Hence why McCatty is still considered the AL ERA champ in 1981, but at least once source, Baseball-Reference.com, awards the crown to yet a third pitcher: Dave Righetti of the Yankees, because Righetti’s 2.05 ERA was significantly lower than Stewart’s or McCatty’s, even though Righetti worked just 105⅓ innings, short of the number required by rule 9.22 (b) since the Yankees played 107 games. So go figure.
Even then, the official scorer’s task is not complete. He or she still has to account for the names of batters who hit home runs with the bases full, along with six other stats. The final two are the length of the game, “with delays deducted for weather, light failure, or technological failure not related to game action” but not for the time spent attending to an injured player, umpire, manager, or coach. And lastly, “Official attendance, as provided by the home club.” But his or her job is still not done, especially if there are extenuating circumstances like the game at hand being either forfeited or officially protested or suspended—or, in rare instances, he or she needs to exercise an official scorer’s authority to rule on any point not specifically covered in the scorer’s rules.
The time required to play a game has always been included in major- league box scores, but a century ago no one paid much attention to how long a game took to play, and official scorers were not compelled to be precise. Most rounded off the time of a game, usually to the nearest five minutes. A game that took an hour and eight minutes would thus be recorded as having lasted an hour and ten minutes. Games, in any event, that took longer than an hour and a half generally had frequent interruptions for arguments with umpires.
During the 1887 season, when bases on balls were counted as hits and many players went up to bat looking for walks, causing some games to drag on for over two hours, Sporting Life found the development so revolting that it predicted “the public will call a halt (to the new rule) by refusing to attend games.” When club owners also began noticing that fans were leaving their parks in the seventh or eighth inning, they quickly took heed. The new rule was scrapped after only a one-season trial, and the average time of a game again fell to well under two hours.
9.03 Official Score Report (Additional Rules)
And there is yet more to the report for each game—a draconian amount in all. One can only pity an official scorer new to the job who has signed up for it on the assumption it demands little more input than what appears in each game’s box score. About the only task of substance that is left for someone other than the official scorer to provide for a major league game is a play-by-play account of it. The reader is encouraged to compare the job description given for today’s official scorers as contrasted with what it must have been as late as the early 1900s when RBIs did not yet exist, strikeouts and sacrifice hits were often not kept track of, the rules for ERA and assigning pitchers’ wins and losses were still ill defined, and no particular effort was devoted to Rule 9.03 (c) How to Prove a Box Score, which states: “A box score shall balance (or is proven) when the total of the team’s times at bat, bases on balls received, hit batters, sacrifice bunts, sacrifice flies and batters awarded first base because of interference or obstruction equals the total of that team’s runs, players left on base and the opposing team’s putouts.”
These last examples to many modern fans may seem more like something that would occur in a softball game than a baseball game, but they were chosen for good reason. Against Ernie Lombardi, a catcher who was agonizingly slow and hit murderous top-spin shots that could fell an elephant, teams often played their middle infielders back on the outfield grass, technically creating four- and sometimes even five-man outfields. It was felt that even at that distance a ball Lombardi hit would get to a fielder so fast that there would still be time to retire the slow-footed slugger.
The shift, devised in 1946 by Cleveland player-manager Lou Boudreau to combat Ted Williams of the Boston Red Sox, packed three infielders on the right side of the diamond and left only third baseman Ken Keltner to the left of second base. However, Boudreau was not the inventor of the “Williams Shift.” A similar alignment had been used by managers as far back as 1922 against lefty pull hitters Cy Williams of the Philadelphia Phillies and Ken Williams of the St. Louis Browns, both of whom thrived on the short right-field porches in their home parks.
Ernie Lombardi’s .306 career batting average is the highest among National Leaguers who appeared in a minimum of 1,000 games as a catcher and played no other positions. Making his achievement all the more remarkable is how few of his 1,792 hits were “leg” hits.
Official scorers at games where the Lombardi alignment or the various Williams shifts were deployed took no cognizance of the defensive alterations in compiling their score reports. The same attitude continues today and is even more pervasive what with the enormous increase in exaggerated shifts for pull hitters and wannabe pull hitters.
Before 1907, American League official scorers did not always credit players with a game played if they only appeared as pinch-runners or defensive replacements. The National League did not demand that these types of substitute appearances be recorded until 1912. As a result, many players who got into games only fleetingly never saw their names in major-league box scores, and some even failed to be included in the early editions of the Macmillan Baseball Encyclopedia, before researchers confirmed their existence.
Mistakes and omissions in early day box scores are still being unearthed. In the early 1990s, historian Dick Thompson established that Ivan “Pete” Bigler, for many years believed to be a “phantom” player whose name appeared in a 1917 St. Louis Browns box score through a typographical error, actually participated on May 6, 1917, in a game at St. Louis that resulted in an 8–4 win for the Browns’ Alan Sothoron over Chicago. During the game in question, Bigler pinch-ran for pinch-hitter Bill Rumler, who had walked. Ironically, it was later discovered that Bigler and Sothoron were not only born in the same town—Bradford, Ohio—but had been teammates at Juniata College in Pennsylvania.
As for the nineteenth century, researchers are continually discovering new or misidentified players. Scarcely a month goes by without a name on the players’ all-time register being changed or a statistic attributed for more than a century to a certain player found actually to belong to a different player.
Today, a player is officially credited with a game played once he legally enters it—even if not a single pitch is thrown after his entry. When none are thrown before the game is stopped for rain or for any other reason, he is considered to be in the game as a batter. If a pitch is thrown before a stoppage occurs, the record reflects that he is in the game on defense.
Although RBIs did not become an official statistic until 1920, many sportswriters kept track of them on an informal basis prior to then. As far back as 1879, a Buffalo paper recorded RBIs in box scores of the Buffalo Bisons’ National League games. In the mid-1880s, Henry Chadwick, the father of baseball writers, urged the inclusion of the RBI feature in all box scores. Finally, by the early 1890s, Chadwick carried his point and National League official scores grudgingly obeyed instructions to catalog RBIs. But most found it a burden, and the practice was soon abandoned. In 1907, the New York Press revived the RBI, but it did not become an official statistic again until the Baseball Writers’ Association of America (BBWAA) championed its adoption in 1920.
RBI figures for most of the pre-1920 seasons have since been reconstructed from box scores and game accounts, but several American Association seasons are still incomplete, as is the entire 1884 Union Association campaign. In addition, many of the reconstructed figures are guesswork at best. A particularly significant one is the 0 RBI credited to Worcester first baseman Chub Sullivan in 1880 in 166 at-bats, which currently stands as the major league record for the most at bats in a season without an RBI. In October 2005, this author’s research established conclusively that Sullivan had at least one RBI in 1880. On May 14, at Boston, he singled home Art Whitney from third base in the ninth inning after Whitney hit an apparent home run over the left-field fence but missed third base and was allowed to return there safely when umpire Billy McLean contended that Boston pitcher Tommy Bond failed to do “what should be done” to have Whitney ruled out.
Recent research has ascertained even more significant findings. In 1961, Baltimore first baseman Jim Gentile had an RBI that went unrecorded at the time, hiking his total for the season to 141. That additional RBI enabled Gentile to tie Roger Maris for the American League RBI crown that season. Lou Gehrig’s AL season record RBI total of 184 in 1931 is also now open to question.
There are six scenarios under which an official scorer is required to award the batter with a base hit, the value of which is determined by seven contingencies. There are also provisions under Rule 9.05 (4) when an official scorer on occasion is obliged to award a hit even though the defensive team does not make an ordinary effort to retire a batter. The most common situation occurs in a tie game with the home team at bat in the bottom of the ninth or in extra innings with a runner on third and less than two out. Sometimes, when a batter hits a long fly ball that will inevitably permit the winning run to tag up and score, the defensive team will make no attempt to catch it. The official scorer customarily awards the batter a single in these cases of defensive indifference.
Probably the most infamous hit that an official scorer was forced to award owing to defensive indifference was on October 3, 1976, when two Kansas City Royals teammates, third baseman George Brett and designated hitter Hal McRae, went down to their final at-bats of the season in a game against the Minnesota Twins, neck and neck in their battle for the American League batting crown.
In the top of the ninth, the Royals trailed the Twins’ Jim Hughes, 5–2, with one out when Brett hit a routine fly ball that dropped in front of Twins left fielder Steve Brye and then bounced over Brye’s head and rolled to the wall. Before Brye could chase down the ball and relay it home, Brett tallied an inside-the-park home run, enabling him to finish the season with a .333 batting average.
McRae then stepped to the plate, also at .333, albeit a fraction of a point lower than Brett’s mark subsequent to his ITP home run. When he hit a groundball to shortstop Luis Gomez that was obviously going to result in his being thrown out, McRae angrily shouted at Twins manager Gene Mauch as he ran toward first base. McRae’s wrath swiftly brought Mauch out of the dugout. A fight nearly ensued when McRae (who was African American and Brett Caucasian) accused Mauch of ordering Brye to let Brett’s fly ball drop safely and implied a racial motivation.
Mauch denied that he had coaxed his players to steer the batting crown Brett’s way if the chance arose, but Brett joined in McRae’s grievance when he went on record with a statement that his fly ball definitely should have been caught. Brye later waffled, saying at first that he had misjudged the ball, but eventually he acknowledged that most American League players preferred to see Brett rather than McRae win the batting crown because Brett was a full-time player whereas McRae served as no more than a designated hitter. Whether Jim Hughes was among them, since Brett’s faux homer socked him with an earned run and later in the inning the Royals brought the tying run to the plate before being retired, is not a matter of record. In any event, it was Hughes’s final big-league start.
This special comment evokes the question if it applies with equal force when a no-hitter is in the making, especially in light of the commonly held belief that an official scorer is obliged to follow a rule that the first hit of a game should always be a clean one. Even though there has never been any such rule, there have been several occasions when an official scorer has gone out of his way to label a hit an error, sometimes even after the fact, in order to preserve a no-hitter.
One of the most renowned instances came in St. Louis’s Sportsman’s Park on May 5, 1917, when Browns southpaw Ernie Koob won a 1–0 no-hitter over the Chicago White Sox. The official scorer, John Sheridan, was late that day getting to the park and so missed seeing Buck Weaver’s sharp grounder in the first inning that Browns second sacker Ernie Johnson, filling in for injured Del Pratt, fielded with his chest. Many newspapers the next day called the sizzler a hit as per the wire service report. But when Sheridan had time to digest what was at stake, after the game ended he took a poll of his fellow writers and players on both teams as well before electing to charge Johnson with an ex post facto error on the play. As proof that Sheridan’s decision occurred after the game was in the books, the Chicago Sunday Tribune headline the following day was KOOB TAMES SOX IN ONE HIT GAME, 1–0, evidence that the play was still considered a hit at the time the Tribune went to bed. Even St Louis Post-Dispatch sportswriter W. J. O’Connor acknowledged the no-hitter was “hardly immaculate . . . it was slightly tainted, stained with doubt at its very incipiency.”
Sheridan’s handling of the situation resulted in a pledge by both Chicago and St. Louis baseball writers to safeguard the game against any more such controversial scorer’s decisions in the future. The pledge was largely forgotten by the following afternoon when Bob Groom of the Browns, after a hitless two-inning relief appearance in the first game of a twin bill, shut out the pennant-bound White Sox 3–0 in the second game and duplicated Koob’s no-hit feat in the process to mark the only time in history that two pitchers on the same team have registered no-nos on successive days.
This introduction explains why when Rickey Henderson stole 130 bases in 1982 to break the modern record, the “modern” qualifier was added. In 1886, official scorers were instructed to credit a runner with a stolen base for every extra base he advanced of his own volition. Beginning that year, any time a runner went from first to third on a single or advanced a base on a fly ball he earned a theft that was worth as much as a steal he made on the pitcher. A runner was also credited then with a stolen base even if he ran beyond or overslid the bag he was trying for and was subsequently tagged out.
In 1887, two players in the National League and four in the American Association swiped more than 100 bases, led by Hugh Nicol of the Cincinnati Red Stockings with 138, still the all-time record. There is no way now of determining how many of Nicol’s steals fit the current definition, but it is probably significant that he averaged more than one a game and had more thefts than hits. In 125 contests, Nicol collected just 102 hits and posted a .215 batting average, lending considerable weight to the theory that he garnered a lot of steals via the old standard since he had relatively few baserunning opportunities.
In 1892, a proviso was added to the stolen base rule, spelling out that a theft would only be credited to a runner if there was either a possible chance or a palpable effort made to retire him. Eliminated were instances where a runner moved up a base on a fly ball too deep for an outfielder even to make a throw, or where a runner went from first to third on a hit into the gap while the batter loafed to a single. But there was still a lack of uniformity among official scorers. Some continued to bestow a stolen base whenever a runner hustled, while others went by the letter of the rule. As a result, the 1892 proviso was dropped before the 1897 season.
Finally, in 1898, the modern stolen base rule was adopted, removing credit for any extra bases advanced on a batted ball. Where in 1887, a runner scoring from first base on a single would have been credited with two stolen bases—third and home—now he is credited with none.
Before 1909, a runner earned a theft even if a teammate at the front or the back end of an attempted double or triple steal was nabbed.
Likewise, when a runner becomes entangled with a fielder after successfully stealing a base but fails to maintain contact with the base in disentangling himself and is tagged while off it, the stolen base is eradicated. If, however, the runner occupies the base and then steps off it of his own accord and is tagged by a fielder essaying to pull a hidden ball trick, he receives credit for the steal.
Until 1920, a runner could be awarded a stolen base when a defensive team was indifferent to his advance. An important point here: Making no palpable effort to nail a runner attempting to steal is not automatically characterized as indifference. Former Orioles catcher Gus Triandos, the only documented player with a perfect 1.000 steal percentage in over 1,000 games, profited from this rule. Triandos’s only career steal attempt came in the ninth inning of the second game of 1958’s season-ending doubleheader at New York on September 28 between Baltimore and the Yankees. Triandos, a right-handed hitter, batted for lefty swinger Joe Ginsberg, and singled off reliever Zach Monroe. His steal attempt was successful when the Yankees were so surprised that no throw was made. But the official scorer ruled correctly that a theft should be awarded because the situation warranted a throw. The Yankees were only ahead, 6–3, at the time and had yet to record an out in the inning.
Having to judge whether a batter is trying to sacrifice or bunting for a base hit has been a problem for official scorers ever since 1894, when sacrifice bunts first became an official statistic and a rule was adopted not to charge a player with a time at bat on a successful sacrifice. Because the note to Rule 9.08 (a) says the batter must be given the benefit of the doubt, many times an official scorer has no option but to award a sacrifice, even though the game situation plainly indicates a batter ought to have been hitting away.
In Game 38 of Joe DiMaggio’s 56-game hitting streak in 1941, on June 26 at New York, with the Yankees ahead of the visiting Browns by a narrow 3–1 margin, Tommy Henrich was reluctantly credited with a sacrifice when he bunted with one out in the bottom of the eighth and Red Rolfe occupying first to stay out of a potential double play so that DiMaggio, hitless on the day, would be assured of getting one more at bat. It paid off when DiMaggio laced a two-out double off Browns starter Elden Auker to keep his streak alive. Clearly, with New York ahead, 3–1, and one out already recorded, Henrich sensibly would not be sacrificing himself but trying to ignite a rally in a game still so close, and one might question whether the Brownies’ scorer would have been so generous to Henrich had the game been played to St. Louis.
Tommy Henrich, the perpetrator of a dubious sacrifice hit that assured Yankees teammate Joe DiMaggio of a final opportunity in Game 38 to extend what eventually became his record 56-game hitting streak in 1941.
Rule 9.08 (d) has a mercurial history. The 1908 season was the first in which a player was not assessed with a time at bat if he advanced a teammate at least one base with a fly ball. No distinction was made, however, between sacrifice flies and sacrifice bunts until 1920, when the RBI was made an official statistic and it became important to determine how many RBIs were the result of fly ball outs. In 1931, a decision was made to eliminate the sacrifice fly rule and charge a player with a time at bat. Batting averages dropped accordingly, but stayed so high during the 1930s that the rule change was little noticed.
The sacrifice fly was revived on an experimental basis in 1939, but only in cases where a fly out scored a runner. For reasons that are impossible to fathom now, the experiment lasted only one season. In 1940, the sacrifice fly was again abolished and the rule remained dormant until 1954, when it was once more hauled out of mothballs and given another trial. Since then, the rule has endured with only one significant change. In 1974, an addendum made it clear that a sacrifice fly should be credited if a batter hit a fly ball that brought home a run after it was caught by an infielder running into the outfield to chase a pop fly. Many official scorers had already been giving sacrifice flies in such cases, as well as on foul flies caught by infielders that enabled a runner to score. Among the leading victims while sacrifice flies were nonexistent between 1931 and 1953 were Stan Musial, Joe DiMaggio, Luke Appling, and especially Ted Williams. All lost points off their batting averages during many of their prime years, and in Williams’s case even a two-point increase in his career batting average would advance him from a tie for seventh to fourth place on the all-time list. But perhaps the greatest casualties were Johnny Mize and Hank Greenberg, two legendary sluggers who were qualifiers in only one season that included sacrifice flies (1939).
9.09 Putouts and 9.10 Assists
For the most part, the rules for crediting fielders with putouts and assists have always been very similar to what they are now, but there have been some notable differences. The 1878 season was the first in which an official scorer was authorized to credit an assist to a fielder if a batted or thrown ball bounced off him to another fielder, who then made an assist or a putout. Until the late 1880s, pitchers were often erroneously given an assist every time they struck out a batter. Before 1931, a pitcher was also given an assist if a catcher snared a pitched ball in time to nail a runner trying to steal home. That same year the rule was first drafted to credit a putout to the fielder closest to the play when a runner is hit by a batted ball.
The first year the sacrifice fly was permanently resurrected as a stat, the Dodgers’ Gil Hodges set a record that still stands for the most in a season, with 19 in 1954. Largely owing to the sac fly rebirth, Hodges batted .304 that year, a career high. Had the sac fly been recognized as a stat 13 years earlier, Ted Williams would have hit .413 in 1941 rather than a mere .406.
One of the new scoring rules for 1977 mandated giving an assist to any fielder who made a play in time to retire a batter, even if a subsequent error by another fielder prevented the out from being recorded. This rule was an ancestor of the current rule to the same purpose and allowed official scorers in the nineteenth century to credit assists to pitchers even on strikeouts that went awry. Interestingly, however, while the pitcher might get an assist on such a play, he would not always receive credit for a strikeout. The most remarkable example of this quixotic rule in action occurred on July 7, 1884, in a Union Association fray between Boston and Chicago. In the box for Chicago that day, with everything working for him, was Hugh “One Arm” Daily. Daily not only shut down the Boston Unions, 5–0, on just one hit, a three-bagger by Beantown catcher Ed Crane, but he fanned 19 batters to tie Charlie Sweeney’s then-existing major-league record, set only a month to the day earlier. In actuality, Daily should have had 20 strikeouts. One was lost to a third-strike passed ball by Chicago receiver Bill Krieg. The Union Association, in its lone year as a major league, refused to credit a pitcher with a whiff unless the batter was retired. On another missed third strike in Daily’s dream game, Krieg managed to toss out the batter at first in the nick of time. Krieg thus had 18 putouts on the day and one assist. Meanwhile, Daily notched 22 assists—all but two of them on strikeouts or, in one instance, a miscarried strikeout.
Since 1889, a third strike has been scored as a strikeout but not an assist in every league even if it results in a wild pitch or a passed ball that permits a batter to get to first base.
Even though it has never happened in a major-league game, the reader can infer from this rule that it is possible for a fielder, particularly a middle infielder, to be credited with all three of his team’s putouts in an inning without ever touching the ball. Here is but one way that can occur and hold a team scoreless despite it getting five hits in the inning: Batter A singles; Batter B singles Batter A to second; Batter C grounds a ball that hits Batter A, with the shortstop getting the putout, and Batter B moves to second while Batter C is credited with a single; Batters D and E follow with almost identical ground balls toward short that each hit the lead baserunner. The shortstop thereby notches three putouts while his pitcher logs a 5.00 WHIP for his work in the scoreless inning. By the way, the maximum number of hits a team can collect in an inning without scoring is six.
Although it has yet to occur on the major-league level, a pitcher could conceivably toss a perfect game in which his team made an infinite number of errors. Rule 9.12 (a) (2) explains how it can be done; every muffed foul fly that extends a batter’s turn at bat is an error regardless of whether or not the batter subsequently reaches base. Meanwhile, a number of major-league perfectos have been destroyed solely by errors. The ultimate in absurdity occurred to Jim Galvin, ironically the author of the first documented professional perfecto in history in an August 17, 1876, contest between his St. Louis Reds and the independent Cass club of Detroit at a tournament between pro and semipro teams in Ionia, Michigan.
In a National League game at Buffalo on August 20, 1880, Galvin topped Worcester’s Fred Corey, 1–0, on a muddy field despite having six errors made behind him—the most ever by a team behind a pitcher who was otherwise perfect. Galvin had to overcome two boots each by second baseman Davy Force and third baseman Dan Stearns, plus a dropped throw by first baseman Dude Esterbrook and a fumbled grounder by shortstop Mike Moynahan. Esterbrook atoned for his miscue by tripling home outfielder Joe Hornung for the game’s lone run. Because of the swarm of enemy baserunners, few at the game even noted that Galvin had hurled a no-hitter, the first ever by a pitcher on a visiting team.
The 1904 season was the first in which an official scorer was licensed to charge a fielder with an error for failing to cover a base. Previously, the error had always been given to the fielder who threw the ball—i.e., the catcher on a steal attempt—even when the throw would have been on target if a teammate had been where he was supposed to be.
Many followers of the game believe there should also be a rule that charges a team with an error rather than an individual player—or in some cases no one at all—in situations where an error of omission or an error in judgment occurs, such as when two or more fielders allow a pop fly to drop untouched between them. For one year—and one year only—there was such a rule, instigated by representatives of the AA’s St. Louis Browns. In 1888, all batted balls that allowed a player to reach base safely but were neither hits nor errors that could be justifiably assigned to a fielder, were deemed “unaccepted chances.” In part because the new category could not be conveniently fit into box scores, it was dropped after the 1888 season at the same winter meeting where four balls and three strikes were permanently set (but there is no reason it could not be resuscitated). Every pitcher who has been charged with an earned run owing to a play that should have been made but could not be labeled an error will agree that the notion of an unaccepted chance is perfectly logical.
As for assists, Rule 9.10 (b) says:
Early on, box scores often showed pitchers with an inordinate amount of assists. The natural assumption was that opposing hitters were bunting on them a lot, typically because they were poor fielders (a la Tom Ramsey, who posted a glamorous .761 fielding average in his six-year pitching career). But, as previously has been discussed, the high assist totals generally came from official scorers who improperly awarded pitchers with an assist any time they recorded a strikeout and some who only credited an assist if the batter who fanned had to be thrown out at first by the catcher. Contingency (1) under Rule 9.10 (b) has never been witnessed by this author except in a Little League game.
Just twice has this author witnessed a fielder deliberately let a foul fly fall uncaught to prevent a runner on third with less than two out from tagging up and scoring; one occasion was in a high school game and the other in a men’s senior league game. But it does happen now and again in major-league games, and then often because an alert teammate has shouted to the fielder within range of the ball, “Let it drop!” or something to that effect. One outfielder who had the presence on his own to let a foul fly drop was Cards left fielder Matt Holliday in the top of the 12th inning of a game against the Brewers on April 28, 2014, at Busch Stadium III. With the score tied, 3–3, and Milwaukee’s Jonathan Lucroy on third base with one out, Holliday purposely let Greg Davis’s deep fly to left drop untouched in foul territory. His heads-up play went for naught when Davis then tripled to right to score Lucroy and Milwaukee ultimately won, 5–3.
Under Rule 9.12 (a) (1) Comment there is this paragraph:
“The official scorer shall not score mental mistakes or misjudgments as errors unless a specific rule prescribes otherwise. A fielder’s mental mistake that leads to a physical misplay—such as throwing the ball into the stands or rolling the ball to the pitcher’s mound, mistakenly believing there to be three outs, and thereby allowing a runner or runners to advance—shall not be considered a mental mistake for purposes of this rule and the official scorer shall charge a fielder committing such a mistake with an error. The official scorer shall not charge an error if the pitcher fails to cover first base on a play, thereby allowing a batter-runner to reach first base safely. The official scorer shall not charge an error to a fielder who incorrectly throws to the wrong base on a play.”
Yes, players—even big leaguers—lose track of outs and throw balls into the stands, intending to give a lucky fan a souvenir. If the bases are empty when it happens, there is no problem. With runners on, however, the ball is immediately dead and they each advance two bases. On June 16, 2016, in a night game at Philadelphia’s Citizens Bank Park between Toronto and the Phillies, Phillies center fielder Odubel Herrera, mistakenly thinking there were three outs after catching a fly ball off the bat of the Blue Jays’ Michael Saunders, trotted out to the stands and threw the ball into them, helping to turn a one-run inning into a four-run inning. It mattered little since the Jays won, 13–2, but a similar humiliating gaffe by Montreal’s Larry Walker on April 24, 1994, was instrumental in an Expos loss to the Dodgers in a Sunday night game at LA.
In the second inning with one out and Jose Offerman on first, Walker caught Mike Piazza’s fly ball to deep right and casually flipped it into the stands. Offerman ended up on third base and the Dodgers posted two runs in the frame when Tim Wallach, the next batter, homered off rattled Expos pitcher Pedro Martinez. The Dodgers went on to win the game, 7–1. There are numerous other examples of this type of brain lock that we could offer, but practically all of them, curiously (or perhaps not), are from seasons since the 1994 strike. We all would give plenty to know the first time such a howler occurred in a major-league game, but there is no doubt it rarely, if ever, happened in the days when fans were all but arrested if they did not return balls hit into the stands to the playing field. Today’s players are actually encouraged to provide spectators with souvenir balls at every opportunity. Their efforts, in particular those by first and third basemen recording an inning-ending out, have been thwarted by the new major-league park rules requiring a solid stretch of netting to protect spectators along the first- and third-base lines from foul shots into the stands. Instead of simply tossing a ball to someone who catches their fancy, they now have to lob it over the netting and put it up for grabs.
In the game’s early years, scorers would often charge pitchers and catchers with errors in addition to wild pitches and passed balls, respectively. Now, the official scorer needs only to account for each extra base a runner takes. A passed ball or a wild pitch is explanation enough.
Earned run averages for National League pitchers were calculated as early as 1876; that season, Louisville’s Jim Devlin—yes, the same Jim Devlin who was banned after the 1877 season for taking bribes to throw games—allowed 309 runs but scarcely a third of them (109) were earned. After that season, there was a lapse of six years before the National League began calculating them again in 1883. Meanwhile, the upstart American Association kept track of pitcher ERAs for its entire 10-span as a major league (1882–91). But it was not until 1912 that the National League made calculating pitchers’ ERAs a permanent fixture. The following season the American League began doing likewise annually. At that time, an earned run was assessed to a pitcher every time a player scored by the aid of base hits, sacrifice hits, walks, hit batters, wild pitches, and balks before enough fielding chances had been offered a defensive team to record three outs. In 1917, stolen bases were added to the list of permissible aids to the scoring of an earned run.
Before the 1931 season, runners who reached base on catcher’s interference were added to passed balls and the other types of miscues that exempted a pitcher from being charged with an earned run. The 1931 rule reemphasized, however, that a run emanating from a batter who reached first on a wild pitch third strike was earned—even though the pitcher was charged with an error on the play—as the wild pitch was solely the pitcher’s fault. All other errors that allowed a batter to reach base or prolonged his turn at bat excused the pitcher from being saddled with an earned run, including errors committed by him.
In most cases, pitchers’ ERAs prior to the 1912 season have been reconstructed by dedicated researchers after laboriously poring over old box scores and game accounts. In some cases, a statistic that was computed well over a century ago was taken as fact for the lack of any method for verification. No one can really be sure now that Tim Keefe, by present scoring rules, really posted an all-time record-low 0.86 ERA by a qualifier in 1880 with Troy. It seems hard to imagine that Keefe could have allowed only 10 earned runs in 105 innings and yet lose six of his twelve decisions, whereas Troy’s other Hall of Fame pitcher, Mickey Welch, won more than half his games despite a 2.54 ERA that was 0.17 runs above the National League average that season.
Even Dutch Leonard’s twentieth-century record-low 0.96 ERA in 1914 has been scaled downward from 1.01 relatively recently as new information has come to light revealing that Leonard hurled 224 ⅔ innings that season rather than 222 ⅔, the total with which he was credited for well over half a century. Indeed, the career and single-season ERAs for almost every pitcher active prior to 1920 have undergone some adjustments since the first Macmillan baseball encyclopedia appeared in 1969.
Incidentally, the career and single-season ERAs for even some contemporary pitchers differ from one record book to another. Tommy John is an example. The final edition of the Macmillan encyclopedia in 1996 lists John with 4708 ⅓ career innings pitched and ERAs of 2.97 in 1979 and 2.64 in 1981. However, Baseball-Reference.com presently credits John with 4710⅓ career innings and ERAs of 2.96 in 1979 and 2.63 in 1981.
A mistake on either’s part? Not at all. To simplify the math work, for the 1970 through 1981 seasons major-league statisticians were unwisely directed to round off the innings a pitcher worked to the nearest whole inning, only to revert in 1982 to the original rule that counted each third of an inning. Baseball-Reference.com incorporates all the thirds of an inning John had lost to the new ruling in his career and single-season stats, whereas the final Macmillan edition continued to deduct a third of an inning from John’s stats in six different seasons. This sort of discrepancy has now been resolved for all pitchers from John’s era. But as has been already discussed earlier, another quirk that developed as a result of the rule to round off innings pitched at the end of the season cost Baltimore’s Sammy Stewart the AL ERA crown in the strike-shortened 1981 campaign. Stewart allowed 29 earned runs in 112 ⅓ innings for a 2.323 ERA, while Oakland’s Steve McCatty was touched for 48 earned runs in 185 ⅔ innings for a 2.327 ERA. When the innings were rounded off to the nearest whole number, however, McCatty won 2.32 to 2.33.
One oddity of the formula in Rule 9.12 (g) for determining when to charge a relief pitcher with an earned run is that it makes it possible for a reliever to be assessed an earned run for a tally that is unearned in his team’s totals. If, say, a reliever enters in an inning when all the runs his predecessor allowed are unearned because of errors, and then promptly gives up a home run to the first batter he faces, the dinger is an earned run charged to his account, but an unearned run to the team’s. Consequently, in many instances a team will have a season ERA that is lower than the aggregate season ERA totals of its pitchers.
Rule 9.17 and its five contingencies have recently replaced Rule 10.19, which some authorities feel was more straightforward and simpler to grasp. In any case, the 1950 rule book was the first to formalize what previously had only been a custom not to award a starting pitcher a victory unless he worked at least five innings in all games that went six or more innings. Before then, exceptions had occasionally been made, especially when a pitcher had to be removed after being injured. One of the most flagrant exceptions occurred in the 1924 World Series, when Giants starter Hugh McQuillan was awarded the win in Game Three even though he worked only 3 ⅔ innings because his club led Washington, 3–2, when he departed in favor of the more deserving Rosy Ryan, who would have gathered the Giants’ 6–4 win as per the 1950 rule change. McQuillan pitched again in relief twice later in the Series, negating the possibility that he suffered a disabling injury in his lone start.
In the nineteenth century, when teams often had only one standout hurler, pitching aces were frequently removed whenever they held a seemingly insurmountable lead. But removing a pitcher prior to 1889 meant having him swap positions with another player, who would then finish the game in the box. Sometimes a pitcher would be lifted as early as the second inning and sent to first base or right field on the premise that he could always be brought back into the box if the complexion of the game changed. No thought was given then to denying a hurler a win in such cases. In reviewing old records, however, some researchers have taken it upon themselves to deduct victories whenever a starting pitcher made only a token appearance.
One example where an effort has been made to rewrite history resulted in Providence’s Charley Radbourn now being listed in record books with only 59 wins in 1884 rather than 60, an all-time mark that for over a century went pretty much unchallenged. But closer examination reveals that the win in question, in a game on July 28 at Philadelphia, was, via the rule as of 1950, properly assigned to the starting pitcher Cyclone Miller, who trailed, 4–3, when he left the box after the fifth inning but led, 7–4, when Providence scored four runs in the top of the sixth, after which Radbourn came in to pitch four scoreless frames to produce an 11–4 Providence win. The issue remains, though, that in many other similar games throughout pre-1950 baseball history, the wins were given to the winning team’s most deserving pitcher, not necessarily the starter—even when he left the game with his club ahead. An even larger issue is that pitching wins and losses were only an informally kept statistic, as was a pitcher’s ERA, in the game’s embryonic years. When Cy Young finally took off his pitching toe plate for the last time in 1911, there were no headlines that proclaimed, “Young Departs with 511 Wins.” His total, in fact, has been revised several times over the years.
Formerly Rule 10.17 (c), Rule 9.17 (c) can lead to some bizarre scoring decisions and in at least one instance, a decision that impacted on a Hall of Fame reliever’s career record saves total.
In a night game at Camden Yards on September 12, 2013, between Baltimore and the Yankees, with his club ahead, 5–2, Yankees manager Joe Girardi called on setup man David Robertson to pitch the bottom of the eighth inning. Robertson retired the first two Orioles hitters he faced, but then yielded two successive singles followed by a three-run homer by Danny Valencia to tie the game at 5–5. Despite next serving up a double to J. J. Hardy, Robertson escaped the frame with the score still knotted. After the Yankees eeked out a run in the top of the ninth, Yankees relief kingpin Mariano Rivera came on in the bottom of the ninth to seal a 6–5 win for his club by setting the Orioles down 1-2-3 for what virtually everyone expected would be his 653rd career save, embellishing his already record total.
But instead of charging Robertson with a blown save and crediting him with the win, since he was still in the game when the Yankees went ahead, 6–5, in the ninth, the official scorer, Mark Jacobson, awarded the victory to Rivera, thus denying him what would have been his league-leading 44th save of the season. Jacobson’s decision was correct on all counts but one that is nonetheless not often seen, as Rule 9.17 (c) is perhaps more inconsistently applied by official scorers than any other. It gave Rivera his 82nd and final career win, but froze his record save total at 652 since he retired after the 2013 season. When and if someone approaches Rivera’s all-time saves record, it will be interesting to see how vividly this game, his last win instead of his last save, is remembered.
Here is an illustration of a more commonly seen application of Rule 9.17 (c). On August 16, 2004, at Arizona’s Bank One Park (now known as Chase Field), Pirates reliever Jose Mesa came on in the bottom of the ninth to protect a 7–3 lead that would have meant a win for starter Sean Burnett (had Mesa been successful). Instead, Mesa gave up five hits and four runs to send the game into extra innings. Pittsburgh tallied a run in the top of the 10th and Mike Gonzalez held the Diamondbacks scoreless in the bottom half. If Mark Jacobson had been the official scorer for this game, Gonzalez probably would have gotten the win, but Diamondbacks scorer Rodney Johnson awarded it to Mesa instead, even though he was colossally ineffective in the single inning he worked and Gonzalez was given his only save of the season.
Rule 9.17 (d), at a glance, seems so clear that it requires no discussion. Yet until the twentieth century was well underway it was not unusual for an official scorer to saddle a starting pitcher with a defeat when his team lost, even if he left the game with his club leading. Likewise, a pitcher sometimes would collar a win in a game that he left while his team was trailing. In a 1912 meeting on April 20 at the Polo Grounds between the Giants and Dodgers, Giants rookie Jeff Tesreau got one such victory that seemed insignificant at the time but turned out to be of monumental importance. Tesreau was relieved by Rube Marquard in the top of the ninth inning of a game in which the Giants were trailing, 3–2, after he gave up three runs in the ninth. Marquard retired the only batter he faced to end the Brooklyn rally. When the Giants tallied two runs in the bottom of the frame off Nap Rucker in relief of Eddie Stack, the official scorer put a “W” beside Tesreau’s name. Nowadays, the win would go to Marquard, who was in the game when the winning run scored. Had Marquard garnered that extra victory, it would have enabled him to launch the 1912 season by winning his 20th straight decision. Instead, he had to settle for 19 straight wins, tying the then all-time record set by Tim Keefe in 1888 rather than establishing a new standard.
Had Rule 10.19 (a) been on the books in 1912, Rube Marquard would have begun the season with 20 straight wins. As it is, his 19–0 start is the best in history.
There is a strong temptation now among some baseball historians to correct these apparent injustices. As a result, the career won and lost totals of many pitchers, including not only Cy Young but several other Hall of Famers, have been revised in the past four decades. It becomes almost a matter of personal taste whether Tom Hughes had 16 wins for the Boston Braves in 1915 (as per the final 1996 Macmillan edition and Baseball-Reference.com) or 20 wins (as per the 1982 Macmillan edition). Hughes is by no means an extreme example. In any event, many decisions made by official scorers over a span of some fifty years have since been rescinded in an effort to bring a historical uniformity to all records.
This rule might almost have been tailored to account for Ernie Shore, the perpetrator of the greatest one-game relief stint in history. On June 23, 1917, in the first game of a doubleheader at Fenway Park, Shore relieved Red Sox starter Babe Ruth after Ruth was booted by home-plate umpire Brick Owens for arguing a ball four call to Washington Senators leadoff hitter Ray Morgan. Enraged, Ruth charged Owens and threw a punch at him before he could be hauled off the field by a policeman. Had Ruth’s attempted haymaker landed, the course of baseball history might have been permanently altered. Sox catcher Pinch Thomas was also tossed for objecting to the call. Thomas’s replacement, Sam Agnew, gunned down Morgan trying to steal second, and Shore then retired the next 26 batters in a row and for over half a century received credit for both a shutout and a perfect game.
Shore’s perfect-game honor is still the subject of controversy. Most historians now agree that no pitcher can earn a perfecto in a game where an opposing runner has reached base safely, or a complete game for that matter when he was not the starting pitcher. But Shore’s combined no-hitter and shutout remain firm according to the rule book.
On May 31, 1988, Yankees reliever Neil Allen collected a whitewash in a similar manner when he blanked the A’s, 5–0, at Oakland. Al Leiter started for the Yanks and on his first pitch was struck on the left wrist by Carney Lansford’s smash, which resulted in a double. After Leiter was removed from the game, Allen was given all the time he needed to warm up as per the injury rule. He then preceded to toss nine innings of three-hit scoreless ball to receive credit for a shutout but not a complete game. Allen’s 1988 stats reflect that he had no complete games but one shutout. It appears to be a misprint, until Allen’s game log for the 1988 season is examined.
Until 1969, the term save was not even an official part of the game’s lexicon. That season, major league rulemakers—at the urging of sportswriter Jerome Holtzman—first paid formal acknowledgement to a facet of relief pitching that many publications, The Sporting News among them, had already long since championed. But what The Sporting News deemed a save and what is now considered a save are not at all the same. In 1967, for example, The Sporting News named right-hander Minnie Rojas of the California Angels the American League “Fireman of the Year” for his 22 saves, whereas Ted Abernathy of the Chicago Cubs got the National League trophy for netting 26 saves.
Most record books now list Rojas with 27 saves in 1967 and Abernathy with 28. The reason for the disparity is because The Sporting News granted a save only when a reliever faced the tying or lead run during his mound stint or began the final inning with no more than a two-run lead and then pitched a perfect inning. In contrast, both major leagues awarded a save in 1969, the year the term got its official baptism, if a reliever merely entered the game with his team in front and held the lead for the remainder of the game. What this meant is that, by The Sporting News’s definition, a reliever in 1967 who came into a game with his team ahead 4–1 and worked three perfect innings would not get a save, but the major-league rule in 1969 gave a save to a reliever who worked just the final inning of a 10–0 blowout.
Over the years, these inequities have been eliminated. In 1973, the rule was amended to give a reliever a save if he either found the potential tying or winning run on base or at the plate during his stint or else worked at least three effective innings. Two years later, the current save rule was adopted.
In determining retrospective saves for pitchers active before the concept was born, researchers applied the 1969 rule. Hence many early day pitchers are now credited with specious saves. An unfortunate example is Ted Conover, a one-game major leaguer with a career 13.50 ERA, who earned a retrospective save some eighty years later for his effort on May 7, 1889, when he replaced Cincinnati rookie starter Jesse Duryea in the eighth inning of a 16–4 Cincinnati blowout win over last-place Louisville and surrendered all four Louisville runs in his two-inning stint. Why was Conover called on at all by Reds manager Gus Schmelz? Probably to save overtaxing Duryea’s arm, but to little avail. As it was, Duryea not only lost his chance for a shutout but worked 401 innings in his frosh season and won 32 games. He won only 27 more in his brief five-season career.
When center fielder Stevie Wilkerson took the hill at Anaheim for the pitching-depleted Orioles on July 25, 2019, in the bottom of the 16th inning and notched a save in Baltimore’s 10–8 win over the Angels, the media made much to-do of his being the first position player ever to record a save. That may have been the case ever since the save became an official stat in 1969, but is otherwise untrue. In 1883, Cap Anson finished two games in the box for Chicago and collected a retrospective save in one them. But unlike Ted Conover’s save six years later, Anson’s emanated from a solid performance. In two relief appearances that year, he logged an 0.67 WHIP. Two years later, Anson’s teammate, second baseman Fred Pfeffer, bagged two saves and two wins in five appearances. Prior to 1887, when the pitching rules tightened, making it problematical for a non-pitcher to enter the box and throw with whatever delivery he pleased, Anson frequently used his position players in relief or even as spot starters. Wilkerson’s save may be the last of its kind if MLB’s roster-size increase to 26 in 2020 enables teams not to have to use position players as pitchers in crucial situations.
Before each major league had an official statistician, tabulation errors were often made that resulted in the wrong players being awarded batting titles, stolen base crowns, etc. The most blatant error was perpetrated by an unidentified statistician, probably from Philadelphia, in 1884. From the data furnished by American Association officials at the close of that season, the loop batting crown belonged to Philadelphia A’s first baseman Harry Stovey with a towering .404 average. Stovey’s heady figure stood uncontested for a century until researchers in the 1980s carefully scrutinized the 1884 AA season and discovered that his true mark was .326 and the real winner of that season’s batting crown was Dave Orr with a .354 average. A blunder so gargantuan seems as if it must have been perpetrated deliberately to steer the honor to Stovey, one of the AA’s most popular and highly esteemed players, rather than Orr, a nonentity at the time.
In 1901, at the close of the American League’s inaugural season as a major league, Nap Lajole was awarded the loop’s fledgling batting title with a .422 average on 220 hits in 543 at-bats. A statistician noticed in 1918 that 220 hits in 543 at-bats produced only a .405 mark, and all the record books then reduced Lajole’s 1901 average to the lower figure. Following a story on Lajoie in The Sporting News in 1953, attention was again drawn to his 1901 season. The official American League records for that year had long since been destroyed, but baseball historian John Tattersall’s examination of the 1901 box scores unearthed 229 hits for Lajole in 543 at-bats. Tattersall’s research again restored Lajole’s average to .422, where it remained until another search through the 1901 box scores in the 1980s confirmed that Lajole had actually collected 232 hits in 544 at-bats that year, for a twentieth century–record .426 average. Among the three newly discovered hits were a triple and a home run that also raised Lajole’s slugging percentage to .643, a 13-point hike.
Is it now safe to assume that at least all the serious discrepancies in batting, pitching, and fielding records in the so-called post-1900 “Modern Era” have been eliminated? Far from it. In fact, Nap Lajole’s .426 batting average in 1901 has yet to gain universal acceptance, and several other significant discrepancies that have been revealed remain intact in record books sanctioned by the major leagues, including Baseball-Reference.com.
One is the 1910 American League batting race, for years food for violent controversy, which was seemingly resolved some while ago when it was incontrovertibly established that Ty Cobb hit .383 rather than .385, giving the crown to Nap Lajole with a .384 mark.
The problem radiated from a Detroit box score that had inadvertently been included twice in the season-end calculations, the duplication resulting in Cobb being credited with three extra at-bats and two unearned hits. However, major-league officials continue to recognize Cobb as the 1910 American League batting leader, believing that history should not be rewritten. Many baseball analysts concur, albeit for a different reason. In a doubleheader on the last day of the 1910 season, to help Lajole overtake the unpopular Cobb, St. Louis Browns manager Jack O’Connor ordered rookie third baseman Red Corriden to play deep on Lajole, enabling the batter to bunt down the third-base line at will and collect six “baby” hits in the twin bill. Cobb still won the bat title by a single point—or so it was then thought—but O’Connor and Browns coach Harry Howell were later banned from the majors for their role in the plot to deprive Cobb of his honor. In any case, Baseball-Reference.com now lists both Cobb’s and Lajoie’s 1910 averages in bold, designating league leadership, even though the averages differ.
National League statisticians also have a longstanding cross to bear. Chicago Cubs third baseman Heinie Zimmerman was retroactively awarded the Triple Crown in 1912 when he seemingly paced the senior loop with a .372 batting average, 14 home runs, and 103 RBIs. It has since developed that Zimmerman had only 99 RBIs that year, leaving him three behind Honus Wagner, the true leader, with 102. But the final Macmillan encyclopedia in 1996 continued to assign Zimmerman 103 RBIs in 1912, even though other major reference books by then were crediting him with only two legs of the Triple Crown (batting average and home runs).
Meanwhile, recent developments have given statistics sticklers good reason to doubt that even post-expansion statistics are 100 percent free of errors. The most significant one uncovered since the turn of this century is that Orioles first baseman Jim Gentile was deprived of an aforementioned RBI in 1961 that would have put him in a tie with Roger Maris for the American League lead that year with 141.
Never in major-league history has a team with the highest winning percentage in its league not won the pennant. But prior to 1882, the rules made such an event possible—and it happened on one occasion in the top minor league of its time.
At the finish of its 1878 season, the International Association standings listed the Syracuse Stars with 29 wins and 11 losses for a .725 winning percentage. Meanwhile, Buffalo ended with 32 wins and 12 losses for a .727 winning percentage. Early day record books credited the pennant to Syracuse, while more modern reference works give the crown to Buffalo. This in effect would be counter to the current philosophy to concur whenever possible with the statistical rules of the time rather than those in our day were it not for modern researchers’ consensus that Buffalo’s true record was 27–10 (.730) in league play while Syracuse finished half a game behind at 26–10 (.722).
Only once in major-league history has a player been denied a batting crown even though he met the minimum performance standards then in existence. In 1938, rookie Washington Senators outfielder Taft Wright hit .350 in exactly 100 games. At that time, a player customarily had to appear in at least 100 games to be eligible for a batting title. But because Wright collected just 263 at-bats, an exception was made and the crown instead went to the far more deserving Boston Red Sox first baseman Jimmie Foxx, who finished a point behind Wright at .349.
Because there was no at-bat minimum in 1938, Wright theoretically could have won the crown with just one at bat as long as he somehow got into 100 games. The 100-game minimum became the unofficial standard in 1920 and remained in effect until 1945, when a 400 at-bat minimum was formally introduced. For the next dozen seasons, the rule for determining batting and slugging leaders fluctuated wildly, with a new twist added almost yearly. In 1957, the major leagues at last adopted the current standard that a player must have 3.1 plate appearances per every game his team plays to qualify as a batting or slugging leader.
Taft Wright was denied the American League batting title in 1938 in part because Jimmie Foxx finished right on his tail. If no other hitter had been within 30 points of Wright’s .350 mark, quite possibly he would have been the first rookie in American League history to wear a batting crown.
Supporting this is the fact that several other batting leaders prior to 1957 who would not have qualified for their crowns under the current rules were allowed to continue to reign largely because their averages stood alone at the head of the pack. Since 1901, the two major leagues have crowned five batting champs—some say six—with inadequate credentials by current standards. The two American League winners who would not qualify now are Ty Cobb in 1914 and Dale Alexander in 1932. In addition, some current reference works—Baseball-Reference.com among them—credit Nap Lajoie with the 1902 crown even though he had only 352 at-bats and played just 87 games of a 140-game schedule. The three National League champs with far fewer than 3.1 plate appearances per game their teams played are Bubbles Hargrave in 1926, Debs Garms in 1940, and Ernie Lombardi in 1942. Lombardi won with just 309 at-bats, and Hargrave had only 326. Both were catchers, accounting somewhat for the willingness to overlook their skimpy plate totals. Because of the position’s harsh demands, catchers have generally been given special dispensation with regard to appearance requirements in determining league leaders. In fact, for many years they were only required to catch half their team’s games to qualify for the fielding crown.
Since Lombardi’s triumph in 1942, no National League catcher has won a batting title, and only Joe Mauer, with three titles, has won in the American League. But had the same 100-game-minimum rule that governed in 1942 still applied a dozen years later, the 1954 NL batting crown would have gone to Smoky Burgess, a backstopper with the Philadelphia Phillies. In 108 games and 345 at-bats, Burgess swatted .368—23 points better than Willie Mays, the recognized leader in 1954.
Bubbles Hargrave reached base via a hit, walk or hit by pitch only 144 times in 1926 but won the National League batting title.
The 1954 season saw another batting-title first when a player who would have won his league’s title under the current rule failed to qualify under the standard then in existence. On the surface, this seems an impossibility. How could a player accumulate 3.1 plate appearances for every game his team played and yet fail to have enough at bats to qualify as a leader? And yet, incredibly, it happened.
In 1952, a rule was enacted that a player had to have 2.6 official at-bats for every game his team played to win a batting title. The rule was still extant in 1954, when Ted Williams posted a .345 batting average, slugged at a .635 pace, and had a .516 on-base percentage after he collected 136 walks in just 117 games. But because Williams got so many free passes, he had only 386 at-bats. His total was 20 short of the 406 he needed to give him 2.6 at bats for each of the 156 games the Red Sox played, and the crown instead went to Cleveland’s Bobby Avila, who finished the season with a .341 average. Williams was awarded the slugging title, however, largely because his .635 mark was 100 points higher than runner-up Minnie Minoso.
When Williams’s walks, sacrifice hits, and hit by pitches in 1954 are combined with his at-bats, his total number of plate appearances is far in excess of the number needed today, let alone in 1954 when the schedule was eight games shorter.
But although Williams surely felt an injustice had been done to him, few members of the baseball public were aware of it at the time. By 1954, most fans were thoroughly befuddled as to what credentials a player needed to win a batting title. The confusion persisted until expansion lengthened the schedule to 162 games and made it imperative that a player whose team played a full slate accumulate at least 502 plate appearances to qualify as a leader. In 1959, many Clevelanders were baffled when Tito Francona of the Indians entered the last day of the season with the highest average in the American League and yet was said by the media to have no chance to win the batting title even though he was just a couple of at-bats shy of 400. Only then did Tribe fans discover that at some point (back in 1957, to be exact) the rule had been changed from 400 at-bats to 3.1 plate appearances for every schedule game. Francona finished with 399 at-bats and a .363 batting average—10 points higher than winner Harvey Kuenn. But since he needed some 30 more plate appearances to meet the minimum standard, his average would have fallen below Kuenn’s had the requisite number of plate appearances been added to his total.
One final note: In 1996, Tony Gwynn fell four plate appearances short of the required 502 when injuries held him to just 116 games, but when four hitless plate appearances were added to his total of 498, his .353 mark still finished well ahead of Colorado’s Ellis Burks at .344.
Before 1951, qualifications for ERA leaders were fuzzy. Generally, any hurler who either hurled 10 complete games or else worked at least 154 innings—the number equaling the amount of games scheduled prior to expansion in 1961—was considered to be a qualifier, but sometimes the ERA champ would be a real eye-opener.
In 1940, after being called up from the minors with less than eight weeks to go in the season, Ernie Bonham of the New York Yankees tossed 10 complete games in 12 starts and compiled a 1.90 ERA, easily good enough to win the crown . . . until someone pointed out that he had pitched only 99⅓ innings. A number of record books recognized Bonham anyway (and a few still do), whereas others gave the honor to Bob Feller, who finished the season with a 2.89 ERA. The unofficial complete-game minimum of 10 was otherwise firm prior to 1952, however, including in 1943 when Howie Pollet was recognized as the NL ERA leader with a 1.75 mark despite having toiled only 118 innings. Incidentally, Baseball-Reference.com continues to recognize Pollet, and not his far more deserving Cardinals teammate Max Lanier, who finished with a 1.90 ERA in 213⅓ innings as the 1943 NL ERA king, while refusing to recognize Bonham as the 1940 AL ERA leader.
Until recently—just once before 1951, when the one inning for each game played by a pitcher’s team went into effect—was there an exception to the unofficial complete-game minimum: in 1927, when New York Yankees rookie Wilcy Moore, working as a combination starter-reliever, posted a 2.28 ERA in 213 innings, but had only six complete games in his 12 starts. Ironically, exactly ten years earlier, Fred Anderson of the Giants—who had eight complete games—was not recognized as the 1917 NL ERA champ (though he now is by Baseball-Reference.com) even though he hurled 162 innings and had an ERA 0.39 less than the leader, Pete Alexander. The Moore and Anderson cases pointed up the most serious flaw in the then-existing qualification standards: Unless an exception was arbitrarily made, as in the case of Moore, a pitcher frequently used in relief had no chance to win the award regardless of how many innings he hurled because he could never collect a sufficient number of complete games. Significantly, in 1952, only the second season the new rule was in place, the National League ERA crown went to Hoyt Wilhelm with a 2.43 ERA in 159⅓ innings and 71 games, all in relief. Whether Wilhelm would have been awarded the ERA title if his performance had occurred prior to 1951, we will never know.
Along with embracing the increasing importance of relief pitching, the new rule in 1951 paraded a certain prescience in another way when it made a minimum number of innings pitched the only criterion for eligibility. Were 10 complete games still a criterion, no pitchers would have qualified for an ERA crown since 2011 when Tampa Bay’s James Shields became the last pitcher to date to log a double-digit complete-game total with 11.
Rule 9.23 (b) answers whether Joe DiMaggio’s 56-game hitting streak in 1941 would have been terminated if he had been unable to play in a game during his skein. It would simply have been put on hold and then resumed when DiMaggio returned to action regardless of how many games he missed. Some other significant hitting streaks have nearly gone unrecognized, though, when their perpetrators sat out games. In 1922, first baseman Ray Grimes of the Chicago Cubs set a then major-league record when he collected at least one RBI in 17 consecutive games. No one was aware of it at the time, not even in Chicago, because the streak did not come in a continuous 17-game stretch—Grimes was idled by a back ailment for nine days in the middle of his skein. But even if his feat had been accomplished in a single 17-game burst, it still might have gone unremarked until long after the fact. In 1922, RBIs had only been an official statistic for two seasons, and as yet few cared all that much about records anyway. When Pete Rose established a new record for career base hits in 1985, virtually the entire sporting world was aware of it, but Ty Cobb’s landmark 4,000th hit in 1927 received so little attention that even Cobb failed to realize what he had done until he read about it in the newspaper the following day.
Unrecognized altogether for over half a century was Bill Joyce’s record of 69 consecutive games reaching base safely in 1891 as a member of the American Association champion Boston Reds (in what evolved into the AA’s final season as a separate major league). Joyce’s record, begun in 1890 while with Brooklyn of the Players’ League, was tied in 1941 by by Ted Williams (69) and broken by Joe DiMaggio (74), but at that time no one even knew he had held it. Joyce’s streak was frozen at 69 when he broke an ankle sliding on July 2, 1891, at Boston in a 12–4 win over Washington. His skein ended on October 3, 1891, his first game back in the lineup after his injury interruption, when he went 0-for-3 in a 6–2 win at Boston over Washington’s Kid Carsey.
The current mark is 84 consecutive games, set in 1949 by Ted Williams, the only player in history to compile two such streaks lasting 60 or more games.
This could almost be called the “Lou Gehrig Rule.” In Gehrig’s day, there was no formal rule regarding the minimum amount of time a player had to appear in a game to extend a consecutive-game playing streak beyond that his name had to appear in the box score. The present rule was instituted before the 1974 season in conjunction with legislation on what terminates a hitting streak.
One must think that the rulemakers had Gehrig’s shadow on their minds when they decreed that a single plate appearance—even in a pinch-hitting role—would not terminate a consecutive-games-played streak, but a pinch-running appearance would. Obviously a pinch-runner is often in a game longer than a pinch-hitter, who may be around for only a single pitch. Gehrig, however, had a day on July 14, 1934, at Detroit when his back was suffering so bothersome a bout with lumbago that it seemed his streak was at an end. Fortunately, New York was on the road, allowing Yankees manager Joe McCarthy to find an ingenious way around Gehrig’s temporary disability. On his lineup card that afternoon McCarthy penciled in Gehrig as the Yankees’ shortstop and leadoff hitter in place of Red Rolfe.
The record books thus show Gehrig as having played one game at shortstop in 1934, making him one of the rare lefthanders to play a keystone position, even though he never actually served as a short fielder. After opening the game with a single, Gehrig was removed for Rolfe as soon as he touched first base. His streak was thereby preserved, never to be seriously jeopardized again until early in the 1939 season when he began showing symptoms of the incurable neuromuscular disease that would soon claim his life.
Little remembered, however, was that this game just so happened to be the turning point of the season for both clubs. Pinch-running for Gehrig, Rolfe scored the first run of the game as the Yankees took a 4–0 lead after the opening frame. The contest quickly evolved into a free-swinging affair. Heading into the bottom of the ninth, the Yankees led, 11–8, but fell prey to a last-ditch four-run rally that sent the Tigers home with a 12–11 victory, catapulting them into first place ahead of the Yankees by two percentage points. After Detroit won again the following day over the Yanks to take a one-game lead, Mickey Cochrane’s club eventually romped home by a seven-game margin.
Gehrig, of course, held the American League record for consecutive games played as well with 2,130, until Cal Ripken Jr. broke both his career and league records on September 6, 1995. The National League record prior to expansion was much more modest, but also took a bizarre twist along the way. Stan Musial, from April 15, 1952 through August 22, 1957, set a new NL record for consecutive games played with 895, breaking Gus Suhr’s old mark of 822 (Musial’s skein was subsequently broken by Billy Williams.) But if not for a suspended game, Musial’s streak would have ended at 862 games. Musial intended to sit out the second game of the July 21, 1957, doubleheader at Pittsburgh. The St. Louis Post-Dispatch reported he had not played because “the combination of the doubleheader and the hot humid weather was too formidable.” With one out in the top of the ninth and the Cards ahead, 11–2, Ken Boyer singled and the game was suspended to comply with a Pittsburgh curfew.
When the game was resumed on August 27, Musial immediately pinch-ran for Boyer and then played first base in the bottom of the ninth. The full half inning on defense extended his streak, even though it officially ended after the August 22 game against the Phillies at Connie Mack Stadium in which he tore a muscle and chipped a bone swinging at a pitch from Jack Sanford. The following day, Joe Cunningham replaced Musial at first base.