U.S. manufacturing experienced a precipitous and historically unprecedented decline in employment in the 2000s. Many economists and other analysts—pointing to decades of statistics showing that manufacturing real (inflation-adjusted) output growth has largely kept pace with private sector real output growth, that productivity growth has been much higher, and that the sector’s share of aggregate employment has been declining—argue that manufacturing’s job losses are largely the result of productivity growth (assumed to reflect automation) and are part of a long-term trend. Since the 1980s, however, the apparently robust growth in manufacturing real output and productivity have been driven by a relatively small industry—computer and electronic products, whose extraordinary performance reflects the way statistical agencies account for rapid product improvements in the industry. Without the computer industry, there is no prima facie evidence that productivity caused manufacturing’s relative and absolute employment decline. This paper discusses interpreting labor productivity statistics, which capture many factors besides automation, and cautions against using descriptive evidence to draw causal inferences. It also reviews the research literature to date, which finds that trade significantly contributed to the collapse of manufacturing employment in the 2000s but finds little evidence of a causal link to automation.