To recap: yes, early cars were still susceptible to rust, but they were thickly built to the point that most had been sent on their way in favor of something new before their original owners started to notice any major problems.
That all started to change in the early 1970s. Automakers were slowly switching over to unibody-type construction as well as more ornate and elaborate vehicle designs in almost every segment of the market. At the same time, the Big Three felt increasing pressure to produce lighter, more fuel-efficient automobiles while also cutting costs, which meant moving towards increasingly thinner gauges of steel.
It was a recipe for disaster that would produce nearly 15 years of rust-vulnerable automobiles, vehicles with expiration dates looming a mere two or three years after they had been built due to their extreme susceptibility to corrosion in oxidation-friendly regions. Sound dramatic? Not if you lived through it. In fact, Chrysler’s own data from the era suggested that one in five winter climate cars featured rust holes after a mere two years on the road, which jumped to more than half after another two years.
All those folds and seams in vehicle designs Detroit was building were perfect for collecting moisture, dirt, grime, and salt, which in combination with the reduced steel content across the board dramatically accelerated the corrosion process. Plastic and rubberized undercoatings meant to deaden sound or even protect against salt spray actually ended up further nestling water and calcium chloride against vulnerable components.
Laissez-fair attitudes towards quality control certainly didn’t help, either. Sub-par vehicles not infrequently showed up on dealer lots, often with trim or paint missing, which left metal exposed to the elements and ripe for a visit from the rust fairy. It was not uncommon for brand-new vehicles to require some type of rust repair right out of the box.