Looking at giant old computers is fun. The complexity and power they had, not to mention the energy they consumed. These ancient beasts processed the land so that we could have processors in our pockets, and now our mobile processors are more powerful.
But there are so many lessons in the past informing us about today. For example, looking at the specialized hard drives, you have to wonder why they had so much extra circuitry. Did they need a lot o sensors? And while controllers have always been separate from hard drives, the size of the controllers and the size of the hardware around the physical disks were much more significant. Today hard drive circuitry is minimal compared to a controller, but older drives seem to be larger than the controller in some cases; only from the lessons of building and iterating that we learn what was needed and what was not. The march of technological progress and demand for it helped fuel the miniaturization of the various components.
When I look at an IBM rack that's only job is to connect a punch and printers to the processing unit, I develop an appreciation for the standards and buses we have today. You have to guess they built the processor first, and the peripherals were an afterthought or just a lower priority. Yet, they did so much in hardware that's now done in software.
Doing things in hardware was a more considerable detriment then than now because it was much less reliable. Some of this Big Iron had more redundancy than we'll ever see in our consumer-level computing devices. Less goes wrong when working with lower voltages and smaller transistors.
So this looks really inefficient, but software people, in some cases, were harder to come by than hardware people, and building correct software took a lot longer because they didn't have the tools we have today. Hardware products would be finished and wait years to be utilized because the software lagged behind.