Computers though are just smaller and cheaper.
They also over estimated progress in what class of tasks/ problems can tackle. There is about zero progress in real AI since 1946. The inherent kinds of things that Turning, Shannon, Von Neuman and others 100% knew COULD be done with enough storage hasn't changed. The power (i.e. speed) of a computer doesn't allow more complex things, just faster completion.
They only underestimated the cost and size. The IC was what made that possible:
Newly employed by Texas Instruments, Kilby recorded his initial ideas concerning the integrated circuit in July 1958, successfully demonstrating the first working integrated example on 12 September 1958.
Commercially viable ICs from about 1960
ANY current computer design can be made with 1930s mechanical relays or electronic valves (tubes). It would just be VERY large, expensive and slow. Tommy Flowers proved a valve computer could be reliable during WWII. Conrad Zuse had a reliable relay based computer by 1939. The German Military were not interested. The ARM based CPU core in phones and tablets is lower power than Intel / AMD x86-64 because it uses very much fewer transistors.
A discrete transistor is about 1mm x 1mm x 0.1mm in a 4mm diameter tube. IC transistors are about 25nm, an nanometer is a million times smaller than a mm.
Acorn Archimedes - Wikipedia, the free encyclopedia About 4.5 million instructions per second in 1987. A 1945 computer was about 500,000 times slower, though at code breaking only about 1000 times slower as that is what it was used to help with.
Turning proved what class of problems a computer could solve. That hasn't changed. The so called Turing test was just an unproven idea, almost a parlour game, not a proof of AI or any suggestion AI was possible. There is still none. "AI research" today uses its own narrow meanings for language to create the illusion of progress.
No one in 1940s or 1950s imagined that in 1969 we would have men on the moon and then before the end of the 1970s simply lose interest.
Golden Age SF writers massively overestimated progress in spaceflight and massively underestimated progress in computers.
So I would argue the reverse is true for up to 1973 (The first Intel 4004 microprocessor shipped in 1971, Affordable "pre IBM" PCs by 1976)
Intel 4004 - Wikipedia, the free encyclopedia
Today's Graphical User Interface, mouse and networking / Internet concept all working by 1976 on mini-computers.
Computer just have got cheap and small. A decent display now costs more than the rest of the laptop!
Entire computer including storage, RAM, graphics and I/O can be one chip, though typically three in a phone / tablet and many more in Intel type Laptop. It can't do any kind of thing the 1939 or 1944 computer in theory could do very slowly.