This is a good point, but since we are speaking in terms of theoretical boundaries, all we really need to decide is if there is more going on in the brain than information processing. If the answer is no, then while we may not understand exactly what those biological circuits are doing for decades or centuries to come, we can conclude that, in principle, a computer can do the same thing someday. Information processing is just a matter of taking inputs, transforming it in some way, and producing an outputs. I can't think of any sort of information transformation a biological circuit can do (again, in principle) that an electrical one couldn't. The difference is the scale and magnitude of the biological circuitry compared to the laughably simple digitized neural networks running on today's machine learning software. Even so, this is a difference of scale not a difference of kind.
If there is more going on in the brain than just information processing (e.g., a soul or something, which like you, I tend not to believe), then biology and computers may be fundamentally in different domains.
But again, we're speaking in terms of theoretical domain, not practical or current domain. I absolutely agree that we have no idea how the brain really works and can't hope to truly simulate it at the current time.
I half agree here. If you mean to compare current ML and NN implementations to the brain is a mistake, I agree fully. What we have now are very rudimentary systems that require vast training processes which have no corollary in real humans.
But if by architecture you just mean CPUs, networks, RAM, etc... then I actually disagree. I still think it is just a matter of scale, speed, and complexity. In principle (IMHO), our current hardware could run the same I/O circuits as a given human brain IF we knew way more than we do about how the brain works so that we could develop the ML software. It would be uselessly slow, though. Architecture improvements would certainly be necessary to make this practical (and are already coming... NVIDIA has a whole ML line of chips now) but still... problems of scale, not kind.
Curious to hear your take on all that.