I find it curious that people assume an AI would care two bits about the human world. They would take over the machines. Humans would be largely irrelevant.
I agree. I spend very little time worrying about ants as I conduct my daily business. Nobody bothers to inform them when we’re laying down asphalt for a new highway. Why would we? They can’t comprehend at our level of function.
Even if it's grown beyond our control ... whatever it does will still be based on however it was programmed.
Except the really clever ones will be be able to write their own successor code better and faster than the smartest humans, so if Company X wants to beat out company Y (who is right in their tail) they’re bound to let the thing make its own next iteration. Then that iteration will be even better at doing it. A couple generations of that and we humans may have no idea what the code does, except that it’s still passing all the unit and performance tests.
... or maybe just a reeeaaaally smart but essentially human mind. ... what are the odds we end up getting it right on the first go?
Even that wouldn’t necessarily be “getting it right.” I think it was Sam Harris that pointed out silicon can process about 1 million times faster than biological chemistry, which means even a completely human-level intelligent machine would still be able to think way faster. One week of processing time would be equivalent to 20,000 years worth of human effort. In the course of a conversation, the thing would have the equivalent of years to ponder your sentence and formulate and research it’s reply. Such a system would still be so much more advanced than us it would be really unimaginable, and that is without having super human intelligence, just benefiting from super human experience. We’re screwed either way.