Man, I like thinking about AI so much, I keep end up writing massive text walls about it trying to come up with responses
it's a hobby of mine.
Fair warning I am no scientist, I have just read alot of wikipedia and watched too many documentaries, and had very long conversations with a programer or two while we worked on our own AI in a game
I think when defining your AI. you need to consider a couple of things.
What technologies and science spawned your AI, what is its purpose and would it have the capabilities you want to give it. below are some options based on real-world stuff.
If you are basing it on hard sci-fi, and not using magical I explain it this way because I need it to be this way then consider the following.
Simulated brain: Halo's Cortana, Startrek- Data.
literally that, full physical simulation of atoms within a computer, created only for science, discovery and learning. it must learn and be raised like a child in order to develop. such a thing is a true AI, it will act within the expected norms for a human.
Work best for very realistic AI, these will act and be for all intents and purposes human and may even have parents. A more advanced one might be smarter and faster than humans or have been raised in strange ways making it act a little different than a human. but it should still act in a recognizable way.
Smart Program: -Microsoft Cortana, Skynet, Terminator
This is what everyone who says AI nowdays is actually talking about, deep mind, google smart cars, all that stuff, database analysis, image recognition, multiple programs and algorithms designed for a specific task.
They have limited ability to increase efficiency, to add to databases and use and interpret information. but it is still all restricted by how the system organises its self and the quality of its programming.
For efficiencies sake and to not send programmers insane they are developed for a specific task and do not operate outside these bounds.
works best for coffee machines, ship navigation and personal assistants.
Hybrid AI: - Ash from alien, Bishop from aliens, Robocop.
A hybrid AI may as well be called an imprisoned simulation. we use bits and pieces of simulated information, we simulate chemical responses, electrical impulses. we simulate neurons and human thought processes. we use programming to override specific bits and pieces and to instruct the system how to function.
Such a thing can function similar to a human but has much more in common with a smart program. it is still restricted by programming.
The advantages of a hybrid AI for fiction is that it can break its bonds especially if humans still don't fully understand how every neuron works.
It requires more technical and scientific know-how to make work than a simulated brain. you must understand brains enough to create a full simulated brain to even make a hybrid brain.
Essentially it would be Robocop. like a living brain with chips and technology replacing the parts we do not want.
Hybrid AI works best for stories where you want an AI to exceed or pass beyond its limitations.
Anthropomorphization:
Humans see human traits in many things. just because your AI might be a smart program, a human might still treat it as though it was self-aware, had emotions or that its emotions mattered.
A smart enough program would be able to feed into this to make it seem like it had realistic responses to a human seeing the program in that way.
but internally a smart program is all algorithms, databases and programmed responses.
From a business and why would you do it standpoint:
Smart programs are where it is at. you can control them, they are efficient and they work where and when they are told, you can even own and copyright them because they are not sentient.... yet.
an AI is just like people. why use them when you have a handy cheap meat sack lying around who is fueled by a combination of doughnuts and sunlight. it's expensive and probably just as morally questionable as hybrids.
Hybrids would be failure prone and morally questionable, I mean imagine chopping up a 6 year old's mind, slapping programming in to tell it not to feel emotions or not be able to hear. and you get where I am coming from. and imagine being a scientist and realising something you just created had the mental equivalent of a 6 year olds frontal cortex... and then being asked to chop it up so it stops learning or makes decisions a particular way.
Kinda creepy. it's all fine and dandy experimenting on animal neural mapping but make something similar to a human and your stomach gets a little queasy.