Tuesday, October 23, 2007

"What is Intelligence?": Food for the AI Folks...

Wikipedia describes intelligence to be: a property of mind that encompasses many related abilities, such as the capacities to reason, to plan, to solve problems, to think abstractly, to comprehend ideas, to use language, and to learn.
And this is simply not what wikipedia thinks; some of the great talks by renowned computer scientists I have ever attended (including Turing Award winners like Fran Allen) talk alike. Nevertheless the question is, it is really sufficient to describe intelligence by these skills? Haven't most of the work traditionally in AI starting from propositional planning to combinatorial logic always strived to inculcate one or more of these traits in the computers? If yes, then why are we so much away from even a 10% intelligent (compared to humans) computer today?
I am not sure if there is any precise answer, or actually if we ever will have the ability to actually answer this and make it happen in the future. But there are two things, I believe, which are typically very characteristic of humans and which the AI community hasn't probably thought of to instill in their endeavor for a super-smart computer!
These are: intuition and adaptation.
Intuition: It is the ability to take decisions or do things without being goaded by a standard reasoning process. I guess it is very typical of human beings and acts as a sophisticated ability to make judgments where reasoning cannot be applied. Unfortunately, while lot of work has been done about how to make computer reason about things, little has been said about taking decisions (under certain circumstances) when no reasoning can be applied. So here is a new direction, although the problem is difficult!
Adaptation: It is typically a positive characteristic of an organism that has been favored by natural selection. Is it not interesting to think of building systems that can actually evolve over time? Agreed, there has been some work in this regard. But the problem is more profound than judged. Adaptation is something which should enable a system to evolve in the sense can it can get rid of some characteristics, generate some, inherit some other others as well as mould and modify them to its own needs. The second food for thought!
Let's see what the next 40 years of AI research has in store for us! I will come back to this blog to compare my perceptions then...

4 comments:

waldwick said...

Great post!

I am no expert...but can't you model intiution by randomness. You may argue that computers can not generate truly random numbers and so events, but again, intiution is also not completely random either.

There are computer programs which simulate adaptaion by cumulative selection. Cumulative selection is the key word since, which result is desirable and which result is not, has to be determined by an external environment. In natural selection too, nature acts as the selector out of all the random pool of species. Therefore, I guess, who has adapted successfully is totally independent of adapataion per se, but is dependent on nature...

I might not make whole lot of sense...but your post is great... applause... :)

Munmun said...

@ waldwick
I think I understand somewhat what you mean. The catch is, intuition is not even like pseudo-randomness (which all computers today are capable of generating) It is more like a 'biased randomness'. Take the example of a biased dice. When you roll it, some value turns up with higher probability than others. I am also not very clear; but intuition is probably something like this.

waldwick said...

That is an interesting point you have made.
If a decision is biased, then is it not a weighted decision?
And when I think about myself, I mean when I do something intuitively, I usually have hundred % weight devoted to the intuition I have just had. But I guess, randomness comes into picture while selecting the objective itself.
Of all the things that I want to do in a day or in my life as a whole, I randomly pickup a task and do it.
I think that a completely random selection of one of the tasks of all the tasks could only be defined as intuition.
Wouldn't a logical intuition be = decision? =)
That brings in an interesting point again. Let us say that there is a bug in the algorithm that generates weights. And every once in a while it generates erroneous weights here or there. Could these erroneous weights be called logical intuitions? He He…World of AI is amazing….

dipthought said...

There are bright people and there are really bright ones. Those are mere mortals.

Geniuses are different ball game altogether. In this era of parity, the are becoming very rare to come across.