A strong AI would need a theory of incompleteness in order to save time when solving problems, and that's not as simple as saying "this program is taking more than a second to finish, I quit!". There must be some intuition to make the decision.
I'm puzzled as to why we'd somehow expect so much more of artificial intelligence than we expect of natural intelligence, or that we somehow don't expect to be able to develop artificial intuition systems.
My own intuition ;-) is that intuition seems to derive more from pattern matching than from logic.