AI is swiftly becoming a reality. Machines are "learning" in exactly the same way that we humans learn, by seeing, making guesses, being corrected. Only they do it at blinding speed. Humans are much more limited in the speed at which they can acquire new knowledge. (The old models for "programming intelligence" into computers have been abandoned, because, really, we always knew they wouldn't work.)
The frightening thing, again, is that "speed" I mentioned. There is a "singularity" approaching, according to futurist Ray Kurzweil ("The Singularity is Near"). Kurzweil, in describing his law of accelerating returns, predicts exponential increases in a variety of relevant technologies, and as this "singularity" is reached, he makes a very good case that machine intelligence will be greatly more powerful than all human intelligence combined.
What will that mean? Nobody really knows, and yet we continue to move forward, without knowing. 'Tis a bit unnerving to me.