Print

Print


On Dec 11, 2018, at 11:34 AM, Chris Gray <[log in to unmask]> wrote:

> 1. I would add that what is called "AI" nowadays is not what was meant when the term was invented.  "AI" now is just another name for statistics and statistics (especially in disguise) is very dangerous given that untrained human beings are very bad at statistics.  I would rather that people spent less time promoting "AI" and more time becoming aware of cognitive biases and how they infect everything we think.  I love the book title "Don't believe everything you think".

I concur. Not only has the topic of artificial intelligence waxed & waned over time, but so has the definition and implementations. Artificial intelligence is most definitely a loaded term.

Now-a-days, artificial intelligence is associated with chatbots, aspects of natural language processing, and machine learning. Before it was about expert systems and neural networks. Chatbots are a reincarnation of the venerable ELIZA application. Natural language processing takes all sorts of forms. And yes, machine learning is a whole lot of linear algebra applied against a matrix of vectors. In all cases, the systems take some sort of input, and the output is deemed "intelligent". One has to define intelligence, and in most cases it is associated with memory and logical thinking. It ignores emotional intelligence, physical intelligence, musical intelligence, etc. 


> 2. We are not even close and most of that technology should be classified as torturing the data until it confesses.

I will spin a nuance here. Given the volume of materials at hand, AI often produces output which is "good enough", especially compared to time spent doing the same work by hand. The problem is measuring "good enough". For example, I have written systems enabling a person to "read" hundreds of journal articles and accurately glean their "aboutness" in a fraction of the time it would take read them in the traditional manner. Moreover, I can enable a person to easily & thoroughly navigate through the corpus. Mind you, there are a whole lot of things the person can to in the traditional manner that my systems are not able to do.

AI and human efforts ought to complement each other. It is not an either/or sort of thing. 


> 3. Technology is no substitute for serious human thought.  Metadata will not save us.  On this Cory Doctorow's Metacrap is my touchstone.  Also Fred Brooks's "No Silver Bullet" was right in 1986 and it is still right.
> 
> I'm not saying that techniques that come out of AI research aren't worth using, but that we should use them instead of being used by them and those that hype them.


Which leads me to a take-away from the recent AI conference. "Make sure a human is in the loop."

ai4lib++

--
Eric Morgan
University of Notre Dame