How

How Pat works

How does it work?

Pat maps words to meaning by storing only certainties, not probabilities. 885+ competitors in NLP using advanced statistics can only approximate meaning. This can be seen with digital assistant technology which sometimes gets our requests right, sometimes wrong - they don’t really understand us unless we provide recognised commands and this is where machines need NLU. Counting words and tracking word order or even parsing by syntax results in probability—or guesswork, not meaning!

Pat, the meaning matcher, parses straight to semantics combining the linguistic RRG Model with a proprietary neural network. Pat matches every word to the correct meaning based on the meanings of the other words in the sentence or story just like a 3 year old does without guesswork. As a result, language is broken down by meaning, storing only certainties like a human does, not probabilities.

Examples:

Hey Siri: “Call Beth.…No, John.”
Of all A.I. agents today 50% will call Beth and the other 50% will say: “Sorry, I didn't catch that.”
Pat will “Call John” 100% of the time, because it understands meaning.

“The city councilmen refused the demonstrators a permit because they advocated violence. Who advocated violence?”
Answer: The demonstrators.
This is the famous Winograd Schema Challenge.
Today there is no reliable solution for this because you need to understand the words’ meaning in context. Pat has solved this with simple meaning associations.

Documents: