News Article
Fuzzy Semantic Relations in the ASTRID system
Fuzzy Semantic Relations in the ASTRID system

Published: 2019-01-20 in Science

A recent breakthrough in the ASTRID project makes it possible for the system to recognize fuzzy and esoteric semantic relations in training data.

 

A new discovery in the semantic model that powers the ASTRID system, has made it possible for the system to recognize fuzzy defined predicates. The so-called predicates describe the (logical) relations between the concepts that describe ASTRID's internal world-model. The predicates make it possible for the system to reason about the world in structural, causal and temporal contexts.

The original basis for this kind of reasoning is called 'predicate logic'. The predicates are traditionally logical (hence the name), and therefore always true or false. This is the only way to do 'reasoning' in a symbolic rule-based system that lacks common-sense knowledge.

Traditionally, predicates are also pre-defined in systems that use predicate logic. ASTRID was already capable of finding the predicates in the training data, without pre-defined lists of predicates, but now those predicates don't even have to be 'logically' true or false.

 

Humans understand that reality is not inherently true or false. Obviously, there are things that are either true or false, but many things are not black and white like that. Some things are mostly true, but sometimes false. Other things are 'somewhat' true in a certain context, but also somewhat false. Traditional predicate logic cannot deal with concept like 'sometimes', 'mostly', 'somewhat', 'seldom', and other fuzzy determinations like that.

In the past, the solution for this was called 'reasoning with uncertainty'. Systems and models like 'fuzzy logic' and 'Bayesian logic' were invented to handle this. The problem with these approaches is that the 'fuzzy' part doesn't have any real meaning. It is just a calculated value that doesn't relate to anything. The ASTRID system, on the other hand, can now actually learn, for example, that 'seldom' means 'once a month' in one context, and 'once every century' in another context. 

 

NOTE: This news-article is presented here for historical perspective only.

This article is more than two years old. Therefore, information in this article might have become changed, incomplete, or even completely invalid since its publication date. Included weblinks (if present in this article) might point to pages that no longer exist, have been moved over time, or now contain unrelated or insufficient information. No expectations or conclusions should be derived from this article or any forward-looking statements therein.

Telegram
LinkedIn
Reddit
© 2021-2023 MIND|CONSTRUCT  
Other Articles in Science
 
  News
  • 2012-05-09 - Research paper online: Why we need 'Conscious Artificial Intelligence'  

 
  Blogs
  • 2018-01-24 - Breakthrough in common-sense knowledge - Dr. Dr. Ir. Gerard Jagers op Akkerhuis