Posts tagged ‘AI’

Cognitive Context

Eliza brings a couple of things to the table that other systems don’t – mostly as it allows a way to quickly load some structure into systems – which then allow the running of test data against those structures.  It’s often a way to short-circuit starting from 0 knowledge (new born infant) and to boot-strap yourself a 3 year old.  A simple example is extracting sentences from a paragraph.  It can be used as a pre-parser or a post parser or as a way of rephrasing data.  Rephrasing is a handy tool for testing validity.

It provides a vehicle for asking questions but also provides an approach to determining the relevance of information within the available context.  The term available context was used as it’s often interesting to limit the available information to cognitive processes.

You often ask questions about statements you encounter: Who, What, Where, When, Why

You’ll also have an operational mode that you’ll switch between: operational modes help to define how a cognitive process should approach the problem.

In the human model – think along the lines of how your state of mind changes based on the situational aspects of the encounter.  The context of the situation can be external, reflective or constructed.

External contexts are where we are expected to respond – maybe not to all input – but to some.  Often these situations are where an action or consensus is required.
Reflective contexts are where information is absorbed and processed – generally to bring out understanding or knowledge but also when a pattern is reverse fit – not proving a fact but re-assimilating input so that it correlates.
Constructed contexts are the what if situations & problem solving. Similar to the reflective context but more about adjusting previous input to test fitness to something new while attempting to maintain it’s validity to other knowledge.

You’ll often start in a reflective context as you assimilate information and then move into a constructed context to maximise knowledge domains.  Then you’ll often edge into the external context – while running reflective contexts in the background.  Periodically you’ll create constructed contexts to boot-strap knowledge domains and to learn from how knowledge domains are created (which in turn will tune how the reflective domains obtain information).

Essentially this is a lot of talk for saying that you don’t always need to provide an output.  🙂

Now I mentioned at the beginning that it’s often interesting to limit the information available to an available context – often it’s not only interesting but also important.  The available context is the set of prior knowledge (and the rules (or the approach) of applying the relationships to the information the it’s surrounding knowledge).

If all knowledge is available to an available context and the same approach is used for processing that information – then it’s hard for a system to determine relevance or importance of which facts to extract from data.  In essence the system can’t see the wood from the trees.

Think about how you tackle a problem you encounter – you start with one approach based on your experience (so you’re selecting and limiting the tools you’re going to apply to deal with the situation) and based on how the interaction with the situation goes – you’ll adjust.  Sometimes you’ll find that you adjust to something very basic (keep it simple stupid or one step at a time) – at others you’ll employ more complex toolsets.

The Eliza approach can be used not just as a processing engine – but also as a way of allowing cognitive systems to switch or activate the contexts I mentioned earlier.  It’s also a handy pre-parser for input into SOAR.

One of the reasons for these recent posts is after visiting zbr’s site and reading his interest in NLP and cognition.  I stumbled over his site when looking to understand more about POHMELFS, Elliptics and your DST implementation.  I’ve been looking for a paralleled distributed storage mechanism that is fast and supports a decent approach to versioning for a while for a NLP & MT approach.  Distribution and parallelism are required as I implement a virtualised agent approach which allow me to run modified instances of knowledge domains and/or rules to create dynamic contexts.  Versioning is important as it allows working with information from earlier time periods, replaying the formation of rules and assumptions and greatly helps to roll-back processing should the current decision tree appear fruitless.  In human cognitive terms these act as sub-concious processing domains.

Saturday 20th June, 2009 at 3:08 pm Leave a comment

Cognition (expanded)

There are several underlying problems with cognition which are different from what most expect.

The primary issue is due to perception where too much emphasis is attributes to the human senses (primarily sight and sound) – which as I’ve mentioned before – are just inputs.  As you’ll know from physics – you’ll often see simple patterns repeated in many different fields – it’s unlikely that cognitive processes will be any different when dealing with sound/sight and thought.

The next issue is that many fall foul of attempting to describe the system in terms they can understand – a natural approach but essentially it boils down to the pushing of grammar parsers and hand lexers with too much forward weighting to identify external grammar (essentially pre-weighting the lexers with formal grammar).  An approach that can produce interesting results but isn’t cognition and fails as an end game for achieving it.  Essentially this is the approach used in current machine translation processes in it’s various forms.

The key fundamental issue is much simpler and related to issues around:  pattern, reduction & relationship.  An area that had some activity a while ago in various forms (cellular networks, etc) but fell to the wayside generally due to poor conceptual reference frameworks and the over-emphasis on modelling approaches used in nature (neural networks).

Now comes the time of definitions – a vehicle to ensure we’re on the same page 🙂

Pattern:
Cognitive processes thrive on them – and it’s one of the main drivers behind how it perceives, processes and responds to information.  There’s a constant search to find similarities between what is perceived and what is known.  It’s a fuzzy matching system that is rewarded, in the sense that it promotes change or adaptation, as much by differences as it is with finding similarities.  When thinking about similarities – a handy term is to think about something being true or false.  Don’t confuse true/false as the general definitions of the terms – it’s more about the sense of confidence.  If something has a high confidence of being valid then it is true.  The threshold of confidence is something that evolves and adapts within the cognition over time (essentially as a result of experience).
The development of patterns is both external (due to an external perception or input) and internal.  To avoid turning this comment into something massive (and boring you 🙂 ) – think along the lines of the human cognitive process and the subconscious or dreams.

Reduction:
Reduction happens at several key stages –  essentially it’s when a domain of experience breaches a threshold.  It’s a way of reducing the processing required to a more automatic response.  Think along the lines of short-circuit expressions.  It’s a fundamental part of the cognitive process.  From a human cognitive perspective you have probably seen it in your climbing and in your learning of the trumpet.  We often express it as “having the knack” or “getting the hang” of something.
It’s important for 2 reasons: a) it means it has gained knowledge about a domain; b)  it allows the cognitive process to further explore a domain.  While Reduction is a desirable end-game – it is not The End from a cognitive process perspective.  The meta information for this node of Reduction combines again and again with Pattern and Relationship allowing the process to reuse both the knowledge itself but more importantly the lessons learned when achieving reduction.

Relationship:
Relationship is really a meta process for drawing together apparently unrelated information into something that’s cohesive and is likely to either help with identifying patterns or for bringing about Reduction.  Relationship at first looks very similar to Pattern but differs in it’s ability to ask itself “what if” and by being able to adjust things (facts, perception, knowledge, Pattern, Reduction and versions of these[versions are actually quite important]) to suit the avenue that it being explored.  When expressed in human cognitive terms think of Relationship as the subconscious, dreams or the unfolding of events in thought.  The unfolding of events is an example of versions.  Essentially Relationship is a simulation that allows the testing of something.

Saturday 20th June, 2009 at 3:02 pm Leave a comment


Recent Posts