AXTut Programs

From emergent
Jump to: navigation, search
← previous InputData AX Tutorial → next OutputData

NOTE: Within emergent, you can return to this document by selecting docs/Programs in the left browser panel).

Programs for Controlling the Simulation

Return to the LeabraWizard_0 wizard panel, and in the same Standard Everything section, click the Standard Programs link (or press: Standard Programs). This pops up a confirmation dialog explaining that it will create a new set of standard programs based on the project type (this project is a LeabraProject). Hit OK.

This created a set of LeabraAll_Std standard Leabra programs used to train the network, which organize the presentation of input patterns to the network into a hierarchy of time scales:

  • LeabraBatch -- iterates over multiple different training runs (think: simulated network "subjects" participating in an experiment) -- each having its own different random weights initialization (we won't use this initially).
  • LeabraTrain -- a complete training run of the network from random initial weights to final correct performance, by iterating over multiple "epochs"
  • LeabraEpoch -- one full pass through all of the different task input patterns, by iterating over multiple "trials"
  • LeabraTrial -- processes one input pattern, using four phases of "quarters" -- three minus phases present the input stimulus, and allow the network to come up with its own best guess as to the correct response, and one plus phase presents the correct answer to allow the network to learn to perform the task correctly.
  • LeabraQuarter -- multiple updates of neural unit activations to process a given input/output pattern, interated over multiple "cycles"
  • LeabraCycle -- a single cycle of updating of neural unit activation states (roughly 5-10msec of simulated real time)

There are also some other supporting programs that we'll discuss later.

Running the Simulation

First, make sure you're viewing the Network_1 network in the right side view panel. Then click on the LeabraTrain program, which can be selected in the Navigator panel under programs → LeabraAll_Std → LeabraTrain. At the bottom of the LeabraTrain program window that appears in the middle Editor panel, press the Init button, followed by the Run button (these links will initialize and run the program for you in emergent!).

You should then see the network processing each of the input patterns for the task multiple times, as the program iterates over epochs of trials of quarters of cycles of processing. Depending on your hardware, this may wiz by in quite a blur. With the default initial settings in the network, it may take a long time to learn (we'll adjust these soon) -- hit the Stop button if you need to.

You can see more clearly what it is doing by watching as the network performs one quarter of settling at a time. To do this, hit the Stop button, and then Init again. Then hit the button named Quarter:1 in the collection of gray buttons that starts with Step:1. As you step through the training program, you should observe the network activate an output unit in the minus phase (look for MINUS_PHASE) in the text displayed in the Network_1 view), followed by the right answer coming on in the PLUS_PHASE.

We'll learn a lot more about how programs work when we write one from scratch to generate our input data for training the network. If you're adventurous, you can click on the programs and hit the Edit Program button near the top of the program editor panel to see the underlying "guts" that make the programs do what they do. Everything that happens in running the simulation is explicitly listed out, and can be modified in any way that you might want -- this is very powerful and probably a bit dangerous too.. :) We recommend that you don't do anything to modify the programs at this point.

The next step is to more clearly monitor the performance of the network as it learns, by recording OutputData from the network.

→ next OutputData or OutputData in emergent