AXTut CPTAX Program

From emergent
Jump to: navigation, search
← previous TaskProgram AX Tutorial → next PfcBg
  • PRELIMINARY INSTRUCTIONS: This tutorial is a continuation of the basic AX Tutorial and assumes the user has completed that and has a basic familiarity with editing in Emergent, creating simple projects, and so on. Accordingly, these instructions do not go into the nitty-gritty of finding certain elements and doing basic operations, details covered in the basic tutorial. Users are referred there for basic click-by-click-level instructions.
  • Also, the safest place to start for this tutorial is to open the ax_tutorial_final.proj and find this AX_Tut_CPTAX page from there. The ax_tutorial_final.proj reflects the state of the ax_tutorial.proj after having completed the AX Tutorial correctly and is a reliable basis for extending it further here. Alternatively, users continuing directly from the basic AX Tutorial can continue working in their local, working version of the ax_tutorial.proj. In this case, it is advisable to save that version of the project with a new name (e.g., ax_tutorial_my_final.proj) and then work in their customized version.
  • NOTE: The original ax_tutorial.proj itself, as downloaded from the wiki or opened from the …/demo/leabra folder of Emergent, is essentially an empty project with no specific content; conversely, the ax_tutorial_cptax.proj reflects the completed state of this advanced tutorial and is therefore not suitable as a working copy either.

Overall Plan for the Program

In thus segment of the tutorial our goal is to write a program that will generate the CPT-AX task (CPT stands for continuous performance task), which is the logical extension of our simple AX task to the sequential domain. Instead of A and X each being targets, the target is now an A followed by an X in sequence. Other sequences such as A followed by Y or B followed by X are non-target sequences. In our simplified version of this task (and in several of the actual experiments on people), we restrict the sequences to cue-probe pairs, where cues are A,B,C and probes are X,Y,Z.

  • See e.g., Braver, T.S., Barch, D.M. & Cohen, J.D. (1999). Cognition and control in schizophrenia: A computational model of dopamine and prefrontal function. Biological Psychiatry, 46, 312-328, for an application of this task and further discussion and references.

One task detail is key for generating interesting behavioral and neural data: the frequency of the A-X target sequence is set to be relatively high (typically 70%), so that it becomes a "prepotent" expectation. As a result, non-target sequences become much more interesting. In particular, for A-Y, the occurrence of an A should produce a strong expectation of getting an X on the subsequent (probe) trial, which critically will be influenced by the extent to which the A cue is well remembered. Errors on this trial type, where people might press "Target" at the Y, suggest strong maintenance of the A cue. A complementary argument applies to B-X sequences, where the occurrence of an X, typically a target, generates a habit-like predisposition towards producing a Target response, unless it is overridden. The C, Z items serve as controls, as does the B-Y sequence.

With the above in mind, here is the logic of our overall program:

  • Flip a weighted coin to determine whether we want to generate a target sequence or not. If a target comes up (~70% of the time) we just need to produce A followed by X, which is very easy to implement.
  • Dealing with Non-Target sequences is harder. We need to randomly select from the cues (A,B,C) and then the probes (X,Y,Z), while ensuring that we don't randomly pick A-X. We'll discuss a couple of different strategies for this.
  • Using a simple for loop, we will do the above cue-probe generation process multiple times to generate a relatively large set of trials that we will then run as an epoch's worth of training.

List of Variables

To start, we'll create some new variables that we'll need -- in addition to some left over from the basic AX Tutorial. This is a key first step and illustrates a basic heuristic for the typical flow of programming in Emergent -- create variables and then operate on them.

  • pct_target -- how frequent should the target sequence be? (default = 0.7; although actually a proportion, percent (pct) is a more intuitive concept)
  • rnd_number -- a random number between 0 and 1 (floating point or Real) that we'll generate to simulate the flipping of a weighted coin.
  • cue -- the identity of the cue input (A,B, or C) represented as a DynEnum of type Input, taking on values I_A, I_B, or I_C.
  • probe -- the identity of the probe input (X,Y, or Z), represented by an Input DynEnum as well. Together cue and probe will take the place of the input_unit variable from the basic AXTutorial.
  • probe_out_unit -- correct answer for the output layer on probe trials, replacing output_unit (DynEnum of type Output). Since the appropriate answer is Non-Target for all the cue-time trials we don't really need (or want) to use a variable for that.

We'll need some additional variables later, but since they will be more about internal housekeeping that basic programming logic we'll wait to deal with those when the need arises.

Getting Started: Copy and Modify

NOTE: See Preliminary Instructions above to make sure you're starting from the appropriate state of the project reflecting this stage of the overall tutorial.

The easiest way to get started is to duplicate and modify the existing AXTaskGen program we made in the basic AX Tutorial. (Again, this is a good heuristic -- if there is a program that has several elements that you want, just copy and modify instead of starting from scratch.) To do this, select the AXTaskGen program in the left Navigator tree, and use the context menu to select Duplicate (or Ctrl+m). Rename the copy "CPTAXGen", and update the description to reflect what we're doing.

Now go to Edit Program tab (middle Editor panel), and click on the for object in the prog_code, and use the context menu to just delete the whole thing (or Ctrl+d), which is almost all of the guts of the old program. All that should remain is the ResetDataRows at the start.

We can now setup our variables as indicated above. In the LocalVars rename input_unit to cue, then duplicate it and call the new one probe. Then, duplicate that guy and rename it "rnd_number", and change the type from DynEnum> to Real. Then rename output_unit to probe_out_unit. Finally, leave Name as it is. Then, in the global vars section above create a new var called "pct_target", set the type to Real, and enter a value of 0.7. Enter informative descriptions for each variable in their desc fields (you can copy and paste from the descriptions above if you wish).

Flipping a Weighted Coin For the Target

The first step in our actual prog code is to implement the logical equivalent of flipping a weighted coin to decide if it is a Target sequence or not. We do this by generating a random number (rnd_number) which is uniformly distributed between 0 and 1. We then perform a check to see if this number is less than our test percent value, which will occur 70% of the time for a value of 0.7.

  • In the Tools/Functions tab (far left) there is a random() element -- drag-and-drop that to the end of the prog code section (drop on prog_code or after the ResetDataRows). In the selection window that pops up for random() scroll down and click on the ZeroOne method. In the pale blue edit window that appears at the top of the Editor select rnd_number as the result_var. You should ignore (and leave blank) the thr_no arg that appears under the Random::ZeroOne statement in prog_code - this is a low-level parameter for specifying thread numbers to optimize parallel processing and isn't relevant here. Each time this method is called as the program is run it will generate a random real value between 0 and 1 and assign it to rnd_number.
  • Next, drag the if element (Tools/Control) to the end of your program. In the cond file, enter: <code>rnd_number <= pct_target.

The code we will put under this if statement will define the Target case, and we'll add an else block for the Non-Target case in a moment. To set the target values, we just need to assign cue and probe to A and X respectively. Drag the Tools/Assign/variable= element so it ends up under the if conditional, and set the result_var to cue, and enter I_A in the <code>expr field. Repeat to make a variable= assignment for probe = I_X. Finally, repeat a third time and set probe_out_unit = O_T. There should now be three variable assignment statements under the if conditional statement.

Generating Non-Target Cases

First, add an else conditional after the if, where we will deal with the NonTarget cases.

Two simple strategies for generating Non-Target sequences (that exclude A-X) are:

  • Brute force: randomly generate a cue and a probe and check that they aren't A-X -- if they are, then repeat the process until they aren't. This is not particularly efficient, but it is effective and very easy to code, so we'll use it here.
  • Choose from a list: generate a list of all possible cue-probe combinations, remove A-X from this list, and then randomly select an item from this list. This is more efficient computationally, but a little trickier to code. Motivated users are encouraged to explore this approach as an exercise, as it demonstrates some important programming techniques.

To implement the brute-force method, we can enclose the random generation code in a "do...while" loop, which does some things (in this case, generating random cue/probe pairs) and then tests whether it should loop again (if it just generated an A-X), or not.

Drag the do while element from Tools/Control tab into your else block. Enter (exactly) cue == I_A && probe == I_X in the test field for continuing to loop ( == is the equality operator, to be distinguished from the = assignment operator; and && is the logical AND operator in the C/C++ programming language syntax).

Inside the do...while loop_code, we want to randomly generate a cue, then a probe. Drag-and-drop that Tools/Functions/random() element again, this time to the end of the do...while loop and scroll down in the window that pops up to select the IntMinMax method (TIP: clicking in the category field at the top and selecting "Int" can greatly speed up your search). Notice that new lines appear for min and max arguments under the main statement in the prog code -- these are the values that will be passed to the IntMinMax method. Click on min, and enter I_A. For the max, enter I_C + 1, because values are generated exclusive of the max arg (but inclusive of min; i.e., C/C++ syntax, where values go between 0 and n-1 instead of 1 to n). As explained earlier just ignore the thr_no argument. Finally, select cue as the result_var in the editing section at the top. Again, notice the nifty use of enum variables, which here is taking advantage of their "integer" nature to assign either 'I_A', 'I_B', or 'I_C' to cue.

Next, duplicate this method (e.g., Ctrl+m), change min = I_X, max = I_Z+1, and result_var to probe. Finally, drag Tools/Assign/variable= to be the last line in the block and set probe_out_unit = O_N as all of these trials are Non-Target cases. (TIP: Alternatively, you could also have copied the probe_out_unit = O_T assign statement from the if block and then change O_T to O_N).

To test the program at this point, drag a Tools/Functions/print variable element to the end of the prog code, and select cue, probe, and probe_out_unit for the vars to print. Then, do Init and then Run several times. You should observe that it tends to produce a predominance of A-X and O_T trial types (~70%). Run it enough times to satisfy yourself that the probe_out_unit value always corresponds to the cue-probe pair. To test it further, you could also temporarily set pct_target = 0, and Run several times more. Now you should never see any A-X sequences, but only O_N (Non-Target) trial types. When done, don't forget to change pct_target back to 0.7.

Here is what your program should look like at this point:

LocalVars (5 vars)
  cue = I_A (Input) // cue input (first of sequence of two inputs)
  probe = I_A (Input) // probe input (second of two)
  probe_out_unit = O_N (Output) // output unit (target or not)
  rnd_number = 0 (real)
  Name = (String)
ResetDataRows:  table = input_data 
rnd_number=Random::ZeroOne()
if (rnd_number < pct_target)
  cue = I_A
  probe = I_X
  probe_out_unit = O_T
else
  do ... while (cue == I_A && probe == I_X)
    cue=Random::IntMinMax(I_A, I_C+1,)
    probe=Random::IntMinMax(I_X, I_Z+1,)
  probe_out_unit = O_N
Print: cue probe probe_out_unit

Generating Descriptive Names

In the basic AX Tutorial we named our trial types based simply on the input_unit being presented to the network. Now things are more complicated since a logically-defined trial is actually a cue-probe sequence. Thus, a trial in which an 'X' is presented is actually different according to whether the prior trial had been an 'A', or not. Thus, we want a naming scheme that reflects this sequential dependency and fully characterizes each trial, a scheme such as that illustrated by the following examples:

  • AX_A_N - "AX" defines the cue-probe sequence; "A" denotes that this trial presents the "A"; and "N" means that, like all cue trials, it is a Non-Target case.
  • AX_X_T - Here, the "X" denotes that this trial presents the "X"; and "T" that it is a Target trial.

To generate names of this form, we will take advantage of some string manipulation methods available to us for instances of the String class. TIP: For accomplished programmers these methods will probably be highly familiar since they are true workhorses of computer programming. However, for novices they may take some time getting used to.

  • First, let's create three more local variables. Context-click on local vars and select Add Var. Rename the new variable cue_str and assign data_type = String.
  • Duplicate (e.g., Ctrl+m) cue_str and rename it probe_str. Duplicate again and rename the third guy probe_out_str.
  • Next, drag-and-drop Tools/Assign/variable= to the end of prog_code and select cue_str as the result var. In the expr field enter cue. Note again how we are taking advantage of the dual-personality of enum variables in that we can assign the "string" nature of enum variables directly to String variables.
  • Duplicate the assign statement you just made and set result var to probe str and enter probe in the expr field. Duplicate again to assign probe_out_unit to the probe_out_str String variable.
  • Next, copy the cue_str = cue statement to the end of prog code and edit the expr field to read: cue_str.after("_"). This clips the 'I_' from the string just leaving the defining letter (e.g., A,B,C). Now do the same for probe_str (probe_str = probe_str.after("_")) and probe_out_str (probe_out_str = probe_out_str.after("_")).
  • TIP: The above string manipulation is a great chance to take advantage of the Ctrl-L "lookup" shortcut we introduced you to in the basic AX Tutorial. E.g., in the expr field for the cue_str case, just enter cue_str. and then, with the cursor immediately after the "period" click Ctrl-L. This will bring up a list of many, many string manipulation methods available to you that are worth getting familiar with.

We now have the pieces we'll need to generate fully descriptive names for all of our trials. However, we'll have to wait to actually generate the names because we will be writing two rows to StdInputData each time through and this means we'll have to change the value assigned to Name between the two of them.

Generating the Input Data Patterns

The next step is use the values we have generated to actually write to the StdInputData datatable. This will be accomplished by twice adding a new row element from the Tools/Data_RW toolbox and then using two different ways of writing data each time to accommodate our different kinds of data.

  • Add a new row element from the Tools/Data R/W tab on the toolbar to the end of the program (e.g., drop on prog_code) and set data_var to input_data in the green editing area at the top of the Editor. This row will be used to define our cue-time trials.

Now it's time to actually generate the names for our cue-time trials:

  • Next, drag-and-drop Tools/Assign/variable= to the end of our program and set result_var = Name. Then, in the expr field enter: cue_str + probe_str + "_" + cue_str + "_N". Note that since all cue-time trials are Non-Target we can just enter "N" directly and don't need a variable.
  • Next, add a data=vars element (still in the Tools/Data R/W tab) and also set data var to input data. Then, click the set_data flag to true to write data to the data table (Recall from the basic AX Tutorial, if this flag is left false then data will be read from the datatable to variables sharing the same name with selected columns). Leave row_spec = CUR_ROW and all matches false (unchecked) as before. Finally, select Name in the var 1 field. This will write the current value of the Name variable (which should be the name of the current trial if everything is working properly) to the Name column of our StdInputData table.

Now we need to write the actual patterns for the Input and Output columns (corresponding to the same-named layers in the network), as we did earlier in the basic AX Tutorial. Recall that these columns are composed of matrix-type cells that store arrays of data. We'll proceed exactly as we did in the AX Tutorial:

  • Since there is no special widget for that in the Data R/W section of the toolbar we'll use the Tools/Functions/method() element again. Drop it to the end of our code again and in the obj field that appears in the blue editing region at the top select input_data.
  • Then, in the method field that appears after clicking it, scroll down and select SetMatrixFlatVal. As noted in the basic AX Tutorial this method writes data to matrix-type columns using a "flat" scalar-valued indexing scheme, which is perfect for using enums as indicies!
  • Edit the four argument fields opened up under SetMatrixFlatVal(,,,) after selecting each in turn as follows: Variant& val = 1; Variant& col = "Input"; int row = -1; and, int cell = cue. TIP: See the basic AX Tutorial for a key to understanding what each of these arguments means. As pointed out when we did this in the AX Tutorial, we actually created the enums in the beginning with this indexing scheme in mind so that the integer value corresponds to the appropriate index.
  • Next, we'll do the same thing for the Output column. Duplicate (e.g., Ctrl+m) the SetMatrixFlatVal(,,,) line; then, change col = "Output" and cell = O_N. Note again that since all cue-time trials are Non-Target, we can enter the literal value directly without going through a variable.
  • Now, since we're done writing for the cue-time trial, go back to the Tools/Data R/W tab and drag-and-drop the row done element to the end of our code. Again, this lets the system know that you're done writing to the current row of data, and that it can update any relevant displays.
Figure 1: Near-final CPTAXGen program code before adding a for wrapper.

Next, we need to repeat the above process for the probe-time trial, which is nearly identical except for three crucial differences. The easiest way to proceed is to copy-and-paste the complete block of statements we just created, starting with the new row statement and ending with row done. So do that. Then, we need to make three edits:

  • In the line in which we assign the value to the Name variable, change the expr field to read: cue_str + probe_str + "_" + probe_str + "_" + probe_out_str. Since probe-time trials can be either Target or Non-Target we need to go through a variable for that component of our name this time.
  • In the line in which we are writing to the Input column (i.e., the first SetMatrixFlatVal() instance), change the fourth argument assignment to probe, (i.e., cell = probe).
  • In the line in which we are writing to the Output column (second SetMatrixFlatVal()), change the fourth argument assignment to probe_out_unit, (i.e., cell = probe_out_unit).
  • Finally, add another row done element and that's it!

Your prog code section should now look something like Figure 1 at the right.


Init and Run the program while looking at the StdInputData view tab. You should see it generate sequential cue-probe trial pairs with appropriate outputs. Keep repeating to see a several cue-probe trial pairs including both Target and Non-Target outputs. You should also watch how the naming scheme corresponds with whether each trial is a cue-time (first row) or a probe-time trial and how these correspond to the Input and Output patterns that appear in the grid view.

Generating Multiple Cue-Probe Trials

The last thing we need to do to complete our program is to loop over the existing set of code multiple times to create several cue-probe sequences per epoch for the network to train on. We'll do this by simply wrapping a for around the block of code we just made. To do this, drag a Tools/Control/for element on top of the second statement line of the prog code (rnd_number=...). Then, multi-select the rest of the program code below that, and then drag the whole thing onto the for statement, and select Move Into. (TIP: To multi-select click on the rnd_number.. line, then hold down the Shift key and click on the last line of the code (row done.)

If we Run our program now, we'd get 10 trials by default. But, we want to make the number of trials a variable that can be changed easily by the user as desired. Context-click on the (global) vars section, select New and name the new variable n_trials. Set its var type = Int and its int val = 50. Then, click on the for statement and replace 10 in the test expression with n_trials.

Let's turn off the console printout now since we don't really need it anymore and it doesn't actually contribute to the logic of the code. Click on the Print statement and click the OFF box ("flag") that appears in the brown editing section at the top -- this keeps it around in case you ever want to use for debugging, etc. later.

Updating the Control Panel

Go to the Program Ctrl tab at the top of the CPTAXGen program Editor panel and select it. (We've been in Edit Program.) You'll see that the (global) vars we've created are listed there. However, because we've put all the "internal" variables into the LocalVars section, they are not listed. The idea is that the variables in a control panel are things that the user might want to modify. The mouse-over tooltip displays whatever was entered in the desc field for each variable or argument. This provides a quick, easily accessible interface for users to modify these values. However, for things you might not want a casual user to have to deal with, they can be removed from by clicking OFF the CTRL_PANEL flag in the Editor panel on a variable-by-variable basis. You can also use the CTRL_READ_ONLY flag to keep them visible, but just not editable in the control panel.

Calling from the Epoch Program

The last thing we need to do in order to actually run our simulation is to call the CPTAXGen program every epoch, so that we get a new random selection of trials each time (keeps the network from simply memorizing the particular sample we happen to have generated). To do this, go to the LeabraEpoch program in the LeabraAll_Std subgroup of programs, and click the Edit Program tab. Drag the Tools/Functions/call program() element to between the third (epoch_timer ...) and fourth (trial_mon_data ...) lines of the prog_code section. Select CPTAXGen for the target of call program.

Finally, there is one final very critical step involving the LeabraEpoch program. Go to the Program Ctrl tab, and observe the set of program vars available for you to set. The first one, called data_loop_order is set to PERMUTED by default -- this means that the trials (rows of the input data table) are presented in a shuffled random order (without replacment, so each trial only appears once). Clearly this would wreak havoc on our cue-probe sequential pairs so we need to change it. Select SEQUENTIAL instead, which will present the trials in the exact order the rows occur in the StdInputData table.

  • TIP: There is actually a better way of dealing with this issue that involves creating grouped trials (i.e., cue-probe sequences), where you can randomize the order of the groups, but present the trials within the group in sequential order. That can be done right in the NetDataLoop, but we aren't going to explore that here.

Running the Network

Now we're finally ready to run the network with our CPTAXGen program. Go back to the LeabraTrain program and click Init (initializes the weights) and then Run and see what happens!?

You should see that it will run and run and never fully learn the task (it will only stop training if the error goes to zero). There is some chance that it might get there just by virtue of a lucky set of trials; if it does try hitting Run again -- it should not stay at zero, and will keep running.

To make things go faster, create a New Graph View of the EpochOutputData datatable in its own separate frame -- you can then switch to that tab to monitor training performance, and then switch back to the Network_1 tab to see details of what is going on as it runs.

Alternatively, you could turn off the Net View and Trial Output Data grid views, by clicking off their display flags (Disp 2/3D; Disp) on their respective edit panels under the Network_1 view tab. But, in this case adding the new EpochOutputData graph view is just easier. The reason these things speed things up is because updating the running network is very time consuming.

Since increasing the size of the Hidden layer can often help, increase the un_geom to 5 X 5 and try running the network several more times. You will see that it doesn't really help in this case. In any event, leave the Hidden layer size at 5 X 5 because it will work a little better in the next (PfcBg) segment.

It is not a surprise that the network doesn't fully learn the task as this task requires a memory of the cue-time trial in order to answer correctly on 'X' probe-time trials, and our network as currently configured really doesn't have a good way to do that. In fact, it is actually more surprising that our network is able to learn as well as it does! It turns out that under some circumstances neural networks can learn to use weight changes as a kind of primitive, highly imperfect form of working memory. Plus, since the default target of AX is so frequent, our network can get to ~90% correct based on current trial input (cue or probe) alone. (Motivated users may want to do that calculation themselves to see for themselves.)

In the final (PfcBg) stage of this AX Tutorial, we will give our network some working memory so it can get to perfect performance!

→ next PfcBg: Adding a Prefrontal Cortex, Basal Ganglia Working Memory System