Stepping Carefully

In this post I will continue with my so-called hieroglyphics project.  This project uses a set of image data that describes handwritten characters.  The dataset is frequently used to evaluate machine-learning algorithms.  I’m using the dataset to explore a variety of modelling techniques within JMP.

In my last post I used a script to incrementally add terms to my model so that I could explore the performance of the model with increasing complexity.  But the order in which I added the terms was based on a heuristic and it wasn’t necessarily optimal.  So in this post I want to explore using stepwise regression.

Continuing the Journey

In my last post I discovered that I would probably benefit from using more data, so I have bumped the size of data table up to 5,000 rows with a 60/20/20 split between training, validation and test.  This will increase processing time particularly given that I have up to 784 terms to accommodate.  Each term corresponds to an image pixel.  But I know that some pixels remain turned off across all images so I’ve written a script to remove them.  In fact it is a useful script for pre-processing any data prior to modelling, since it removes any columns that contain only a single value across all the rows:

This reduces the number of pixel-columns from 784 to “only” 622!

Running Stepwise

So now I can use these 622 columns as model terms in the Fit Model platform configured to use a stepwise personality.  Once configured all I need to do is click the Go button.  But not so fast!  Stepwise is a computationally intensive procedure; I have no idea how long it will take and once it starts it wont want to stop.

Stepping Carefully

I prefer to use a code wrapper to give more additional control of the stepwise procedure.  I want it to do a number of things:

  • Give me an indication of progress with an estimated completion time
  • Create some “space” during the execution that will allow be to request a “cancel”
  • Place a time-limit or a size-limit on the modelling

This is what the code looks like:

The script is set up to add up to 100 terms to the model (rather than just click Go).  I ran the code overnight but it still hadn’t completed – I aborted the execution and captured the results for 63 steps (the code uses a global variable ::doAbort that allows me to quit the stepwise regression prematurely whilst retaining all of the results that have been collected).

The Results

One of the great things about JMP is that whenever an output report contains tabulated results, you right-click and save them as a data table.  Here is part of my step history:


The plot below shows how the AICc and BIC statistics evolve with the number of steps:


Qualitatively the graphs give an indication of when the statistics plateau, and hence the point at which there is no benefit from adding additional terms.  The graphs tell me that it was reasonable for me to abort my script when I did.  Phew!

If I apply a rule-of-thumb that a consistent step-change in the statistic of less than 2 is not significant then I can quantify the plateau point: for AICc it is at 44 steps and for BIC it is at 29 steps.

Next I want to take a look at the performance of the Validation R-Square.  Since I also have the training R-Square it will be useful to overlay the graphs:


As you would expect the training R-Square just wants to increase as model complexity increases whereas the validation R-Square plateaus and then declines.

Running Stepwise

Next I would like to look at the performance graphs for misclassification rates.  This information is not contained in the step history, so I need to use a script to retrieve the information.

In about the time it takes to boil a kettle, this script gives me a data table of results.  Graphing training and validation misclassification rates versus number of terms helps me assess the level of performance I can expect from the model and the number of terms that I need:


Using a model of 20 terms I can expect to achieve a misclassification rate of just 3%.

One thought on “Stepping Carefully”

Leave a Reply

Your email address will not be published. Required fields are marked *