Sample final year project plan

Here's the project plan from the final phases of Steve's final year project. The idea isnt so much to illustrate a "good" plan, as to give one example of the kind of thing that may be useful. It certainly shows that Steve was thinking quite hard about what needed doing and in what order, very early on in the project.

The project involved connecting up a matel power glove, which is a Data Glove produced by matel for games. It was known that certain sequences when sent to the glove could be used to switch it into a higher resolution mode that would be useful for doing real VR experiments with it. Steves project involved finding out how to switch the glove into the high resolution mode, writing a driver for it, a gesture recognition library based on artificial neural networks that would run on out parallel Transputer racks, and some demonstrations of it all working.


Steve's plan

The plan for the year. This was written early in the project.

1 Disclaimer!

I'm finding it very difficult to produce a coherent diagram to show my plans over the next 5 weeks, since many of the jobs that need doing are small, and need not occur in any given order.  

This text attempts to describe what I intend to do.  

2 Gesture recognition

I intend to have gesture recognition of some kind working within two weeks. 

My initial attempt at this problem will be to use the quasi-recursive network idea, that is, using one (large) network with a certain amount of feedback to recognise gestures as a function of previous recognised postures and time. If this attempt does not look promising within 4 days, it will be abandoned, and a method using a single large network and a relative-position buffer will be investigated for a maximum of two days. If this seems feasible (I think it might be too slow), I will implement this; if not, I will provide some heuristics to recognise gestures. 

The system will recognise the gestures grasp and drop 

3 Tool Development

This section is to take 1 week and will be broken down as follows 
  • Enhance gbd compiler to use command line options
  • Make gmp (merge pattern)  more robust (will involve changes to the driver also)
  • Make glp (learn pattern) more robust (ditto)
  • Integrate all three tools with unix make
It will be possible to issue the 'make' command, and have a controlled generation of network, postures, gestures and cortex. 

4 Demonstration

  • This section will take 1 week, and will consist of
  • Getting the graphical hand to work with X-view
  • Getting the graphical hand to respond to glove movement
  • Providing a posture driven version of 'scissors paper stone (dynamite)'

5 Integration

Vague and fluffy; remaining time integrating with KN's project and Aviary.