Since the last update, we have mostly been researching new encoders again. In particular, we are interested in encoders that perform well with recurrent versions of AOgmaNeo. Recurrent branches seem to have better compression on many tasks (but not all), so it will likely be an optional thing to enable.
Two new encoders are looking promising, the more general one being a topology-preserving encoder, and the more narrow one being a "fast weights" encoder.
The topology-preserving encoder is once again based on self-organizing maps, much like our previous experiments in this area. However, this time it minimizes reconstruction error in one pass as opposed to several iterations. This both performs better in terms of encoding quality and also runs a fair bit faster. Right now this encoder is a suitable candidate to replace the old ESR encoder still used in AOgmaNeo at this time: It seems to either perform better or the same on just about every task, and has the added benefit of having a topology-preserving encoding (which can be used for certain invariances).
The second encoder we experimented with is one based on the concept of "fast weights" (paper). Our use is a bit different from that paper - instead of using "fast weights" as a memory mechanism primarily, we wanted to see if we can make an encoder that "forgets on purpose", but in a predictable way. The idea is that if an encoder always forgets in the same way for a certain stream of inputs, the forgetting itself is a form of permanent memory. This idea seems to work well on only a few tasks (on which it outperforms all other encoders though). Until we can figure out what is harming its generality, this encoders is currently not likely to replace ESR.
Demo-wise, this last month has been slow, but we performed a couple of experiments on the Lorcan Mini robot. There are still some issues with out-of-distribution inputs there, but it's getting close to a new video.
We have pushed a new update to AOgmaNeo and PyAOgmaNeo, but it doesn't contain the new encoder just yet (it's just some fixes and a couple of other things). Soon!
Until next time!