Very interesting approach, congrats on your release!
Have you noticed any practical performance overhead related to the usage of Proxy objects? Granted, that might not be the area of focus you have in mind for Aberdeen. I'm just asking as I mused on a similar idea some time ago and always felt hindered by this point.
My second question is related to composition and reactivity (computed fields and other transformations happening _outside_ the components). Do you see any particular pattern working well with Aberdeen?
I had similar concerns when I made the jump from a custom data class (with methods like `set`, `get`, `increment`, `push`, etc) to a transparent `Proxy` around native JavaScript objects. I did some quick benchmarks back then, and concluded that it was actually not meaningfully slower. Modern JavaScript engines are awesome! :-)
Aberdeen is not in js-framework-benchmark yet, but I've done a pull request https://github.com/krausest/js-framework-benchmark/pull/1877 -- By my own testing, performance is similar to something like react (while of course destroying React on time-to-first-paint, bytes transferred, etc). However, this test is not a particular good fit for Aberdeen, as lists in Aberdeen are always sorted by a given key (it maintains an ordered skiplist for the data). The test only requires the easy case: sort by creation time. So Aberdeen is doing some extra work here.
With regard to reactive data transforms: Aberdeen provides some (reactively 'streaming') helper functions for that (`map`, `multiMap`, `partition`, `count`). But these are mostly based on the `onEach` primitive, for reactive iteration. Take a look at the `map` implementation, for instance: https://github.com/vanviegen/aberdeen/blob/a390ce952686da875...
Is past projects, we've been using things like this a lot. Pretty easy and fast!