This is cool, I have this idea that (adaptive correction) is a bilateral symmetry to (resistance against equilibrium). I used this in vector math when coding a simple mechanism for thrust correction in relation to velocity direction, in relation to the direction of the desired position (bilateral mirror of thrust in relation to velocity proportional to the vector direction of the goal). When there is intermediate goals, shift in algorithm of control is proportional to proximity of obstacle (which is another bilateral symmetry). It's like a nesting of equal and opposite obstacle->correction vector comparisons based of goal of proximal importance.
@NicholasWilliams-uk9xu6 ай бұрын
There is a problem with data loss in that neural network example (if scale is the focus) for (category narrowing). In small neural networks that preform high degree information processing in order to narrow to one single outcome, a lot of information is lost (0,1 cat or dog 1, 0 outcomes) and this is fine on the narrow goal. However, if the information processing has to navigate a larger set of goals, and multiple networks are losing vast amounts of information that would otherwise be integral to the larger ensemble, the network must scale to be much larger to handle (lack of leverage [back propagation loss of nuance in favor of exaggerated certainty]) loss of data that would have otherwise been useful in a ensemble context where the goals are more interrelated, connected, and more numerous. Backpropagation becomes a less viable adaptive mechanism for larger systems, and becomes more expensive strategy when the parameters scale and goals becomes interrelated and numerous.