There was a paper a few years ago about a similar effect in artificial neural networks [0]. The gist was that a large network can contain many subnetworks, and the number of subnetworks grows much faster than the size of the network they are contained in. They were able to find a subnetwork in a randomly weighted network with equivalent performance to a trained network of a much smaller size.
[0] https://arxiv.org/abs/1911.13299