It was an informal discussion in our research group.
But the need for topology came in because it was a debate on the flexibility of real number tensors versus what programs can do in practice. So I boiled both of them down to sets and am looking at their behavoir.
The modularity issue in deep learning is just the connected property. But it is pretty easy to introduce a noncontinuous function. In fact, this is common.
https://knowledge.uchicago.edu/record/2215
The issue is that connectedness is a valuable property to use, most of the time. So deeper networks are better. More data is better.
Program synthesis is also aided by the connected property, but relies on it less. So, not useful for modularity per se, but it can probably bridge larger data dead spots, or rough patches in data.
@jmw150 Thank you!
@extrn
^The topology section of the thesis I linked to.