Offhand thought:
I don't know if neural nets are the royal road to AGI, but I think objections of the form "but it just [simple thing]s!" are risky. Here's my thought: if evolution produced all the complex stuff we can do, doesn't it seem much easier to evolve a system by just duplicating some functional part extensively, rather than one that follows some wildly complex ruleset?
Put another way, should we expect the magic to emerge from something LIKE neural nets?