From Pinker, How the Mind Works (emphasis mine):
Why put connectoplasm under such strong lights? Certainly not because I think neural-network modeling is unimportant — quite the contrary! Without it, my whole edifice on how the mind works would be left levitating in midair. Nor do I think that network modeling is merely subcontracting out the work of building demons and data structures from neural hardware. Many connectionist models offer real surprises about what the simplest steps of mental computation can accomplish. I do think that connectionism has been oversold. Because networks are advertised as soft, parallel, analogical, biological, and continuous, they have acquired a cuddly connotation and a diverse fan club. But neural networks don’t perform miracles, only some logical and statistical operations. The choices of an input representation, of the number of networks, of the wiring diagram chosen for each one, and of the data pathways and control structures that interconnect them explain more about what makes a system smart than do the generic powers of the component connectoplasm.
It’s not that I’ve fallen away from my belief of networked problem solving and networked learning — I’m more bullish on these things than ever.
But as connectivism expands, I see many places the sloppy, religious belief that it’s all about the magic of the connectoplasm, the belief that we are on a quest for pure implementations of generic fully decentralized learning networks, freed from the tyranny of hierarchy and intentional design.
There are things that networks do very well and things that hierarchies do very well. There are things which can be genericized, and there are things that are best left hardwired. Most complex human tasks, when approached optimally, will mesh these approaches (as do, frankly, both ds106 and the Change MOOC).
What matters is not the extent you lean on the connectoplasm to do your work — what actually matters is the design you choose to implement around the connectoplasm to make it function efficiently…