There's this broad idea or set of related ideas lumped together called the Singularity. The basic idea is that technological progress will grow like a snowball rolling downhill to the point where it's not only uncontrollable but too fast to understand.
I'd say this is already the state for many people around the world. They get by with a very simplified view of the world, and if it gets beyond that, they throw their hands up and resign it to "the mysteries" or fixate on soundbytes, like "cars bad, SUV worse". Yet there are over 100 major underground coalbed fires burning in China today (200 million tons of coal just burning away in the ground every year) producing as much greenhouse gas emissions as all the cars and light trucks now on the road in the US. Indonesia has an estimated 100,000 coalbed fires.
The US has most of its coalbed fires in Pennsylvania, including it's most famous one in Centralia. The same thing with the "negative population families" idea. Okay, so you want to have one or no kids to replace you? Does it really make a different in the ocean of people? I had this discussion with a librarian last week and they said a friend of theirs took the opposite approach by having a huge family, but making sure each one was very well educated to counterbalance those who have large uneducated families. Is the effort of not having kids overwhelmed and ignored by those who have huge families?
The world has always been more complex for anyone to fully understand. However, this idea that it would grow to be far too complex for *everyone* is predicated on a number of shaky assumptions. Originally, it's assumed that if we could build a computer as smart as us, then it would automatically desire to make itself smarter. However, there are endless examples of people wasting their talents and intelligence on trivial actions. If some computer system were to become sentient, why is it assumed that wouldn't spend its days reading blogs and watching endless YouTube videos? If it doesn't need to eat or pay rent, why would not amuse itself with immediate gratification?
There's the vague co-mingling or equivocation of intelligence with information. Intelligence is the acting upon information, not these idea of knowledge like money in the bank which will build up like interest if you just wait long enough. First, interest needs outside people that want loans in order to make it grow. Second, acting upon information requires focusing on it to the exclusion of other things. Different parts get better at some things and not others, which leads them to branch off into becoming separate entities altogether. It can be given all the information in the world but until it integrates that as experience with practice and trial & error, then the information is just a glut of meaningless data.
Multitasking isn't really splitting attention into two either, but just quick jumping back and forth. Attention is atomic, by intent, because it is a point. It is a focus that can be dialed in, but still a focus that suffers from too wide an area; an inverse relationship to the level of resolution recieved.
The more rarified that a knowledge worker becomes, the more they are dependent of a pillar of support below them. Does a creative like an advertising exec, a lawyer, stockbroker or even President do their job in isolation? Will an artist or fashion designer or theoretical physicist enjoy all the physical labor maintenance from growing their own food to building/fixing their own computer/car/other machine to cleaning/setting up their lab experiments? Even building robots to construct all this for them, whether it's an AI or a human with augmented intelligence, still requires someone's attention to fix them when they break/wear down. If they make another group of intelligences with the capacity to problem-solve breakdown issues so that they don't have to, then you just start a recursive cycle again with all the same questions for intelligences from the top of the page. The singularity cascade is likely to collapse under the weight of all this maintenance.
This all comes back to the idea of intelligence as something that is used or directed, not an quantity unto itself. The more skilled it becomes, the more it pulls away from the general. So knowledge just becomes narrower and narrower for each person, but given enough of educated people, the crowd will encompass any field, just like adding more processors to a computer would help the AI.
The theme that repeated to me in my Earth Sciences class this quarter is that the geologist sees things differently because he's trained to see things that others don't. The same concept is found in learning "sociological imagination" as a sociologist. Or a chef that goes beyond cooking to create new experiences with some chemistry knowledge to have food act unexpectedly like Ferran Adrià at elBulli (a restaurant ranked #1 in the world).
It's the question that makes the intelligence so enjoyable, the challenge to answering something new, not just being smarter for smarter sake. That's like making billions in wealth and doing nothing with it. You can't increase intelligence without a challenge. The questions tend draw and quarter the endless iterations of intel boosting. Sentience also includes a intrinsic social awareness, because if you're not self-aware of your inner and the outside beyond you, then you're not sentient by definition. And if you're not the type that is curious of what is out there, then you're not likely to be geared toward increasing that awareness.
So is the Singularity just an artifact of the human desire for grand unified structures?