History lesson, science edition:
According to archeology, humans first invented techniques to control fire approximately 400,000 years ago (but it was not a widespread technology until about 100,000 years ago.) We figured out techniques for agriculture and animal husbandry during an innovative period about 10,000 years ago, which led to permanent towns and all that came from that.
The article that got me starting down this line of thought was about the fact that much of the world's population is
much closer to the 10,000-year-ago point than to modern US social and technological norms. For most of the time since "domesticating" fire, we used it for what it did in nature. We essentially refined our ability to use it - burn this, it heats things up, heating things like food for cooking and metal for shaping can be useful - but didn't come up with wholly new ways to use it. It wasn't until scientific experimentation really got rolling that humans discovered ways of converting thermal energy into mechanical energy (other than that burning Ogg with a hot branch would make Ogg run around and chase you, thus producing both mechanical energy and amusement.)
Literally thousands of centuries to go from "we can make this burn" to "this can do more than burn" - directly leading to steam and then internal combustion engines. It was a major transition to a better understanding of all aspects of combustion, instead of just accepting it as is. We've had a good understanding of how to
harness thermal energy and convert it to mechanical energy for almost 200 years now. We really hit our stride in using it in every aspect of our lives within about the last 50-100 depending on how you draw the line.
Perhaps next step would be a way to avoid the conversion from potential energy into thermal energy in the first place - why burn it to generate heat to use the expansion of heated gases to produce mechanical energy (often then to convert that mechanical energy to electrical energy?) Fuel cells already do work directly from the potential energy without combustion, and have been around for a few decades now; 80-130 years until full adoption, maybe? That's assuming that we don't suffer a societal setback that leaves us at a lower technology level due to over-adoption of the internal combustion engine...
Side note: I wonder if speed of adoption of agriculture and fixed dwellings also took only 1 or 2 lifespans from the time of clear understanding of the concept by a particular group/tribe, to full adoption. That was a similar transition from simply accepting the fact that these plants grew at certain times of year and could be eaten, to fully understanding the nature of growing cycles and requirements.
It was also 9,800 years of "Lightning started a fire and burned our house down. Time to rebuild." The scientific method led to more efforts to understand the situation instead of accept it. 200 years of lightning rods preventing the problem are the result - in essence, preventing the conversion of electrical energy into unwanted thermal energy. That one was very quickly adopted because it was solely technology, not social change related to tech.
In general, I just had a bit of a moment of "science rocks!" and hopefulness for future improvements arising from the scientific method, and thought it was worth sharing. The fundamental concept of modern science is deeply rooted in an assumption that reality is based in cause and effect with no supernatural interventions, allowing humans to perform tests which they can count on to give consistent results. Building on that base, every aspect of reality can be investigated as a puzzle to solve and new ways of looking at the world can emerge. The invention of a philosophical framework that encourages that testing changed everything, but it first required a world that no longer believed that supernatural causes were behind most events. Leading to the second cut of this post (stop reading here if avoiding the religion topic)...
This
link regarding how people reason about their attitudes was interesting reading, but I think the authors may be confusing cause and effect.
Most people who have a strong religion probably have internalized the values they were taught by that faith, and when asked to think about what their God(s) opinion is on a topic, will likely fall back on that internalized "I believe what I've been told God believes" opinion - thus registering on brain scans as thinking about their own opinions rather than God's, and showing up in polls as feeling that God agrees with their views. This is the opposite of projecting their own belief's onto God. I think the results of Study 5 may be a result of re-evaluating what they have been taught that God's opinion on the subject is, but it is possible the researchers are right in seeing it as a genuine egocentric projection of the subject's own opinions onto God. I just question whether the initial bias of feeling God agrees with them is really an effect of egotism, or merely one of being devout believers.