Assessing the relative cognitive capacities of long-gone species such as neanderthals, denisovans and early modern humans is tricky.
So much of what we take as evidence of intelligence and inferred cognitive capacities is a product of an accrual of skills and learning acquired over many, many thousands of years in the case of modern humans and one to two million years in the case of more ancient hominins.
But such cultural and technical measures can be very misleading, because they depend on opportunity.
A good example of which were the many tribes living deep in the interior of Papua New Guinea, who, until the early 20th century, were out of touch with the rest of the world and lived what some explorers characterized as Stone Age existences.
Yet, with the opening up of the country following the Second World War, those same people soon showed that they were as adept as you and I at mastering any skill and language as we were. The problem was not the brain, which proved as to be as capable as ours, but absent or very limited opportunities to learn what we take for granted.
Among paleoanthropologists, there has long been an assumption that the finesse with which left-behind stone artifacts were fashioned could be taken as a measure of the cognitive capacity of the brain to create them — the cruder the tools, the cruder and less developed the brain that created them.
Or to turn it around, it’s all too easy to look at the extraordinary power of technology and science these days and somehow attribute those achievements to a single company, a single scientist or engineer or even a few of them, when the truth is that most of the stunning advances of the late 20 and early 21 centuries depended on collaborative incremental contributions from many scientists and engineers all building on solid dependable foundational studies from those who preceded them.
Single scientists, working largely alone, such as Albert Einstein, were uncommon in his time and more so now. It takes villages, towns and cities of scientists working together, whether directly or simply by sharing insights and data, to make most science work in our time.
True, there are large projects such as the James Webb Space Telescope or any other major astronomical project that I’m aware need key leaders, but those leaders usually depend on a much broader base of expertise.
The current exponential growth in science and engineering is a product of the development of many more excellent institutions and scientists entering more and more fields in biology, physics, chemistry and the computational sciences and engineering (artificial intelligence and related fields), worldwide.
What had been the province largely of Europe, and the U.K. in the early half of the 20 century soon became dominated by the United States in the latter half of the twentieth century and early quarter of this century, is once more changing, heralded by the development of many new and excellent universities outside the western world, especially in China but broadly too, throughout south Asia.
It’s that rapid expansion of excellence in science and engineering and the many millions of new contributors throughout the world, including high-tech companies and institutes, which is fuelling the current escalating developments in basic and applied sciences.
The point is that achievements at this pace and quality are very much a product of numbers and breadth in expertise of those involved, but probably not any evolutionary improvements in the brains of individual contributors.
Although theoretically that’s possible, some esteemed scientists, such as Steven Weinberg, a Nobel Laureate in physics, believed that the human brain has cognitive limits that might limit what can be solved.
Maybe, but working together, the collective cognitive limits of groups of scientists working together are surely much higher than for any individual.
Finally, to return to the conundrum of inferring the cognitive abilities of early humans and other hominins is tricky.
But given their small numbers — an estimated total world population of pre-humans of 100,000, roughly a million years ago and scattered as they were over vast territories in small groups vulnerable to extinction — it’s not surprising that it took so long for them to develop whatever refinements in tools they managed over such long time periods.
It’s possible that later versions might have possessed brains not so different than our own. But how would we know, until the explosion of cave art 40 to 50,000 years ago?
We’ll never know, but it’s possible, even if not likely that they were as bright as those Stone Age residents of Papua New Guinea were in the mid-last century.
With an eye to the near future, AI will almost certainly acquire broad intelligence surpassing any human soon and go well beyond that point.
That will really change science and much of human life in the next quarter century, as witnessed by the flood of applications of AI to solving some of the most intractable questions in science and engineering these days.
Just look at some of the best articles and best journals these days and you will soon get my point. I can’t keep up — and I intentionally look.
Dr. William Brown is a professor of neurology at McMaster University and co-founder of the InfoHealth series at the Niagara-on-the-Lake Public Library.