-1.3 C
Niagara Falls
Thursday, December 5, 2024
Dr. Brown: Computers — early days to modern times
"most computers in the 1980s and 1990s were strictly rules-based," writes Dr. William Brown. "We knew precisely what we wanted the computer to do and there was no expectation that the computer would do anything other than obey the embedded carefully scripted programs." MIDJOURNEY

In the 1970s, one of my children in public school won a computer with 2k of memory.

It was useless but, then again, it was cheap.

However, as my career as a neurophysiologist progressed, my team became increasingly dependent on computer devices to extract tiny signals from background noise by averaging hundreds of responses, which made event-related signals stand out while unrelated signals cancelled one another out.

Later still, a closely related off-shoot of our group in London, Ont., led by Mark Davis and his new company developed the first computer-based system in the world for analyzing electrophysiological data in the clinic and operating room and was a great success in laboratories like mine for the better part of a decade.

Even so, most computers in the 1980s and 1990s were strictly rules-based.

We knew precisely what we wanted the computer to do and there was no expectation that the computer would do anything other than obey the embedded carefully scripted programs (algorithms of the day) for doing this or that mandated task.

However, change was in the wind with the introduction of what was called machine learning and neural networks by this year’s laureates in physics, John Hayfield and Geoffrey Hinton, who modelled their computing devices after the brain.

Single cells in the nervous system integrate signals from a variety of sources, but only when the aggregate signal exceeds a certain level does the neuron respond by generating a signal that it sends to other neurons.

Neurons are also arranged in functionally related groups: Nuclei, layers and/or columns.

Repeated similar signals are strengthened and infrequent signals weakened, a process that underlies memory and signal recognition.

That simple neural model was adopted by Hayfield and Hinton in the 1980s to illustrate how machine learning might work to analyze data.

Still, the fields of machine language, neural networks and deep learning faltered until 2011 and Google.

That year, Google’s Brain project was used to extract images from YouTube videos and fed them into a network of 1,000 computers, with a combined total of only one million neurons (nodes).

Crude and underpowered as it was, the project worked and with each passage through successive layers of nodes and networks, more and more defining features began to stand out, until finally, in this case, recognizable faces appeared.

This was the first concrete demonstration of deep learning at work.

Even then, what held the field back was the need for far more computational power, much higher processing speeds and much larger databases on which to learn before challenges such as near-instantaneous translations of language or facial recognition would be possible.

Last year, a series on artificial intelligence was hosted at the Niagara-on-the-Lake Public Library, triggered because of enormous public interest in ChatGPT and its various versions.

Whatever the pros and cons of ChatGPT and lookalikes by other companies, machine learning, neural networks and deep learning have been a godsend to scientists.

One example is the challenge of forecasting rapidly evolving weather changes such as hurricanes, flooding events and tornados and on the other end of the time scale, is the challenge of identifying trends and causative factors in long-term climate changes.

Both generate enormous amounts of data that must be analyzed to make sense of the numbers.

That’s why modern high-powered machine language-based computing devices are so essential.

They have the power to crunch the numbers and identify patterns in the data far beyond human computational limitations.

Then, there were the spectacular triumphs in the last few years by Dennis Hassabis and John Jumper for their development of AlphaFold2 and David Baker’s similar RoseTTAFold, both designed to unravel the mystery of how linear strings of amino acids fold into 3D molecules to do whatever their particular job is in biology.

The three men shared the Nobel Prize in chemistry this year in what was a triumph of integrating basic physics, chemistry and computer science.

Finally, to close, some of my readers may have followed in this newspaper the struggle of patients with amyotrophic lateral sclerosis to express themselves.

One of the triumphs of machine language is its ability to extract relevant signals — in this case, signals in the brain’s neocortex related to the choice of words and articulating those words when the related systems have been affected by the disease.

The signals are recorded from the appropriate regions of the brain but mixed with a myriad of other signals — making it all but impossible to make sense of what’s going on without much-updated versions of machine language.

Only a few years ago the best that could be hoped for was 10 words a minute with an error rate of 30 percent and a vocabulary of about 100 words.

The latest version has a vocabulary of thousands of words, at least 30 to 50 words a minute and an error rate of less than five per cent.

That’s real progress and not because of improvements in the electrode array, but because of the latest programs, which learn to cull the brain’s electrical signals for those essential for choosing the right words quickly and precisely — in so doing — restoring working speech to someone who has lost it.

That’s only one example of the transformative power of AI and machine language.

The physics and chemistry Nobel Prizes this year are a tribute to the five scientists who helped to develop the tools that underlie AI and the subject of the annual series on the prizes beginning with physics on Nov. 6 at 2 p.m. in the NOTL library. Please sign up with Debbie Krause.

Dr. William Brown is a professor of neurology at McMaster University and co-founder of the InfoHealth series at the Niagara-on-the-Lake Public Library. 

Subscribe to our mailing list