“Those who can imagine anything, can create the impossible.”

Alan Turing
image of a shiny robotic torso
source: Photo by Xu Haiwei on Unsplash

Origins of AI

Artificial intelligence (AI) has a long history, with roots dating back to the 1950s. It was at this time Alan Turing introduced the idea of a thinking machine, and the idea that it is possible to create machines that can mimic human intelligence and perform tasks that normally require human cognition, such as learning, problem solving, and decision making.

A key development in technology was the invention of the digital computer in the 1940/50s. Back then computers took up rooms of space because vacuum tubes just took up lots of space. Computers had the ability to perform complex calculations quickly and accurately, and they laid the foundation for the development of more advanced AI systems. A key component of computers is memory. They too took up lots of space (check out the history of memory). The invention of the transistor in 1947 paved the way to smaller and more compact computers decades later.

image of 3 vacuum tubes
vacuum tubes (source: Photo by Ries Bosch on Unsplash)

In the 1950s, researchers began to consider the possibility of using computers to simulate human intelligence. One of the first researchers to propose this idea was Alan Turing, who is considered to be the father of modern computer science. Turing’s work laid the foundation for the development of the first AI programs, using computers which were designed to perform specific tasks such as playing chess or solving mathematical problems.

In the 1960s and 1970s, AI research focused on developing programs that could understand and process natural language, as well as on creating expert systems that could make decisions based on rules and data. In the 1980s and 1990s, AI research expanded to include machine learning, which involves developing algorithms that allow computers to learn from data without being explicitly programmed.

AI Today

Today, it is common to have Siri, Cortana, Alexa, and Google on our phones or home appliances. Through language recognition, you can ask them anything and they can help you answer things you can normally search on the web. But we know they aren’t quite like what Alan Turing has described.

Nevertheless, there is technology out there now in their infancy where the Turing’s conditions have been exceeded, and where people on the receiving end of the chat session can no longer tell if the other side is actually an artificial intelligence. The system sounds and reasons like a human being. Check out openai.com. and you will see what I mean. Below you will see how the AI responded to my question about “why people celebrate Christmas.”

An AI-generated response to the prompt  Why do people celebrate Christmas?
source: chat.openai.com

There are other AI technologies out there that aid humans in various ways. Check these out:

  • Novel AI – this technology is based on the same technology at chat.openai.com. However they have monetized the technology to help content creators build stories of certain genres. The assistance comes in the form of content and images.
  • DALL-E 2 – this technology is the image side of openai.com. You can describe the image you have in mind using text and the AI will create the image you describe.

Below is an image I had the AI create using this description: “an oil painting by Dally of a Japanese garden during the spring just at the first light of the morning

an oil painting by Dally of a Japanese garden during the spring just at the first light of the morning
AI-created painting (source: DALL·E at openai.com)

What Changes Made AI Possible?

In my opinion, three things converged to make today’s AI possible”:

  • Cheap memory: In the 1970s computers ran with as little as 8K of RAM. Today 8K is several orders of magnitude smaller than even the built-in memory cache of common CPUs (Central Processing Units). CPUs come with level 1 to 3 cache, which can go from around 384K to 32MB respectively!
  • Fast CPUs/GPUs: Today’s CPUs are leaps and bounds better than the original CPUs; CPUs today come with multiple cores (internal CPUs) running at multi-GHz clocks. In the late 1970s CPU only had one core and were running around 4.77Mhz.
  • Massive access to cloud compute/storage: Thanks to Amazon Web Services (AWS), Microsoft’s Azure Cloud, and Google’s Cloud Services Platform, companies have the ability to develop and research AI technology without having to buy their own hardware.

AI in the Future

Things are changing rapidly in the field of AI. The next major leap in computing technology is in quantum computing. Quantum computing can perform certain computations faster than classical computers. With quantum computing, AI technology may continue to level up!