Sunday 19 March 2023

Artificial Intelligence makes it to the big time

Artificial Intelligence (AI) has been around for a long time, but, to be honest, it has really only reached the public’s imagination through science fiction films and TV. In the Terminator movies, Skynet became self-aware on 29 August 1997. And we know how that turned out!

It’s true that numerous people have been working on AI projects, and great work has been done, but as far as the ordinary man in the street is concerned, it didn’t have any impact on his life. OK, maybe his phone was able to predict the next word he was going to type in a text. Alexa would show your photos or play your favourite music. And maybe Deep Blue had beaten a famous chess master a few years ago (it was Garry Kasparov in 1997). It’s just that the general public were, until recently, typically unaware of what AI can do and how it was being used.

According to Wikipedia, Artificial Intelligence (AI) is intelligence – perceiving, synthesizing, and inferring information – demonstrated by machines, as opposed to intelligence displayed by non-human animals and humans. Example tasks in which this is done include speech recognition, computer vision, translation between (natural) languages, as well as other mappings of inputs.

Alan Turing came up with the Turing test in the 1950s, which measures the ability of a machine to simulate human conversation. Because humans can only observe the behaviour of the machine, it does not matter whether it is ‘actually’ thinking or has a ‘mind’, the important thing is its behaviour.

But what has brought AI into the public consciousness is ChatGPT. This AI chatbot was developed by OpenAI and launched in November 2022. It’s built on top of OpenAI’s GPT-3 family of large language models and has been fine-tuned (an approach to transfer learning) using both supervised and reinforcement learning techniques.

The GPT part of its name stands for Generative Pre-trained Transformer. And what it does so well is content creation, ie its deep machine learning can create human-like text in response to simple prompts. You might like to think of it as souped-up version of Cortana on your laptop or Google Assistant on your phone.

Machine Learning (ML) allowed phones to recognize what words people most often used after a previous word. This information was collected on each phone and then centralized in order to learn from as much data as possible. The most likely next words were then sent back to people’s phones. More data was collected to see whether those first guesses were correct. The new results were sent back to people’s phones. So now, predictive text is getting to be fairly accurate, on the whole, although it can never be completely right for every conversation. This is how machine learning works.

With ChatGPT, giving it some articles can result in it writing an essay for you – which is a worry for schools and universities, and also for people who read and write newspaper articles and even blogs. ChatGPT can even write poems and jokes. And, like predictive text did, it is learning from everything that it is being fed. So, it will only get better.

The interesting thing is that ChatGPT now has so many users and many people have heard about it – even if they are not using it. I saw the famous Dilbert cartoons had jokes about it. ChatGPT can be found at https://chat.openai.com/chat. Microsoft’s new Bing search engine includes ChatGPT, allowing users to ask real questions and get complete answers – they say. However, there is currently a waiting list to use the service.

I asked ChatGPT what were the best alternatives to ChatGPT. It told me that there is no single ‘best’ alternative to ChatGPT because it depends on what specific needs and requirements I have. It then went on to suggest:

  •  GPT-3 is a language model developed by OpenAI, like ChatGPT. It is larger and more powerful than ChatGPT, with 175 billion parameters compared to ChatGPT's 1.5 billion parameters. However, it is also more expensive to use.
  • BERT (Bidirectional Encoder Representations from Transformers) is another popular language model, developed by Google. It is known for its ability to handle complex natural language processing tasks, such as question answering and sentiment analysis.
  • Transformer-XL is another language model developed by Google, with a focus on long-term dependencies. It is designed to handle sequences that are longer than what other models can typically handle, making it useful for tasks such as language modelling and machine translation.

It didn’t mention Bard, Google’s consumer-focused AI chatbot. Nor did it mention Copilot, Microsoft's new AI for Microsoft 365.

Other alternatives include:

And there are quite a few others.

In terms of mainframes, the IBM z16 was sold as being designed for AI. It has an AI accelerator built onto its core Telum processor. IBM said that the z16 is particularly suited to processing artificial intelligence apps. The AI accelerator on the Tellum processor utilizes an AI inferencing model that analyses details from the massive transaction processes that go on within the mainframe to spot trends and make intelligent predictions. IBM explained that AI has a broad applicability to a wide set of use cases across a variety of different industries, from banking and finance to insurance to healthcare, and many others. The AI accelerator can handle massive amounts of critical transactions and workloads in real time and at scale.

You can always tell when something has grabbed the imagination of the public when you hear people talking about it down the pub. It used to be Instagram, then TikTok, but now it’s ChatGPT. AI, in all its forms, has finally made the big time!

 

No comments: