A Brief AI Glossary

Against the Current No. 237, July/August 2025

Peter Solenberger

Artificial Intelligence (AI) — Computer systems designed to simulate the human processes of learning and applying what is learned. Humans observe the world, generalize from their observations, and apply the generalizations in their activity. AI systems examine digitalized data, identify patterns in the data, and apply the patterns to new data.

Model — To model is to identify a pattern in data, for example, the association of demographic variables (race, gender, income, education, etc.) with life expectancy, or a pattern in an X-ray with a cancer diagnosis. The model can be applied to make predictions about new data.

Large language model (LLM) — An AI model produced by self-training on large quantities of text, simulating the process by which children learn language. Children hear and see, and from that learn the meaning and use of words. LLM systems examine large quantities of text or other digitalized data and identify patterns in the data.

Training — Identifying patterns in data by determining the likelihood that the characteristics of an observation are associated with an outcome, such as the likelihood that two facial images are of the same person.

Neural network — Layers of connected nodes that loosely model the neurons in the brain by passing signals to adjacent nodes, as synapses do in the brain. The signals are numbers representing the statistical contribution of the node to an outcome.

Deep learning — Machine learning that uses multilayered neural networks to perform tasks. The layering adds and limits complexity, since nodes communicate only with adjacent nodes in the same layer or the layers immediately above and below it, not with all the other nodes.

Transformer — A deep learning architecture in which multiple heads (graphics processing units) evaluate tokens (words or other data converted to numbers) in a context passed to the node being evaluated. The signal for key tokens is amplified, while the signal for less important tokens is dampened.

Generative pre-trained transformer (GPT) — A transformer pre-trained on a large dataset and used to generate predictions of the most likely outcome, such as a translation or the reading of an X-ray.

July-August 2025, ATC 237

Leave a comment

GUIDELINES FOR SUBMITTING COMMENTS TO AGAINST THE CURRENT:
ATC welcomes online comments on stories that are posted on its website. Comments are intended to be a forum for open and respectful discussion.
Comments may be denied publication for the use of threatening, discriminatory, libelous or harassing language, ad hominem attacks, off-topic comments, or disclosure of information that is confidential by law or regulation.
Anonymous comments are not permitted. Your email address will not be published.
Required fields are marked *