Against the Current No. 237, July/August 2025
-
State of the Resistance
— The Editors -
Deported? What's in a Name?
— Rachel Ida Buff -
Unnecessary Deaths
— Against the Current Editorial Board -
Viewpoint on Tariffs & the World-System
— Wes Vanderburgh -
AI: Useful Tool Under Socialism, Menace Under Capitalism
— Peter Solenberger -
A Brief AI Glossary
— Peter Solenberger -
UAWD: A Necessary Ending
— Dianne Feeley -
New (Old) Crisis in Turkey
— Daniel Johnson -
India & Pakistan's Two Patterns
— Achin Vanaik -
Not a Diplomatic Visit: Ramaphosa Grovels in Washington
— Zabalaza for Socialism -
Nikki Giovanni, Loved and Remembered
— Kim D. Hunter - The Middle East Crisis
-
Toward an Axis of the Plutocrats
— Juan Cole - War on Education
-
Trump's War on Free Speech & Higher Ed
— Alan Wald -
Reflections: The Political Moment in Higher Education
— Leila Kawar - Reviews
-
A Full Accounting of American History
— Brian Ward -
The Early U.S. Socialist Movement
— Lyle Fulks -
How De Facto Segregation Survives
— Malik Miah -
Detroit Public Schools Today
— Dianne Feeley -
To Tear Down the Empire
— Maahin Ahmed -
Genocide in Perspective
— David Finkel -
Shakespeare in the West Bank
— Norm Diamond -
Questions on Revolution & Care in Contradictory Times
— Sean K. Isaacs -
End-Times Comic Science Fiction
— Frann Michel
Peter Solenberger
Artificial Intelligence (AI) — Computer systems designed to simulate the human processes of learning and applying what is learned. Humans observe the world, generalize from their observations, and apply the generalizations in their activity. AI systems examine digitalized data, identify patterns in the data, and apply the patterns to new data.
Model — To model is to identify a pattern in data, for example, the association of demographic variables (race, gender, income, education, etc.) with life expectancy, or a pattern in an X-ray with a cancer diagnosis. The model can be applied to make predictions about new data.
Large language model (LLM) — An AI model produced by self-training on large quantities of text, simulating the process by which children learn language. Children hear and see, and from that learn the meaning and use of words. LLM systems examine large quantities of text or other digitalized data and identify patterns in the data.
Training — Identifying patterns in data by determining the likelihood that the characteristics of an observation are associated with an outcome, such as the likelihood that two facial images are of the same person.
Neural network — Layers of connected nodes that loosely model the neurons in the brain by passing signals to adjacent nodes, as synapses do in the brain. The signals are numbers representing the statistical contribution of the node to an outcome.
Deep learning — Machine learning that uses multilayered neural networks to perform tasks. The layering adds and limits complexity, since nodes communicate only with adjacent nodes in the same layer or the layers immediately above and below it, not with all the other nodes.
Transformer — A deep learning architecture in which multiple heads (graphics processing units) evaluate tokens (words or other data converted to numbers) in a context passed to the node being evaluated. The signal for key tokens is amplified, while the signal for less important tokens is dampened.
Generative pre-trained transformer (GPT) — A transformer pre-trained on a large dataset and used to generate predictions of the most likely outcome, such as a translation or the reading of an X-ray.
July-August 2025, ATC 237