“All we have left of liberty is an ad-man's illusion.” ― Jean Baudrillard, 1991, ‘The Illusion of the End’
The surveillance technologies used by the Advertising Industrial Complex follow the lineage of those devised during the cold war and the state surveillance of members of the Civil Rights Movement in the US in the 1950s, 60s, and 70s. The Advertising Industrial Complex has since repaid this debt, and its technological infrastructure and approaches for gathering, storing and processing Big Data are increasingly flowing back into, and inseparable from, those now adopted and relied upon by the Military Industrial Complex.1
Permanent informational, psychological, and ideological hybrid-warfare has become our socio-cultural ground reality. Add to that the rush to apply Big Compute to a statistical number crunching of Big Data as the universal hammer to crack all nuts, and it should come as no surprise that the corporate cloud infrastructure and the suite of machine learning technologies that run upon it, for the identification of human targets for ads is —and has been for some time— the very same corporate cloud infrastructure and suite of machine learning technologies used for the identification of human targets for bombs.2
“Today, my body was a TV’d massacre, that had to fit into sound-bites and word limits, filled enough with statistics to counter measured response.”
— Rafeef Ziadah, 2011, ‘We teach life, Sir’3
As Dan McQuillan has noted regarding the next token prediction machines, currently masquerading under the marketing term of ‘AI,’ “the business model is the threat model.”4 This self-reinforcing nature in their operation is again made evident, but in the most abominable manner, by Israel’s human target prediction machines.5
Frustrated by the ‘bottleneck' in target acquisition when generated by days of painstaking human labour and careful judgement —because murder must be optimised too— Israel’s military now uses prediction machines to generate lists of thousands of human targets within seconds. The moment all of the targets on the current list have been exhausted (or obliterated), the prediction machines are simply made to produce another list of thousands of human targets.
“This is the end of the world that never ends” — Andreas Malm, 8 April 20246
The error rate of these predictions is apparently ten percent.7 A level of imprecision that would certainly not be tolerated in any automated shopping checkout system. There are, however, no unexpected ‘packages’ in the ‘bagging area’ here. All outcomes have been anticipated and authorised. The acceptance of this known margin of error, this genocidal embrace with the function approximators that are prediction machines, that from the perspective of those targeted will appear as deliberate and indiscriminate devastation, can only be interpreted as a war crime, a campaign of psychological violence against a civilian population, an act of state perpetrated terrorism.
Just in case you thought it impossible to imagine anything more barbaric, it is, of course, made yet more horrific by the revelation that the system intentionally locates its human targets in their family homes, and the officially authorised collateral damage for each strike ranges from 10 to 300 civilian deaths, depending on the predicted military seniority of the target.
“According to sources, the [target prediction] machine gives almost every single person [2.3 million residents] in Gaza a rating from 1 to 100, expressing how likely it is that they are a [Hamas] militant.” — Yuval Abraham, April 3, 2024, reporting for 972 Magazine8
This is the Tyranny of the Recommendation Algorithm given kinetic and malevolent flesh. Israel’s prediction machine for identifying the next human target is a ‘customers who bought x also bought y’ system for the delivery of bombs in place of commodities. It is Facebook’s or LinkedIn’s ‘people who connected to x also connected to y,’ system for the network propagation of death in place of influence. Dissatisfied with the slow delivery of climate destruction, this is capitalism selecting expedited delivery of family packs of the ultimate alienation of death, direct to the doorsteps of a people already among those suffering the brunt of its environmental impacts.910 This is not only (as others have noted) the first algorithmic genocide, it is also Surveillance Capitalism for a personalised execution, maximum ‘conversion impact’11 for minimum ad-spend; the first physical manifestation of the weapons grade Advertising Industrial Complex. It is Ad Intel for targeted missile campaigns, with special care taken not to waste the precision of expensive bombs on the taking of ‘cheap’ lives.
“You don’t want to waste expensive bombs on unimportant people — it’s very expensive for the country and there’s a shortage [of those bombs],”12
There is a key distinction that needs to be made within the wide gamut of machine learning implementations commonly lumped together under the umbrella of ‘AI.’ It is critical to our understanding of the operation of Israel’s target prediction machines that we articulate the implementations at the two extremes of this spectrum, in order that we may situate these targeting prediction machines along it. By so doing, we can better understand that the nature and function of these machines, far from exonerating us, in fact, only serves to further implicate us in their operation.
At one end of this continuum are systems like IBM’s Deep Blue and similar “classical” chess engines. Systems that consist of vast aggregations of centuries of human chess games, paired with human crafted heuristics for the prediction of the optimal next move. Alongside such machines, we can situate LLMs, machines that are literally formed of human cognition, our subjective value judgements, reduced to a statistically quantified aggregate. As such, they constitute an averaging of us, of all of the good and all of the bad, of our strengths and our weaknesses, our nobilities and our bigotries, our desire and our rage, all mapped and graphed and quantified into cold statistical weightings. There can be no removal of our ideologies from these systems, it is the very material from which they are formed.
At the other end of this continuum sits Deep Mind’s AlphaZero. Devoid of a single human game of chess, this neural network was simply given the rules and objective of the game, and an otherwise blank slate. Animated by modern compute (in 2017), after just twelve hours of games against itself, it was the strongest chess playing entity ever created.13 The nature of its playing style, the next move it predicts as optimal, often appears starkly alien, illegible and at odds with legacy human chess strategies. The same can be said of its sibling AlphaGo. Such machines are mysterious and remote from human cognition. Like Deep Blue, they too operate through memorisation and retrieval, their difference stems from the way their searches through and mappings of the possibility spaces of these games have been machinic rather than guided by human intuition and judgement.
Israel’s target prediction machines sit right at the Deep Blue end of this continuum. They are summed subjectivity, an accumulation of the value judgements of Israeli intelligence operatives, the output of whose target identification they have been fed during their training, paired with the cognition of those involved in any fine tuning. Like LLMs, such target prediction machines are made of us. Without us they are senseless, they can neither sense the world nor make sense of it. While they may uncover patterns we would not have discerned, while parsing volumes of data far beyond our capacity, they do so through our eyes, our minds and our values. These machines manifest a temporal displacement of human value judgments; baked into their training data, these judgements ossify into repeating loops that coerce human complexity into brutally quantised, wholly fabricated averages (like the average Hamas militant). Machines as fictions, tragedies, plays whose scripts generate lists of predicted antagonists. There can be no perfect and complete models, there is no un-coded gaze,14 yet actions based on these predictions are incarcerated within Weber’s iron cage;15 a looping, self-exhausting performance in the theatre of war and commerce; an infernal bombardment of the present with an unyielding rerun of static value judgements and ideologies derived from subjective experiences of the past.16 17
Eventually there were days where Israel’s air force had already reduced the previous list of targets to rubble, and the system was not generating new targets that qualified at the current threshold required for residents of Gaza to be predicted as ‘legitimate military targets,’ or ‘sufficiently connected to Hamas.’ Pressure from the chain of command to produce new targets, presumably from a desire to satisfy internal murder targets, meant that the bar at which a Gaza resident would be identified as a legitimate Hamas target was simply lowered. At the lower threshold, the system promptly generated a new list of thousands of targets. At what threshold, from 100 to 1, will the line be drawn, the decision made that the bar can be lowered no more, and the killing stop? Or will the target predictions simply continue while there remain Palestinians to target?
Spotify’s next song prediction machine will always predict a next song, no matter how loosely the remaining songs match the target defined by your surveilled activity history. It will never apologise and declare: “Sorry, but there are no remaining songs you will enjoy.”
“…even if an attack is averted, you don’t care – you immediately move on to the next target. Because of the system, the targets never end. You have another 36,000 waiting.”18
Revealed in the operation of this target prediction machine we see the borders policed and enforced by Israeli forces of oppression and violence extending far beyond that of the geographical delimitation imposed by the security fence and the endless checkpoints that control and segment the territory of Gaza.19 Here we must discern the violence of algorithmically enforced epistemological borders: a brutal quantisation, categorisation and demographic segmentation laid bare by an abominable ordinality of annihilation; a monstrous doom scroll of genocidal recommendation.
In an essay published in 2023, Baldur Bjarnason, argues that the psychology of interacting with next word prediction machines (LLMs)20 and their user’s propensity to accept their predictions, closely resembles the psychic con —and the six steps of cognitive bias and psychological manipulation within it. Unsurprising perhaps, given how deeply intertwined next token prediction machines, like LLMs, are with the Advertising Industrial Complex and its fundamental reliance upon surveillance, psychological profiling and manipulation.
Many of the tricks and biases Bjarnason describes, can indeed be seen at play in the operation of these target prediction machines, but the first step in the psychic con, has particular resonance here. The logic is straightforward. Most people are not interested in attending a seance, or what a psychic may predict for them were they to attend. So the audiences that do choose to attend psychics, of course, self-select to those most likely, or most willing, to be taken in by, and go along with, the predictions made by psychics. Likewise, most people are not frustrated by ‘bottlenecks’ of human cognition slowing down the process of generating lists of humans to kill. So those that seek to develop machines capable of generating, in mere seconds, a list of thousands of human targets to kill, of course, self-select to those most likely to go along with the predictions made by such target prediction machines and most willing to authorise execution of all those listed.
When the Israeli chain of command is presented with this deluge of data, this list of thousands of human targets, it is not so difficult to anticipate what happens next. In fact, the least plausible detail, of this horrific yet eminently important report, is perhaps the assertion that each target is given as much as twenty seconds of human oversight, before the strike is authorised (to check they are not female). After all, this is the same chain of command that chose, whether motivated by desperation or malice, to develop and employ this inhumane prediction machine in the first place and to accept its margin of error while defining the abhorrent varying degrees of ‘acceptable’ collateral damage.
“Do you have enough bone-broken limbs to cover the sun?
Hand me over your dead, and give me the list of their names in one thousand two hundred word limits.” — Rafeef Ziadah, ‘We teach life, Sir’
When we opt for the generation of predictions to be performed by machines, especially when they operate —as they always do— at scales far beyond the capacity of human parsing, we have self-selected and are pre-committed to the execution of those predictions. These predictions are then executed without the degree of human judgement, that would have otherwise been applied, had each prediction been calculated by a human. Authority has been ceded to the machine; our lived experience subordinate to its perceived ‘higher’ statistical truth.
Are we to believe that the Big Tech corporations, on whose infrastructure these target prediction machines rely, could reasonably expect deployment of such systems to not result in genocide?21
Here we can see the familiar seizure of epistemic authority by prediction machines, as well as the nature of this repeat coup (and surrender). The characteristic first step is an overwhelming volumetric demand placed on human cognition:22 a buffer-overflow attack on human information processing capacity. This diagnosis of human shortfall in productivity is twofold (as is the common assessment delivered by capitalism’s twin henchmen: optimisation and efficiency), and pertains to both our speed of processing input and the rate of our production of output.23 Our response to this assault on human capacity is to cede curation (processing, sense-making and filtering)24 of this deluge to the machines. The importance of curation must not be underestimated. The subjective ordering and filtering that are the core aspects of curation form the fundamental root of all creative acts. Therefore, this defeat, the full capture and occupation by the machines of the territories of curation, is beyond a mere breach of the city perimeter, it is the total dismantling of our defences and the instigation of a machinic seige that seeks to limit the flow of information to only that given to us by this Apparatus of Attention. The second step is the ceding of production and the elevation to supreme epistemic authority of the statistical ’truths’ produced through machinic parsing of the deluge. This inhuman(e) production, this information, the predictions from these machines, from our new overlords, both informs and, of course, commands.
In order to make sense of the world for the machine —in a lethal application of the logics of the Advertising Industrial Complex, that itself took its methodologies from military intelligence— Israeli intelligence officers and their surveillance infrastructure reduces Palestinians and the details of their lives to a set of quantified measures. This reduction, this assertion that the Palestinian people can be read, reduced and measured, this alienation from their humanity, is amplified in and by the runtime operation of the target prediction machines, transforming the psychological turmoil of mass murder to a routine number crunching.
Historically, the epistemic problematic within capitalism, was that only what can be known can have value, and conversely, only what has value can be known. Within systems like Israel’s target prediction machines, what can be known and what is deemed to have value, is imposed through the epistemic and ideological judgements made through human cognition when generating the data on which the system was trained —paired with that imposed during any ‘fine tuning.’ The process of having to reduce the world down to machine parsable datum, not only serves to train the machine to view the world in those terms but those engaged in gathering this data become habitualised to viewing the world in these terms. Yet retracing, in-step with the machine, precisely the paths along which value judgements are applied at runtime, to reach the same output, to arrive at what the machine has deemed to be known, is beyond the capacity of human cognition, and so, mysterious. As Zhanpei Fang has noted, rejecting this ‘uninterpretability’ becomes politically imperative.25 However, the temporally displaced value judgements in the ‘dead’ human labour inside these machines, and the interpassivity that this stored cognition —and the ceding to a perceived statistical objectivity— brings to their runtime operation, results in a psychological distance within its operators that appears to afford some sense of impunity.
According to personnel working with these systems, this interpassivity brought to the process of target acquisition by statistical prediction machines —and execution of those targets— the cold presence of only the ‘dead’ human labour26 in-the-loop (rather than live human subjectivity), is seen as a feature rather than a bug. Overcoming the production limitations of human cognition, is a familiar theme in the development of so called “AI.” Yet here, it seems that, beyond the parsing of a vast deluge of data and the instant generation of thousands of target predictions, it is the impunity from responsibility garnered from this deceptive temporal displacement of human agency —and the clear consciences this affords— that is perhaps the key value proposition for the current users of the system. Why unnecessarily burden the psyche of a valuable soldier —already grieving their own losses— if blood on hands and atrocities on consciences can be avoided by having the machines automate away the inconvenient trauma of cold-blooded mass-murder (or at least the perceived responsibility for it)?27
“The machine did it coldly. And that made it easier.”28
Is the new:
“I was just following orders.”
This constitutes a further psychic con. An anti-memetic that leaves the users of next target prediction machines in a state of selective amnesia as to where the responsibility lies for the acts they have committed during their working day. Reminiscent of the TV show, Severance, the personnel appear to return home after a long day ‘at the office’ working with these machines, with remarkably clear consciences, at least, for those involved in the perpetration of a genocide. Operating inside the cybernetics of surveillance and control, once epistemic authority is ceded to these statistical machines, whatever they fail to see, any ‘territory’ that does not align to their predicted map, is deemed to have no measurable value, it can therefore be expunged with impunity, with no perceived loss.
“From Bangladesh to Guatemala, Sudan to Myanmar, genocides might have been perpetrated with varying degrees of complicity from the capitalist core: but here we are dealing with something qualitatively different. A useful comparison would be with the genocide against the Bosnian Muslims – an event that shaped my own political youth. With an arms embargo, the West denied that people the right to defend themselves; through their retreat from Srebrenica, the Dutch forces knowingly handed over that town to Ratko Mladić; in the four years of the war, the so-called international community stood by as Bosnian Muslims were decimated. But these were, primarily, acts of omission. The West did not arm Republika Srpska with the best bombs from its arsenals. Bill Clinton did not fly in to hug Slobodan Milošević. The slaughter was not accompanied by the constant refrain ‘the Serbian nationalists have the right to defend themselves’. What we are seeing now might be the first advanced late capitalist genocide.” — Andreas Malm, 8 April 202429
Time and again the presumption that these machines, the “slow AI” of corporations included, sufficiently encapsulate and apply human reasoning, and the subsequent dereliction and surrender of our responsibilities to them, leads to disaster for humanity and our planet. Sam Altman of OpenAI, recently sought 7 trillion USD of investment, to fund the vast infrastructure required to support the growing demands of running his prediction machines. It is estimated that the cumulative sum spent globally on nuclear weapons has reached over 10 trillion USD. For what do we need these prediction machines or these monstrous weapons, other than to compensate for our inability and reluctance to respect the rights of all other humans and life on this planet? Certainly not for the prediction of the priorities of those with power over these technologies of terror, or to predict the consequences of using them.
Whether a prediction machine predicts the next video or post in a stream of content or instead predicts the next word-token, pixel in an image, spacetime patch in a video, or slice of audio to be included within any single piece of content being ‘generated,’ or instead it predicts the next human target for extermination, the same ambivalent number crunching machines apply the same senseless logic to generate, within an accepted margin of error, output deemed to be “good enough” to execute.
“AI is a tool like any technology, wielded by the powerful to serve their interests. If it obliterates entire sections of the employment economy, that is because the powerful are using it to reduce their labour costs. And if it facilitates the automation of death, it is because the powerful are using it to advance a project of colonial subjugation and extermination. It represents not a radical break with human history, in other words, but a radical intensification of business as usual: the rich getting richer, the poor getting poorer and the powerless getting crushed by a machinery of power that is increasingly sophisticated and enduringly barbaric.” — Mark O’Connell , April 14th, 2024, Irish Times
The dropping of bombs onto Gaza based on the output of human target prediction machines, constitutes a machinic execution by numbers. Code executions in every sense. Code has always been executed and utterances have always killed. Death Sentences, are, of course, nothing new, but this is human sentencing through code execution, seemingly given less deliberation than double clicking an ‘.exec’ file attached to an email.
So much about the deployment of these target prediction machines for the instrumentalisation of genocide, from their nomenclature (Israeli forces named the one that targets buildings: ‘The Gospel’), to the location of the seat of power from where this brutal retribution has been cast down, from the compute cloud of Judaeo-Christian Big-Tech infrastructure, from the boardrooms of the powerful unto the homes of the powerless, is a smiting of the Old Testament variety.
“The tongue has the power of life and death” — The Bible, (Proverbs 18:21)
We need not be rocket scientists (or Tony Stark), nor do we require the help of prediction machines to see where this is all heading —if, indeed, it is not utterly redundant to say we are ‘heading’ anywhere at all, after the world has stood idly by, even aided and abetted30 a six month (and counting) genocide, and it is not even the only one.31
Our destination is all too predictable. The humans suffering at the blunt end of prediction machines, are increasingly the very same humans suffering at their sharp end. The workers whose labour is exploited to make sense of the world, painstakingly labelling training data for a poverty wage, are increasingly the very people bombed in their homes and repelled at our borders by the machines their labour has built.32 In other words, capital has closed the loop. The labour of the subjugated, discriminated and exterminated is the labour that builds the machines ‘responsible’ for their subjugation, discrimination and extermination.33
Steven Moffat’s script for a recent Dr. Who episode, lays it all out quite succinctly:
“Life is cheap. ‘Patients’ are expensive. The Valenguard Algorithm. … Valenguard battle products are fitted with AI. The algorithm maintains a fighting force at just above the acceptable number of casualties. Keeps you fighting. Keeps you dying. Keeps you buying. Medical services ‘optimise’ the casualty rate for continued conflict. War is business, and business is booming.”
…
“There’s nobody else here. You declared war on an empty planet. There are no Castarians in the mud. They are not in the fog. There are no Castarians. Just the algorithm maintaining an acceptable casualty rate in the face of nothing at all. You’re fighting your own hardware and it’s killing you at just the right amount to keep you buying more.
Most armies would notice they are fighting smoke and shadows. But not this lot, Ruby. You know why? Because they have faith. Faith. The magic word that keeps you from having to think for yourself.”
— Dr. Who (Whoniverse) Episode 3. ‘Boom’ - written by Steven Moffat
There is no one else here. We are fighting ourselves. A cycle of violence accompanied by empty “thoughts and prayers.” There is but one way to peace, as Moffat’s Dr Who makes clear, telling us to: “Just surrender, and it’s all over.”
But surrender runs counter to the logics of capital; a script that quantizes to a binary oblivion then a self-annihilating singularity. The desire for the abdication of responsibility, continuation of conscience free exploitation, extraction and accumulation of power and wealth, for climate destruction and genocide without guilt, threatens to hasten the transferal of control to machines with ever fewer humans in-the-loop. A repeating loop of the ideologies of the past, formed in the interests of those with power.
The “AI” currently on active manoeuvres is not the marketing-hyped super intelligence the claimed benevolence of which will solve all our climate problems and deliver capitalism the bottomless well of growth and profit it so desires, but neither is it the Terminator. Instead it more closely resembles Miyazaki’s boar god, Nago: a technologically instantiated pestilence, animated by our own greed, vanity and rage. While this beast finds its most direct manifestation in Israel’s target prediction machines (augmented with munitions supplied by the West), the same toxic pestilence must also be seen to be manifest in the vast new compute facilities currently being erected to house the hardware and cooling needs of these resource devouring prediction machines.3435
For nearly two decades, Boston Dynamics has been showing us exactly to where this is descending. Their monstrous Big Dog robot —or a reengineered copy thereof— can now be seen in operation, as a large-gauge automatic-rifle-carrying (and firing) member of Chinese battalions stationed along their border with Cambodia.36 Mmm, don’t you just love the smell of ‘progress’ in the morning?37
…
And I recount, I recount a hundred dead, a thousand dead.
Is anyone out there?
Will anyone listen?
…
*please listen to Rafeef Ziadah’s full recital : ‘We teach life, Sir’.
Update: Assistant Professor in history at Israel’s ‘leading academic research institution’: Hebrew University, Lee Mordechai, confirms reports of the horrors taking place in Gaza, viewed from the other side of the security fence. https://twitter.com/LeeMordechai/status/1803052066652205400
Further Reading …
Yuval Abraham’s report for 972 magazine is perhaps the most important account of the deployment of machine learning based prediction technology in recent years. https://www.972mag.com/lavender-ai-israeli-army-gaza/
Andreas Malm’s piece for Verso: ‘The Destruction of Palestine is the Destruction of’ the Earth,’ is incredible. Please do read it. https://www.versobooks.com/en-gb/blogs/news/the-destruction-of-palestine-is-the-destruction-of-the-earth
“According to the sources, the army knew that the minimal human supervision in place would not discover these faults. “There was no ‘zero-error’ policy. Mistakes were treated statistically,” said a source who used Lavender. “Because of the scope and magnitude, the protocol was that even if you don’t know for sure that the machine is right, you know that statistically it’s fine. So you go for it.”” →
Neural net based chess engines have continued to improve. At the time of writing the top engine has an ELO rating over 3600. Almost a thousand above the top rated human. The fact they continue to improve, should underscore to us that their output is a prediction of the next best move, not absolute certainty. The moves predicted from each position now, could well be overturned by the superior predictions of the engines of the next generation.
Dr. Joy Buolamwini coined the term ‘the coded gaze’ when discovering the hidden biases baked into machine vision systems and the algorithmic violence that such biases inflict on those disadvantaged through the operation and deployment of these systems.
There are more possible games of chess than there are atoms in the known universe and yet a chess board consists of just sixty four squares and thirty two pieces, each can only move in limited ways and only a single piece can move at a time. The population of Palestine is nearly 5.5 million (2.3 million in the Gaza strip), each with their own aspirations, interrelations and patterns of movement. Chess has certainty, it is possible to state without doubt that a sequence of moves will force checkmate in a set number of moves. The real world is not reducible to such certainties. It is not possible to make a target prediction machine, of the likes constructed by Israel, that would sit anywhere near the AlphaZero end of the continuum defined above. The rationale and qualifying criteria for the execution of Gaza residents does not reduce to cold, ideologically free logic or rules, there is no blank slate that can be found to start from, or simple objective that can be defined. Without pollution from a vast aggregation of human rage and human value judgments, no amount of Big Compute or hours of ‘self-play’ will result in an AlphaZero style machine that identifies palestinian targets the way AlphaZero itself predicts optimal next moves in chess, absent of human cognition and illegible to it.
Of course, capitalism likes nothing more than infinite (surplus) value extraction, or ‘getting paid’ for an old bit of (someone else’s) labour over and over and...
As articulated in my previous post these are technologies of erasure. Their template, that of the airbrush. Central to their function is the removal of that which those developing them and those employing them prefer not to see.
This echos the demands on production that outstripped human labour capacity in the pre-industrial era, prior to the mill owners adopting the automated looms. Brian Merchant’s book, ‘Blood in the Machine,’ provides an excellent account of this history.
“Musk's claim that brain-computer interfaces could overcome our information absorption speed, which he sees as a main limit to our ability to keep up with advances in machine intelligence, is telling: [He’s claiming] Humans should adapt to machines, not vice versa.” https://x.com/DorotheaBaur/status/1797982910542393825
It must be noted here that the ‘sense’ made from the data by the machine, relies upon the sense-making labour of ghost workers whom are often those suffering both the blunt and sharp ends of these machines of prediction.
See: ‘Work Without Workers: Labour in the Age of Platform Capitalism’ by Phil Jones and this brilliant post ‘Data is Dead Labour: The Lie of Artificial Intelligence’ by Adam Jones
Rather than amplifying accumulation and exploitation, is it not time that these machines extended our capacity for sharing and respect? Rather than policing our borders, is it not time that these machines extended our capacity to welcome? Rather than weaponising our prejudices and practices of exclusion, is it not time that these machines extended our capacity for empathy and acceptance? Rather than automating our rage and our vengeance, is it not time that these machines extended our capacity for kindness and reconciliation?