Stock Markets
Daily Stock Markets News

Opinion | Alex Karp, Nicholas Zamiska: U.S. tech companies should help build AI


Alexander C. Karp is co-founder and CEO of Palantir Technologies. Nicholas W. Zamiska is the company’s head of corporate affairs and legal counsel to the office of the CEO. Their book, “The Technological Republic: Hard Power, Soft Belief, and the Future of the West,” will be published in February.

On July 16, 1945, not long after dawn, a group of scientists and government officials gathered at a desolate stretch of sand in the New Mexico desert to witness humanity’s first test of a nuclear weapon. The explosion was described by an onlooker as “brilliant purple.” The thunder from the bomb’s detonation seemed to ricochet and linger in the desert.

J. Robert Oppenheimer, who had led the project that culminated in the test, contemplated that morning the possibility that this destructive power might somehow contribute to an enduring peace. He recalled the hope of Alfred Nobel, the Swedish industrialist and philanthropist, that dynamite, which Nobel had invented, would end wars.

After seeing how dynamite had been used in making bombs, Nobel confided to a friend that more capable weapons, not less, would be the best guarantors of peace. He wrote, “The only thing that will ever prevent nations from beginning war is terror.”

Our temptation might be to recoil from this sort of grim calculus, to retreat into hope that a peaceable instinct in our species would prevail if only those with weapons would lay them down. It has been nearly 80 years since the first atomic test in New Mexico, however, and nuclear weapons have been used in war only twice, at Hiroshima and Nagasaki. For many, the bomb’s power and horror have grown distant and faint, almost abstract.

The record of humanity’s management of the weapon — imperfect and, indeed, dozens of times nearly catastrophic — has been remarkable. Nearly a century of some version of peace has prevailed in the world without a great-power military conflict. At least three generations — billions of people and their children and grandchildren — have never known a world war. John Lewis Gaddis, a professor of military and naval history at Yale, has described the lack of major conflict in the postwar era as the “long peace.”

The atomic age and the Cold War essentially cemented for decades a calculus among the great powers that made true escalation, not skirmishes and tests of strength at the margins of regional conflicts, exceedingly unattractive and potentially costly. Steven Pinker has argued a broader “decline of violence may be the most significant and least appreciated development in the history of our species.”

It would be unreasonable to assign all or even most of the credit for this to a single weapon. Any number of other developments since the end of World War II, including the proliferation of democratic forms of government across the planet and a level of interconnected economic activity that once was unthinkable, are part of the story.

The great-powers calculus that has helped prevent another world war might also change quickly. But the supremacy of U.S. military power has undoubtedly helped guard the peace, fragile as it might be. A commitment to maintaining such supremacy, however, has become increasingly unfashionable in the West. And deterrence, as a doctrine, is at risk of losing its moral appeal.

The atomic age could soon be coming to a close. This is the software century; wars of the future will be driven by artificial intelligence, whose development is proceeding far faster than that of conventional weapons. The F-35 fighter jet was conceived of in the mid-1990s, and the airplane — the flagship attack aircraft of American and allied forces — is scheduled to be in service for 64 more years. The U.S. government expects to spend more than $2 trillion on the program. But as retired Gen. Mark A. Milley, former chairman of the Joint Chiefs of Staff, recently asked, “Do we really think a manned aircraft is going to be winning the skies in 2088?”

In the 20th century, software was built to meet the needs of hardware, from flight controls to missile avionics. But with the rise of artificial intelligence and the use of large language models to make targeting recommendations on the battlefield, the relationship is shifting. Now software is at the helm, with hardware — the drones in Ukraine and elsewhere — increasingly serving as the means by which the recommendations of AI are carried out.

And for a nation that holds itself to a higher moral standard than its adversaries when it comes to the use of force, technical parity with an enemy is insufficient. A weapons system in the hand of an ethical society, and one rightly wary of its use, will only act as an effective deterrent if it is far more powerful than the capability of an opponent…



Read More: Opinion | Alex Karp, Nicholas Zamiska: U.S. tech companies should help build AI

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments

Get more stuff like this
in your inbox

Subscribe to our mailing list and get interesting stuff and updates to your email inbox.

Thank you for subscribing.

Something went wrong.