There's not a week without a new announcement of a new AI office, AI safety institute, or AI advisory body initiated by a government, usually the democratic governments of this world. They're all wrestling with, “How to regulate AI,” and seem to choose, without much variation, for a focus on safety.
Last week we saw the Department of Homeland Security in the US joining this line of efforts with its own advisory body. Lots of industry representatives, some from academia and civil society, to look at safety of AI in its own context. And what's remarkable amidst all this focus on safety is how little emphasis and even attention there is for restricting or putting guardrails around the use of AI in the context of militaries.
And that is remarkable because we can already see the harms of overreliance on AI, even if industry is really pushing this as its latest opportunity. Just look at venture capital poured into defense tech or “DefTech” as it's popularly called. And so, I think we should push for a widening of the lens when we talk about AI safety to include binding rules on military uses of AI. The harms are real. It's about life and death situations. Just imagine somebody being misidentified as a legitimate target for a drone strike, or the kinds of uses that we see in Ukraine where facial recognition tools, other kinds of data, crunching AI applications, are used in the battlefield without many rules around it, because the fog of war also makes it possible for companies to kind of jump into the void.
So it is important that safety of AI at least includes the focus and discussion on what is proper use of AI in the context of war, combat, and conflict, of which we see too much in today's world, and that there are rules in place initiated by democratic countries to make sure that the rules based order, international law, and human rights humanitarian law is upheld even in the context of the latest technologies like AI.
- Russia-Ukraine war: How we got here ›
- Robots are coming to a battlefield near you ›
- AI explosion, elections, and wars: What to expect in 2024 ›
- Biden & Xi set to agree on regulating military use of AI ›
- Ukraine’s AI battlefield ›
- Will AI further divide us or help build meaningful connections? - GZERO Media ›
- How neurotech could enhance our brains using AI - GZERO Media ›
More For You
1,170: The number of high-rise buildings in Kyiv that were left without heating following a barrage of Russian attacks last night on Ukraine’s capital and its energy facilities, per Kyiv Mayor Vitali Klitschko.
Most Popular
What We’re Watching: US critical minerals summit, Rafah crossing reopens, Border violence in Pakistan
U.S. President Donald Trump and Japanese Prime Minister Sanae Takaichi hold up signed documents regarding securing the supply of critical minerals and rare earths, at a bilateral meeting at Akasaka Palace in Tokyo, Japan, October 28, 2025.
Representatives from the European Union, United Kingdom, Japan, and others will meet in Washington this week to discuss a strategic alliance on critical minerals.
Hard numbers: Large protests in Czechia, UAE-linked firm has large stake in the president’s company, & More
80,000: The number of people estimated to be in the streets of Czechia on Sunday to show their support for President Petr Pavel after he blocked the nomination of an environmental minister who performed the Nazi salute and posted Nazi memorabilia.
The US has started handing $1,000 to the bank accounts of newborn babies. But can policies like this one help boost sagging birthrates in advanced democracies?
