AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |
Back to Blog
Fo4 crafting automatrons crashes12/27/2022 ![]() system may accurately identify an enemy tank, but not understand that it’s parked next to a kindergarten, and so should not be attacked for fear of killing civilians. system into thinking that a turtle was actually a rifle by subtly altering the pattern of pixels in the image.Įven if target identification systems were completely accurate, an autonomous weapon would still pose a serious danger unless it were coupled with a nuanced understanding of the entire battlefield. In one experiment, researchers managed to trick an A.I. It can often make strange mistakes that humans never would. Image recognition software, while equaling human abilities in some tests, falls far short in many real-world situations-such as rainy or snowy conditions, or dealing with stark contrasts between light and shadow. researchers warn, today’s machine-learning–based algorithms can’t be trusted with the most consequential decision anyone will ever face: whether to take a human life. They might also be better at determining, in the heat of battle, whether the shape suddenly appearing from behind a house is an enemy soldier or a child.īut in practice, human rights campaigners and many A.I. Computers can process information faster than humans, and they are not affected by the physiological and emotional stress of combat. In theory, A.I.-enabled weapons systems may be able to reduce civilian war casualties. What’s different today is that the technology is far more sophisticated and accurate. More primitive versions have been around since the 1960s, starting with a winged missile designed to fly to a specific area and search for the radar signature of an enemy antiaircraft system. They would supplement that country’s existing fleet of Turkish-made Bayraktar TB2 drones, which can take off, land, and cruise autonomously, but need a human operator to find targets and give the order to drop the missiles or bombs they carry. is said to be sending 100 Switchblades to Ukraine. For example, each Switchblade costs as much as $70,000, after the launch and control systems plus munitions are factored in, according to some reports. “Suddenly, people see that a war in Europe is possible, and defense budgets are increasing,” he says.Īlthough less expensive than certain weapons, loitering munitions are not cheap. The Ukraine war has further accelerated demand, Lev Ari adds. has begun major purchases, including UVision’s Hero family of kamikaze drones, as well as the Switchblade, made by rival U.S. That got many countries interested, Lev Ari says. In that conflict, Azerbaijan used advanced drones and loitering munitions to decimate Armenia’s larger arsenal of tanks and artillery, helping it achieve a decisive victory. for navigation and image analysis.ĭagan Lev Ari, the international sales and marketing director for UVision, an Israeli defense company that makes loitering munitions, says demand had been inching up until 2020, when war broke out between Armenia and Azerbaijan. “Predator drones are superexpensive, so countries are thinking, ‘Can I accomplish 98% of what I need with a much smaller, much less expensive drone?’ ” says Brandon Tseng, a former Navy SEAL who is cofounder and chief growth officer of U.S.-based Shield AI, a maker of small reconnaissance drones that use A.I. Infantry soldiers can, with just a little bit of training, easily deploy these new autonomous weapons. The selling point is that small, semiautonomous drones are a fraction of the price of, say, a much larger Predator drone, which can cost tens of millions of dollars, and don’t require an experienced pilot to fly them by remote control. Militaries worldwide are keeping a close eye on the technology as it rapidly improves and its cost declines. That’s raised alarm bells among human rights campaigners and technologists who fear they represent the leading edge of a trend toward “killer robots” on the battlefield-weapons controlled by artificial intelligence that autonomously kill people without a human making the decision. ![]() The war in Ukraine has become a critical proving ground for increasingly sophisticated loitering munitions. In either case, once the enemy has been spotted and the operator has chosen to attack it, the drone nose-dives into its quarry and explodes. software that lets them hunt for particular kinds of targets based on images that have been fed into their onboard systems. In some cases, the drones are equipped with A.I. The drone’s operator, remotely monitoring a video feed from the craft, can wait for enemy soldiers or a tank to appear below. Colloquially referred to as a “kamikaze drone,” it can fly autonomously to a specific area and then circle for up to 30 minutes. Photos of the drone were quickly uploaded to social media, where weapons experts identified it as a KUB-BLA “loitering munition” made by Zala Aero, the dronemaking arm of Russian weapons maker Kalashnikov. ![]()
0 Comments
Read More
Leave a Reply. |