br3d
They shoot down training drones all the time, so shooting things down isn't novel. "In anger" is a normal phrase to describe doing something in a conflict situation, in contrast to training
I think the problem is that you're not explaining which country you're thinking of, and seem to be suggesting it's the same legal situation everywhere
Those are much much higher up, which introduces a lot of signal latency. The Starlink types are low down, which makes the Comms faster (and also means they keep burning up in the atmosphere)
Don't ask online strangers for medical advice. Go to a doctor if you're worried
Comparison with current excitement about AI is interesting. Look at the language people use to describe the behaviour of LLMs
I can't remember if it was worded exactly like that, but I do remember something like what you're describing happening, with everything suddenly computerised. I distinctly remember a TV science presenter looking to camera once and using the phrase "...feeds into the inevitable microprocessor"
When are we going to stop using the term "AI" to mean "uses a computer"?
And not to mention the massive ongoing subsidies needed to provide the infrastructure and deal with the externalities. If anyone is craving a state-interventionist communist lifestyle, it's our friend above
You seem to be suggesting that because some level of risk is inevitable, any level of risk is acceptable. There's a big difference between minimal practical risk and reckless levels of risk, but your construction doesn't capture that with its crude binary of "risk or no driving". We could drive with far less risk, eg enforcing speed limits with technology
Politicians indulging populist nonsense for short-term electoral advantage? Let's hope not - I think we've seen enough of that recently...
Blaming individuals for systems failures is the oldest trick in the book for avoiding fixing things. Google "The nut that holds the wheel" if you want to learn more, eg this article