I have always wanted to talk about this.
Dangers of self driven cars.
Well over the years, technology has advanced from the time we used donkey and horses to now that we are even talking about self driven cars. Am an engineer (still in training) and I can boldly say that no technology is 100% genuine and free from errors. On saying that, this self driven cars am sure are being created to better the lives of individuals yes, but let's check the side effects.
1. No human instinct
Being human is what makes us unique and different from every other thing be it machine or animals simply because of our humanity. Machines are programmed so they do as they told. For instance when there is a kid up ahead trying to cross the road, i doubt it if the machine would detect that. A second can actually save a person's life and I don't think machines can reason for themselves.
Kidnapping will be up again looking at it like this every program has a data base and in that data base all your activities would be stored. Now imagine if the wrong hands hold of that would happen to the customers? The kidnappers will be able to know every movements of everyone there by allowing them to plan their moves with no rush.
3. Virus Attack.
Just as humans get virus so those computers. Am sure that the makers of this car will put a strong anti virus or something but there are some virus that can not be cured because you can't cure what you can't find. So I believe that virus attack is also a danger.
Just as technology increases so does the cyber world. New programs are being written and new ways to program are being thought of. Hacking is a very difficult stuff but if one knows what he/she is doing, it can be quite easy.
Imagine a rich man acquires such car and he is on his way back from work. He meets traffic up ahead. The car might have been programmed to find another route but which route will that be? If a crime wants to carried out, it is always thought of and planned to avoid any complications. The traffic up ahead in that illustration would have been programmed by hackers trying to convince the man that everything is ok and the alternative route for him might be their most convenient spot. Who knows.
A good hacker finds a back door into any system.
So risk of being hacked is also a danger.
Well I will stop here for now but there are others like weather, technical glitch etc.
I don t see any danger at the AI because it is going to drive better than most humans are doing.
Of course there will be some mistakes but way less than if a human would drive.
The most dangerous thing are hackers in my opinion. It could be a easy option for killers to hack into a self driving car and let it drive against a Wall at Fullspeed without getting their hands dirty.
I am so, so hesitant about the self-driving car prospects. In my opinion, it is a very cool research project that should remain just that- a research project. It is a very cool concept. But to be put into mass adoption? To fill our streets with them? I think not.
For one, you face the very real prospect of unexpected road conditions. I don't mean weather- I'm sure they will be able to figure that out. But if you've ever driven a long distance, i.e. cross country, you know that electronic mapping systems can be very outdated and things can change very quickly. Again, I'm sure they will have plenty of ways to adapt to this but it's just one of many areas in which I see room for errors when it comes to these death machines.
For another, you risk the entire spectrum of malfunctions that are possible to occur. I'm not saying likely to occur, I'm saying possible. Sure, humans are awful drivers. But all it takes is a few small mishaps in electronics to kill dozens, hundreds of people even. Think about all the technology we already have. Including the technology in vehicles. It malfunctions all the time. Ok maybe "all the time" is a bit of an exaggeration on the higher end of things but the point is; things happen.
I was reading something awhile back about the ethical portion of self-driving cars and how they would more or less be designed to sacrifice human life to prevent other possibilities if chances were over a certain percentage of catastrophe. That sounds reasonable, however morbid, but really think about that. That means the car makes up it's mind sometimes in hundredths of a second. I don't trust any computer that much. Even the developers of the car, if I remember correctly, said that although they believed it was the morally correct decision that they would never "drive" one. That's as good an indicator as any in my opinion.
Also, there is the very real possibility of hackers. I've heard this discussed as well and some people like to brush it off like it's some sort of impossibility. Like "oh, no, the security would be far too good for that". Now, I'm sure they would have top of the line security. But, even the most (publicly available) top of the line security these days doesn't entirely eliminate the possibility of hackers.
Aside from the possibilities of your car being taken control of, there's another thing to consider about hackers. If the day comes when we all have self driving cars, they will probably be interconnected with our personal lives to automatically know addresses(obviously), contacts; even things like wallet addresses(I'm assuming by this point crypto will be broadly accepted). So, even if the hacker can't control the vehicle- they could potentially gain access to all other information about someone. We've all already heard the folk horror stories of people's lives being traced through their GPS devices in their phones by hackers. It's all too real.
The last thing I'd like to quickly bring as I don't know too much about it but- radiation. I've heard that's a large concern. I get concerned enough being around all these cell signals all the time.
I don't think self-driving cars are a good idea. Not anytime in the near future, anyway.
Artificial Intelligence and Human Stupidity.