
Technologies are great, however, in some cases, everything is not so rosy, as it may seem at first glance. Today we will talk about the most dangerous trends in the field of technology and their negative impact on the modern world.
Technologies are developing with seven-mile steps and are not going to stop! However, such progress, unfortunately, is not always associated with something good.
In fact, even the most popular tendencies in the field of technology could potentially have serious consequences for the confidentiality and security of users.
Stop it almost impossible, but know the enemy in the face - damn item.
Budget intelligent Columns
Amazon ECHO - the first smart speaker of the consumer level, which was released in 2014. Since then, many major brands have been developing their own devices, and we have seen the emergence of popular products such as Google Home and Apple Homepod.
By itself , the intelligent column, which performs the role of a home assistant, has become a revolution in the field of artificial intelligence and machine learning.
However, the question of confidentiality when using a similar device is quite controversial, so it has repeatedly become the subject of disputes.
It is worth noting one important thing: the most famous smart speakers are really safe in the market, since such giants like Amazon, Google and Apple are unlikely to take a product development with weak security.
However, this does not apply to other brands that produce budget devices - they are also popular, because not everyone is ready to spend over $ 100 to the device from Google or Amazon.
Given that any user is not to have a smart column at home, lower prices and lower quality found their place in the market. Unfortunately, like many representatives of IOT ("Internet of things"), the smart columns are equipped with insufficient quantities of functions responsible for the safety, which puts your home jet.
Unreliable software for face recognition
Software for face recognition has come a long way since its inception (about ten years ago). The technology provided many benefits, such as unlocking the phone without password (with face recognition), helping the authorities to search for missing people.
Under ideal conditions, the technology is capable of delivering near-perfect results - accuracy can be as high as 99.9%! When testing facial recognition applications, high-quality images with uniform lighting and sharp angles are often used, but in the real world, photos rarely turn out like this.
Under poor illumination, the accuracy drops sharply. The same goes for makeup, facial hair, glasses, piercings and medical masks.
However, facial recognition software is capable of operating under less than ideal conditions, but the belief that it is rarely wrong is worrisome, especially given that even low-quality software is used to track people or messages. about crimes.
Autonomous vehicles
Autonomous car cybersecurity is no joke. Unlike compact devices that fit in your pocket, a car system without the proper level of security can affect not only the user's personal information and data, but also their physical security.
Yes, self-driving cars are not the main means of transportation, but they are already used in many cities around the world.
Vehicles of this type are almost always connected to the Internet. They constantly send various indicators and information from sensors located throughout the vehicle to a centralized cloud system.
Car manufacturers do their best to keep vehicles safe, but no online or offline system is 100% secure, as evidenced by countless corporate hacks around the world.
Deepfakes are getting more popular
In due time deepfake (deepfake) was a real miracle of modern technology. Processing even a short deepfake video with one or more people required huge amounts of visual data and a powerful computer.
In the past, deepfakes were only done with famous people (politicians or celebrities) to spread misinformation and try to destroy the reputation of a particular person.
But now everything is completely different.
Modern technology has developed so much that almost any user is able to create a deepfake with anyone. You no longer need to have hundreds of photos and videos from different angles, since a few profile images in social networks and a short video clip with a picture of a person are enough - and that's it, the deepfake is ready!
In addition, the similarity to facial recognition software should also be noted. A recent study by Sungkyunkwan University in South Korea found that even the most trusted facial recognition software can be bypassed with a deepfake.
Lack of privacy
life, for good reason. Confidentiality is the cornerstone of freedom of speech, the ability of a person to express himself, live peacefully and maintain his dignity.
At the same time, privacy is one of the least protected human rights. Not to mention that many people simply don't care! Surveys show that 13% of Internet users worldwide are willing to give up their personal information in exchange for free access to online content and various services.
Over the past few years, there have been several attempts to legislate privacy laws such as the GDPR in Europe and the California Consumer Privacy Act (CCPA).
However, the fact is that commercial organizations are not prohibited from collecting personal information about users, since the rules only require the permission of the user (ie you).
This is the reason why you now see a pop-up window on almost every website asking you to accept cookies - people are tired of constant notifications and just blindly agree to everything cookies. They don't even think about what type of data the website is asking for...
Output
Some trends in technology can have a negative impact on users around the world, and the main problem is that they are beyond anyone's control. Unfortunately, there is only one way to get rid of them - to abandon technology and leave the network, but this, to be honest, is quite difficult.
Although there is nothing you can do to stop what is happening, knowing the problem will help you prepare for the worst.