Sat. Dec 4th, 2021


Madrid

Updated:

Keep

The
European Comission
has proposed a package of new rules aimed at limiting the use of Artificial Intelligence and the use of systems of facial recognition in public spaces. The objective of these measures, according to the agency, is “to guarantee the security and fundamental rights of people and companies” and to strengthen adoption, investment and innovation in AI.

“By setting the standards, we can pave the way for ethical technology around the world and ensure that the EU remains competitive along the way. Our future-proof and innovation-friendly rules will intervene where strictly necessary: ​​when the security and fundamental rights of EU citizens are at stake, ‘said Margrethe Vestager, European Commissioner for Competition, on the new standards package.

Four levels of risk

The Commission has established four risk levels in the use of Artificial Intelligence systems that can be harmful to the citizen. The most dangerous receives the category of «unacceptable risk». Within this group are systems considered “a clear threat to security, livelihoods and people’s rights”, which are now prohibited under the new legislation. This is the case of those “who seek to manipulate human behavior” (for example, toys that use voice assistance that encourages dangerous behavior of minors) and of those that allow the ‘social scoring’ of governments in order to differentiate citizens and thus create biases.

Second are the systems of «high risk». Within this group is the use of Artificial Intelligence systems in critical infrastructures, such as transport, in cases where they can put the health of the citizen at risk. It also includes the use of systems to determine access to education and professional course (for example, in the exam grading), its application in robot-assisted surgery, in rrecruitment of workers, in essential public or private services (such as applying for a bank loan), in immigration and the administration of justice.

According to the Commission, all ‘high risk’ schemes will be subject to ‘strict obligations’ before they can be placed on the market. These include the use of a risk analysis, the ability to trace the results, the use of detailed documentation, human supervision and, furthermore, the systems must be ‘robust, safe and accurate’.

In third place are systems representing “limited risk”, as is the case with chatbots. When used by a company or institution, “users must be aware that they are interacting with a machine in order to make an informed decision to continue or take a step back.” Fourth and last, the Commission highlights the “minimal risk” systems, which is where the majority of those that use AI are included, such as video games or image applications.

At the moment, this package of rules is a proposal, so it has not entered into force. The European Chamber and the different governments that make up the EU must approve it, so the process for its implementation could still take a little over a year. At the time it is approved, it will be mandatory for all member states.

Facial recognition, prohibited in public spaces

Facial recognition is one of the main ethical problems that technology has faced in the last decade. In accordance with the new regulations of the European Commission, all biometric identification systems, which allow to recognize a citizen by their features, they become part of the ‘high risk’ group and their employment will be subject to strict requirements. Thus, its use in public spaces it will be prohibited except in very specific circumstances. “For example, when it is strictly necessary to search for a missing child, to prevent a specific and imminent terrorist threat or to detect, locate, identify or prosecute a perpetrator or suspect of a serious crime,” the Commission emphasizes.

In addition, its use must be previously authorized by a judicial body or other independent body and within the appropriate limits in time, geographic scope and the databases searched.

See them
comments


www.abc.es

By admin

Leave a Reply

Your email address will not be published. Required fields are marked *