Estimated reading time: 0 minutes, 23 seconds

According to Science Friday, artificial technology is creeping into law enforcement applications and it is time to set clear standards about what is fair and legal.

Facial recognition technology is all around us—it’s at concerts, airports, and apartment buildings. But its use by law enforcement agencies and courtrooms raises particular concerns about privacy, fairness, and bias, according to Jennifer Lynch, the Surveillance Litigation Director at the Electronic Frontier Foundation.

Read 67 times
Rate this item
(0 votes)
More in this category: « AI Filmmaking AI in IT »

Leave a comment

Make sure you enter all the required information, indicated by an asterisk (*). HTML code is not allowed.

Visit other PMG Sites:

Template Settings

Color

For each color, the params below will give default values
Tomato Green Blue Cyan Dark_Red Dark_Blue

Body

Background Color
Text Color

Header

Background Color

Footer

Select menu
Google Font
Body Font-size
Body Font-family
Direction
PMG360 is committed to protecting the privacy of the personal data we collect from our subscribers/agents/customers/exhibitors and sponsors. On May 25th, the European's GDPR policy will be enforced. Nothing is changing about your current settings or how your information is processed, however, we have made a few changes. We have updated our Privacy Policy and Cookie Policy to make it easier for you to understand what information we collect, how and why we collect it.
Ok Decline