Can Today's Machine Learning Pass Image-Based Turing Tests?

Apostolis Zarras*, Ilias Gerostathopoulos, Daniel Méndez Fernández

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingAcademicpeer-review

Abstract

Artificial Intelligence (AI) in general and Machine Learning (ML) in particular, have received much attention in recent years also thanks to current advancements in computational infrastructures. One prominent example application of ML is given by image recognition services that allow to recognize characteristics in images and classify them accordingly. One question that arises, also in light of current debates that are fueled with emotions rather than evidence, is to which extent such ML services can already pass image-based Turing Tests. In other words, can ML services imitate human (cognitive and creative) tasks to an extent that their behavior remains indistinguishable from human behavior? If so, what does this mean from a security perspective? In this paper, we evaluate a number of publicly available ML services for the degree to which they can be used to pass image-based Turing Tests. We do so by applying selected ML services to 10,500 randomly collected captchas including approximately 100,000 images. We further investigate the degree to which captcha solving can become an automated procedure. Our results strengthen our confidence in that today's available and ready-to-use ML services can indeed be used to pass image-based Turing Tests, rising new questions on the security of systems that rely on this image-based technology as a security measure.

Original languageEnglish
Title of host publicationInformation Security. ISC 2019
EditorsZ Lin, C Papamanthou, M Polychronakis
PublisherSpringer, Cham
Pages129-148
Number of pages20
Volume11723
ISBN (Electronic)978-3-030-30215-3
ISBN (Print)978-3-030-30214-6
DOIs
Publication statusPublished - 2019

Publication series

SeriesLecture Notes in Computer Science
Volume11723
ISSN0302-9743

Cite this