As humans, we have a knack for estimating another person’s age quite accurately just by glancing at their face. Although age estimation may seem relatively simple to us, computers have a much more difficult time performing the task. In one of the latest attempts to build a computer that can accurately estimate a person’s age, researchers have taken a bottom-up approach to the challenge, collecting hundreds of thousands of images and videos from the Internet to train the system. Their goal is to build a universal human age estimator that is applicable to all ethnic groups and various image qualities.
The researchers, Bingbing Ni of the Advanced Digital Sciences Center in Singapore, along with Zheng Song and Shuicheng Yan from the National University of Singapore, have published their study in a recent issue ofIEEE Transactions on Multimedia.
To begin, the researchers developed an automatic web image and video mining scheme, in which they used age-related search queries to collect nearly 400,000 images from popular image search engines such as Flickr and Google Images as well as 10,000 YouTube video clips. The face images were tagged with ages, and were used to develop a novel learning algorithm for training the system. Although the faces in the video clips were not tagged with ages, the researchers could still use them for training, as they provided the same face at different angles and different lighting conditions.
Although there have been several other computerized human age estimators, this system has by far the largest database of facial images compared to the others: after poor-quality images and false alarms were removed, the database contained 77,000 images containing 219,000 faces. In addition, unlike some of the previous smaller databases, the images in this database included faces of people from different racial groups, and had different illumination conditions and different qualities. This diversity gave the system a “universal” capability.