Description

Image classification is a extensively studied problem that lies at the heart of computer vision. However, the challenge remains to develop a system that can identify and classify thousands of objects like the human visual system. The accumulation of massive image data sets has permitted the study of this problem at a big-data scale. However current algorithms have been shown to fall short of being practical and accurate at scale. To further understand how these algorithms scale, we developed a library of functions to explore the scalability of the support vector machine (SVM) linear classification algorithm when applied to problems of image classification. Our study provides valuable insights into not only how the SVM algorithm scales up and where it falls short, but also into how to create smarter and more efficient image classifiers that are fine- tuned for the large scale image classification challenge.

Share

COinS
 

A Study of the N-D-K Scalability Problem in Large-Scale Image Classification

Image classification is a extensively studied problem that lies at the heart of computer vision. However, the challenge remains to develop a system that can identify and classify thousands of objects like the human visual system. The accumulation of massive image data sets has permitted the study of this problem at a big-data scale. However current algorithms have been shown to fall short of being practical and accurate at scale. To further understand how these algorithms scale, we developed a library of functions to explore the scalability of the support vector machine (SVM) linear classification algorithm when applied to problems of image classification. Our study provides valuable insights into not only how the SVM algorithm scales up and where it falls short, but also into how to create smarter and more efficient image classifiers that are fine- tuned for the large scale image classification challenge.