My research mainly focuses on learning theory, approximation theory and related subjects. Analyzing and processing data has been an important task in various fields of science and technology, for which machine learning produces efficient algorithms. Learning theory studies mathematical foundations of machine learning and helps design new learning algorithms. Starting from the work on statistical learning theory and support vector machines, it has been developed very fast and raised many challenging tasks and related theoretical issues concerning huge data of large variables: clustering, ranking, feature selection, dimension reduction, sparsity, and deep learning. Besides fundamental tools provided by probability analysis, statistics and optimization, methods and ideas form approximation theory plays an important role in learning theory: capacity estimation for function spaces is crucial in probabilistic bounds for uniform convergence, while estimating approximation error is a fundamental issue in understanding models and hypothesis spaces for consistency of learning algorithms.