Sneha Girap (Editor)

Vladimir Vapnik

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit
Name
  
Vladimir Vapnik


Vladimir Vapnik Vladimir Vapnik Unlocking a Complex World Mathematically

Born
  
December 6, 1936 (age 87) Soviet Union (
1936-12-06
)

Institutions
  
Facebook AI Research GroupNEC Laboratories AmericaAdaptive Systems Research Department, AT&T Bell LaboratoriesRoyal Holloway, University of LondonColumbia University

Known for
  
Vapnik–Chervonenkis theoryVapnik–Chervonenkis dimensionSupport Vector MachineStatistical Learning TheoryStructural risk minimization

Notable awards
  
Kampe de Feriet Award (2014)C&C Prize (2013)Benjamin Franklin Medal (2012)IEEE Frank Rosenblatt Award (2012)IEEE Neural Networks Pioneer Award (2010)Paris Kanellakis Award (2008)Fellow of the U.S. National Academy of Engineering (2006)Gabor Award, International Neural Network Society (2005)Alexander Humboldt Research Award (2003)

Books
  
The Nature of Statistical Learning Theory

Alma mater
  
Russian Academy of Sciences, National University of Uzbekistan


Doctoral advisor
  
Aleksandr Lerner

Intelligent learning similarity control and knowledge transfer prof vladimir vapnik


Vladimir Naumovich Vapnik (Russian: Владимир Наумович Вапник; born 6 December 1936) is one of the main developers of the Vapnik–Chervonenkis theory of statistical learning, and the co-inventor of the support vector machine method.

Contents

Vladimir Vapnik datasciencecolumbiaedufilesseasdeptsjlb2180c

Vladimir vapnik abr


Early life and education

Vladimir Vapnik Link Interview with Vladimir Vapnik Less Wrong

Vladimir Vapnik was born in the Soviet Union. He received his master's degree in mathematics at the Uzbek State University, Samarkand, Uzbek SSR in 1958 and Ph.D in statistics at the Institute of Control Sciences, Moscow in 1964. He worked at this institute from 1961 to 1990 and became Head of the Computer Science Research Department.

Academic career

Vladimir Vapnik Facebook39s AI team hires Vladimir Vapnik father of the

At the end of 1990, Vladimir Vapnik moved to the USA and joined the Adaptive Systems Research Department at AT&T Bell Labs in Holmdel, New Jersey. While at AT&T, Vapnik and his colleagues developed the theory of the support vector machine. They demonstrated its performance on a number of problems of interest to the machine learning community, including handwriting recognition. The group later became the Image Processing Research Department of AT&T Laboratories when AT&T spun off Lucent Technologies in 1996. Vapnik left AT&T in 2002 and joined NEC Laboratories in Princeton, New Jersey, where he worked in the Machine Learning group. He also holds a Professor of Computer Science and Statistics position at Royal Holloway, University of London since 1995, as well as a position as Professor of Computer Science at Columbia University, New York City since 2003. As of February 2017, he has an h-index of 115 and, overall, his publications have been cited close to 180,000 times. His book on "Statistical Learning Theory" alone has been cited close to 60,000 times.

Vladimir Vapnik Intelligent Learning Similarity Control and Knowledge Transfer

On November 25, 2014, Vapnik joined Facebook AI Research, where he is working alongside his longtime collaborators Jason Weston, Ronan Collobert, and Yann LeCun. In 2016, he also joined Vencore Labs.

Honors and awards

Vladimir Vapnik was inducted into the U.S. National Academy of Engineering in 2006. He received the 2005 Gabor Award, the 2008 Paris Kanellakis Award, the 2010 Neural Networks Pioneer Award, the 2012 IEEE Frank Rosenblatt Award, the 2012 Benjamin Franklin Medal in Computer and Cognitive Science from the Franklin Institute, the 2013 C&C Prize from the NEC C&C Foundation, the 2014 Kampé de Fériet Award, and the 2017 IEEE John von Neumann Medal,.

Selected publications

  • On the uniform convergence of relative frequencies of events to their probabilities, co-author A. Y. Chervonenkis, 1971
  • Necessary and sufficient conditions for the uniform convergence of means to their expectations, co-author A. Y. Chervonenkis, 1981
  • Estimation of Dependences Based on Empirical Data, 1982
  • The Nature of Statistical Learning Theory, 1995
  • Statistical Learning Theory (1998). Wiley-Interscience, ISBN 0-471-03003-1.
  • Estimation of Dependences Based on Empirical Data, Reprint 2006 (Springer), also contains a philosophical essay on Empirical Inference Science, 2006
  • References

    Vladimir Vapnik Wikipedia