Harman Patil (Editor)

National Institutional Ranking Framework

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit

National Institutional Ranking Framework (NIRF) is a methodology adopted by the Ministry of Human Resource Development (MHRD), Government of India, to rank all institutions of higher education in India. The Framework was approved by the MHRD and launched by Minister of Human Resource Development on 29 September 2015. There are separate rankings for different types of institutions depending on their areas of operation like universities and colleges, engineering institutions, management institutions, pharmacy institutions and architecture institutions. The Framework uses several parameters for ranking purposes like resources, research, and stakeholder perception. These parameters have been grouped into five clusters and these clusters were assigned certain weightages. The weightages depend on the type of institution. About 3500 institutions voluntarily participated in the first round of rankings. The ranked lists were released by MHRD on 4 April 2016.

Contents

Formation of the NIRF

MHRD organized a one-day workshop on 21 August 2014 on evolving methodologies for the ranking of institutions of higher education in India. The meeting resolved to constitute a Committee for evolving a National Ranking Framework. Later it was also decided to co-opt representatives of Central Universities and IIMs also into the proposed Committee. Based on these decisions, a Core Committee consisting of 16 members was constituted on 29 October 2014 with Secretary (HE, MHRD, as Chairperson and Additional Secretary (TE), MHRD, as Member-Secretary. The other members were the Directors of the IIT's at Kharagpur and Madras, the Vice-Chancellors of Delhi University, EFL University, Central University of Gujarat and JNU, Directors of the IIM's at Ahmedabad and Bangalore, Dirctors of School of Planning and Architecture (Delhi), NIT (Warangal), ABV-Indian Institute of Information Technology & Management (Gwalior), IISER (Bhopal), NAAC (Bangalore) and Chairperson of NBA (New Delhi).

The terms of reference of the Committee were:

  • Suggest a National Framework for performance measurement and ranking of
    1. Institutions;
    2. Programmes;
  • Suggest the organizational structure, institutional mechanism and processes for implementation along with time-lines of he National Ranking Framework.
    1. Suggest a mechanism for financing of the Scheme on National Ranking Framework.
    2. Suggest linkages with National Assessment and Accreditation Council (NAAC) and National Board of Accreditation (NBA), if any.

    The Core Committee identified a set of measurable parameters to be used as metrics for ranking the institutions. These parameters were grouped into five major headings. The committee suggested the wieghtages to be assigned to various groups of parameters in the case of institutions of engineering education and left the task of carrying out similar exercises for institutions of other disciplines to other competent agencies. The initial draft of the report was prepared by Surendra Prasad, Chairman, National Board of Accreditation and Member of the Core Committee.

    The University Grants Commission constituted an Expert Committee on 9 October 2015 to develop a framework for the ranking of universities and colleges in India and the framework developed by this Expert Committee has been incorporated into NIRF. The Core Committee also suggested a framework for ranking institutions offering management education also. The All India Council for Technical Education developed parameters and metrics for ranking institutions offering pharmacy education and also architecture education.

    Recommendations of the Core Committee

    The following are some of the recommendations of the Core Committee:

  • The metrics for ranking of engineering institutions should be based on the parameters agreed upon by the Core Committee.
  • The parameters have been organized into five broad heads or groups and each group has been divided into suitable sub-groups. Each broad head has an overall weight assigned to it. Within each head, the sub-heads should also have appropriate weight distributions.
  • A suitable metric has been proposed which computes a score under each sub-head. The sub-head scores are then added to obtain scores for each individual head. The overall score is computed based on the weights allotted to each head. The overall score can take a maximum value of 100.
  • The Committee recommended the classification of institutions into two categories:
  • Category A institutions: These are institutions of national importance set up by Acts of Parliament, State Universities, Deemed-to-be Universities, Private Universities and other autonomous institutions.
  • Category B institutions: These are institutions affiliated to a University and do not enjoy full academic autonomy.
  • Engineering, management, pharmacy and architecture institutions

    The approved set of parameter groups and the weightages assigned to them in respect of institutions offering programmes in engineering, management, pharmacy and architecture are given in the following table.

    Universities and colleges

    The approved set of parameter groups and the weightages assigned to them in respect of universities and colleges are given in the following table.

    Rankings in 2016

    The results of the first round of rankings were released on 4 April 2016. Since there were major inconsistencies in data relating to Category B institutions in all domains, no rankings were announced for Category B institutions in 2016. Also, since there were not enough participation in the domains of Architecture and General Degree Colleges, no rankings were announced for these.

    Criticism of the Rankings in 2016

    1. "There was no cross-verification of data before announcing the ranking. The data used for evaluation were submitted by the institutions themselves and the responsibility for accuracy and authenticity of the data lies with the respective institutions."
    2. "The stated intent of the government was to prepare India-centric ranking parameters that were sensitive to metrics such as access to higher education and social inclusion. Interestingly, the weightage given to India-specific parameters is not pronounced."
    3. "The IITs have chosen to participate in the rankings under the “engineering” category. They should have competed under the category of “universities” ".
    4. Institutions devoted to specific disciplines like Institute of Chemical Technology is ranked along with multidisciplinary universities like JNU/BHU.

    References

    National Institutional Ranking Framework Wikipedia