Go to top of page

Risk Assessment Framework Consultation: Summary Report

24 January 2020

List of acronyms and abbreviations

AQF Australian Qualifications Framework
EFTSL Equivalent Full Time Student Load
ESOS Act Education Services for Overseas Students Act 2000
ESPSE Education Services (Post-Secondary Education) Award 2010
FTE Full Time Equivalent
GOS Graduate Outcomes Survey
HDR Higher Degree Research
HEIMS  Higher Education Information Management System
HESF Higher Education Standards Framework (Threshold Standards) 2015
PIR Provider Information Request
QILT Quality Indicators for Learning and Teaching
SES Student Experience Survey
TAFE Technical and Further Education
TCSI  Transforming the Collection of Student Information
TEQSA Tertiary Education Quality and Standards Agency
TEQSA Act Tertiary Education Quality and Standards Agency Act 2011
USI Unique Student Identifier
VET Vocational Education and Training

 

Executive summary

This document provides an account of the consultation process that took place; summarises stakeholder feedback on the Risk Assessment Framework; and communicates TEQSA’s focus areas and forthcoming plans to the sector.

In May 2019, the Tertiary Education Quality and Standards Agency (TEQSA) embarked on a process of consulting with the higher education sector on TEQSA’s Risk Assessment Framework (‘the Framework’). The current framework has been in place since 2014, and accordingly, it is timely for TEQSA to consider ways in which the Framework can be improved to effectively and comprehensively reflect the nature and operations of higher education providers.

The consultation process commenced in May 2019 with a critical friends consultation, followed by a consultation paper that was released in July 2019, and finally, a series of workshops held across five cities—Melbourne, Perth, Brisbane, Adelaide, and Sydney. The workshops were well attended, with 271 representatives from 141 higher education providers and key peak bodies. TEQSA also received 17 written submissions representing various parts of the sector—independent providers, faith-based providers, pathway providers, TAFEs, and universities.

Through the consultation process, TEQSA asked stakeholders to consider: the effectiveness of the Framework and the risk indicators; areas for improvement to the Framework; whether the Framework should be universal or differentiated; the use of provider context and regulatory history; the publication of provider performance data and the risk thresholds; and sector risks.

Broadly, stakeholders were supportive of the overall structure of the Framework and the key risk areas within the Framework. However, some of the recurring feedback received included the consideration of more contextual factors that impact risk indicators such as Attrition, Graduate Destinations, and Student-Staff Ratio. The issue of data lag in the risk assessments also featured prominently throughout the workshops. Stakeholders also proposed potential new risk indicators, suggested revised methodologies for existing risk indicators, and gave their insight into the risks faced by the sector today. Moreover, stakeholders requested for more guidance from TEQSA on the types of information and contextual factors that are taken into account in the risk assessments.

While some stakeholder comments may not feature in this summative document, TEQSA has reviewed the feedback it has received and will be consider them in planning and implementing the changes to the Framework. TEQSA will endeavour to do so in a fair and transparent manner that does not deviate from the design principles set out in the Consultation Paper. TEQSA aims to finalise the next version of the Framework and communicate changes to the existing version in the first half of 2020.

Risk Assessment Framework consultation process

TEQSA’s consultation process consisted of three key phases as outlined below:

Critical Friends Consultation (May 2019)

In May 2019, TEQSA engaged with a number of critical friends across the sector to help shape the format, content, and focus of the risk assessment workshops. These critical friends represent various parts of the sector, such as independent providers, faith-based providers, pathways, TAFEs, and universities. The critical friends consultation process helped TEQSA in determining the approach taken for the risk assessment workshops.

Consultation Paper (July 2019)

TEQSA released a Risk Assessment Consultation Paper on 15 July 2019 to consult the higher education sector on the approach to risk assessments. Seven questions were posed to the higher education sector. TEQSA encouraged stakeholders to respond to these questions, and/or present other relevant considerations for TEQSA to factor into its future planning. TEQSA received 17 written submissions during the consultation period. These stakeholders have presented relevant and important considerations for TEQSA as it implements the next iteration of the Framework.

Risk Assessment Consultation Workshops (July – September 2019)

TEQSA hosted a series of Risk Assessment Consultation Workshops across Melbourne, Perth, Brisbane, Adelaide and Sydney between July and September 2019. All registered higher education providers in Australia were invited to attend. The workshops were well-attended by representatives from over 80% of Australian higher education providers. Each workshop had at least one Commissioner present to hear first-hand from, and engage with participants. The workshop began with an overview of the Framework, followed by interactive sessions where participants provided individual feedback on the existing Framework and group feedback on emerging sector risks. This process provided a platform for participants to deepen their understanding of TEQSA’s current risk assessment approach and process and allowed for open-ended dialogue between providers and TEQSA on the ways it can further enhance its monitoring and assessment of risk.

Response from the sector

TEQSA has consolidated the feedback it received from written submissions and the consultation workshops. A summary of the feedback has been included in the following sections of this report, followed by next steps.

The utility of the Framework

During the consultations, stakeholders were asked to comment on whether the existing risk indicators are fit for purpose. Based on the feedback received, stakeholders indicated broad support for TEQSA’s annual risk assessment process. The presence of this process was seen as an indication of the maturity of TEQSA as a regulator. A current version of the Framework can be found on TEQSA’s website.

Stakeholders encouraged TEQSA to continue engaging and collaborating with providers, and appreciated how the support of case managers and TEQSA’s knowledge of providers contributed to the consideration of provider contexts in the risk assessment process. However, there were many suggestions that TEQSA could better map and communicate how each risk indicator relates to the Higher Education Standards Framework (Threshold Standards) 2015 (HES Framework) and to the National Code of Practice for Providers of Education and Training to Overseas Students 2018 (National Code).

Universal versus differentiated Frameworks

Stakeholders were asked whether TEQSA should maintain a universal risk assessment framework for all providers, or develop differentiated risk assessment frameworks. Generally, there were mixed views on whether the Framework should be universal or differentiated.

TEQSA received a considerable number of comments indicating a preference to see the Framework and risk thresholds differentiated by provider characteristics. The main differentiating characteristics included provider size and mission, fields of education, regional status, equity group profile, proportion of overseas students, and Australian Qualifications Framework (AQF) levels. Moreover, some stakeholders suggested that a ‘one size fits all’ approach to sector regulation ‘disadvantages groups of providers’, and that a more nuanced and contextualised approach to determining risks to students is needed.

  • “All [providers] are different and have their own nuances.”
     
  • “…no one size fits all. Risk that apply to certain organisations are not relevant to others.”

Some stakeholders supported having a universal Framework for all higher education providers on the basis that the AQF, HES Framework, Education Services for Overseas Students (ESOS) Act, and National Code are universally applied across the sector. There were also some suggestions that it would be helpful to have a universal framework as a baseline, with allowance made for consideration of provider context and responses, and a transparent rationale for this.

  • “The [Framework] needs to be the same given that there is one AQF, HES Framework, ESOS Act, and National Code.”
     
  • “The Framework [needs to be] universal for consistency and comparability.”

Other suggestions included: taking an integrated approach that has a mixture of universal and differentiated elements; differentiating the risk indicator weightings for different provider groups; allowing for more flexibility in the interpretation of the HES Framework for small or niche providers; and reviewing calculations to ensure the validity of the risk indicators for all provider types and delivery models.

Publishing the risk thresholds

TEQSA had mixed feedback on whether it should publish the risk thresholds or maintain confidentiality of this information.

The main rationale from stakeholders in support of publishing the thresholds centred upon the need for transparency and communication with the sector. Comments also suggested it would facilitate benchmarking, provide guidance on TEQSA’s risk appetite, assist providers in target-setting, self-assessment and with monitoring their own risks. There were also comments indicating it would facilitate providers’ ability to focus resources efficiently, better understand and proactively address performance, and synchronise risk management practices.

  • “Communicating the risk thresholds… to providers would be useful and [help focus] strategies and discussions.”
     
  • “The thresholds need to be published to provide guidance to providers around TEQSA’s risk appetite.”

Proponents of maintaining the confidentiality of risk thresholds argued that publishing risk thresholds would draw attention to the quantitative component of the risk assessments and undermine the qualitative aspects such as provider context, regulatory history, and provider responses to the draft risk assessments. Moreover, there were concerns that this would inhibit an open and constructive dialogue between TEQSA and providers, and lead providers to focus solely on the thresholds rather the underlying processes that support continuous improvement. There were also concerns of gaming and creating league tables based on purely quantitative and outdated data without contextual moderation.

As an alternative, some stakeholders proposed the publication of value ranges rather than thresholds, and for the risk assessment to indicate where a provider sits within the ‘risk band’. Other proposed solutions also include more nuanced risk categories (moderate-low/moderate-high), and introducing best practice information as a way to facilitate benchmarking and providing general guidance on how close or far a provider is from the thresholds.

  • Publishing the TEQSA risk thresholds will lose ‘continuous improvement’ and lead to gaming, and would miss the key analysis done by TEQSA.”
     
  • “Publishing the thresholds may lead to changes in good risk management by focussing on ticking boxes rather than quality.”

Publishing provider-level data

In relation to the publication of provider-level risk data, there was overwhelming support for maintaining confidentiality. Many stakeholders stated that publishing provider-level data was ‘unnecessary’ and cited reasons similar to those provided for retaining the confidentiality of the thresholds. For example, stakeholders highlighted the risk of oversimplified league tables and ‘pseudo rankings’ that are misused by competitors, the resulting reputational risks faced by providers, and attention directed solely towards to quantitative data which makes up only one component of the risk assessments.

Only a minority of feedback received indicated a preference for publishing provider performance data. There were requests that if TEQSA proposed to publish performance data, that the accuracy of data be validated and confirmed by providers prior to publication.

The importance of provider context

Contextual factors as input

Overall, stakeholders supported TEQSA’s approach of incorporating contextual factors and provider responses prior to finalising the risk assessments. Most stakeholders affirmed the need for a more nuanced and contextualised approach to assessing risks.

To enhance transparency, stakeholders also requested that TEQSA release further guidance on how it takes contextual factors into account when conducting risk assessments.

Moreover, a considerable number of stakeholders provided interrelated comments on the Graduate Destination indicator. Stakeholders noted how factors such as industry and labour market conditions, local employment rates, course type, field of education, pathway providers, graduate demographics, and regional status may affect a provider’s graduate destination rate. Some of these aspects are external factors beyond a higher education provider’s control. Moreover, stakeholders commented that the current definition is incompatible with the nature and reality of employment as it does not take into account students working on a part-time, contract, or self-employed basis.

Suggestions for a revised definition included contextualisation of local employment rates, regional or metropolitan profiles, field of study employment rates, graduate labour market location and economic conditions. Stakeholders specialising in pathways to further study were also highlighted as requiring specific consideration in relation to the Graduate Destinations indicator.

In relation to student experience, some stakeholders suggested that TEQSA could better take into account context by factoring proportions of enrolments by study area into its risk indicator(s) for student experience.

Other comments on how TEQSA can consider contextual factors included: having pre-assessment discussions with providers or ‘ongoing’ risk assessments; ensuring TEQSA’s own internal consistency, including case manager continuity; considering provider change trajectory and improvements over time; allowing more time for providers to respond to risk assessments; and for TEQSA to take into account emerging trends and shifting expectations.

Risk indicators for student profile, performance and outcomes

Overall, there was broad support for the main areas of focus in relation to student performance and outcomes. However, there were common themes that emerged from the consultation process on areas for further improvement.

Attrition

TEQSA received feedback noting issues with the current use of raw attrition data and the recommendation to use adjusted attrition. Some stakeholders saw the use of adjusted attrition as a mechanism to alleviate circumstances where a student’s withdrawal from a course due to factors beyond the provider’s control would contribute to the provider’s attrition statistics. Stakeholders acknowledged existing circumstances that have prohibited the feasibility of TEQSA using adjusted attrition, and supported a move to using this data following the expansion of the Unique Student Identifier (USI) to include all higher education students.

There was also a relatively large number of comments requesting that TEQSA track attrition across all years of coursework rather than only in the commencing year. This would ensure that the attrition calculation suits all types of academic periods and takes into account reasons for student attrition and cases where students are on a leave of absence.

Student or graduate satisfaction

Stakeholders indicated broad support for either adding student satisfaction as an indicator to complement the existing graduate satisfaction indicator, or to replace the Graduate Satisfaction indicator. It was broadly accepted that student satisfaction would provide a more timely and accurate reflection of student experience, and enrich the existing data available.

Furthermore, a few responses noted that student and graduate perceptions of teaching are not the only measures of student experience. Other factors highlighted as influencing student experience included support for transition to tertiary education, pastoral care, literacy support, student support and representation services. There was a suggestion that TEQSA could consider having an indicator for student outcomes which combines several weighted factors.

Graduate destinations

Stakeholders highlighted issues with the current Graduate Destinations indicator, including the significant variation in graduate employment outcomes across different fields of education, the lack of control or influence on survey response rates, as well as the perception that the current definition of the indicator does not capture emerging employment arrangements. Some stakeholders believed that this measure is ‘biased toward domestic students’. Apart from the contextual factors proposed previously (page 7), stakeholders suggested that TEQSA could consider using the overall employment rates from the Graduate Outcomes Survey (GOS) Quality Indicators for Learning and Teaching (QILT), or modify the existing definition to factor in graduates in forms of employment other than full-time or undertaking further study. One submission also suggested that TEQSA could consider incorporating employer satisfaction, regional employment outcomes and graduate starting salary into the Framework.

Other indicators

Some stakeholders contended that the view of growth in student load as a risk creates a dilemma for providers who have been approved for accreditation based on their supplied business case and are pushed for business growth by their stakeholders.

A number of stakeholders also highlighted that the Completions indicator was not meaningful in and of itself and it is an indirect outcome of other indicators such as Student Load and Progress. These stakeholders suggested that TEQSA could better reflect the relationship between these indicators.

Finally, some stakeholders pointed to the engagement of students and student unions and to align TEQSA’s approach with the proposed model under the Final Report for the Government’s performance-based funding model for universities.

Risk indicators for academic staffing profiles

During the consultation process, TEQSA posed the question of whether its risk indicators for academic staffing were sufficiently robust, and whether there are other measures that might be more suitable to monitor risk.

Senior Academic Leaders

A significant number of stakeholders indicated they were broadly comfortable with TEQSA’s existing definition of senior academic leaders. However, there were also a small handful of stakeholders that expressed concerns over the indicator through the written submissions and workshops. One such view was that the indicator does not provide an appropriate reflection of academic governance.

Other stakeholders questioned the correlation between senior academic leaders and teaching quality, and suggested that TEQSA should consider only including senior academic leaders that dedicate a minimum proportion of their time to teaching activities as compared to research. There was a suggestion that the proportion of senior academics as a percentage of total academic staff, contextualised by the size of enrolments, would be a better measure of balance. Finally, some suggested that TEQSA could consider relating the definition of a ‘senior academic leader’ to the types of leadership roles within an institution (e.g. role function) in addition to, or instead of academic staff level.

  • “It is difficult to identify a simple staffing metric that measures academic governance effectively and therefore the evidence relating to academic engagement, structures, committees, policies…”
     
  • “TEQSA [should] consider whether both job classification and role function information would allow for better benchmarking across the sector.”

Student-to-staff ratio

The current indicator measures the total equivalent full-time study load (EFTSL) of coursework students to the total full-time equivalent (FTE) teaching staff as a ratio.

In response, stakeholders indicated that student-staff ratios vary depending on fields of education, AQF level and mode of teaching delivery. Hence, stakeholders suggested that TEQSA could better take into account contextual factors in its staffing indicators, and pointed out the fact that it is timely for the methodology of this indicator be reviewed to reflect the current nature of teaching and delivery by staff.

Some stakeholders suggested that TEQSA considers a ratio of supervision staff to higher degree by research students and/or honours students as this is currently excluded from the metric; indicators for risk ‘culture’ and staff engagement; and a student-to-staff ratio that includes support staff FTE.

  • “[There are] complaints from Higher Degree by Research (HDR) students that their supervisors have too high a workload and are thus unable to supervise adequately.”
     
  • “Student-staff ratio needs to be considered in the context of rapidly changing technological advances in teaching and learning, where online tools can in fact provide much of the student support and feedback…”
     
  • “The ratio is in need of review together with the Higher Education Staff Collection as a whole which has not changed over time since its inception several decades ago.”

Casual academic staffing

A substantial number of stakeholders suggested that TEQSA’s casual academic staffing indicator is based on a traditional model of teaching, and that it should take into account the trend towards casualisation of academic staff over the years. In particular, a key focus for some stakeholders was that engaging teaching staff on a casual basis was necessary to ensure relevant industry experience and that in some circumstances it is the choice of an academic to work on a non-ongoing basis. Stakeholders in both the written submissions and workshops expressed a similar view that this indicator is problematic for the employment of industry professionals as teaching staff and does not reflect the reality of the higher education sector.

Some stakeholders offered the view that casualisation should be seen as a strategy rather than a weakness and suggested that this indicator focus on the provider’s course offerings and the duration of casual employment i.e. where a casual academic has been repeatedly employed over a prolonged period of time rather than the percentage of casual staff.

However, some stakeholders argued in favour of retaining the indicators for Student-Staff Ratio and Casual Academic Staffing, citing evidence that over-reliance on insecure labour can drive down the quantum and quality of teaching experienced by students. For instance, when casual academic staff are allocated an unrealistic timeframe to mark student assessments, they either have to ‘work for free’ to give each assessment the appropriate amount of attention, or reduce the time spent marking each assessment.

  • “[TEQSA] needs to… collect data on casual staff who are consistently employed on a long-term basis.”
     
  • “Consider the use of industry or clinical specialists as casual staff.”
     
  • “Over-reliance on insecure labour can drive down the quantum and quality of teaching experienced by each student.”

Risk indicators for financial performance and capability

Currently, TEQSA employs two financial risk indicators—financial viability and financial sustainability, which measure short term and longer term financial health, respectively. TEQSA asked the sector how changes can be made to the current financial indicators for further enhancement, and if there are any other financial measures it should consider in its financial analysis without significantly increasing the reporting burden on providers. A large number of stakeholders indicated a view that TEQSA’s current financial indicators are fit for purpose.

However, TEQSA also received suggestions that the financial risk indicators need to better account for various business models, sizes and strategic financial goals. Some stakeholders suggested that TEQSA should consider revising its ‘revenue concentration’ sub-indicator to account for providers that have a high proportion of overseas students or which have a focus on teaching in courses where there is a stable demand from students, and that TEQSA re-assess its risk tolerance for providers that lease their premises. In addition, some stakeholders suggested that the financial risk indicators need to take into account third party arrangements, inflation, Consumer Price Index (CPI), and market diversification among students. Another stakeholder questioned the current formulation of the ‘change in employee benefits ratio’.

In order to better account for provider context, some stakeholders proposed that TEQSA could conduct benchmarking by assessing financial performance within bandings relevant to different types of providers based on size, and compare individual providers to the average of similar providers within an appropriate band.

In terms of transparency and enhanced communication, some stakeholders requested that TEQSA provide more details of its financial sub-indicator weightings. TEQSA also received feedback suggesting it could consider creating additional risk indicators that measure: the extent to which average return on net assets under provider control exceeds the rate of inflation; a gearing ratio (total debt less free cash divided by Equity); and interest cover ratio (EBITDA divided by Interest Expense).

  • “A limited source base of students can be an indication of risk factor to the financial viability of some institutions. Again, context is important…”

Other considerations

Data lag

The issue of data lag was consistently raised by the sector across both the workshops and the written submissions. A substantial number of stakeholders regarded the data lag as one of the shortcomings of the current risk assessments and requested for this to be addressed in the future Framework. By and large, stakeholders also acknowledged the constraints which TEQSA operates in, given the time needed for the data to be collected, verified, and consolidated. Nonetheless, TEQSA acknowledges the importance of timely data and is committed to improving the timeliness of the data within the risk assessments. It is expected that this improvement will be reflected with the streamlining of the Provider Information Request (PIR) collection into the Transforming the Collection of Student Information (TCSI) project. An expected outcome of the TCSI project will be alignment and integration between TEQSA’s data collection with that of the Department’s.

Regulatory history

In response to the issue of how regulatory history should be weighed in the risk assessments, stakeholders encouraged TEQSA to maintain consistency between risk assessments and the regulatory history of providers; to consider the provider’s regulatory history in tandem with its context; and to avoid duplication and over-regulation by better utilising existing information from other regulatory agencies.

In other instances, stakeholders urged TEQSA not to base the risk assessments on the complaints received, and requested further guidance and transparency on the kinds of intelligence that feed into the risk assessments. Some stakeholders also raised the issue of the currency of regulatory history and queried whether there will be a point when adverse regulatory history is no longer deemed as relevant for the risk assessments. As such, stakeholders requested that TEQSA only consider more recent regulatory decisions. Other suggestions also included better coordination and use of information horizontally (across teams within TEQSA) and longitudinally (over time), the use of more up-to-date regulatory information, and disregarding conditions that have been revoked.

Proposed risk indicators

Apart from the aforementioned risk indicators, stakeholders also suggested that TEQSA develop risk indicators for, or consider the following areas:

  • third party staff and third party arrangements
    • incorporating third party staff data would further enhance the consistency of data used in the risk assessments
  • academic research and development
  • further use of QILT data
    • exploring the use of data from other QILT focus areas, such teaching quality and student support
  • academic governance
  • academic integrity
  • source country reliance and concentration
  • information and data security
  • provider’s own risk management strategies
  • articulation rates for pathway providers
    • use tracer data from partner institutions.

In addition, stakeholders also requested that TEQSA provide more guidance and recommendations to manage the risks identified in the provider risk assessments.

Sector risks

At the workshops, TEQSA asked stakeholders to consider ongoing or emerging risks to the sector. While this document provides a summative report of the feedback received in relation to the Framework, the following section provides a brief account of the main themes that emerged in relation to this topic. TEQSA has ongoing initiatives in relation to some of the following risks, and if not, it could consider these issues separately from the redevelopment of the Framework.

Figure 1 (below) provides an overview of the risks that were raised by workshop participants. These issues span across both the providers’ internal operations and external regulatory landscape and cover themes including organisational management and governance, operational challenges, education internationalisation, students, teaching and learning activities, regulatory activities and public funding arrangements. More specifically, the most common issues identified are academic integrity, non-genuine students, cost recovery, cyber/data security, reliance on international students, and contract cheating.

Figure 1. Sector risks nominated by workshop participants

Risk Assessment Consultation Summary Report - Figure 1 - internal and external risks

Given the interconnectedness and complexity of these issues, stakeholders acknowledged that there are no simple solutions and that addressing these challenges requires cohesive efforts and close collaboration between key stakeholders. As such, stakeholders proposed the following solutions or mitigation strategies to address the identified risks:

  • share good practice and provide guidance
  • TEQSA as sector advocate
  • revise risk assessment framework
  • intra-agency/professional bodies coordination
  • focussed monitoring
  • consider more contextual factors
  • thematic assessment
  • enhance TEQSA’s process
  • agent regulation
  • government policy change
  • initiatives and interactive events
  • change costing model
  • use more recent data
  • reporting and process alignment
  • regulation
  • framework for self-accrediting institutions
  • consider other intelligence
  • professional standards to measure quality
  • better manage material change.

Generally, stakeholders see TEQSA being well-positioned to share good practice, provide guidance on actions that can be taken, and act as a sector advocate. Other suggestions supported TEQSA’s current revision of the Framework, urged for more frequent intra-agency/professional bodies coordination, more consideration of contextual factors impacting providers’ activities, conducting thematic assessments across the sector, further enhancing TEQSA’s processes, and for TEQSA to play a more significant role in the regulation of higher education agents.

Next steps

As part of the consultation process, TEQSA’s risk assessment team has also completed internal consultations on the Framework with the agency’s assessment teams.

In the coming months, TEQSA will consider ways to implement the changes in a fair and transparent manner that will be consistent with the design principles set out in TEQSA’s consultation paper. Consideration will also need to be given to the data aggregation processes required to implement any changes. In particular, it is expected that there will be a distinction between changes that can be made in the immediate to near future, and changes that can only be implemented in the longer term.

At this stage, TEQSA expects that some areas of key focus for the next version of the Framework (among others) will be:

  • refining the risk assessment process by considering whether providers should be engaged earlier in the process
  • refining and revising existing risk indicators, taking into account feedback received, where feasible from a technical perspective and where the fairness and integrity of the process is maintained
  • revising the current presentation of risk assessment reports, to ensure their comprehensiveness and utility to providers (including through the potential incorporation of provider benchmarking).

TEQSA plans to finalise the next version of the Framework to communicate changes from the existing version to the sector in the first half of 2020. Taking into account the time required for planning and implementation, it is expected that the next risk assessment cycle, which will be based on the revised Framework, will commence in the fourth quarter (Q4) of 2020.

TEQSA further notes that in the longer term, it will consider how it can respond to feedback received while taking into account data aggregation processes, including:

  • the streamlining and integrating of TEQSA’s PIR data collection into the broader higher education dataset as part of the TCSI project
  • the roll-out of the USI currently only available to VET and domestic students, to include all higher education students.

Further updates to the Framework will be made available on this website.

References

Appendix 1: List of Submissions to Consultation Paper

Universities Australia
Independent Higher Education Australia
Melbourne Institute of Technology
William Angliss Institute
Council of Australian Postgradaute Associations Incorporated
Regional Universities Network
Torrens University and Think: Colleges
Australian Catholic University
Deakin University
Edith Cowan University
University of Adelaide
University of Queensland
Western Sydney University
The University of Notre Dame Australia
The University of Western Australia
C5C Group Pty Ltd
John Loxton1

Appendix 2: List of Consultation Workshop Attendees

Australian Institute of Higher Education
Academies Australasia Polytechnic
The Australian Council for Educational Research
Acknowledge Education
Adelaide Central School of Art
Adelaide College of Divinity
Adelaide Institute of Higher Education
Academy of Information Technology
Alphacrucis College
Asia Pacific International College
Australian Chiropractic College
Australian Campus of Physical Education
Australian Catholic University
Australian College of Nursing
Australian College of Physical Education
Australian College of Theology
Australian Film, Television and Radio School
Australian Guild of Music Education
Australian Institute of Business
Australian Institute of Management Education and Training.
Australian Institute of Music
Australian Institute of Professional Counsellors
Australian Institute of Police Management
Batchelor Institute of Indigenous Tertiary Education
Bond University
Box Hill Institute
Campion College Australia
Canberra Institute of Technology
Carnegie Mellon University
Centre for Pavement Engineering Education
Charles Sturt University
Chisholm Institute
Christian Heritage College
CIC Higher Education
Collarts
Central Queensland University
Crown Institute of Higher Education
Curtin College
Curtin University
Deakin University
Eastern College Australia
Edith Cowan College
Edith Cowan University
Elite Education Institute
Endeavour College of Natural Health
Engineering Institute of Technology
EQUALS International
Eynesbury
Federation University
Flinders University
Gateway Business College
Governance Institute of Australia
Griffith College
Griffith College (Navitas)
Griffith University
Higher Education Leadership Institute
Holmes Institute
ICHM
International Institute of Business and Technology (Australia)
Independent Tertiary Education Council Australia
Insearch Limited
Institute of Health and Management
Institute of Internal Auditors-Australia
International College of Hotel Management
ISN Psychology Pty Ltd
James Cook University
JMC Academy
Kaplan Australia
Kent Institute Australia
King's Own Institute
La Trobe University
LCI Education
Le Cordon Bleu
Macleay College
Macquarie University
Marcus Oldham College
Melbourne Institute of Technology
Melbourne Polytechnic
Monash College
Monash University
Montessori World Educational Institute (Aust) Inc
Murdoch University
Nan Tien Institute
National Art School
National Institute of Dramatic Art
Navitas Bundoora Pty Ltd
Navitas Limited
Newcastle International College
Ozford Institute of Higher Education
Perth Bible College
Photography Studies College (Melbourne)
Polytechnic Institute Australia
Proteus Technologies Pty Ltd
Queensland University of Technology
RedHill Education
RMIT University
Russo Business School
South Australian Institute of Business and Technology
Sheridan College
Sydney Institute of Business and Technology
Southern Cross Education Institute-Higher Education
Southern Cross University
SP Jain School of Global Management
Study Group Australia Pty Limited
Swinburne University of Technology
Sydney College of Divinity
Sydney Institute of Traditional Chinese Medicine
Tabor
TAFE NSW
TAFE Queensland
TAFE SA
The Cairnmillar Institute
The College of Law Ltd
The Institute of International Studies (TIIS)
The MIECAT Institute
The Tax Institute
The University of Adelaide
The University of Newcastle
The University of Notre Dame Australia
The University of Sydney
Top Education Institute
Torrens University
Universal Business School Sydney
Universities Australia
University of Canberra
University of Divinity
University of New England
University of Queensland
University of South Australia
University of Sydney
University of Tasmania
University of Technology Sydney
University of the Sunshine Coast
University of Western Australia
University of Wollongong
University of New South Wales
UOW College Australia
Victoria University
Victorian Institute of Technology
Western Sydney University
Western Sydney University International College
William Angliss Institute
 

Appendix 3: Key Themes from Workshop Feedback

Figure 2. Critical reflection on existing risk indicators

Risk Assessment Consultation Summary Report - Figure 2 - word cloud of critical reflection on existing risk indicators

Figure 3. Regulatory history and other intelligence

Risk Assessment Consultation Summary Report - Figure 3 - word cloud of regulatory history and other intelligence

Figure 4. Contextual factors

Risk Assessment Consultation Summary Report - Figure 4 - word cloud of contextual factors

Notes

  1. Submission was made in a personal capacity, and notes that the views expressed do not represent the views of organisations that the individual is associated with.