1. Introduction
1.1. Background
Digital provision has moved from the institutional periphery to the strategic core of higher education. That shift, accelerated by the disruptions of the COVID-19 period but not reducible to them, has exposed a persistent analytical problem. Universities frequently discuss online and blended learning as though the central question were whether individual lecturers can teach differently or whether a particular platform can host courses at scale. Yet the more difficult question is institutional: how a conventional university reorganizes governance, rules, infrastructure, academic work, quality assurance, and support systems so that digital provision is not merely available but credible. The difference is decisive. A university can upload content quickly and still fail to deliver a legitimate academic experience. It can purchase an enterprise learning management system and still lack the policy, staffing, assessment design, and quality controls necessary for trustworthy online education.
1.2. Problem Statement
That problem is especially salient in African higher education. Universities across the continent operate within fast-expanding enrolment systems, sharp inequalities in devices and connectivity, uneven public financing, strong demands for employability and access, and regulatory environments that are becoming more explicit about virtual learning, internal quality assurance, and the use of artificial intelligence (AI). Resource constraints do not negate the case for digital provision. On the contrary, they intensify the need for institutionally disciplined transition because poorly sequenced digitization can increase fragility, deepen exclusion, and erode confidence in academic standards. The relevant issue is therefore not whether African universities should remain conventional or become digital in some abstract sense. It is how they can design digital provision in ways that are educationally sound, regulatorily legitimate, financially viable, and socially inclusive.
The present article addresses a gap in the literature. Research on online and blended higher education is substantial, but much of it remains fragmented. One body of work examines student satisfaction, engagement, or self-regulation in online environments. Another focuses on faculty attitudes or professional development. A third emphasizes learning management systems, learning analytics, or digital transformation as broad organizational agendas. A fourth turns to quality assurance, accreditation, or policy. Each literature is useful, yet the prevailing tendency is analytical disaggregation. Technology is treated separately from governance, governance separately from pedagogy, pedagogy separately from regulation, and regulation separately from institutional capability. This fragmentation obscures the fact that online transition is a systems problem. Universities do not move online by changing only one layer of their operation. They do so by aligning authority structures, financial commitments, infrastructure, curricula, assessment regimes, staff capability, student support, and monitoring mechanisms across time
| [1] | Ali, R., & Georgiou, H. (2025). A process for institutional adoption and diffusion of blended learning in higher education. Higher Education Policy, 38, 523–544.
https://doi.org/10.1057/s41307-024-00359-y |
| [9] | Fernández, A., Gómez, B., Binjaku, K., & Kajo Meçe, E. (2023). Digital transformation initiatives in higher education institutions: A multivocal literature review. Education and Information Technologies, 28, 12351–12382.
https://doi.org/10.1007/s10639-022-11544-0 |
| [14] | Graham, C. R., Woodfield, W., & Harrison, J. B. (2013). A framework for institutional adoption and implementation of blended learning in higher education. The Internet and Higher Education, 18, 4–14. https://doi.org/10.1016/j.iheduc.2012.09.003 |
| [18] | Jisc. (2023). Framework for digital transformation in higher education. https://www.jisc.ac.uk/guides/framework-for-digital-transformation-in-higher-education |
[1, 9, 14, 18]
.
This article therefore treats online transition not as a narrow modality question but as a problem of institutional architecture. The phrase institutional architecture is used here to denote the structured arrangement of governance, policy, people, processes, technologies, quality controls, and resource commitments that together make a mode of provision workable and legitimate. The argument is that credible online and blended provision emerges when these institutional components are deliberately coupled and sequenced. It does not emerge automatically from technological acquisition, from emergency teaching improvisation, or from isolated enthusiasm among academic innovators. The literature on digital transformation in higher education increasingly points in this direction, warning against models that equate transformation with digitization while neglecting organizational design, leadership, data governance, and long-term sustainability
| [5] | Castro Benavides, L. M., Tamayo Arias, J. A., Arango Serna, M. D., Branch Bedoya, J. W., & Burgos, D. (2020). Digital transformation in higher education institutions: A systematic literature review. Sensors, 20(11), Article 3291.
https://doi.org/10.3390/s20113291 |
| [8] | Farias-Gaytan, S., Aguaded, I., & Ramirez-Montoya, M.-S. (2023). Digital transformation and digital literacy in the context of complexity within higher education institutions: A systematic literature review. Humanities and Social Sciences Communications, 10, Article 386.
https://doi.org/10.1057/s41599-023-01875-9 |
| [37] | Wang, L. (2023). Putting digital transformation at the heart of HE systems. UNESCO.
https://www.unesco.org/en/articles/putting-digital-transformation-heart-he-systems |
[5, 8, 37]
.
Rwanda offers a useful policy-reference environment for examining these questions. It should not be treated as the whole of African higher education, but it is analytically instructive because recent Rwanda Higher Education Council guidance now addresses distance learning, virtual learning and artificial intelligence, internal quality assurance, institutional infrastructure, learning and assessment, and student support through an increasingly explicit regulatory architecture
| [24] | Rwanda Higher Education Council. (2023a). Higher education institutional infrastructure and academic standards.
https://www.hec.gov.rw/publications/guidelines |
| [25] | Rwanda Higher Education Council. (2023b). National learning, teaching and assessment policy.
https://www.hec.gov.rw/publications/policies |
| [26] | Rwanda Higher Education Council. (2023c). National student support and guidance policy. |
| [27] | Rwanda Higher Education Council. (2024). Guidelines for internal quality assurance (IQA) mechanisms for higher education. |
| [28] | Rwanda Higher Education Council. (2025). Guidelines and assessment tools for the virtual learning and the use of artificial intelligence in Rwanda's higher learning institutions. |
| [29] | Rwanda Higher Education Council. (2026). Guidelines and assessment tools for distance learning. |
[24-29]
. This indicates that digital transition in African higher education is no longer merely an internal managerial preference. It is becoming part of the formal quality and legitimacy environment in which institutions operate.
Figure 1 summarizes the sequence through which Rwanda’s policy architecture for digital and virtual higher education became progressively more explicit between 2023 and 2026.
Comparable pressures are evident elsewhere on the continent. South African studies point to infrastructural inequality, funding constraints, and uneven institutional support as recurring barriers to credible digital transition
| [21] | Mabidi, N. (2024). A systematic review of the transformative impact of the digital revolution on higher education in South Africa. South African Journal of Higher Education, 38(3). https://doi.org/10.20853/38-3-6366 |
| [30] | Sanders, D. A., & Mukhari, S. S. (2024). The perceptions of lecturers about blended learning at a particular higher institution in South Africa. Education and Information Technologies, 29, 11517–11532.
https://doi.org/10.1007/s10639-023-12302-6 |
[21, 30]
, while evidence from a resource-constrained East African higher education setting shows that access, awareness, and lecturer capability shape student uptake of digital provision
| [33] | Tulinayo, F., Ssentume, P., & Najjuma, R. (2018). Digital technologies in resource constrained higher institutions of learning: A study on students' acceptance and usability. International Journal of Educational Technology in Higher Education, 15, Article 36. https://doi.org/10.1186/s41239-018-0117-y |
[33]
. Rwanda is therefore treated here as a policy-reference environment within a broader African pattern rather than as a proxy for the continent.
Figure 1. Rwanda’s emerging policy architecture for digital and virtual higher education, 2023–2026. The timeline summarizes the sequence of Higher Education Council policy and guidance documents cited in the manuscript and shows the increasingly explicit regulatory environment surrounding infrastructure, teaching and assessment, student support, internal quality assurance, virtual learning, artificial intelligence, and distance learning.
1.3. Study Objective and Research Questions
Against that background, the objective of the article is to develop a university-wide framework for bringing conventional institutions online in African higher education. The central research question is: What institutional architecture is required to move a university from conventional face-to-face delivery to credible, quality-assured online or blended provision in African higher education? Four subsidiary questions guide the analysis: which institutional components recur most consistently across the literature; how those components interact; what sequencing patterns and bottlenecks characterize successful and unsuccessful transitions; and how these insights can be translated into a practical toolkit for institutional planning.
1.4. Study Contribution
The article makes three contributions. First, it synthesizes heterogeneous evidence that is usually discussed in separate scholarly and policy conversations. Second, it proposes an original framework, the Institutional Architecture for Credible Digital Transition, that explains digital transition as the interaction of contextual conditions, steering mechanisms, operational domains, implementation stages, and feedback loops. Third, it translates that framework into a companion University Online Readiness and Transition Toolkit containing a readiness rubric, phased roadmap, and policy checklist that can support institutional leaders, regulators, and digital-learning directors. By proceeding in this way, the article seeks to contribute both to scholarship on higher education transformation and to the practical governance of online and blended provision in African contexts.
2. Literature Review and Analytical Framework
2.1. Key Concepts and Working Definitions
Online transition is frequently described with terms that are analytically adjacent but not interchangeable. A conventional institution refers here to a university whose dominant delivery model, governance routines, quality controls, staffing assumptions, and student support systems were designed primarily for face-to-face provision. Online provision denotes programmes or courses in which teaching, learning, communication, assessment, and support are delivered predominantly through digital environments. Blended provision refers to structured combinations of face-to-face and online learning activities within an intentionally designed pedagogical and institutional model rather than an improvised mixture of modalities. Digital transformation is broader than either. It concerns the reconfiguration of processes, structures, competencies, and decision-making through digital means across the institution as a whole, not merely in the classroom
| [5] | Castro Benavides, L. M., Tamayo Arias, J. A., Arango Serna, M. D., Branch Bedoya, J. W., & Burgos, D. (2020). Digital transformation in higher education institutions: A systematic literature review. Sensors, 20(11), Article 3291.
https://doi.org/10.3390/s20113291 |
| [9] | Fernández, A., Gómez, B., Binjaku, K., & Kajo Meçe, E. (2023). Digital transformation initiatives in higher education institutions: A multivocal literature review. Education and Information Technologies, 28, 12351–12382.
https://doi.org/10.1007/s10639-022-11544-0 |
| [18] | Jisc. (2023). Framework for digital transformation in higher education. https://www.jisc.ac.uk/guides/framework-for-digital-transformation-in-higher-education |
[5, 9, 18]
.
Two further concepts require precision. Online readiness is not the possession of hardware or a learning management system (LMS) in isolation. It is the extent to which a university has the strategic, regulatory, infrastructural, pedagogical, financial, and organizational capabilities needed to deliver and continuously improve digital provision. Credibility, in turn, is used in a thick institutional sense. It includes regulatory compliance, academic legitimacy, dependable operations, fair and secure assessment, inclusive learner support, and the ability to withstand scrutiny from students, regulators, employers, and peer institutions. This matters because some institutions achieve digital availability without digital credibility. They can host teaching online but cannot yet demonstrate that quality, standards, and support remain intact.
2.2. Technology-led and Institution-led Explanations of Digital Change
The literature reveals a recurring tension between technology-led and institution-led accounts of change. Technology-led narratives present digital transition as the diffusion of platforms, analytics, AI tools, or communication systems into existing university structures. Institution-led accounts argue that tools matter, but only as part of broader transformations in governance, workflows, capability, and quality culture. Systematic and multivocal reviews of digital transformation in higher education consistently conclude that higher education institutions have often pursued fragmented initiatives rather than holistic redesign, with strategy, human capability, and process alignment lagging behind technological adoption
| [5] | Castro Benavides, L. M., Tamayo Arias, J. A., Arango Serna, M. D., Branch Bedoya, J. W., & Burgos, D. (2020). Digital transformation in higher education institutions: A systematic literature review. Sensors, 20(11), Article 3291.
https://doi.org/10.3390/s20113291 |
| [8] | Farias-Gaytan, S., Aguaded, I., & Ramirez-Montoya, M.-S. (2023). Digital transformation and digital literacy in the context of complexity within higher education institutions: A systematic literature review. Humanities and Social Sciences Communications, 10, Article 386.
https://doi.org/10.1057/s41599-023-01875-9 |
| [9] | Fernández, A., Gómez, B., Binjaku, K., & Kajo Meçe, E. (2023). Digital transformation initiatives in higher education institutions: A multivocal literature review. Education and Information Technologies, 28, 12351–12382.
https://doi.org/10.1007/s10639-022-11544-0 |
[5, 8, 9]
. Jisc's framework similarly treats digital transformation as a whole-organization agenda involving leadership, investment, infrastructure, people, and process alignment rather than isolated innovation projects
. UNESCO has made the same point at the systems level by arguing that digital transformation concerns content, pedagogy, governance, and management simultaneously
.
2.3. Adoption Research and the Organizational Scaling of Blended Provision
The blended learning adoption literature offers a particularly useful bridge between classroom and institutional analysis. Prior work shows that institutional adoption follows identifiable stages and depends on the coordinated development of strategy, structure, and support
| [1] | Ali, R., & Georgiou, H. (2025). A process for institutional adoption and diffusion of blended learning in higher education. Higher Education Policy, 38, 523–544.
https://doi.org/10.1057/s41307-024-00359-y |
| [14] | Graham, C. R., Woodfield, W., & Harrison, J. B. (2013). A framework for institutional adoption and implementation of blended learning in higher education. The Internet and Higher Education, 18, 4–14. https://doi.org/10.1016/j.iheduc.2012.09.003 |
| [15] | Graham, C. R., Woodfield, W., & Harrison, J. B. (2014). Blended learning in higher education: Institutional adoption and implementation. Computers & Education, 75, 185–195. https://doi.org/10.1016/j.compedu.2014.02.011 |
[1, 14, 15]
. Those findings remain highly relevant beyond blended learning, narrowly conceived because they illuminate the organizational conditions under which institutions move from experimentation to scaled provision. A related systematic review demonstrates that adoption is shaped not only by student and lecturer attitudes but also by administrative arrangements, policy, and implementation practice
| [2] | Anthony, B., Kamaludin, A., Romli, A., Mat Raffei, A. F., Phon, D. N. A. L. E., Abdullah, A., & Ming, G. L. (2022). Blended learning adoption and implementation in higher education: A theoretical and systematic review. Technology, Knowledge and Learning, 27, 531–578.
https://doi.org/10.1007/s10758-020-09477-z |
[2]
. What this literature suggests is that digital transition is cumulative and staged. Mature provision does not arise from a singular decision to go online. It emerges from the repeated alignment of institutional routines.
2.4. Institutional Legitimacy and Regulatory Conformity
At the same time, the literature is marked by unresolved tensions. One concerns legitimacy. Universities are organizations embedded in regulatory, professional, and reputational fields. They do not innovate in a vacuum. Institutional theory is therefore highly relevant because it helps explain why universities seek conformity with accrediting expectations, peer norms, and professional standards even while attempting local innovation
| [6] | DiMaggio, P. J., & Powell, W. W. (1983). The iron cage revisited: Institutional isomorphism and collective rationality in organizational fields. American Sociological Review, 48(2), 147–160. https://doi.org/10.2307/2095101 |
[6]
. In online transition, coercive pressures come from regulators and quality-assurance bodies, normative pressures from academic and professional communities, and mimetic pressures from the imitation of prestigious institutions' digital models. This helps explain why many universities adopt the language of digital transformation, but it also warns against the uncritical importation of models developed in high-resource contexts. Isomorphic pressures can produce legitimacy, yet they can also encourage superficial mimicry when underlying capabilities are weak.
2.5. Sociotechnical Systems and the Joint Design of Work and Technology
A second tension concerns the relationship between technical and social systems. Sociotechnical perspectives are useful here because they reject the assumption that technology can be analyzed independently of work design, decision rights, support processes, and human capability. Digital environments change not only what tools are used but how academic and administrative work is organized, how information moves, how decisions are made, and how staff and students interact. Govers and van Amelsvoort argue that digital transformation requires the joint optimization of social and technical elements rather than the dominance of one over the other
| [13] | Govers, M., & van Amelsvoort, P. (2023). A theoretical essay on socio-technical systems design thinking in the era of digital transformation. Gruppe. Interaktion. Organisation. Zeitschrift für Angewandte Organisationspsychologie, 54, 27–40.
https://doi.org/10.1007/s11612-023-00675-8 |
[13]
. This insight is especially salient for universities, where technology choices affect curriculum workflows, assessment practice, record systems, helpdesk design, library access, and data governance. An LMS can therefore not be understood as a neutral platform. It is part of a sociotechnical configuration that either supports or destabilizes institutional coherence.
2.6. Capability, Change, and Professional Support
A third tension concerns capability and change. Organizational change scholarship suggests that transition capacity depends on leadership commitment, middle-management translation, incentives, resources, and routineized learning rather than declarative strategy alone. In higher education, this is visible in evidence on professional development and staff adoption. Gao et al. show that online faculty professional development has expanded, but the field remains uneven, and technical training alone is insufficient
| [11] | Gao, Y., Wong, S. L., Md Khambari, M. N., & Noordin, N. (2022). A bibliometric analysis of online faculty professional development in higher education. Research and Practice in Technology Enhanced Learning, 17, Article 17.
https://doi.org/10.1186/s41039-022-00196-w |
[11]
. Sanders and Mukhari similarly show that lecturers' willingness to sustain blended provision depends on managerial support, reliable technology, time, and professional development
| [30] | Sanders, D. A., & Mukhari, S. S. (2024). The perceptions of lecturers about blended learning at a particular higher institution in South Africa. Education and Information Technologies, 29, 11517–11532.
https://doi.org/10.1007/s10639-023-12302-6 |
[30]
. Resistance in this context is often not hostility to innovation but a rational response to unsupported workload transfer.
2.7. Student Experience, Support, and Equity
The literature on student experience adds further caution. A systematic review indicates that learner satisfaction depends on multiple factors, yet programme quality, assessment, and learner support remain underexamined
| [22] | Martin, F., & Bolliger, D. U. (2022). Developing an online learner satisfaction framework in higher education through a systematic review of research. International Journal of Educational Technology in Higher Education, 19, Article 50.
https://doi.org/10.1186/s41239-022-00355-5 |
[22]
. Recent evidence shows that the support students receive in online learning environments influences academic performance, reinforcing the point that digital transition cannot be judged by course content delivery alone
| [12] | García-Machado, J. J., Martínez Ávila, M., Dospinescu, N., & Dospinescu, O. (2024). How the support that students receive during online learning influences their academic performance. Education and Information Technologies, 29, 20005–20029. https://doi.org/10.1007/s10639-024-12639-6 |
[12]
. In resource-constrained contexts, the importance of support is even more evident. A study in a resource-constrained higher education setting found that access, awareness, capacity, and lecturer characteristics shaped students' acceptance of digital technologies
| [33] | Tulinayo, F., Ssentume, P., & Najjuma, R. (2018). Digital technologies in resource constrained higher institutions of learning: A study on students' acceptance and usability. International Journal of Educational Technology in Higher Education, 15, Article 36. https://doi.org/10.1186/s41239-018-0117-y |
[33]
. Such findings challenge celebratory narratives that equate access to platforms with equitable participation. Rwanda-specific evidence likewise suggests a mixed readiness profile: device access and basic digital familiarity may be substantial, yet trust, accreditation concerns, and willingness to participate remain uneven, reinforcing the need to treat learner support and legitimacy as institutional conditions rather than mere technical add-ons
.
2.8. Quality Assurance, Learning Analytics, and AI Governance
Quality assurance and data governance introduce additional complexity. The European Association for Quality Assurance in Higher Education (ENQA) argues that e-learning should be assessed through existing quality standards, but with mode-specific attention to design, support, staffing, information provision, and monitoring
. Learning analytics can strengthen monitoring and improvement, yet they also raise questions of privacy, autonomy, consent, and institutional power
| [10] | Gašević, D., Tsai, Y.-S., & Drachsler, H. (2022). Learning analytics in higher education: Stakeholders, strategy and scale. The Internet and Higher Education, 52, Article 100833.
https://doi.org/10.1016/j.iheduc.2021.100833 |
| [19] | Jones, K. M. L. (2019). Learning analytics and higher education: A proposed model for establishing informed consent mechanisms to promote student privacy and autonomy. International Journal of Educational Technology in Higher Education, 16, Article 24.
https://doi.org/10.1186/s41239-019-0155-0 |
[10, 19]
. AI governance intensifies this dilemma. UNESCO's guidance on generative AI and more recent work on rights-based AI governance both emphasize human-centred regulation, transparency, data protection, and institutional preparedness rather than unrestrained deployment
. The implication is that digital transition produces new governance objects: data flows, algorithmic tools, authorship disputes, and analytics dashboards that require formal oversight if academic legitimacy is to be preserved.
2.9. Composite Analytical Framework and Guiding Propositions
These debates support a composite analytical framework built around three mutually reinforcing lenses. Institutional theory explains why digital provision must secure legitimacy in regulatory and professional fields. Sociotechnical systems theory explains why technical tools must be aligned with work processes, support structures, and human roles. Capability- and change-oriented perspectives explain why sequencing, investment, and routinization determine whether institutional ambitions become sustainable practice. Taken together, these lenses suggest that credible digital transition depends on three propositions. First, legitimacy must be designed, not assumed; governance, policy, and quality assurance are constitutive rather than auxiliary. Second, digital provision is a joint social and technical accomplishment, not a platform feature. Third, transition is staged; preparation and consolidation are as important as launch. These propositions guide the analysis that follows and ground the Institutional Architecture for Credible Digital Transition framework developed in Section 4.9.
3. Methodology
3.1. Design
This study employed a systematized integrative review combined with comparative policy analysis. The design was chosen because the research question is explanatory and framework-building rather than effect-size oriented. The relevant evidence base is heterogeneous: empirical studies, systematic reviews, theoretical texts, organizational frameworks, and official quality-assurance and regulatory documents all bear on the question of how universities move credibly online. A narrow systematic review restricted to one study type would therefore have obscured essential dimensions of the problem. The integrative review logic made it possible to synthesize diverse forms of evidence, while the comparative policy component ensured that institutional architecture was examined not only as an organizational matter but also as a question of public regulation and legitimacy
.
3.2. Search Scope, Sources, and Temporal Boundaries
The search and selection process was deliberately transparent but pragmatic. Searches were undertaken between January and March 2026, with the main publication window set from 2013 to February 2026 in order to capture the post-MOOC, post-pandemic, and AI-affected phases of digital higher education. Seminal earlier methodological and theoretical works were retained where necessary. Academic retrieval drew on Google Scholar and on publisher platforms accessible through web indexing, including ScienceDirect, SpringerLink, Nature Portfolio, Frontiers, MDPI, and PubMed/PMC. Policy and framework retrieval targeted UNESCO, Jisc, ENQA, and official Rwandan higher education sources, especially the Higher Education Council and the Ministry of Education. This approach was appropriate for a systematized review whose goal was analytical saturation across institutional domains rather than exhaustive enumeration of every course-level study.
3.3. Search Strings and Retrieval Logic
Search strings combined higher education, digital provision, and institutional architecture terms. Typical combinations included: "higher education" AND ("online learning" OR "blended learning" OR "distance learning" OR "digital transformation") AND (institution* OR governance OR strategy OR policy OR quality assurance OR assessment OR infrastructure OR faculty development OR student support OR learning analytics OR AI governance OR Africa). Additional targeted strings were used for Rwanda and regulation, such as "Rwanda higher education virtual learning guidelines", "Rwanda internal quality assurance higher education", and "distance learning accreditation Rwanda". Backward and forward citation checking was used selectively to strengthen conceptual coverage.
3.4. Eligibility Criteria
Eligibility criteria reflected the study objective. Included sources addressed higher education and spoke either directly to institution- or system-level digital transition or to a domain that becomes decisive at scale, such as assessment integrity, faculty development, learner support, or data governance. Peer-reviewed studies, systematic reviews, major organizational frameworks, and official policy or regulatory documents were included. Sources had to offer analytical, empirical, or regulatory relevance to online or blended provision. Excluded sources were K-12 focused studies, vendor marketing materials, opinion pieces without analytical substance, and narrowly course-specific papers whose findings did not travel meaningfully to institutional design. Emergency remote teaching accounts were included only when they yielded durable institutional lessons rather than descriptive crisis narratives.
3.5. Screening, Corpus Construction, and Appraisal Logic
Screening proceeded in three stages: title and abstract or webpage review, full-text or extended abstract review where available, and final relevance assessment against the study's institutional architecture focus. Duplicates and near-duplicates were removed. Analytical saturation was judged pragmatically. Screening continued until newly retrieved items repeatedly reproduced already-established institutional domains or added examples within those domains without materially changing the emerging framework or its interdependencies. Selection therefore aimed at analytical coverage across governance, infrastructure, curriculum, assessment, faculty capability, learner support, quality assurance, and data and AI governance rather than numerical exhaustiveness alone. The final corpus comprised 38 sources: peer-reviewed empirical studies and reviews, methodological and theoretical texts, international framework documents, and official policy and regulatory guidance. Rather than assigning spurious precision to heterogeneous evidence, the study used a reasoned appraisal strategy. Peer-reviewed studies were read with Mixed Methods Appraisal Tool (MMAT)-informed attention to design clarity, coherence between question and method, transparency of evidence, and relevance to institutional transition
| [17] | Hong, Q. N., Fàbregues, S., Bartlett, G., Boardman, F., Cargo, M., Dagenais, P., Gagnon, M.-P., Griffiths, F., Nicolau, B., O'Cathain, A., Rousseau, M.-C., Vedel, I., & Pluye, P. (2018). The Mixed Methods Appraisal Tool (MMAT) version 2018 for information professionals and researchers. Education for Information, 34(4), 285–291. https://doi.org/10.3233/EFI-180221 |
[17]
. Policy and framework documents were appraised for authority, currency, specificity, implementation relevance, and connection to recognized quality-assurance or governance mandates. Reporting logic was informed by methodological guidance on integrative and scoping reviews, particularly the importance of explicit eligibility criteria, transparent search logic, and clear synthesis procedures
| [3] | Arksey, H., & O'Malley, L. (2005). Scoping studies: Towards a methodological framework. International Journal of Social Research Methodology, 8(1), 19–32.
https://doi.org/10.1080/1364557032000119616 |
| [20] | Levac, D., Colquhoun, H., & O'Brien, K. K. (2010). Scoping studies: Advancing the methodology. Implementation Science, 5, Article 69. https://doi.org/10.1186/1748-5908-5-69 |
| [32] | Tricco, A. C., Lillie, E., Zarin, W., O'Brien, K. K., Colquhoun, H., Levac, D., Moher, D., Peters, M. D. J., Horsley, T., Weeks, L., Hempel, S., Akl, E. A., Chang, C., McGowan, J., Stewart, L., Hartling, L., Aldcroft, A., Wilson, M. G., Garritty, C., Lewin, S., Godfrey, C. M., Macdonald, M. T., Langlois, E. V., Soares-Weiser, K., Moriarty, J., Clifford, T., Tunçalp, Ö., & Straus, S. E. (2018). PRISMA extension for scoping reviews (PRISMA-ScR): Checklist and explanation. Annals of Internal Medicine, 169(7), 467–473. https://doi.org/10.7326/M18-0850 |
[3, 20, 32]
.
3.6. Data Extraction and Analytical Synthesis
Data extraction was guided by an analytical matrix built around six fields: source type; geographical or policy context; principal institutional domain addressed; level of analysis (macro, meso, or micro); identified dependencies or interactions with other domains; and implications for sequencing, credibility, or failure. Thematic coding was then followed by relational synthesis. In practice, this meant moving from identification of recurring institutional components to analysis of how those components interact and under what conditions they support or undermine credible digital provision. A final round of abductive synthesis was used to derive the proposed framework, testing whether the emerging model could account for both positive institutional conditions and recurrent failure points.
3.7. Limitations of the Review Design
This design has limitations. The corpus was restricted to English-language materials and to sources accessible through open web or official repositories. The study was systematized rather than fully exhaustive and did not attempt meta-analysis. It also did not generate primary data from African universities. Nevertheless, these limits are compatible with the study's objective. The purpose was not to rank interventions statistically but to construct a defensible institutional architecture from the best available conceptual, empirical, and regulatory evidence. That objective requires breadth across domains, transparent synthesis, and caution against overclaiming, all of which shaped the present analysis.
Table 1 summarizes the review’s search dimensions, typical search terms, and eligibility emphasis.
3.8. Ethical Considerations and Use of AI Tools
This study relied exclusively on secondary sources, including peer-reviewed literature, institutional frameworks, and official policy and regulatory documents available in the public domain. It did not involve human participants, interviews, surveys, experiments, or identifiable personal data; formal ethical approval and informed consent were therefore not required. During manuscript preparation, the authors used OpenAI’s GPT-5.4 Thinking model for limited research-support and editorial assistance, including support with source discovery, organizational structuring, phrasing refinement, and language polishing. All substantive decisions concerning argumentation, source selection, verification, interpretation, and final revision were made by the authors, who accept full responsibility for the content of the manuscript.
Table 1. Search and eligibility logic.
Search dimension | Focus | Typical terms | Eligibility emphasis |
Sector and modality | Higher education and forms of digital provision | higher education; online learning; blended learning; distance learning; digital transformation | University-level relevance |
Institutional architecture | Governance and operating model | governance; strategy; policy; regulation; accreditation; quality assurance | Institution or system-level explanatory value |
Operational domains | Capabilities required at scale | LMS; infrastructure; curriculum redesign; assessment integrity; faculty development; student support; data governance; AI governance | Transferable implications for scaled provision |
Context filter | African and policy-reference relevance | Africa; African higher education; Rwanda | Contextual transferability and regulatory significance |
4. Findings and Discussion
The synthesis shows broad convergence on the proposition that credible digital transition is institutional rather than merely pedagogical. The subsections that follow integrate findings and discussion across the principal domains that recur in the literature and policy corpus. Rather than treating these domains as separable checklists, the analysis emphasizes their interaction, their sequencing, and the recurrent implementation bottlenecks that arise when institutions strengthen one domain while neglecting others.
4.1. Institutional Governance and Leadership Architecture
The literature converges on a foundational point: credible digital transition begins as a governance question before it becomes a delivery question. Institutions that scale online and blended provision more successfully tend to place digital strategy under formal senior leadership authority, link it to academic planning and budget processes, and establish cross-functional structures that bridge academic affairs, information and communication technology, quality assurance, registry, finance, and student services
| [1] | Ali, R., & Georgiou, H. (2025). A process for institutional adoption and diffusion of blended learning in higher education. Higher Education Policy, 38, 523–544.
https://doi.org/10.1057/s41307-024-00359-y |
| [14] | Graham, C. R., Woodfield, W., & Harrison, J. B. (2013). A framework for institutional adoption and implementation of blended learning in higher education. The Internet and Higher Education, 18, 4–14. https://doi.org/10.1016/j.iheduc.2012.09.003 |
| [18] | Jisc. (2023). Framework for digital transformation in higher education. https://www.jisc.ac.uk/guides/framework-for-digital-transformation-in-higher-education |
[1, 14, 18]
. Where that architecture is absent, digital provision is often relegated to an e-learning unit with limited authority, producing what may be termed pilotization without institutionalization. Courses appear online, but governance routines, accountability lines, and budget frameworks remain face-to-face by default.
This point is more than managerial. Institutional theory suggests that universities must demonstrate that online provision is governed through recognizable and legitimate structures if it is to be trusted by regulators, professional bodies, and employers
| [6] | DiMaggio, P. J., & Powell, W. W. (1983). The iron cage revisited: Institutional isomorphism and collective rationality in organizational fields. American Sociological Review, 48(2), 147–160. https://doi.org/10.2307/2095101 |
[6]
. In that sense, senate approval processes, board-level oversight, academic regulations, and formal risk ownership are not bureaucratic afterthoughts. They are part of the credibility infrastructure of digital provision. Rwanda's policy environment illustrates the growing explicitness of this expectation. The Rwanda Higher Education Council (HEC) now publishes separate guidance on distance learning, virtual learning and AI, internal quality assurance, infrastructure standards, learning and assessment, and student support, indicating that digital provision is regulated across multiple institutional functions rather than treated as a purely pedagogical matter
| [24] | Rwanda Higher Education Council. (2023a). Higher education institutional infrastructure and academic standards.
https://www.hec.gov.rw/publications/guidelines |
| [25] | Rwanda Higher Education Council. (2023b). National learning, teaching and assessment policy.
https://www.hec.gov.rw/publications/policies |
| [26] | Rwanda Higher Education Council. (2023c). National student support and guidance policy. |
| [27] | Rwanda Higher Education Council. (2024). Guidelines for internal quality assurance (IQA) mechanisms for higher education. |
| [28] | Rwanda Higher Education Council. (2025). Guidelines and assessment tools for the virtual learning and the use of artificial intelligence in Rwanda's higher learning institutions. |
| [29] | Rwanda Higher Education Council. (2026). Guidelines and assessment tools for distance learning. |
[24-29]
.
Leadership architecture also shapes whether institutions adopt sustainable financing models. Many digital initiatives begin with project funds or donor-supported procurement, but fail when recurring costs become visible. A credible transition requires budget lines for platform licensing or maintenance, connectivity support, instructional design, staff development, accessibility, technical support, cybersecurity, and periodic review. The recurring theme in the literature is that underfunded transition externalizes costs to faculty through unpaid redesign labour and to students through device, data, or access burdens. Financial sustainability is therefore not ancillary to governance; it is one of its most concrete tests.
4.2. Digital Infrastructure and LMS Ecosystem Requirements
The literature strongly rejects any reduction of digital transition to LMS acquisition. An institution can possess a stable platform and still fail institutionally because the LMS is only one component in a wider ecosystem. Credible provision depends on interoperable infrastructure linking course environments, student information systems, digital identity and authentication, library access, content storage, communication tools, technical support, analytics, backup procedures, cybersecurity, and, in many contexts, power and connectivity resilience
. Where these elements are weakly connected, students experience discontinuity, staff duplicate work across systems, and quality monitoring becomes unreliable.
Evidence from resource-constrained contexts makes these dependencies particularly visible. Evidence from a resource-constrained higher education setting shows that student acceptance and usability were shaped not only by technology availability but also by awareness, capacity, access, and lecturer characteristics
| [33] | Tulinayo, F., Ssentume, P., & Najjuma, R. (2018). Digital technologies in resource constrained higher institutions of learning: A study on students' acceptance and usability. International Journal of Educational Technology in Higher Education, 15, Article 36. https://doi.org/10.1186/s41239-018-0117-y |
[33]
. A South African review likewise identifies poor infrastructure, inadequate funding, and persistent inequalities as structural barriers to digital transition
| [21] | Mabidi, N. (2024). A systematic review of the transformative impact of the digital revolution on higher education in South Africa. South African Journal of Higher Education, 38(3). https://doi.org/10.20853/38-3-6366 |
[21]
. These findings challenge universalized models of digital maturity derived from high-bandwidth environments. In many African institutions, infrastructure strategy must prioritize bandwidth sensitivity, mobile compatibility, asynchronous functionality, local caching or offline access where possible, and service models that assume intermittent connectivity rather than ideal continuous access.
By way of illustration, the constraints identified in the literature are not abstract. In South Africa, infrastructural fragility and funding pressure recur as system-level barriers to digital transition
| [21] | Mabidi, N. (2024). A systematic review of the transformative impact of the digital revolution on higher education in South Africa. South African Journal of Higher Education, 38(3). https://doi.org/10.20853/38-3-6366 |
[21]
, while lecturer-level evidence from a South African institution shows that sustained blended delivery depends heavily on managerial support, time allocation, and reliable technology
| [30] | Sanders, D. A., & Mukhari, S. S. (2024). The perceptions of lecturers about blended learning at a particular higher institution in South Africa. Education and Information Technologies, 29, 11517–11532.
https://doi.org/10.1007/s10639-023-12302-6 |
[30]
. In a resource-constrained East African higher education setting, student uptake likewise depended on access, awareness, and lecturer capability
| [33] | Tulinayo, F., Ssentume, P., & Najjuma, R. (2018). Digital technologies in resource constrained higher institutions of learning: A study on students' acceptance and usability. International Journal of Educational Technology in Higher Education, 15, Article 36. https://doi.org/10.1186/s41239-018-0117-y |
[33]
. Together, these examples underline that infrastructure strategy is inseparable from institutional support and affordability.
A further issue is technological dependency. Platform decisions can produce vendor lock-in, fragmented data ownership, or procurement obligations that exceed institutional bargaining power. For universities in low-resource and policy-evolving contexts, technological dependency is also a governance issue because it affects continuity, sovereignty over student data, and the total cost of ownership over time. Infrastructure strategy must therefore be linked to procurement policy, contract review, interoperability standards, exit planning, and data governance. Rwanda's infrastructure standards are useful in this regard because they anchor digital provision in broader institutional adequacy rather than treating technical procurement as self-validating
.
4.3. Curriculum Redesign and Pedagogical Adaptation
The literature is clear that credible online or blended provision does not result from uploading lecture notes or recording face-to-face classes with minimal redesign. Curriculum transition is a process of academic re-specification. It requires programme-level reflection on learning outcomes, contact patterns, activity design, sequencing, media choice, interaction structures, workload, and assessment coherence
| [2] | Anthony, B., Kamaludin, A., Romli, A., Mat Raffei, A. F., Phon, D. N. A. L. E., Abdullah, A., & Ming, G. L. (2022). Blended learning adoption and implementation in higher education: A theoretical and systematic review. Technology, Knowledge and Learning, 27, 531–578.
https://doi.org/10.1007/s10758-020-09477-z |
| [15] | Graham, C. R., Woodfield, W., & Harrison, J. B. (2014). Blended learning in higher education: Institutional adoption and implementation. Computers & Education, 75, 185–195. https://doi.org/10.1016/j.compedu.2014.02.011 |
[2, 15]
. The significance of this point is often underestimated because technology debates can obscure the amount of academic labour required to redesign programmes rather than merely migrate content.
The strongest adoption studies suggest that mature institutions move from course-by-course improvisation to programme-level design routines. Those routines usually include instructional design support, templates or quality standards, review processes, and explicit decisions about the balance between synchronous and asynchronous learning
| [1] | Ali, R., & Georgiou, H. (2025). A process for institutional adoption and diffusion of blended learning in higher education. Higher Education Policy, 38, 523–544.
https://doi.org/10.1057/s41307-024-00359-y |
| [14] | Graham, C. R., Woodfield, W., & Harrison, J. B. (2013). A framework for institutional adoption and implementation of blended learning in higher education. The Internet and Higher Education, 18, 4–14. https://doi.org/10.1016/j.iheduc.2012.09.003 |
[1, 14]
. This transition matters because programme coherence is difficult to sustain when each course is independently digitized according to lecturer preference. A credible online programme must allow students to navigate a recognizable architecture of expectations, communication patterns, deadlines, learning activities, and support pathways across modules.
The literature also complicates simplistic narratives of flexibility. Flexible delivery can widen access, but only if curriculum design acknowledges diverse student circumstances and digital realities. In African contexts this often means designing for low-bandwidth participation, allowing asynchronous engagement where appropriate, ensuring mobile access, and curating learning resources in ways that do not presume constant access to large files or synchronous meetings. Flexibility without structure can easily become abandonment. Thus the relevant design principle is not flexibility in the abstract, but structured flexibility: pathways that widen participation while preserving academic challenge, feedback, and progression.
4.4. Assessment Integrity and Academic Standards
Assessment is one of the most contested domains in digital transition because it sits at the intersection of standards, trust, technology, and student rights. Research reviews show that integrity problems are broader and that sustainable responses require institutional rather than merely technological solutions
| [4] | Butler-Henderson, K., & Crawford, J. (2020). A systematic review of online examinations: A pedagogical innovation for scalable authentication and integrity. Computers & Education, 159, Article 104024. https://doi.org/10.1016/j.compedu.2020.104024 |
| [16] | Holden, O. L., Norris, M. E., & Kuhlmeier, V. A. (2021). Academic integrity in online assessment: A research review. Frontiers in Education, 6, Article 639814.
https://doi.org/10.3389/feduc.2021.639814 |
[4, 16]
. Authentic task design, staged submissions, oral components, assessment variety, secure item management, clear misconduct procedures, staff training, and student induction all matter. Heavy surveillance technologies may address one integrity risk while creating others related to privacy, equity, and mistrust.
This is where academic standards and rights-based governance meet. A university that moves provision online without revising assessment regulations risks two symmetrical failures. It can become permissive in ways that weaken the meaning of grades and awards, or excessively punitive in ways that compromise fairness and student dignity. Recent AI developments intensify this dilemma by introducing new questions about authorship, disclosure, permissible assistance, and evidence of learning. UNESCO's AI guidance argues that institutions need explicit, human-centred governance rather than ad hoc reactions to emerging tools
. Rwanda's recent virtual learning and AI guidance and its national learning, teaching and assessment policy indicate the growing regulatory recognition that digital assessment requires updated rules, not merely new software
| [25] | Rwanda Higher Education Council. (2023b). National learning, teaching and assessment policy.
https://www.hec.gov.rw/publications/policies |
| [28] | Rwanda Higher Education Council. (2025). Guidelines and assessment tools for the virtual learning and the use of artificial intelligence in Rwanda's higher learning institutions. |
[25, 28]
.
A critical implication follows. Assessment integrity should be treated as an institutional design problem, not a surveillance procurement problem. Where universities frame integrity primarily as invigilation, they risk conflating credibility with control. The literature reviewed here suggests a more balanced model: integrity through assessment redesign, identity assurance, policy clarity, due process, academic support, and proportionate use of technology.
4.5. Faculty Development and Change Management
No institutional architecture for digital transition can succeed without sustained faculty capability building. Yet the literature also shows that faculty development is frequently misconstrued as a short course in platform usage. Evidence demonstrates that professional development for online teaching has grown, but the field remains uneven and often under-conceptualized
| [11] | Gao, Y., Wong, S. L., Md Khambari, M. N., & Noordin, N. (2022). A bibliometric analysis of online faculty professional development in higher education. Research and Practice in Technology Enhanced Learning, 17, Article 17.
https://doi.org/10.1186/s41039-022-00196-w |
[11]
. What universities require is not episodic training but a structured development system combining digital pedagogy, assessment design, accessibility, feedback practice, learner support, AI literacy, and data ethics.
The change-management dimension is equally important. Evidence from a South African institution found that lecturers associated successful blended learning with management support, time, improved professional development, and reliable technology
| [30] | Sanders, D. A., & Mukhari, S. S. (2024). The perceptions of lecturers about blended learning at a particular higher institution in South Africa. Education and Information Technologies, 29, 11517–11532.
https://doi.org/10.1007/s10639-023-12302-6 |
[30]
. This resonates strongly with the broader adoption literature. Faculty members do not experience digital transition only as pedagogical innovation. They also experience it as altered workload, changed communication expectations, new forms of visibility, and, at times, managerial intensification. Institutions that overlook this social reality often misdiagnose resistance. What appears as resistance to change may actually be resistance to unfunded redesign work, unstable infrastructure, or unrealistic implementation timelines.
Effective faculty architecture therefore includes more than training. It includes workload recognition, access to instructional design expertise, communities of practice, peer mentoring, revised promotion and recognition criteria, and responsive help channels during live teaching periods. In mature models, digital capability becomes part of academic professionalism rather than an optional specialization. This institutionalization matters because credible provision depends not on heroic innovators but on routinized, distributed competence across departments.
4.6. Student Support, Inclusion, and Digital Equity
Student support is often treated as a service adjunct to online learning, but the literature suggests it is one of the clearest markers of institutional maturity. A systematic review found that learner support, programme quality, and assessment were comparatively less examined in satisfaction research even though they are central to student success
| [22] | Martin, F., & Bolliger, D. U. (2022). Developing an online learner satisfaction framework in higher education through a systematic review of research. International Journal of Educational Technology in Higher Education, 19, Article 50.
https://doi.org/10.1186/s41239-022-00355-5 |
[22]
. Recent evidence reinforces the point by showing that the support students receive in online environments influences academic performance
| [12] | García-Machado, J. J., Martínez Ávila, M., Dospinescu, N., & Dospinescu, O. (2024). How the support that students receive during online learning influences their academic performance. Education and Information Technologies, 29, 20005–20029. https://doi.org/10.1007/s10639-024-12639-6 |
[12]
. The implication is that digital transition cannot be evaluated on teaching design alone. It must also be judged on whether students can successfully navigate admission, orientation, advising, communication, technical problems, library access, wellbeing needs, and academic difficulties in a digital environment.
In African higher education, the equity dimension is especially acute. Device access, data costs, disability support, home study conditions, language, and geographical location can all shape participation. Evidence from a resource-constrained setting shows that usability and acceptance depend partly on access and student capacity
| [33] | Tulinayo, F., Ssentume, P., & Najjuma, R. (2018). Digital technologies in resource constrained higher institutions of learning: A study on students' acceptance and usability. International Journal of Educational Technology in Higher Education, 15, Article 36. https://doi.org/10.1186/s41239-018-0117-y |
[33]
, while UNESCO's broader digital and AI work warns repeatedly that technology can deepen inequality when connectivity and rights protections are uneven
. Consequently, support architecture must move beyond generic helpdesks. It should include onboarding into online study, digital study skills, academic advising, counselling or wellbeing referral routes, accessibility compliance, multi-channel communication, and targeted support for students facing connectivity or device constraints.
Rwanda's National Student Support and Guidance Policy is instructive because it locates support within formal institutional responsibilities rather than discretionary student services
| [26] | Rwanda Higher Education Council. (2023c). National student support and guidance policy. |
[26]
. That principle is broadly transferable. Online inclusion should be treated as a core quality condition of digital provision, not as a charitable supplement. Institutions become credible when they can show that students are not merely admitted into digital environments but are supported to progress within them.
4.7. Quality Assurance, Data Governance, and Continuous Improvement
Quality assurance is the domain that most clearly distinguishes emergency digitization from credible institutional transition. ENQA argues that e-learning does not require a separate concept of quality so much as mode-sensitive application of existing standards regarding design, delivery, staffing, information, and review
. That position is valuable because it resists the false choice between exceptionalizing online learning and ignoring its specific demands. For universities, the practical implication is that internal quality assurance systems must adapt their approval, monitoring, and review mechanisms to digital provision. This includes evidence on course design standards, platform reliability, student participation, complaints, progression, feedback timeliness, assessment quality, and support responsiveness.
Data governance is now inseparable from this agenda. Learning analytics promises better monitoring and earlier intervention, yet it also expands the university's ability to observe and categorize student behaviour. Jones argues that informed consent and student autonomy should not be treated as peripheral concerns
| [19] | Jones, K. M. L. (2019). Learning analytics and higher education: A proposed model for establishing informed consent mechanisms to promote student privacy and autonomy. International Journal of Educational Technology in Higher Education, 16, Article 24.
https://doi.org/10.1186/s41239-019-0155-0 |
[19]
. Gašević et al. similarly show that stakeholders, strategy, and scale are central challenges in learning analytics adoption
| [10] | Gašević, D., Tsai, Y.-S., & Drachsler, H. (2022). Learning analytics in higher education: Stakeholders, strategy and scale. The Internet and Higher Education, 52, Article 100833.
https://doi.org/10.1016/j.iheduc.2021.100833 |
[10]
. A university-wide architecture for digital provision therefore requires clear rules on data collection, purpose limitation, retention, access, human oversight, and appeal. It must also clarify how analytics data feeds improvement rather than becoming a detached surveillance layer.
AI governance broadens the same concerns. UNESCO's 2023 guidance and 2025 survey of higher education institutions both suggest that universities are increasingly recognizing the need for institution-level AI frameworks, yet uneven confidence and ethical uncertainty persist
. The challenge is not merely whether AI tools are allowed. It is how they are governed in relation to academic integrity, student rights, staff work, environmental cost, and epistemic quality. Rwanda's recent HEC guidance on virtual learning and AI shows that regulators are beginning to integrate AI into the quality architecture of higher education itself
| [28] | Rwanda Higher Education Council. (2025). Guidelines and assessment tools for the virtual learning and the use of artificial intelligence in Rwanda's higher learning institutions. |
[28]
. Taken together, these findings suggest that quality assurance, data governance, and AI governance form a trust infrastructure. Without them, digital provision may scale operationally while losing legitimacy.
4.8. Sequencing Models for Transition from Conventional to Blended and Online Provision
One of the clearest conclusions from the reviewed literature is that sequencing matters. Institutions do not move successfully from conventional delivery to credible online or blended provision by trying to optimize every domain simultaneously. Rather, they tend to progress through distinguishable phases, albeit unevenly. The stage models proposed in blended learning adoption studies remain useful here, especially when extended by more recent digital transformation frameworks
| [1] | Ali, R., & Georgiou, H. (2025). A process for institutional adoption and diffusion of blended learning in higher education. Higher Education Policy, 38, 523–544.
https://doi.org/10.1057/s41307-024-00359-y |
| [14] | Graham, C. R., Woodfield, W., & Harrison, J. B. (2013). A framework for institutional adoption and implementation of blended learning in higher education. The Internet and Higher Education, 18, 4–14. https://doi.org/10.1016/j.iheduc.2012.09.003 |
| [23] | Newman, T., McGill, L., & Knight, S. (2025). How to approach digital transformation in higher education: Report and case studies. Jisc. https://www.jisc.ac.uk/reports/how-to-approach-digital-transformation-in-higher-education |
[1, 14, 23]
.
The first stage may be described as preparation. Its defining tasks are strategic and diagnostic: clarifying institutional purpose, mapping regulatory conditions, auditing readiness, identifying programme priorities, and establishing minimum infrastructure and policy baselines. Institutions that skip this stage often conflate digital aspiration with capability. The second stage is transition, in which carefully selected pilots, course and programme redesign, staff development, support protocols, and revised assessment arrangements are implemented under heightened monitoring. The critical challenge here is to avoid treating pilots as self-sufficient successes.
The third stage is consolidation. At this point the institution formalizes policies, stabilizes support services, improves interoperability, routinizes quality review, and aligns budgeting with actual delivery demands. The shift is from innovation projects to institution-wide operating models. Only then does a fourth stage of optimization become meaningful. Optimization includes analytics-informed improvement, refined AI governance, differentiated student support, partnerships, micro-credentials, or more advanced blended and virtual mobility models. Attempting optimization without consolidation is a common error because it produces a sophisticated discourse on top of unstable foundations.
The literature also reveals recurrent failure points. First, institutions confuse emergency remote teaching with online strategy. Second, they underprice transition by ignoring staff redesign labour and student affordability. Third, they delay policy revision, leaving assessment, workload, or data practices governed by face-to-face assumptions. Fourth, they rely excessively on surveillance-based integrity mechanisms instead of redesigning assessment. Fifth, they overlook student support and digital equity. Sixth, they allow fragmented systems and vendor lock-in to undermine coherence. These failures are not random. They arise when institutions treat digital transition as an additive project rather than a reconfiguration of institutional architecture.
4.9. Toward a University-wide Framework for Bringing Conventional Institutions Online
The evidence synthesized above supports a five-layer framework termed the Institutional Architecture for Credible Digital Transition. The framework is intended to explain not simply what factors matter, but how they relate.
Figure 2 presents this five-layer framework schematically, showing the relationship among contextual boundary conditions, steering architecture, operational architecture, temporal sequencing, and outcomes with feedback loops.
Figure 2. Institutional Architecture for Credible Digital Transition. The framework models credible digital transition as the interaction of five layers: contextual boundary conditions, steering architecture, operational architecture, temporal sequencing, and outcomes with feedback loops. The operational layer comprises seven interdependent domains: infrastructure and LMS ecosystem, curriculum redesign, assessment integrity, faculty capability, student support and inclusion, quality assurance, and data and AI governance.
For practical use, the framework can be read as five diagnostic questions: What contextual constraints and enablers define the institution’s room for manoeuvre? What steering mechanisms authorize and resource transition? Are the seven operational domains sufficiently aligned to support credible delivery? What stage of transition has the institution actually reached? What evidence loops are in place to test quality, inclusion, resilience, and sustainability over time? This condensed reading offers institutional leaders a quick decision aid before fuller use of the readiness rubric, roadmap, and policy checklist.
The outer layer comprises contextual boundary conditions. These include the regulatory environment, national quality frameworks, public financing conditions, connectivity and device realities, institutional mission, labour-market demands, and the university's social contract. These conditions define the room for manoeuvre within which transition occurs. In African higher education they often include stronger constraints on bandwidth, affordability, and infrastructural reliability than are assumed in much Northern literature, which means that credible digital transition must be context-responsive rather than model-copying.
The second layer is the steering architecture. This includes governing bodies, senior leadership, digital strategy, finance and procurement authority, risk management, and the formal policy framework governing learning, assessment, data, student support, staff development, and quality assurance. This layer confers legitimacy and direction. It determines whether digital provision is authorized, resourced, and reviewable.
The third layer is the operational architecture, consisting of seven interdependent domains: infrastructure and LMS ecosystem; curriculum redesign; assessment integrity; faculty capability; student support and inclusion; quality assurance; and data and AI governance. These domains should not be imagined as parallel silos. They are coupled. Weakness in one can destabilize the others. For example, strong curriculum redesign without student support weakens retention; strong infrastructure without policy redesign weakens legitimacy; strong analytics without governance weakens trust.
The fourth layer is temporal sequencing. The framework proposes four stages: preparation, transition, consolidation, and optimization. This temporal layer matters because institutions rarely possess full readiness at the outset. Credibility emerges progressively as the steering and operational layers are aligned, reviewed, and stabilized over time. Sequencing therefore functions as a causal mechanism, not merely a project-management convenience.
The fifth layer is the outcomes layer. The desired outcomes are credible provision, academic quality, resilience, inclusion, and sustainability. These outcomes are mediated by feedback loops. Continuous improvement data, quality review findings, learner experience evidence, financial monitoring, and policy review all feed back into the steering and operational layers. Without these loops, institutions cannot learn from implementation or correct drift.
The framework's central claim is that digital transition becomes credible when legitimacy, capability, and sociotechnical integration are jointly produced. Legitimacy is secured through governance, policy, and quality assurance. Capability is secured through investment in infrastructure, staff, support, and finance. Sociotechnical integration is secured by designing technical systems and academic-administrative processes together. The framework also has boundary conditions. It does not assume that every institution should move immediately to fully online provision. For some universities, credible transition may culminate in robust blended models rather than predominantly online programmes. Nor does it assume that digital transformation is inherently progressive. Poorly governed transition can expand access numerically while degrading quality or intensifying exclusion. The framework is therefore descriptive, explanatory, and cautionary at once.
Table 2 provides a compact analytical summary of the framework’s layers, core functions, and typical institutional expressions.
Table 2. Institutional Architecture for Credible Digital Transition.
Layer | Core function | Typical institutional expressions |
Contextual boundary conditions | Define opportunities and constraints | Regulation, mission, financing environment, connectivity realities, qualifications frameworks, labour-market demands |
Steering architecture | Confers legitimacy, direction, and resources | Council/senate oversight, digital strategy, policy architecture, budget model, procurement and risk governance |
Operational architecture | Builds delivery capability | Infrastructure ecosystem, curriculum redesign, assessment integrity, faculty capability, learner support, QA, data and AI governance |
Temporal sequencing | Orders implementation and learning | Preparation, transition, consolidation, optimization |
Outcomes and feedback loops | Stabilize credibility and improvement | Quality monitoring, learner experience evidence, analytics with safeguards, financial review, policy revision |
4.10. Companion Output: University Online Readiness and Transition Toolkit
To translate the framework into operational guidance, this study proposes a companion University Online Readiness and Transition Toolkit. The toolkit is not a substitute for institutional judgment. Rather, it offers structured prompts that can support planning, self-audit, and regulatory dialogue. An extended version of the toolkit, together with a Preparatory Alignment Memo, is available as supplementary material in the study’s Open Science Framework project (DOI: 10.17605/OSF.IO/Z7KQP).
Figure 3 shows the toolkit’s overall structure and its three linked instruments: the readiness rubric, the phased implementation roadmap, and the policy and governance checklist.
Figure 3. Structure of the University Online Readiness and Transition Toolkit. The toolkit translates the proposed framework into three operational instruments: a readiness rubric, a phased implementation roadmap, and a policy and governance checklist. Together, these instruments support institutional planning, self-audit, regulatory dialogue, and staged implementation.
The first instrument is a readiness rubric. The rubric organizes readiness across leadership and governance, policy and regulation, infrastructure, curriculum, assessment, faculty capability, learner support, quality assurance and data governance, and financial sustainability. Each domain is assessed across four maturity positions: ad hoc, emerging, structured, and assured. The purpose is not to produce a simplistic score but to force institutions to confront uneven development across domains. A university may be technically advanced yet policy-poor, or governance-strong yet learner-support weak. The rubric is therefore diagnostic rather than celebratory.
Table 3 operationalizes this readiness rubric across the principal institutional domains and four maturity levels, from ad hoc to assured.
The second instrument is a phased implementation roadmap. The roadmap distinguishes preparation, transition, consolidation, and optimization. Each phase specifies its strategic purpose, essential actions, expected outputs, and main risk if bypassed. This responds directly to a recurring weakness in digital transition literature and practice: the treatment of scaling as an act of will rather than a staged institutional process.
Table 4 presents the phased institutional transition roadmap, including the strategic purpose, essential actions, and principal risk associated with bypassing each stage.
The third instrument is a policy and governance checklist for credible online or blended provision. The checklist identifies the minimum policy architecture required before scale can be claimed with confidence. These policies include digital learning strategy, academic regulations for online and AI-affected assessment, internal quality assurance procedures, data and analytics governance, accessibility and student support protocols, workload and professional development provisions, and technology procurement and continuity rules. The checklist is intentionally modest in form but significant in implication. It asks not whether an institution has online courses, but whether it has created the rule system capable of governing them credibly.
Table 5 translates this third component into a minimum policy and governance checklist for credible online or blended provision.
Table 3. University online-readiness rubric.
Domain | Ad hoc | Emerging | Structured | Assured |
Leadership and governance | Pilots lack formal authority | Committee exists but with limited mandate | Digital transition owned by senior leadership and academic governance | Board/senate oversight, clear accountability, recurring review |
Policy and regulation | Rules absent or face-to-face only | Provisional guidance in place | Approved policies for delivery, assessment, quality assurance, and student support | Policies reviewed routinely and aligned with regulation |
Infrastructure and LMS ecosystem | Standalone platform, unstable support | Basic LMS and communication tools | Interoperable systems, helpdesk, identity, library, backup | Resilient ecosystem with service standards and continuity planning |
Curriculum redesign | Content upload dominates | Selected modules redesigned | Programme-level redesign with standards and templates | Continuous evidence-led redesign across programmes |
Assessment integrity | Replication of invigilated exams | Mixed online methods with partial safeguards | Authentic design plus identity and misconduct procedures | AI-aware, privacy-conscious, standards-aligned assessment regime |
Faculty capability | Voluntary technical training only | Structured workshops and basic support | Instructional design support, workload recognition, communities of practice | Capability embedded in continuing professional development, promotion, and quality assurance |
Quality assurance, data, and AI governance | Reactive technical help only | Orientation and limited advising | Integrated academic, technical, library, and wellbeing support | Targeted equity measures and accessible multi-channel support |
QA, data, and AI governance | Fragmented reporting and weak oversight | Basic indicators and local practice | Integrated QA review, analytics rules, data responsibilities | Transparent, rights-based governance with improvement loops |
Finance and sustainability | Project-funded and uncertain | Partial budgeting for key systems | Recurring budget and cost model established | Sustainable financing and periodic value review |
Table 4. Phased institutional transition roadmap.
Stage | Strategic purpose | Essential actions | Main risk if bypassed |
Preparation | Establish mandate and minimum conditions | Readiness audit; regulatory mapping; governance assignment; baseline policies; infrastructure minimums; programme prioritization | Launch without capability or legitimacy |
Transition | Pilot and learn under controlled conditions | Curriculum redesign; staff development; learner onboarding; assessment redesign; intensified quality assurance monitoring | Isolated pilots mistaken for scalable model |
Consolidation | Move from projects to operating model | Policy formalization; interoperability improvements; support stabilization; budget alignment; routine quality review | Persistent fragmentation and quality drift |
Optimization | Deepen improvement and innovation | Analytics with safeguards; AI governance refinement; differentiated support; external partnerships; advanced blended or online models | Sophisticated rhetoric built on unstable foundations |
Table 5. Policy and governance checklist for credible online or blended provision.
Policy area | Minimum provision | Lead office(s) |
Digital learning strategy | Institutional purpose, scope, target modes, resourcing principles, review cycle | Senior leadership; academic affairs; planning |
Academic regulations | Rules for online attendance, participation, progression, records, appeals, and equivalence | Senate; registry; academic affairs |
Assessment integrity and AI use | Permissible AI use, disclosure, authorship, misconduct, identity assurance, proportional safeguards | Academic affairs; quality assurance; legal/ethics |
Internal quality assurance | Approval, monitoring, review, learner feedback, complaint handling, enhancement cycle | Quality assurance unit; faculties; senate committees |
Data and learning analytics governance | Purpose limitation, access rights, retention, consent/notice, human oversight, security | Information and communication technology; data protection/legal; quality assurance |
Student support and accessibility | Orientation, advising, disability support, library access, wellbeing referral, communication standards | Student affairs; library; information and communication technology; faculties |
Staff workload and development | Workload recognition, training expectations, support roles, incentives, review | HR; academic affairs; faculties |
Procurement, cybersecurity, and continuity | Vendor due diligence, interoperability, backup, incident response, business continuity | ICT; procurement; finance; legal/risk |