Fundamentals of Business

1

Is there language in this institution's strategic plan that can be tied to Open Education (OE)?

1. Is there language in this institution’s strategic plan that can be tied to Open Education (OE)?

🧭 Overview

🧠 One-sentence thesis

This framework provides a structured 20-question diagnostic tool to assess an institution's engagement with Open Education across values, knowledge, support, action, and policy dimensions.

📌 Key points (3–5)

  • Purpose of the tool: a checklist to evaluate how deeply Open Education is embedded in an institution's culture, operations, and governance.
  • Five assessment dimensions: Institutional Values, Institutional Knowledge, Institutional Support, Institutional Action, and Institutional Policy.
  • Action dimension is most detailed: 10 of the 20 questions focus on concrete actions like working groups, staffing, grants, and faculty adoption.
  • Common confusion: awareness (knowledge) vs. action—knowing about OE does not mean the institution has working groups, staff, or policies in place.
  • Scope of engagement: the questions cover strategic planning, faculty awareness, leadership support, student involvement, and formal policies.

🎯 Institutional Values

🎯 Strategic alignment

  • Question 1: "Is there language in this institution's strategic plan that can be tied to Open Education (OE)?"
    • This checks whether OE appears in high-level institutional documents.
    • Strategic plans signal institutional priorities; presence of OE language indicates formal commitment.

💡 Innovation perception

  • Question 2: "Is OE considered innovative at this institution?"
    • Assesses whether the institution views Open Education as a forward-thinking or novel approach.
    • Example: An institution might adopt OE but treat it as routine rather than innovative.

📚 Institutional Knowledge

📚 Faculty awareness

  • Question 3: "If a survey was taken at this institution, would half the faculty be aware of OE?"
    • Measures basic awareness threshold—50% of faculty knowing about OE.
    • Awareness is foundational but does not imply understanding or action.

🔍 Depth of understanding

  • Question 4: "If a survey was taken at this institution, can one quarter of faculty identify multiple types of OE?"
    • Goes beyond awareness to test whether 25% of faculty can distinguish different OE forms.
    • Example: recognizing open textbooks, open courseware, and open pedagogy as distinct types.

🎓 Professional development

  • Question 5: "Does this institution have at least one professional development opportunity per year in OE?"
    • Checks for structured learning opportunities to build faculty capacity.
    • Annual frequency is the minimum threshold.

🤝 Institutional Support

🤝 Leadership endorsement

  • Question 6: "Has a senior leader (Director & above) at this institution publicly spoken in support of OE?"
    • Public statements from senior leaders signal institutional legitimacy and priority.
    • "Director & above" sets a specific seniority threshold.

📣 Champion presence

  • Question 7: "Is there at least one vocal OE champion at this institution?"
    • A champion is someone who actively promotes OE, not just supports it quietly.
    • Don't confuse: a champion may exist at any level, whereas Question 6 requires senior leadership.

📖 Bookstore alignment

  • Question 8: "Is the bookstore at this institution supportive of OE?"
    • Bookstores can facilitate or hinder OE adoption through their policies and practices.
    • Example: A supportive bookstore might promote open textbooks or not penalize courses that use them.

🚀 Institutional Action

🚀 Working group structure

QuestionFocusThreshold
9Does an OE Working Group exist?Presence of a formal group
10Does the group include a senior leader who can advocate at VP level and higher?VP-level access
11Does the group include a member who can advocate at the board of governors?Governance-level access
12Does the group include students?Student representation
13Does the group work closely with students?Active student collaboration
  • Questions 9–13 assess the composition and reach of an OE Working Group.
  • The progression moves from existence → senior advocacy → governance access → student involvement.
  • Don't confuse: including students (Q12) vs. working closely with them (Q13)—the latter implies ongoing collaboration, not just membership.

👥 Staffing and resources

  • Question 14: "Is there someone on staff (.5 or more) at this institution that can assist with OE?"
    • Checks for dedicated staff capacity—at least half-time (0.5 FTE).
    • Example: A part-time OE coordinator or instructional designer with OE responsibilities.

💰 Grant program

  • Question 15: "Does this institution have an OE grant program?"
    • Grants provide financial incentives for faculty to adopt, adapt, or create OE materials.
    • Presence of a grant program signals institutional investment.

🧑‍🏫 Faculty engagement levels

QuestionActivityLevel of engagement
16Have one or more faculty adopted OE?Using existing OE materials
17Have one or more faculty adapted or created or contributed to OE?Modifying or producing OE
18Have one or more faculty or staff conducted research in OE?Studying OE as a scholarly topic
  • Questions 16–18 measure increasing levels of faculty/staff involvement.
  • Adoption is the entry point; adaptation/creation requires more effort; research represents scholarly engagement.
  • Example: A faculty member might adopt an open textbook (Q16), then adapt it for their course (Q17), then publish a study on its impact (Q18).

📜 Institutional Policy

📜 Course approval integration

  • Question 19: "Is OE part of the instructional design / course approval process at this institution?"
    • Checks whether OE is embedded in formal course development and approval workflows.
    • Example: A course proposal form might ask instructors to consider or report on OE materials.

🏛️ Mandate letter inclusion

  • Question 20: "Is OE part of this institution's mandate letter?"
    • Mandate letters are formal directives from governing bodies or ministries.
    • Inclusion in a mandate letter represents the highest level of institutional commitment.
    • Don't confuse: strategic plan (Q1) vs. mandate letter (Q20)—the latter is typically external and binding.

📝 Tool usage

📝 Recording and completion

  • The document includes fields for "Recorded Response" and "Inventory completed by."
  • This is a self-assessment or audit tool, not a prescriptive standard.
  • The tool is provided by BCcampus.ca, indicating it is designed for use in a specific educational context (likely Canadian post-secondary institutions).
2

Is OE considered innovative at this institution?

2. Is OE considered innovative at this institution?

🧭 Overview

🧠 One-sentence thesis

This question assesses whether Open Education (OE) is perceived and positioned as an innovative practice within the institution's culture and strategic framework.

📌 Key points (3–5)

  • What the question targets: whether the institution views OE as innovative, not just as a practice but as a forward-thinking approach.
  • Where it sits in the framework: part of "Institutional Values," alongside strategic plan alignment, forming the foundation for OE adoption.
  • Common confusion: innovation perception vs. actual adoption—an institution may consider OE innovative without yet having widespread faculty awareness or implementation (covered in later questions).
  • Why it matters: perceiving OE as innovative signals institutional readiness to invest resources, champion change, and integrate OE into strategic priorities.

🏛️ Institutional Values context

🏛️ The Values category

  • The excerpt groups this question under "Institutional Values" (questions 1–2).
  • This category examines foundational attitudes and strategic alignment before measuring knowledge, support, or action.
  • The first question asks about strategic plan language; the second (this question) asks about innovation perception.

🔗 Relationship to strategic planning

  • Question 1 looks for explicit language in the strategic plan that can be tied to OE.
  • Question 2 (this question) probes whether OE is considered innovative, which may or may not appear in formal documents.
  • Don't confuse: strategic plan mentions (Q1) are about formal commitment; innovation perception (Q2) is about cultural attitude and positioning.

🧩 What "innovative" means in this context

🧩 Innovation as institutional perception

The question asks: "Is OE considered innovative at this institution?"

  • It does not define "innovative" explicitly, but the placement suggests it means:
    • OE is seen as a forward-thinking, progressive approach.
    • The institution values OE as a way to advance teaching, learning, or access.
  • Example: An institution might highlight OE in communications as a cutting-edge practice, signaling that it views OE as innovative.

🔍 Why perception matters

  • If OE is considered innovative, the institution is more likely to:
    • Allocate resources (staff, grants, professional development).
    • Encourage champions and senior leaders to advocate publicly.
    • Integrate OE into policies and processes.
  • If OE is not seen as innovative, it may be treated as optional or peripheral, even if some faculty adopt it.

📊 How this question fits the broader assessment

📊 The 20-question framework

The excerpt presents a diagnostic tool with five categories:

CategoryQuestionsFocus
Institutional Values1–2Strategic alignment and innovation perception
Institutional Knowledge3–5Faculty awareness and professional development
Institutional Support6–8Leadership advocacy, champions, bookstore support
Institutional Action9–18Working groups, staffing, grants, adoption, research
Institutional Policy19–20Integration into processes and mandate

🔗 Sequential logic

  • Values (Q1–2) establish whether the institution cares about OE in principle.
  • Knowledge (Q3–5) checks if faculty know what OE is.
  • Support (Q6–8) looks for vocal backing from leaders and infrastructure.
  • Action (Q9–18) measures concrete steps: working groups, staff, grants, adoption, research.
  • Policy (Q19–20) assesses formal integration into institutional processes.

🎯 Why start with innovation perception

  • Positioning OE as innovative (Q2) signals that the institution is open to change and willing to prioritize OE.
  • Without this perception, later questions (e.g., "Does the institution have an OE grant program?") are less likely to yield positive answers.
  • Example: An institution that views OE as merely "free textbooks" may not invest in professional development or research; one that sees OE as innovative may build comprehensive support structures.

🧭 Interpreting responses

🧭 What a "yes" implies

  • The institution publicly or internally frames OE as a forward-thinking practice.
  • Leadership and communications may highlight OE in innovation narratives.
  • This creates a favorable environment for the actions measured in later questions (working groups, grants, policy integration).

🧭 What a "no" implies

  • OE may be present but not prioritized or celebrated.
  • The institution may view OE as a cost-saving measure rather than a pedagogical or access innovation.
  • Subsequent questions (knowledge, support, action) may reveal gaps in awareness, advocacy, and resources.

🧭 Recording the response

  • The excerpt includes a "Recorded Response:" field for each question.
  • Responses are likely yes/no or qualitative notes.
  • The tool is designed for self-assessment or external review, with space for "Other Notes/Comments" and "Inventory completed by:" at the end.
3

If a survey was taken at this institution, would half the faculty be aware of OE?

3. If a survey was taken at this institution, would half the faculty be aware of OE?

🧭 Overview

🧠 One-sentence thesis

This question assesses whether an institution has achieved a baseline level of Open Education awareness among at least half of its faculty, serving as one indicator of institutional knowledge about OE.

📌 Key points (3–5)

  • What this measures: faculty awareness of Open Education (OE) at the institutional level.
  • The threshold: the question asks whether half (50%) of faculty would be aware of OE if surveyed.
  • Where it fits: this is question 3 under the "Institutional Knowledge" category, one of five broader assessment areas.
  • Common confusion: awareness (knowing OE exists) vs. identification (recognizing multiple types of OE)—question 3 measures basic awareness, while question 4 measures deeper knowledge.
  • Why it matters: faculty awareness is a foundational step before adoption, adaptation, or creation of Open Education resources.

📋 The assessment framework

📋 Five assessment categories

The excerpt organizes 20 questions into five groups:

CategoryFocusQuestion numbers
Institutional ValuesStrategic alignment and innovation perception1–2
Institutional KnowledgeAwareness and professional development3–5
Institutional SupportLeadership endorsement and champions6–8
Institutional ActionWorking groups, staffing, grants, and faculty engagement9–18
Institutional PolicyIntegration into processes and mandates19–20

🎯 Purpose of the tool

  • The document is titled "20 Questions To Ask About Open Education" and is designed for institutional self-assessment.
  • Each question has a "Recorded Response" field, indicating this is a diagnostic checklist.
  • The tool helps institutions evaluate their OE maturity across multiple dimensions.

🧠 Institutional Knowledge questions

🧠 Question 3: Faculty awareness threshold

"If a survey was taken at this institution, would half the faculty be aware of OE?"

  • This is a yes/no diagnostic question about baseline awareness.
  • The threshold is 50%—not universal awareness, but a significant portion of faculty.
  • The question is hypothetical ("if a survey was taken"), suggesting institutions may not have formal data but should estimate based on available evidence.

🔍 How this differs from question 4

Question 4 asks: "If a survey was taken at this institution, can one quarter of faculty identify multiple types of OE?"

  • Question 3 = awareness (knowing OE exists)
  • Question 4 = identification (recognizing different OE types)
  • The threshold drops from half (50%) to one quarter (25%) because deeper knowledge is harder to achieve.
  • Don't confuse: an institution might have broad awareness (Q3) but limited detailed knowledge (Q4), or vice versa.

📚 Question 5: Professional development

"Does this institution have at least one professional development opportunity per year in OE?"

  • This question measures whether the institution actively builds knowledge through training.
  • The threshold is minimal: at least one opportunity per year.
  • Example: An institution might offer an annual workshop, webinar, or training session on adopting open textbooks.

🔗 How knowledge connects to other dimensions

🔗 Knowledge as a foundation

  • Awareness (Institutional Knowledge) precedes action and policy.
  • Without faculty knowing what OE is (Q3), they cannot adopt it (Q16), adapt it (Q17), or conduct research on it (Q18).
  • Professional development (Q5) is a mechanism to move from low to high awareness.

🔗 Support and action build on knowledge

  • Institutional Support (Q6–8) includes leadership endorsement and champions who can raise awareness.
  • Institutional Action (Q9–18) includes working groups, staffing, and grants that require faculty to already understand OE.
  • Example: A vocal OE champion (Q7) can help increase the percentage of aware faculty, moving the institution closer to the 50% threshold in Q3.

📊 Interpreting the awareness question

📊 What a "yes" answer suggests

  • At least half the faculty have heard of Open Education and understand it exists.
  • The institution has likely invested in communication, professional development, or visible OE initiatives.
  • This does not guarantee adoption or deep engagement, only baseline familiarity.

📊 What a "no" answer suggests

  • Fewer than half the faculty are aware of OE.
  • The institution may need to prioritize awareness-building activities (e.g., Q5 professional development, Q7 champions).
  • Without awareness, later questions about adoption (Q16) or policy integration (Q19) are less likely to yield positive answers.

📊 Using the question for planning

  • If the answer is "no," the institution might focus on Institutional Knowledge and Institutional Support questions first.
  • If the answer is "yes," the institution can assess whether awareness translates into action (Q9–18) and policy (Q19–20).
  • Example: An institution with 60% awareness but no OE grant program (Q15) might prioritize funding mechanisms next.
4

If a survey was taken at this institution, can one quarter of faculty identify multiple types of OE?

4. If a survey was taken at this institution, can one quarter of faculty identify multiple types of OE?

🧭 Overview

🧠 One-sentence thesis

This question assesses whether at least 25% of faculty possess detailed knowledge of Open Education by being able to identify multiple OE types, serving as a benchmark for institutional knowledge depth.

📌 Key points (3–5)

  • What the question measures: faculty's ability to distinguish and name multiple types of Open Education, not just awareness.
  • The threshold: one quarter (25%) of faculty—a specific benchmark for institutional knowledge assessment.
  • Where it fits: this is question 4 under "Institutional Knowledge," following a broader awareness question (question 3 asks if half the faculty are aware of OE).
  • Common confusion: awareness vs. identification—question 3 measures general awareness, while question 4 measures deeper knowledge of OE types.
  • Context: part of a 20-question diagnostic tool covering institutional values, knowledge, support, action, and policy related to Open Education.

📊 The diagnostic framework

📊 Five assessment categories

The 20 questions are organized into five domains:

CategoryQuestion RangeFocus
Institutional Values1–2Strategic alignment and innovation perception
Institutional Knowledge3–5Faculty awareness, identification skills, and professional development
Institutional Support6–8Leadership endorsement, champions, and bookstore support
Institutional Action9–18Working groups, staffing, grants, adoption, adaptation, and research
Institutional Policy19–20Integration into processes and mandate

🎯 Purpose of the tool

  • The excerpt presents a self-assessment inventory for institutions to evaluate their Open Education readiness and engagement.
  • Each question expects a "Recorded Response" to be documented.
  • The tool is provided by BCcampus.ca.

🧠 Institutional Knowledge section

🧠 Three knowledge benchmarks

The "Institutional Knowledge" category (questions 3–5) creates a progression:

  1. Question 3: Would half the faculty be aware of OE? (50% awareness threshold)
  2. Question 4: Can one quarter of faculty identify multiple types of OE? (25% detailed knowledge threshold)
  3. Question 5: Does the institution have at least one professional development opportunity per year in OE? (infrastructure for knowledge building)

🔍 What "identify multiple types" means

The question asks whether faculty can identify multiple types of OE, not just recognize the term.

  • This requires distinguishing between different forms of Open Education (e.g., open textbooks, open courseware, open pedagogy, open access).
  • It is a higher bar than simple awareness—faculty must know enough to categorize and name different OE approaches.
  • Example: A faculty member who can only say "I've heard of OE" would not meet this standard; one who can list "open textbooks, OER repositories, and openly licensed course materials" would.

⚖️ The 25% threshold

  • The question sets a specific benchmark: one quarter of faculty.
  • This is lower than the 50% awareness threshold in question 3, reflecting that detailed knowledge is harder to achieve than general awareness.
  • Don't confuse: this is not asking if 25% know something about OE—it's asking if 25% can identify multiple types.

🔄 Relationship to other questions

🔄 Knowledge progression

The excerpt shows a logical sequence:

  • Awareness first (Q3): Do half the faculty know OE exists?
  • Detailed knowledge second (Q4): Can a quarter identify multiple types?
  • Support infrastructure third (Q5): Is there professional development to build this knowledge?

🔄 Connection to action and policy

  • Institutional Knowledge (questions 3–5) precedes Institutional Action (questions 9–18).
  • The structure implies that knowledge is a foundation: faculty must understand OE types before they can adopt (Q16), adapt (Q17), or research (Q18) them.
  • Policy questions (19–20) come last, suggesting that formal integration follows after values, knowledge, support, and action are established.

📝 Using this question

📝 How to answer

  • The excerpt provides space for a "Recorded Response" after each question.
  • Institutions would likely need to conduct or reference a faculty survey to answer this question accurately.
  • The question is phrased as "If a survey was taken," suggesting it may be hypothetical or aspirational if no survey exists.

📝 What a "yes" indicates

  • At least 25% of faculty have moved beyond basic awareness to functional knowledge.
  • The institution has achieved a meaningful level of OE literacy among its teaching staff.
  • There is likely a foundation for broader adoption and adaptation efforts.

📝 What a "no" suggests

  • Faculty knowledge may be shallow or concentrated in a small group.
  • The institution may need to invest in professional development (question 5) or champion visibility (question 7).
  • Don't confuse: a "no" here does not mean zero knowledge—it means the 25% threshold for detailed knowledge has not been reached.
5

Does this institution have at least one professional development opportunity per year in OE?

5. Does this institution have at least one professional development opportunity per year in OE?

🧭 Overview

🧠 One-sentence thesis

This question assesses whether an institution provides at least one annual professional development opportunity focused on Open Education, serving as an indicator of institutional knowledge and commitment to OE.

📌 Key points (3–5)

  • What it measures: the presence of recurring (yearly) professional development opportunities specifically about Open Education.
  • Where it fits: this is question 5 in a 20-question framework, positioned under the "Institutional Knowledge" category.
  • Common confusion: this question is about professional development (training/learning opportunities), not about strategic plans (question 1) or faculty awareness levels (questions 3–4).
  • Related questions: it sits between questions about faculty awareness/identification of OE types (3–4) and questions about leadership support (6–7).
  • Why it matters: regular professional development signals that the institution actively builds capacity and knowledge around Open Education, not just passive awareness.

📚 The question in context

📚 Part of a broader assessment framework

  • The excerpt presents a 20-question inventory designed to evaluate an institution's relationship with Open Education.
  • Questions are grouped into five categories:
    • Institutional Values (questions 1–2)
    • Institutional Knowledge (questions 3–5)
    • Institutional Support (questions 6–8)
    • Institutional Action (questions 9–18)
    • Institutional Policy (questions 19–20)

🎯 Positioned in "Institutional Knowledge"

  • Question 5 is the third and final question in the Institutional Knowledge section.
  • It follows two questions about faculty awareness and identification of OE types.
  • This placement suggests professional development is seen as a mechanism for building institutional knowledge.

🔍 What the question asks

🔍 The specific criterion

"Does this institution have at least one professional development opportunity per year in OE?"

  • Frequency threshold: "at least one per year" sets a minimum bar for recurring commitment.
  • Type of activity: "professional development opportunity" refers to training, workshops, seminars, or similar learning events.
  • Subject focus: the opportunity must be specifically about Open Education (OE).

🎓 What counts as professional development

  • The excerpt does not define "professional development opportunity" in detail.
  • Context suggests it means structured learning activities for faculty or staff to build OE knowledge and skills.
  • Example: An institution might offer an annual workshop on creating open educational resources, or a seminar on OE licensing.

⚖️ The threshold: "at least one"

  • The question sets a low bar: even a single annual event qualifies as a "yes."
  • This suggests the question is screening for any regular professional development presence, not evaluating quality or depth.
  • Don't confuse: this is not asking whether most faculty participate, only whether the opportunity exists.

🧩 How this question relates to others

🧩 Building on awareness questions

QuestionFocusWhat it measures
3Faculty awarenessWould half the faculty be aware of OE?
4Faculty identificationCan one quarter identify multiple types of OE?
5Professional developmentIs there at least one PD opportunity per year?
  • Questions 3 and 4 measure existing knowledge levels among faculty.
  • Question 5 measures whether the institution actively creates opportunities to build that knowledge.
  • The progression suggests that professional development is one pathway to achieving the awareness and identification levels asked about in questions 3–4.

🔗 Connecting knowledge to action

  • The Institutional Knowledge section (questions 3–5) sits between Values (1–2) and Support (6–8).
  • Professional development bridges passive awareness and active institutional support.
  • Example: An institution might value OE (question 1), offer PD opportunities (question 5), and then have leaders publicly support OE (question 6).

📋 Using this question in practice

📋 Recording the response

  • The excerpt includes a "Recorded Response:" field for each question.
  • Responses are likely yes/no or descriptive notes about what PD opportunities exist.
  • The inventory is designed to be completed by someone familiar with the institution's OE activities.

📋 What a "yes" indicates

  • A "yes" suggests the institution has made a recurring commitment to OE professional development.
  • It does not guarantee high participation, quality content, or impact on practice.
  • Don't confuse: a "yes" here does not mean the institution has an OE Working Group (question 9) or dedicated staff (question 14)—those are separate indicators of institutional action.

📋 What a "no" reveals

  • A "no" suggests the institution lacks regular, structured OE learning opportunities for faculty/staff.
  • This gap may explain lower awareness (question 3) or identification (question 4) levels.
  • Example: An institution might value OE in its strategic plan (question 1) but fail to provide PD opportunities (question 5), indicating a gap between values and knowledge-building infrastructure.

🌐 Broader inventory context

🌐 The full 20-question framework

  • The excerpt provides all 20 questions across five categories.
  • Other categories include:
    • Institutional Support: leadership endorsement, champions, bookstore support (6–8)
    • Institutional Action: working groups, staffing, grants, faculty adoption/adaptation/research (9–18)
    • Institutional Policy: integration into course approval and mandate letters (19–20)

🌐 Cumulative assessment

  • The 20 questions together create a profile of an institution's OE maturity.
  • No single question determines overall commitment; the inventory is designed to reveal strengths and gaps across multiple dimensions.
  • Example: An institution might score well on Institutional Knowledge (questions 3–5) but poorly on Institutional Policy (questions 19–20), suggesting awareness without formal integration.
6

Has a senior leader (Director & above) at this institution publicly spoken in support of OE?

6. Has a senior leader (Director & above) at this institution publicly spoken in support of OE?

🧭 Overview

🧠 One-sentence thesis

This question assesses whether an institution has visible senior leadership endorsement of Open Education through public statements by directors or higher-level administrators.

📌 Key points (3–5)

  • What the question measures: whether senior leaders (Director level and above) have publicly voiced support for Open Education.
  • Where it fits: this is question 6 under the "Institutional Support" category, one of several dimensions used to evaluate an institution's Open Education maturity.
  • Common confusion: this question is about public statements by senior leaders, not general faculty support or private institutional discussions.
  • Related questions: the framework also asks about vocal OE champions (question 7) and bookstore support (question 8) as part of the broader Institutional Support assessment.
  • Why it matters: public senior leadership support signals institutional commitment and can influence resource allocation, policy, and faculty adoption.

🏢 The role of senior leadership support

👔 Who counts as a senior leader

  • The question specifies Director level and above.
  • This includes directors, vice presidents, presidents, and other executive-level administrators.
  • Example: A dean or provost making a public statement would qualify; a department chair typically would not.

📢 What "publicly spoken" means

  • The question asks whether the leader has made a public statement.
  • This distinguishes visible, on-the-record endorsement from internal discussions or private support.
  • Public statements might include speeches, published articles, institutional announcements, or conference presentations.
  • Don't confuse: private encouragement or internal memos do not meet the "publicly spoken" criterion.

🗂️ Context within the assessment framework

🗂️ Institutional Support category

The excerpt places this question in the Institutional Support section, which includes:

Question #Focus
6Senior leader public support
7Presence of at least one vocal OE champion
8Bookstore support for OE
  • These three questions together assess whether the institution has visible backing from leadership, advocates, and operational units.

🔗 Relationship to other dimensions

The full framework evaluates institutions across multiple categories:

  • Institutional Values (questions 1–2): strategic alignment and perception of innovation
  • Institutional Knowledge (questions 3–5): faculty awareness and professional development
  • Institutional Support (questions 6–8): leadership endorsement and champions
  • Institutional Action (questions 9–18): working groups, staffing, grants, adoption, and research
  • Institutional Policy (questions 19–20): integration into processes and mandates

🎯 Why this question matters

  • Senior leadership support can:
    • Legitimize Open Education initiatives within the institution
    • Influence budget and resource decisions
    • Encourage faculty participation by signaling institutional priorities
    • Shape institutional culture around openness and innovation
  • Example: An institution where a vice president has publicly endorsed OE is more likely to allocate funding and staff time to OE initiatives than one where support remains informal.

📝 How to use this question

📝 Recording responses

  • The excerpt includes a "Recorded Response" field for documenting the answer.
  • Responses should be factual: identify specific instances of public statements if they exist.
  • Example response format: "Yes – the Vice President of Academic Affairs spoke about OE at the annual faculty meeting in [year]" or "No – no public statements identified."

🔍 What to look for

When answering this question, consider:

  • Has a director or higher-level leader made a public statement about Open Education?
  • Was the statement clearly supportive (not neutral or critical)?
  • Was it truly public (accessible to faculty, students, or the broader community)?

Don't confuse:

  • A senior leader attending an OE meeting is not the same as publicly speaking in support.
  • General statements about "innovation" or "affordability" may not explicitly mention Open Education.
7

Is there at least one vocal OE champion at this institution?

7. Is there at least one vocal OE champion at this institution?

🧭 Overview

🧠 One-sentence thesis

The presence of at least one vocal Open Education (OE) champion is one indicator of institutional support for OE, positioned between senior leader endorsement and bookstore support in a broader assessment framework.

📌 Key points (3–5)

  • What this question assesses: whether the institution has a vocal advocate specifically for Open Education.
  • Where it fits: part of the "Institutional Support" category, alongside senior leader public support and bookstore support.
  • Common confusion: a vocal champion is distinct from a senior leader who speaks publicly—the champion role does not specify rank or formal authority.
  • Broader context: this is question 7 in a 20-question framework that evaluates institutional values, knowledge, support, action, and policy around OE.
  • Why it matters: vocal champions signal grassroots or mid-level advocacy, complementing top-down leadership support.

📋 The 20-question framework structure

📋 Five assessment categories

The excerpt presents a diagnostic tool organized into five areas:

CategoryQuestionsFocus
Institutional Values1–2Strategic alignment and innovation perception
Institutional Knowledge3–5Faculty awareness and professional development
Institutional Support6–8Leadership endorsement, champions, and bookstore support
Institutional Action9–18Working groups, staffing, grants, adoption, adaptation, and research
Institutional Policy19–20Integration into processes and mandate
  • The framework moves from values and awareness through support and action to formal policy.
  • Each category builds on the previous: support follows knowledge; action follows support; policy formalizes action.

🎯 Purpose of the framework

  • The tool is designed for institutional self-assessment: "20 Questions To Ask About Open Education."
  • It includes space for "Recorded Response" and "Other Notes/Comments," indicating it is meant to be completed by someone at the institution.
  • The final line asks "Inventory completed by:" suggesting this is a structured inventory or audit tool.

🗣️ What "vocal OE champion" means

🗣️ The champion role

A vocal OE champion: at least one person at the institution who actively advocates for Open Education.

  • The term "vocal" implies public, visible, or outspoken advocacy—not silent or behind-the-scenes support.
  • The question asks for "at least one," setting a minimum threshold rather than expecting widespread champions.
  • Example: An organization might have a faculty member who regularly promotes OE in meetings, emails, or campus events, even if they hold no formal leadership position.

🔍 How this differs from senior leader support

  • Question 6 (preceding this one) asks: "Has a senior leader (Director & above) at this institution publicly spoken in support of OE?"
  • Question 7 (this question) asks about a "vocal OE champion" without specifying rank.
  • Don't confuse: a senior leader speaking publicly is about top-down endorsement; a vocal champion may be at any level and represents sustained advocacy rather than a single public statement.
  • The champion role emphasizes ongoing, active promotion; the senior leader question emphasizes formal authority and public positioning.

🏢 Institutional Support context

🏢 The three support indicators

The "Institutional Support" category includes exactly three questions:

  1. Senior leader public support (Question 6): formal authority speaking publicly.
  2. Vocal OE champion (Question 7): active advocate at any level.
  3. Bookstore support (Question 8): operational/logistical support from the bookstore.
  • These three elements represent different types of support: leadership endorsement, grassroots/mid-level advocacy, and infrastructure/operational backing.
  • Together they assess whether OE has visible champions, formal backing, and practical support systems.

🔗 Relationship to other categories

  • Before this question: the framework has already assessed whether OE aligns with institutional values (Questions 1–2) and whether faculty are aware of OE (Questions 3–5).
  • After this question: the framework moves to "Institutional Action," asking about working groups, staffing, grants, adoption, and research (Questions 9–18).
  • The logic: support (including champions) should follow from awareness and precede concrete action.
  • Example: An institution might have faculty who know about OE (Question 3) and a vocal champion (Question 7), but without a working group (Question 9) or staff support (Question 14), action may be limited.

🧩 Interpreting the question

🧩 What a "yes" answer suggests

  • At least one person is actively promoting OE within the institution.
  • There is visible, ongoing advocacy—not just passive awareness or one-time mentions.
  • The institution has someone who can raise OE visibility, educate peers, and push for adoption or policy changes.

🧩 What a "no" answer suggests

  • OE may lack a visible advocate, even if some faculty are aware or interested.
  • Without a vocal champion, OE initiatives may struggle to gain momentum or visibility.
  • The institution may need to identify or cultivate champions as part of building support.

⚠️ Limitations of this single question

  • The question does not ask how many champions exist, only whether there is "at least one."
  • It does not specify what "vocal" means in practice—frequency, audience, or impact of advocacy.
  • It does not ask whether the champion is effective or whether their advocacy leads to concrete outcomes.
  • Don't confuse: having a champion is an indicator of support, but it does not guarantee action or policy change (those are assessed in later questions).

🌐 Broader assessment implications

🌐 How this question fits the whole inventory

  • The 20-question framework is cumulative: each question adds a piece to the overall picture of institutional OE readiness.
  • Question 7 is positioned early in the "support" section, suggesting that identifying champions is a foundational step before expecting organized action (working groups, grants, etc.).
  • The framework does not prescribe a passing score; it is diagnostic, helping institutions identify strengths and gaps.

🌐 Who might complete this inventory

  • The excerpt indicates it is designed for someone within the institution: "Inventory completed by:" with space for a name.
  • Likely users: OE advocates, instructional designers, library staff, or administrators assessing readiness for OE initiatives.
  • The tool is attributed to "BCcampus.ca," suggesting it originates from a Canadian open education organization, but the excerpt does not provide background on BCcampus itself.
8

8. Is the bookstore at this institution supportive of OE?

8. Is the bookstore at this institution supportive of OE?

🧭 Overview

🧠 One-sentence thesis

Bookstore support for Open Education is one indicator of institutional commitment, sitting alongside senior leadership endorsement and the presence of vocal champions within the broader category of institutional support.

📌 Key points (3–5)

  • Where bookstore support fits: it is one of three questions under "Institutional Support," alongside senior leader endorsement and the presence of an OE champion.
  • What the question asks: whether the bookstore at the institution is supportive of OE—a yes/no recorded response.
  • Common confusion: bookstore support is not the same as institutional action (e.g., working groups, grants, adoption by faculty); it is about whether a key campus unit backs OE.
  • Context: the question is part of a 20-question inventory covering institutional values, knowledge, support, action, and policy related to Open Education.

🏢 Institutional Support category

🏢 What "Institutional Support" measures

The excerpt groups three questions under "Institutional Support":

  • Has a senior leader (Director & above) publicly spoken in support of OE?
  • Is there at least one vocal OE champion at this institution?
  • Is the bookstore at this institution supportive of OE?

These questions assess whether key individuals and units within the institution back Open Education, not just whether policies or programs exist.

🛒 The bookstore question

Question 8: Is the bookstore at this institution supportive of OE?

  • The question is binary: a recorded response (yes/no or similar).
  • It does not specify how the bookstore shows support (e.g., stocking OE materials, promoting them, reducing traditional textbook emphasis).
  • Example: An institution might have a bookstore that actively helps faculty find and order open textbooks, or one that resists OE because it reduces textbook sales.

🎤 Senior leader and champion questions

The other two questions in this category focus on people:

  • Senior leader: Director level and above, publicly speaking in support.
  • Vocal champion: at least one person actively advocating for OE.

Don't confuse: a vocal champion is not necessarily a senior leader; the excerpt treats them as separate indicators.

🗂️ How this question fits the full inventory

🗂️ The five categories

The 20-question inventory is organized into five areas:

CategoryQuestion numbersWhat it covers
Institutional Values1–2Strategic plan language, whether OE is seen as innovative
Institutional Knowledge3–5Faculty awareness, ability to identify OE types, professional development
Institutional Support6–8Senior leader endorsement, vocal champion, bookstore support
Institutional Action9–18Working groups, staffing, grants, adoption, adaptation, research
Institutional Policy19–20OE in course approval, OE in mandate letter

Question 8 (bookstore support) is the last question in the "Institutional Support" section, bridging the gap between endorsement by individuals and concrete actions by the institution.

📋 Purpose of the inventory

  • The excerpt presents this as a tool for assessing an institution's engagement with Open Education.
  • Each question has a "Recorded Response" field, suggesting it is meant to be completed by someone at the institution (identified at the end: "Inventory completed by:").
  • The inventory does not provide scoring or interpretation; it is a checklist for self-assessment or planning.

🔍 Distinguishing support from action

🔍 Support vs action

  • Support (questions 6–8): public statements, champions, and bookstore backing—these signal institutional willingness and culture.
  • Action (questions 9–18): working groups, staff allocation, grants, faculty adoption, research—these are tangible steps and outcomes.

Don't confuse: bookstore support is not the same as faculty adoption (question 16) or having an OE grant program (question 15). Support is about whether a campus unit is aligned with OE; action is about whether the institution has implemented programs or seen faculty engagement.

🧩 Why the bookstore matters

  • The bookstore is a key touchpoint for students acquiring course materials.
  • If the bookstore resists or ignores OE, it can create friction even when faculty want to adopt open resources.
  • Example: A faculty member chooses an open textbook, but the bookstore continues to order expensive traditional texts by default, confusing students and undermining the OE initiative.

📝 Completing the inventory

📝 Recorded response

  • Each question has a "Recorded Response:" field.
  • The excerpt does not specify the format (yes/no, scale, narrative), but the questions are phrased as yes/no or "does this institution have...?"
  • The inventory is attributed to BCcampus.ca, suggesting it is a resource from that organization.

📝 Other notes

  • The inventory includes an "Other Notes/Comments" section and a line for "Inventory completed by:" at the end.
  • This suggests the tool is meant for internal use, planning, or reporting, not for external benchmarking or ranking.
9

Does this institution have an OE Working Group?

9. Does this institution have an OE Working Group?

🧭 Overview

🧠 One-sentence thesis

The presence and composition of an Open Education Working Group is a key indicator of institutional action toward Open Education, especially when it includes senior leaders, board advocates, and students.

📌 Key points (3–5)

  • What the question measures: whether the institution has established a formal OE Working Group as part of its institutional action.
  • Why composition matters: the excerpt emphasizes that effective OE Working Groups should include senior leaders (VP-level advocates), board-level advocates, and students.
  • How it fits into assessment: this question is part of the "Institutional Action" category, which also covers staffing, grants, faculty adoption, and research.
  • Common confusion: having a Working Group is not enough—questions 10–13 probe whether the group has the right membership and student engagement to drive real change.
  • Broader context: the question is one of 20 designed to assess an institution's Open Education maturity across values, knowledge, support, action, and policy.

🏗️ The OE Working Group as institutional action

🏗️ What the question asks

Does this institution have an OE Working Group?

  • This is question 9 in a 20-question assessment framework.
  • It appears under the "Institutional Action" heading, which focuses on concrete steps the institution has taken.
  • The question is binary: either the institution has established such a group or it has not.

🔗 Why it matters

  • A Working Group signals that the institution has moved beyond awareness or support to organizing formal structures.
  • The excerpt groups this question with others about staffing, grants, adoption, and research—all tangible actions.
  • Example: An institution might have champions (question 7) and senior leader support (question 6), but without a Working Group, coordination and sustained effort may be lacking.

👥 Who should be in the Working Group

👔 Senior leadership representation

  • Question 10 asks: "Does the OE Working Group at this institution include a senior leader who can advocate at the VP level and higher?"
  • Why this matters: VP-level advocates can influence budget, policy, and strategic priorities.
  • Don't confuse: a Working Group can exist without senior leaders, but the excerpt treats their presence as a separate, important criterion.

🏛️ Board-level advocacy

  • Question 11 asks: "Does the OE Working Group at this institution include a member who can advocate at the board of governors?"
  • Why this matters: board-level advocates can embed OE into governance and long-term institutional direction.
  • Example: A faculty-only Working Group may lack the authority to change institutional mandates or secure sustained funding.

🎓 Student involvement

  • Question 12 asks: "Does the OE Working Group at this institution include students?"
  • Question 13 asks: "Does the OE Working group at this institution work closely with students?"
  • Why this matters: students are the primary beneficiaries of Open Education (e.g., reduced textbook costs, open access to materials).
  • The excerpt distinguishes between formal membership (question 12) and close collaboration (question 13).

🔄 How the Working Group connects to other actions

🛠️ Staffing and resources

  • Question 14 asks: "Is there someone on staff (.5 or more) at this institution that can assist with OE?"
  • A Working Group without dedicated staff may struggle to execute initiatives.
  • Example: The Working Group sets goals, but a part-time or full-time staff member handles day-to-day coordination, grant administration, and faculty support.

💰 Grant programs and faculty engagement

  • Question 15: "Does this institution have an OE grant program?"
  • Question 16: "Have one or more faculty at this institution adopted OE?"
  • Question 17: "Have one or more faculty at this institution adapted or created or contributed to OE?"
  • Question 18: "Have one or more faculty or staff at this institution conducted research in OE?"
  • The Working Group often oversees or promotes these activities.
  • Don't confuse: faculty adoption (question 16) can happen without a Working Group, but a Working Group typically aims to increase adoption and creation.

📋 The broader assessment framework

📋 Four categories before "Institutional Action"

CategorySample questions from the excerpt
Institutional ValuesIs OE in the strategic plan? Is OE considered innovative?
Institutional KnowledgeWould half the faculty be aware of OE? Can a quarter identify multiple types?
Institutional SupportHas a senior leader publicly supported OE? Is there a vocal champion? Is the bookstore supportive?
Institutional ActionDoes the institution have an OE Working Group? (and related questions 10–18)
  • The excerpt also includes an "Institutional Policy" category (questions 19–20), asking whether OE is part of course approval processes or the institution's mandate letter.
  • The Working Group question sits in the middle of the "Institutional Action" section, suggesting it is a foundational step that enables other actions (grants, adoption, research).

📝 How to use the framework

  • The excerpt provides a "Recorded Response" field for each question, indicating this is a self-assessment or audit tool.
  • At the end, there is space for "Other Notes/Comments" and "Inventory completed by," suggesting institutions document their current state.
  • Example: An institution might answer "Yes" to question 9 but "No" to questions 10–12, revealing that the Working Group exists but lacks key stakeholders.
10

Does the OE Working Group at this institution include a senior leader who can advocate at the VP level and higher?

10. Does the OE Working Group at this institution include a senior leader who can advocate at the VP level and higher?

🧭 Overview

🧠 One-sentence thesis

This question assesses whether an institution's Open Education Working Group has sufficient high-level representation to advocate for OE at the vice-president level and above.

📌 Key points (3–5)

  • What the question targets: presence of a senior leader within the OE Working Group who can influence decision-making at VP level and higher.
  • Where it fits: part of the "Institutional Action" category, which examines concrete steps an institution takes toward Open Education.
  • Related questions: this is one of four questions about OE Working Group composition and function (questions 9–13).
  • Common confusion: don't confuse "senior leader" in general (question 6, Director & above) with "senior leader in the Working Group who can advocate at VP level" (question 10)—this question requires both membership and seniority.
  • Why it matters: VP-level advocacy enables the Working Group to influence strategic decisions and resource allocation at the highest institutional tiers.

🏛️ Institutional Action framework

🏛️ What "Institutional Action" measures

Institutional Action: the category of questions (9–18) that examine whether an institution has taken concrete organizational steps to support Open Education.

  • This category goes beyond values, awareness, or verbal support.
  • It looks for structures (Working Groups, staff positions, grant programs) and behaviors (adoption, adaptation, research).
  • Question 10 sits within this framework as a measure of Working Group composition.

🔗 How question 10 relates to other Working Group questions

The excerpt presents four consecutive questions about the OE Working Group:

QuestionFocus
9Does the institution have an OE Working Group at all?
10Does the Working Group include a senior leader who can advocate at VP level and higher?
11Does the Working Group include a member who can advocate at the board of governors?
12Does the Working Group include students?
  • Question 9 is the prerequisite: the Working Group must exist.
  • Questions 10–12 probe who is in the Working Group and what levels of influence they bring.
  • Don't confuse: question 10 focuses on VP-level advocacy; question 11 focuses on board-level advocacy—these are distinct governance tiers.

👤 Senior leader definition and scope

👤 What "senior leader" means in this context

  • The excerpt does not define "senior leader" explicitly in question 10, but question 6 provides a reference point: "senior leader (Director & above)."
  • Question 10 adds a functional requirement: the leader must be able to "advocate at the VP level and higher."
  • This implies the leader either is at VP level or has sufficient access and credibility to influence VPs and higher executives.

🎯 What "advocate at the VP level and higher" means

  • Advocate: speak in support of, make the case for, influence decisions about Open Education.
  • At the VP level and higher: vice-presidents, provosts, presidents, or equivalent senior executives who set institutional strategy and allocate major resources.
  • Example: An OE Working Group member who is a VP themselves, or a Director who regularly advises VPs, can fulfill this role; a faculty member without such access cannot.

🔍 Distinguishing related questions

🔍 Question 6 vs question 10

AspectQuestion 6Question 10
What it asksHas a senior leader publicly spoken in support of OE?Does the OE Working Group include a senior leader who can advocate at VP level and higher?
CategoryInstitutional SupportInstitutional Action
Key differencePublic statement (one-time or occasional)Membership in the Working Group (ongoing structural role)
  • Don't confuse: a senior leader can publicly support OE (question 6) without being part of the Working Group (question 10), and vice versa.

🔍 Question 10 vs question 11

  • Question 10: advocacy at the VP level and higher (executive leadership within the institution).
  • Question 11: advocacy at the board of governors (the governing body that oversees the institution, typically external to day-to-day operations).
  • These are different governance layers; an institution might have one but not the other.

📋 How to use this question

📋 What a "yes" answer indicates

  • The OE Working Group has direct access to high-level institutional decision-making.
  • OE initiatives can be championed in strategic planning, budget discussions, and policy development at the VP level.
  • Example: An institution's OE Working Group includes the VP Academic, who can bring OE priorities into senior leadership meetings and resource allocation decisions.

📋 What a "no" answer indicates

  • The Working Group may lack influence at the highest levels, limiting its ability to secure resources or policy changes.
  • OE efforts may remain grassroots or mid-level without strategic integration.
  • This does not mean OE is absent, but it may face structural barriers to scaling or institutionalization.

📋 Recording the response

  • The excerpt provides a "Recorded Response:" field for each question.
  • Responses should be factual: does the Working Group currently include such a leader, yes or no?
  • The "Other Notes/Comments" section at the end allows for context or clarification.
11

Does the OE Working Group at this institution include a member who can advocate at the board of governors?

11. Does the OE Working Group at this institution include a member who can advocate at the board of governors?

🧭 Overview

🧠 One-sentence thesis

This question assesses whether the institution's Open Education Working Group has representation at the board-of-governors level, which is one indicator of institutional action toward Open Education.

📌 Key points (3–5)

  • What the question measures: whether the OE Working Group includes someone who can advocate at the board of governors.
  • Where it fits: this is question 11 of 20, part of the "Institutional Action" category (questions 9–18).
  • Related questions: the excerpt also asks about VP-level advocacy (question 10) and student inclusion (question 12) in the same Working Group.
  • Common confusion: board-level advocacy is distinct from VP-level advocacy—the board of governors is typically a higher governance tier than vice presidents.
  • Purpose: the question is part of a diagnostic tool to evaluate an institution's Open Education maturity across values, knowledge, support, action, and policy.

🏛️ Governance-level representation

🏛️ What board-of-governors advocacy means

The question asks: "Does the OE Working Group at this institution include a member who can advocate at the board of governors?"

  • The board of governors is a governance body at the institution.
  • "Advocate at the board" means the Working Group member has access to or influence with this body.
  • This is about representation and voice at the highest institutional decision-making level.

🔍 Why this level matters

  • The excerpt places this question under "Institutional Action," suggesting that having board-level advocacy is a concrete step an institution can take.
  • It signals that Open Education concerns can reach the institution's top governance layer.
  • Example: An institution with a Working Group member who sits on or regularly presents to the board of governors would answer "yes."

🧩 Context within the assessment framework

🧩 The "Institutional Action" category

The excerpt groups questions 9–18 under "Institutional Action." Within this category:

QuestionFocus
9Does an OE Working Group exist?
10Does it include a senior leader who can advocate at the VP level and higher?
11Does it include a member who can advocate at the board of governors?
12Does it include students?
13Does it work closely with students?
14–18Staffing, grants, adoption, adaptation, and research
  • Questions 10 and 11 both ask about advocacy at different governance tiers.
  • Don't confuse: VP-level (question 10) and board-level (question 11) are separate; an institution might have one but not the other.

📋 The broader 20-question tool

The excerpt shows a diagnostic instrument with five categories:

  1. Institutional Values (questions 1–2): strategic alignment and perception of innovation.
  2. Institutional Knowledge (questions 3–5): faculty awareness and professional development.
  3. Institutional Support (questions 6–8): senior leader endorsement, champions, and bookstore support.
  4. Institutional Action (questions 9–18): Working Group composition, staffing, grants, adoption, adaptation, and research.
  5. Institutional Policy (questions 19–20): integration into course approval and mandate letters.
  • Question 11 is one of ten "action" indicators.
  • The tool is designed for self-assessment; the excerpt includes a "Recorded Response" field for each question.

🎯 How to use this question

🎯 What a "yes" answer indicates

  • The OE Working Group has secured a channel to the board of governors.
  • This suggests institutional commitment at the governance level, not just operational or middle-management support.
  • Example: A faculty member on the Working Group who also serves on the board, or a Working Group liaison invited to present OE updates to the board, would satisfy this criterion.

🎯 What a "no" answer indicates

  • The Working Group may still exist and function, but lacks direct board-level advocacy.
  • This does not mean the institution is unsupportive; it may indicate that OE has not yet reached the highest governance tier.
  • Don't confuse: a "no" here does not invalidate a "yes" to question 10 (VP-level advocacy)—the two levels are independent.

🎯 Completing the inventory

  • The excerpt notes "Inventory completed by:" with a response field, indicating this is a self-assessment tool.
  • The tool is attributed to BCcampus.ca.
  • Institutions can use the 20 questions to benchmark their Open Education maturity and identify gaps.
12

Does the OE Working Group at this institution include students?

12. Does the OE Working Group at this institution includes students?

🧭 Overview

🧠 One-sentence thesis

This question assesses whether students are formally included as members of an institution's Open Education Working Group, which is one indicator of institutional action toward Open Education.

📌 Key points (3–5)

  • What it asks: whether students participate directly in the OE Working Group at the institution.
  • Where it fits: part of the "Institutional Action" category, alongside questions about senior leadership, board advocacy, and staff support.
  • Related question: Question 13 asks if the Working Group "works closely with students," which is distinct from having students as members.
  • Common confusion: working with students (Q13) vs. including students as members (Q12)—the first is about collaboration, the second is about formal membership.
  • Why it matters: student membership signals that the institution values student voice in Open Education decision-making and governance.

📋 Context: the 20 Questions framework

📋 Purpose of the framework

  • The excerpt presents a structured assessment tool: "20 Questions To Ask About Open Education."
  • Questions are grouped into five categories:
    • Institutional Values (Q1–2)
    • Institutional Knowledge (Q3–5)
    • Institutional Support (Q6–8)
    • Institutional Action (Q9–18)
    • Institutional Policy (Q19–20)
  • Each question is designed to evaluate a specific dimension of an institution's engagement with Open Education.

🎯 What "Open Education" means in this context

  • The excerpt uses "OE" as shorthand for Open Education.
  • Open Education encompasses multiple types (Q4 mentions "multiple types of OE").
  • Activities include adoption, adaptation, creation, contribution, and research (Q16–18).
  • The framework does not define Open Education in detail; it assumes familiarity with the concept.

🏛️ Institutional Action category

🏛️ What this category measures

  • Questions 9–18 focus on concrete actions the institution has taken, not just values or awareness.
  • Key themes:
    • Existence of an OE Working Group (Q9)
    • Composition of the Working Group (Q10–12)
    • Working Group's relationship with students (Q13)
    • Staffing and resources (Q14–15)
    • Faculty and staff engagement (Q16–18)

🧑‍🤝‍🧑 Working Group composition questions

The excerpt asks three questions about who is in the Working Group:

QuestionFocusWhy it matters
Q10Senior leader who can advocate at VP level and higherEnsures high-level institutional support and decision-making power
Q11Member who can advocate at the board of governorsConnects OE to governance and strategic oversight
Q12StudentsBrings student perspective and voice into OE planning
  • These questions assess whether the Working Group has the right mix of influence and representation.
  • Example: A Working Group with only mid-level staff may lack the authority to implement policy changes; one without students may overlook learner needs.

👥 Question 12: student inclusion

👥 What the question asks

"Does the OE Working Group at this institution includes students?"

  • The question is binary: yes or no.
  • It asks about formal membership, not informal consultation.
  • The excerpt provides a "Recorded Response" field, indicating this is part of a survey or self-assessment tool.

🔍 Why student membership matters

  • Students are the primary beneficiaries of Open Education (e.g., through reduced textbook costs, improved access to materials).
  • Including students in the Working Group ensures their needs and perspectives shape OE strategy.
  • It signals that the institution values participatory governance and student agency.
  • Example: A Working Group designing an OE grant program might prioritize different criteria if students are present to advocate for affordability and accessibility.

⚖️ Distinguishing Q12 from Q13

  • Q12: "Does the OE Working Group at this institution includes students?"
    • Asks if students are members of the Working Group.
  • Q13: "Does the OE Working group at this institution work closely with students?"
    • Asks if the Working Group collaborates with students, even if students are not formal members.
  • Don't confuse: an institution might answer "no" to Q12 but "yes" to Q13 if the Working Group consults students regularly without giving them membership.
  • Example: A Working Group might hold focus groups with students (Q13) but not seat a student representative on the committee (Q12).

🔗 Relationship to other questions

🔗 Upstream questions

  • Q9: "Does this institution have an OE Working Group?"
    • Q12 only applies if the answer to Q9 is "yes."
    • If there is no Working Group, Q12 is not applicable.

🔗 Downstream implications

  • If students are included (Q12 = yes) and the Working Group works closely with students (Q13 = yes), the institution demonstrates strong student engagement in OE.
  • If Q12 = no but Q13 = yes, the institution consults students but does not grant them decision-making power.
  • If both Q12 and Q13 = no, student voice is likely absent from OE planning.

📝 How to use this question

📝 Self-assessment

  • The excerpt is designed for institutions to assess their own OE maturity.
  • The "Recorded Response" field suggests institutions should document their answer and any supporting evidence.
  • Example: An institution might note, "Two undergraduate students and one graduate student serve on the OE Working Group, appointed annually by the student government."

📝 Benchmarking and improvement

  • Answering "no" to Q12 does not mean the institution has failed; it identifies an area for growth.
  • Institutions can use the 20 Questions to set goals: "We will add student members to the Working Group by next academic year."
  • The framework encourages holistic assessment: strong performance in one category (e.g., Institutional Support) may compensate for gaps in another (e.g., Institutional Action).
13

Does the OE Working Group at This Institution Work Closely with Students?

13. Does the OE Working group at this institution work closely with students?

🧭 Overview

🧠 One-sentence thesis

This question assesses whether an institution's Open Education Working Group actively collaborates with students, building on the foundational question of whether students are formally included as members.

📌 Key points (3–5)

  • What the question targets: the depth of student involvement in the OE Working Group, beyond mere membership.
  • Where it fits: this is question 13 in a 20-question institutional assessment, part of the "Institutional Action" category.
  • Sequential logic: question 12 asks if students are included in the group; question 13 asks if the group works closely with them.
  • Common confusion: including students as members (Q12) vs. actively collaborating with students (Q13)—membership does not guarantee close working relationships.
  • Context: the assessment covers institutional values, knowledge, support, action, and policy related to Open Education.

🔍 The question's position and purpose

🔍 Part of a structured assessment

  • The question appears in a 20-question framework designed to evaluate an institution's Open Education readiness and activity.
  • It is organized into five categories: Institutional Values, Institutional Knowledge, Institutional Support, Institutional Action, and Institutional Policy.
  • Question 13 falls under "Institutional Action (continued)," indicating it measures concrete activities rather than abstract commitments.

🎯 What "working closely" probes

  • The question moves beyond formal membership (covered in question 12) to examine the quality and depth of student engagement.
  • It asks whether the OE Working Group has an active, collaborative relationship with students.
  • Example: An institution might have a student representative on the Working Group (satisfying Q12) but rarely consult them or involve them in decisions (failing Q13).

🧩 Relationship to surrounding questions

🧩 The OE Working Group cluster (Q9–Q13)

The excerpt presents a sequence of questions about the Working Group's composition and function:

QuestionFocus
Q9Does the institution have an OE Working Group at all?
Q10Does it include a senior leader who can advocate at VP level and higher?
Q11Does it include a member who can advocate at the board of governors?
Q12Does it include students?
Q13Does it work closely with students?
  • Questions 10–11 assess high-level advocacy capacity.
  • Questions 12–13 assess student involvement, first structurally (membership), then functionally (collaboration).

🔄 Distinguishing inclusion from collaboration

  • Don't confuse: having students in the group (Q12) with working closely with students (Q13).
  • Membership is a structural feature; close collaboration is a behavioral and relational feature.
  • An institution could answer "yes" to Q12 but "no" to Q13 if students are nominally included but not actively engaged in planning, decision-making, or implementation.

📋 Broader assessment context

📋 Institutional Action category

Question 13 is part of a larger "Institutional Action" section (Q9–Q18) that examines:

  • Whether an OE Working Group exists and who participates.
  • Whether dedicated staff support OE (Q14).
  • Whether the institution has an OE grant program (Q15).
  • Whether faculty have adopted, adapted, created, or researched OE (Q16–Q18).

📋 The full assessment framework

The 20 questions span:

  • Institutional Values (Q1–Q2): strategic alignment and perception of innovation.
  • Institutional Knowledge (Q3–Q5): awareness and professional development.
  • Institutional Support (Q6–Q8): leadership endorsement and champions.
  • Institutional Action (Q9–Q18): working groups, staffing, grants, faculty activity, and research.
  • Institutional Policy (Q19–Q20): integration into processes and mandates.

🎓 Student-centered assessment

  • Question 13 reflects a student-centered approach to Open Education governance.
  • It recognizes that students are stakeholders in OE initiatives (e.g., beneficiaries of open textbooks and resources) and should be active participants, not passive recipients.
  • Example: A Working Group that works closely with students might co-design OE adoption strategies, gather student feedback on open resources, or involve students in awareness campaigns.

📝 Recording and completion

📝 Response format

  • The excerpt includes a "Recorded Response:" field for each question, indicating that institutions are expected to document their answers.
  • There is also an "Other Notes/Comments" section and a line for "Inventory completed by:" to track who conducted the assessment.

📝 Source attribution

  • The assessment is attributed to BCcampus.ca, suggesting it is a tool developed by BCcampus (an organization supporting open education in British Columbia, Canada).
  • The document title is "20 Questions To Ask About Open Education."
14

Is there someone on staff (.5 or more) at this institution that can assist with OE?

14. Is there someone on staff (.5 or more) at this institution that can assist with OE?

🧭 Overview

🧠 One-sentence thesis

Having at least half-time dedicated staff support for Open Education (OE) is one indicator of institutional action and capacity to implement OE initiatives.

📌 Key points (3–5)

  • What the question measures: whether the institution has allocated at least 0.5 FTE (half-time equivalent) staff to support OE work.
  • Where it fits: this is question 14 in a 20-question assessment tool, grouped under "Institutional Action."
  • Why staffing matters: dedicated staff capacity signals that the institution is moving beyond planning or awareness into operational support.
  • Common confusion: this question is about staff support (administrative/professional roles), not faculty adoption or champions—those are covered in separate questions.
  • Context: the question is part of a broader framework assessing institutional values, knowledge, support, action, and policy around OE.

🏢 What the question asks

🏢 Staffing threshold

The question asks: "Is there someone on staff (.5 or more) at this institution that can assist with OE?"

  • ".5 or more" means at least half-time equivalent (50% of a full-time position).
  • The role is to "assist with OE"—supporting faculty, coordinating initiatives, or managing programs.
  • This is a yes/no question with a recorded response field.

🔍 Staff vs other roles

  • The question specifies staff, not faculty or senior leaders.
  • Don't confuse with:
    • Question 7 (vocal OE champion—can be anyone).
    • Questions 16–18 (faculty who adopt, adapt, create, or research OE).
    • Questions 10–11 (senior leaders or board advocates in the Working Group).
  • Example: An institution might have a faculty champion (Q7) but no dedicated staff support (Q14), or vice versa.

🎯 Why this question matters

🎯 Institutional action category

  • This question is grouped under Institutional Action (questions 9–18).
  • Other action indicators include:
    • Having an OE Working Group (Q9–13).
    • Running an OE grant program (Q15).
    • Faculty adoption, adaptation, creation, and research (Q16–18).
  • Dedicated staff capacity is a concrete resource commitment, not just a statement of intent.

⚙️ Operational capacity

  • A half-time or full-time staff member can:
    • Coordinate professional development (related to Q5).
    • Support faculty who want to adopt or adapt OE (related to Q16–17).
    • Manage grant programs (related to Q15).
    • Work with the OE Working Group (related to Q9–13).
  • Without staff support, OE initiatives often rely on volunteer effort from faculty or champions, which may not be sustainable.

📋 Context within the 20-question framework

📋 Five assessment categories

The 20 questions are organized into five themes:

CategoryQuestionsFocus
Institutional Values1–2Strategic alignment and innovation perception
Institutional Knowledge3–5Awareness and professional development
Institutional Support6–8Leadership endorsement and champions
Institutional Action9–18Working groups, staffing, grants, faculty activity, research
Institutional Policy19–20Integration into processes and mandates
  • Question 14 sits in the middle of the Institutional Action block.
  • It bridges between governance structures (Q9–13) and faculty-level activity (Q16–18).

🧩 How Q14 connects to other questions

  • Q5 (professional development opportunities): staff can organize or deliver these.
  • Q7 (vocal champion): a champion may advocate, but staff provide ongoing operational support.
  • Q9–13 (OE Working Group): staff can coordinate the group or serve as members.
  • Q15 (grant program): staff often administer grant programs.
  • Q16–18 (faculty adoption/adaptation/research): staff can provide technical or logistical assistance.
  • Example: An institution with a Working Group (Q9) and a grant program (Q15) but no dedicated staff (Q14) may struggle to sustain both.

🛠️ How to use this question

🛠️ Recording the response

  • The tool provides a "Recorded Response" field for each question.
  • Respondents should answer yes or no based on current staffing.
  • The tool is designed for institutional self-assessment or inventory, completed by a named individual.

📊 Interpreting the answer

  • Yes: the institution has allocated at least half-time staff capacity to OE, indicating operational commitment.
  • No: OE work may rely on volunteers, part-time efforts, or no dedicated support, which may limit scale and sustainability.
  • Don't confuse: a "no" does not mean the institution has no OE activity—it may still have champions, faculty adopters, or a Working Group (other questions capture those).

🔄 Relationship to institutional maturity

  • Institutions early in OE adoption may answer "no" to Q14 but "yes" to Q3–5 (awareness and professional development).
  • Institutions with mature OE programs are more likely to answer "yes" to Q14 along with Q9–18 (action indicators).
  • Example: An institution might first build awareness (Q3–4), then appoint a champion (Q7), then allocate staff time (Q14), then launch a grant program (Q15).
15

Does this institution have an OE grant program?

15. Does this institution have an OE grant program?

🧭 Overview

🧠 One-sentence thesis

This question assesses whether an institution provides financial support for Open Education through a dedicated grant program, which is one indicator of institutional action toward OE adoption.

📌 Key points (3–5)

  • What it asks: whether the institution has established a grant program specifically for Open Education.
  • Where it fits: this is question 15 out of 20, positioned within the "Institutional Action" category.
  • Context of assessment: the question is part of a broader institutional readiness inventory covering values, knowledge, support, action, and policy.
  • Common confusion: this question focuses on financial mechanisms (grants), not just general support or champions—distinguish from questions about vocal champions (Q7) or staff assistance (Q14).
  • Why it matters: grant programs signal tangible institutional investment and create pathways for faculty to engage with OE.

🏛️ The institutional readiness framework

🏛️ Five assessment categories

The excerpt presents a 20-question inventory organized into five themes:

CategoryFocusExample questions
Institutional ValuesStrategic alignment and innovation perceptionQ1: OE in strategic plan; Q2: OE considered innovative
Institutional KnowledgeAwareness and understanding among facultyQ3: Half of faculty aware; Q4: Quarter can identify types
Institutional SupportLeadership endorsement and championsQ6: Senior leader support; Q7: Vocal champion; Q8: Bookstore support
Institutional ActionStructures, resources, and activitiesQ9–18: Working groups, staff, grants, adoption, research
Institutional PolicyFormal integration into processesQ19: Part of course approval; Q20: In mandate letter

🎯 Purpose of the inventory

  • The tool is designed to evaluate an institution's readiness and commitment to Open Education.
  • It covers both soft indicators (awareness, champions) and hard indicators (staff positions, grants, policies).
  • The inventory is provided by BCcampus.ca for institutional self-assessment.

💰 The grant program question (Q15)

💰 What the question measures

"Does this institution have an OE grant program?"

  • This is a yes/no question about the existence of a dedicated funding mechanism.
  • It does not specify grant size, frequency, or eligibility—only whether such a program exists.
  • The question appears in the "Institutional Action (continued)" section, indicating it measures concrete steps beyond planning or awareness.

🔗 Relationship to other action questions

The grant program question sits among other action-oriented questions:

  • Q9–Q13: Focus on governance structures (OE Working Group composition and student involvement).
  • Q14: Asks about staffing (at least 0.5 FTE to assist with OE).
  • Q15: Asks about financial support (grant program).
  • Q16–Q18: Ask about faculty/staff engagement (adoption, adaptation/creation, research).

Don't confuse:

  • Q14 (staff assistance) provides human resources for OE support.
  • Q15 (grant program) provides financial resources for OE projects.
  • Both are action indicators, but they measure different types of institutional investment.

🎓 Why grants matter for OE action

  • Grant programs create incentives for faculty to adopt, adapt, or create Open Education resources.
  • They signal that the institution is willing to allocate budget, not just express support.
  • Example: An institution might have a vocal champion (Q7) and a working group (Q9), but without grants (Q15) or staff (Q14), faculty may lack practical support to engage with OE.

🔄 Progression from support to action

🔄 The assessment logic

The inventory follows a logical progression:

  1. Values (Q1–Q2): Does OE align with institutional priorities?
  2. Knowledge (Q3–Q5): Do people know what OE is?
  3. Support (Q6–Q8): Do leaders and key units endorse it?
  4. Action (Q9–Q18): Are there structures, resources, and activities in place?
  5. Policy (Q19–Q20): Is OE embedded in formal processes?

📍 Where Q15 fits

  • Q15 appears in the middle of the "Action" section, after questions about working groups and before questions about faculty engagement.
  • This placement suggests that grant programs are a bridge: they translate institutional support (earlier questions) into faculty activity (later questions).
  • Example: An institution with a grant program (Q15) is more likely to answer "yes" to Q16 (faculty adoption) and Q17 (faculty adaptation/creation).

⚠️ What the question does not ask

  • It does not ask about the amount of funding.
  • It does not ask whether grants have been awarded or used.
  • It does not ask about other types of financial support (e.g., course release time, stipends).
  • The question is binary: the program either exists or it does not.

📋 Using the inventory

📋 Recording responses

  • The excerpt includes "Recorded Response:" fields for each question.
  • The inventory is designed to be completed by someone at the institution (see "Inventory completed by:" at the end).
  • Responses provide a snapshot of institutional readiness across all five categories.

📋 Interpreting results

  • A "yes" to Q15 indicates the institution has moved beyond planning and support into tangible resource allocation.
  • A "no" does not mean the institution is unsupportive—it may rely on other mechanisms (e.g., staff support in Q14, or direct budget allocations not structured as grants).
  • The full picture emerges from all 20 questions, not any single item.
16

Have one or more faculty at this institution adopted OE?

16. Have one or more faculty at this institution adopted OE?

🧭 Overview

🧠 One-sentence thesis

This question assesses whether an institution has moved beyond planning and support structures to actual faculty adoption of Open Education materials in their teaching.

📌 Key points (3–5)

  • What the question measures: whether at least one faculty member has begun using OE resources in practice.
  • Where it sits in the framework: part of "Institutional Action," indicating a shift from planning to implementation.
  • How it differs from related questions: adoption means using existing OE, while adaptation/creation (question 17) means modifying or producing OE.
  • Common confusion: adoption vs adaptation—adoption is simply using OE materials as they are; adaptation involves changing them or creating new ones.
  • Why it matters: faculty adoption is a concrete indicator that OE has moved from institutional discussion to classroom reality.

📋 The question in context

📋 Part of the institutional action category

  • This question appears as question 16 in a 20-question assessment framework.
  • It belongs to the "Institutional Action" section, which runs from question 9 through question 18.
  • The framework evaluates institutions across five dimensions: Institutional Values, Institutional Knowledge, Institutional Support, Institutional Action, and Institutional Policy.

🎯 What "adopted OE" means

Adoption: one or more faculty at the institution have begun using Open Education resources.

  • The question asks for a yes/no response: have any faculty (even just one) adopted OE?
  • It does not specify scale—one faculty member counts as a "yes."
  • Example: A single instructor switches from a traditional textbook to an openly licensed textbook → the institution can answer "yes."

🔄 How this question relates to others

🔄 Progression from support to action

The framework shows a logical sequence:

  1. Earlier questions (6–8) ask about institutional support and champions.
  2. Middle questions (9–15) ask about working groups, staffing, and grant programs.
  3. This question (16) asks whether faculty have actually adopted OE.
  4. Next questions (17–18) ask about deeper engagement: adaptation, creation, and research.

🆚 Adoption vs adaptation vs creation

QuestionWhat it measuresLevel of engagement
16. AdoptedUsing existing OE materialsBasic implementation
17. Adapted or createdModifying or producing OEActive contribution
18. ResearchStudying OEScholarly investigation
  • Don't confuse: adoption is the entry point—faculty use what already exists; adaptation and creation require more effort and expertise.

🧩 Why this question matters

🧩 Indicator of real implementation

  • Institutional support structures (working groups, grants, champions) are necessary but not sufficient.
  • Faculty adoption shows that OE has reached the classroom, not just administrative planning.
  • Example: An institution may have an OE working group and a grant program, but if no faculty have adopted OE, the infrastructure has not yet translated into practice.

🧩 Foundation for deeper engagement

  • Adoption is typically the first step before faculty move to adaptation or creation.
  • If the answer to question 16 is "no," questions 17 and 18 (about adaptation, creation, and research) are likely also "no."
  • The question helps institutions identify whether they need to focus on encouraging initial adoption before expecting more advanced OE activities.
17

Have one or more faculty at this institution adapted or created or contributed to OE?

17. Have one or more faculty at this institution adapted or created or contributed to OE?

🧭 Overview

🧠 One-sentence thesis

This question assesses whether faculty at an institution have moved beyond simply adopting existing Open Education resources to actively adapting, creating, or contributing to them.

📌 Key points (3–5)

  • What this question measures: faculty engagement at the creation/contribution level, not just adoption.
  • Where it sits in the framework: part of "Institutional Action" questions (questions 9–18), specifically following the adoption question.
  • How it differs from adoption: adoption means using existing OE; adaptation/creation/contribution means modifying or producing new OE materials.
  • Common confusion: this is question 17, distinct from question 16 (adoption) and question 18 (research); each measures a different level of faculty involvement.
  • Why it matters: faculty who adapt or create OE demonstrate deeper institutional engagement and capacity-building in open education.

📋 Context within the assessment framework

📋 The "20 Questions" structure

The excerpt presents a diagnostic tool organized into five categories:

  • Institutional Values (questions 1–2)
  • Institutional Knowledge (questions 3–5)
  • Institutional Support (questions 6–8)
  • Institutional Action (questions 9–18)
  • Institutional Policy (questions 19–20)

🎯 Position in "Institutional Action"

Question 17 appears in the second half of the Institutional Action section, which covers:

  • Working groups and governance (questions 9–13)
  • Staffing and resources (questions 14–15)
  • Faculty engagement levels (questions 16–18)

🔄 Three levels of faculty engagement

🔄 The progression from adoption to contribution

The excerpt distinguishes three separate questions about faculty involvement:

QuestionFocusLevel of engagement
16AdoptionUsing existing OE materials
17Adaptation/creation/contributionModifying or producing OE materials
18ResearchStudying OE as a scholarly topic

✏️ What "adapted or created or contributed" means

Question 17: "Have one or more faculty at this institution adapted or created or contributed to OE?"

This question asks whether at least one faculty member has done any of the following:

  • Adapted: modified existing Open Education resources
  • Created: produced new Open Education materials from scratch
  • Contributed: added to or improved existing OE projects

Don't confuse: this is not about whether faculty use OE (that's question 16); it's about whether they make or improve OE.

🎓 Why this question matters for institutional assessment

🎓 Measuring active participation

  • Adoption (question 16) shows awareness and willingness to use OE.
  • Adaptation/creation/contribution (question 17) shows deeper investment: faculty are spending time and effort to build OE capacity.
  • Example: An institution where faculty only adopt OE has less internal capacity than one where faculty also create and share OE materials.

🌱 Building institutional capacity

  • Faculty who adapt or create OE develop expertise that can be shared with colleagues.
  • This level of engagement suggests the institution is not just consuming OE but contributing to the broader OE ecosystem.
  • The question uses "one or more" as the threshold, indicating that even a small number of active contributors signals meaningful institutional action.

🔗 Relationship to other institutional elements

🔗 Supporting infrastructure

Earlier questions in the Institutional Action section set the stage:

  • Question 14 asks about dedicated staff support (0.5 FTE or more) for OE.
  • Question 15 asks about an OE grant program.
  • These resources can enable faculty to move from adoption to creation/contribution.

🔗 Complementary measures

  • Question 17 focuses on production of OE materials.
  • Question 18 (research) focuses on studying OE as a phenomenon.
  • Together, questions 16–18 provide a comprehensive picture of faculty engagement across use, creation, and scholarship.

Don't confuse: a faculty member can do any combination of these—adopting without creating, creating without researching, or all three.

18

Have one or more faculty or staff at this institution conducted research in OE?

18. Have one or more faculty or staff at this institution conducted research in OE?

🧭 Overview

🧠 One-sentence thesis

This question assesses whether the institution has moved beyond adoption and adaptation of Open Education to actively generating new knowledge through research.

📌 Key points (3–5)

  • What the question targets: whether faculty or staff have conducted research specifically about Open Education.
  • Position in the framework: appears as question 18 under "Institutional Action (continued)", following questions about adoption and adaptation.
  • Scope of personnel: includes both faculty and staff, not just teaching faculty.
  • Common confusion: conducting research in OE is different from adopting OE materials (question 16) or adapting/creating OE resources (question 17).
  • Context within the 20 questions: part of a broader institutional assessment covering values, knowledge, support, action, and policy.

🔬 What this question measures

🔬 Research activity in Open Education

Research in OE: faculty or staff conducting scholarly inquiry into Open Education itself.

  • The question asks whether "one or more" individuals have done this work—a threshold indicator, not a count.
  • It sits in the "Institutional Action" category, suggesting research is a form of institutional engagement beyond passive use.
  • Example: A faculty member studying the impact of open textbooks on student outcomes would satisfy this criterion; a faculty member simply using an open textbook would not.

👥 Who counts

  • The question explicitly includes faculty or staff, broadening beyond instructors.
  • Staff might include instructional designers, librarians, or educational technologists who research OE practices.
  • This wider scope recognizes that OE research can come from multiple institutional roles.

🔄 How this differs from related questions

📚 Adoption vs adaptation vs research

The excerpt presents a progression of engagement levels:

QuestionActivityLevel of engagement
16Adopted OEUsing existing open resources
17Adapted or created or contributed to OEModifying or producing open resources
18Conducted research in OEStudying Open Education as a subject
  • Don't confuse: Creating an open textbook (question 17) is not the same as researching the effects or practices of Open Education (question 18).
  • Research in OE means OE is the object of study, not just the medium.

🏛️ Position in institutional maturity

  • This question appears late in the "Institutional Action" section, after questions about working groups, grants, and resource creation.
  • The placement suggests research represents a more advanced stage of institutional commitment.
  • An institution might have faculty adopting OE (question 16) without yet having anyone studying OE systematically (question 18).

📋 Context within the assessment tool

📋 The 20 Questions framework

The excerpt shows this is question 18 in a structured assessment covering five domains:

  1. Institutional Values (questions 1–2): strategic alignment and innovation perception
  2. Institutional Knowledge (questions 3–5): awareness and professional development
  3. Institutional Support (questions 6–8): leadership backing and champions
  4. Institutional Action (questions 9–18): working groups, staffing, grants, adoption, adaptation, and research
  5. Institutional Policy (questions 19–20): integration into processes and mandates
  • Question 18 is the final item under "Institutional Action," suggesting research is a culminating indicator of active engagement.
  • Each question expects a "Recorded Response," indicating this is a self-assessment or audit tool.

🎯 Purpose of the tool

  • The document is titled "20 Questions To Ask About Open Education" and is produced by BCcampus.ca.
  • It provides a systematic way for institutions to evaluate their Open Education maturity across multiple dimensions.
  • The questions build from awareness and values through to concrete actions and formal policies.
19

Is OE part of the instructional design / course approval process at this institution?

19. Is OE part of the instructional design / course approval process at this institution?

🧭 Overview

🧠 One-sentence thesis

This question assesses whether Open Education has been formally integrated into the institution's instructional design and course approval workflows, indicating a policy-level commitment beyond voluntary adoption.

📌 Key points (3–5)

  • What the question targets: whether OE is embedded in formal institutional processes (instructional design and course approval).
  • Where it sits in the framework: part of the "Institutional Policy" category, the final dimension after values, knowledge, support, and action.
  • Why it matters: inclusion in these processes signals that OE is not optional or ad hoc but a standard consideration in course development.
  • Common confusion: this is distinct from faculty choosing to use OE (question 16–17); this asks whether the institution requires or prompts OE consideration during course approval.
  • Context: one of two policy-level questions in a 20-question institutional OE readiness inventory.

📋 The question in context

📋 Part of a broader assessment tool

  • The excerpt presents a 20-question inventory titled "20 Questions To Ask About Open Education."
  • Questions are grouped into five categories:
    1. Institutional Values (questions 1–2)
    2. Institutional Knowledge (questions 3–5)
    3. Institutional Support (questions 6–8)
    4. Institutional Action (questions 9–18)
    5. Institutional Policy (questions 19–20)
  • Question 19 is the first of two policy questions; the second (question 20) asks whether OE is part of the institution's mandate letter.

🎯 What "Institutional Policy" measures

  • This category examines whether OE has been formalized in institutional governance and procedures.
  • Policy-level integration is the deepest form of commitment: it moves OE from individual initiative to institutional standard.
  • Example: An institution might have champions and grants (action) but still lack formal policy requiring OE consideration in course design.

🔍 What the question asks

🔍 Two linked processes

The question targets two specific institutional workflows:

ProcessWhat it means
Instructional designThe process of planning, developing, and structuring courses and learning materials
Course approvalThe formal review and authorization process before a course can be offered
  • The question asks whether OE is "part of" these processes—i.e., whether OE is a required consideration, prompt, or criterion.
  • It does not specify how OE must be integrated (e.g., mandatory vs. encouraged, checklist item vs. full review).

🧩 What "part of" implies

  • Not asking: whether individual instructors use OE in their courses (that is covered in questions 16–17).
  • Asking: whether the institution's formal procedures include OE as a standard element.
  • Example: A course approval form might include a field asking "Have you considered open educational resources?" or instructional designers might be required to discuss OE options with faculty.
  • Don't confuse: faculty adoption (voluntary, bottom-up) vs. process integration (required, top-down policy).

🏛️ Why this question matters

🏛️ Signal of institutional commitment

  • Embedding OE in instructional design and course approval processes indicates that the institution treats OE as a standard practice, not an optional innovation.
  • It ensures that every new or revised course at least considers OE, even if the final decision is not to use it.
  • This is a stronger commitment than having champions, grants, or working groups (all covered in earlier questions).

🔄 Relationship to other questions

  • Builds on earlier dimensions: policy integration typically follows after values alignment (questions 1–2), knowledge building (questions 3–5), leadership support (questions 6–8), and concrete actions (questions 9–18).
  • Complements question 20: the mandate letter question asks about high-level institutional direction; question 19 asks about operational implementation.
  • Example: An institution might have OE in its mandate letter (strategic commitment) but not yet in course approval processes (operational gap).

⚠️ What a "no" answer reveals

  • If OE is not part of these processes, it suggests:
    • OE adoption depends entirely on individual faculty initiative.
    • There is no systematic prompt or support for OE consideration during course development.
    • The institution has not yet moved OE from project-level activity to standard practice.
  • This does not mean the institution opposes OE; it means OE has not reached the policy-integration stage.

📝 How to use this question

📝 Recording the response

  • The excerpt includes a "Recorded Response:" field for each question.
  • Responses are likely yes/no or a brief description of how OE is integrated (the excerpt does not specify the response format).
  • The inventory is completed by a designated person or team (field: "Inventory completed by:").

📝 Interpreting results across the inventory

  • This question is most meaningful when viewed alongside the other 19 questions.
  • A "yes" here combined with "yes" answers in earlier categories (values, knowledge, support, action) suggests a mature, institution-wide OE initiative.
  • A "yes" here but "no" answers in earlier categories would be unusual—policy integration typically requires prior groundwork.
  • A "no" here despite "yes" answers in action questions (e.g., grants, working groups) suggests the institution is active in OE but has not yet formalized it in core processes.

🎓 Institutional self-assessment

  • The inventory is a self-assessment tool for institutions to gauge their OE readiness and identify gaps.
  • Question 19 helps institutions determine whether they have moved OE from initiative to infrastructure.
  • Example: An institution might discover it has strong faculty adoption (question 16) but no formal process integration (question 19), signaling an opportunity to embed OE more deeply.
20

Is OE part of this institution's mandate letter?

20. Is OE part of this institution’s mandate letter?

🧭 Overview

🧠 One-sentence thesis

This question assesses whether Open Education has been formally embedded in an institution's mandate letter, representing the highest level of policy commitment to OE.

📌 Key points (3–5)

  • What the question targets: whether OE appears in the institution's mandate letter, a formal policy document.
  • Where it sits in the framework: question 20 falls under "Institutional Policy," the final category after values, knowledge, support, and action.
  • Common confusion: mandate letters vs other policy documents—this question specifically asks about the mandate letter, not strategic plans (question 1) or course approval processes (question 19).
  • Why it matters: inclusion in a mandate letter signals formal, top-level institutional commitment to Open Education.

📋 The question in context

📋 Part of a 20-question diagnostic tool

  • The excerpt presents a structured assessment framework titled "20 Questions To Ask About Open Education."
  • Questions are organized into five categories: Institutional Values, Institutional Knowledge, Institutional Support, Institutional Action, and Institutional Policy.
  • Question 20 is the final question and the second of two questions in the Institutional Policy category.

🏛️ What a mandate letter represents

  • The question does not define "mandate letter" explicitly, but it is positioned as a distinct policy instrument.
  • It is separate from the strategic plan (question 1) and the instructional design/course approval process (question 19).
  • Example: An organization might have OE language in its strategic plan but not yet in its mandate letter, indicating different levels of formalization.

🔍 Distinguishing policy-level questions

🔍 Three policy-related questions in the framework

QuestionDocument/ProcessWhat it checks
Question 1Strategic planLanguage that can be tied to OE
Question 19Instructional design / course approval processOE as part of operational procedures
Question 20Mandate letterOE as part of the formal mandate

🔍 Don't confuse with earlier questions

  • Question 1 asks about the strategic plan; question 20 asks about the mandate letter—these are different documents.
  • Questions 9–18 focus on actions (working groups, staff, grants, adoption, research); question 20 focuses on formal policy inclusion.
  • The progression suggests increasing formalization: from strategic language → operational processes → mandate-level commitment.

🎯 How to use this question

🎯 Recording the response

  • The excerpt provides a "Recorded Response:" field for each question.
  • Institutions are expected to answer yes/no or provide details about whether OE is part of their mandate letter.

🎯 Completing the inventory

  • The tool includes a field for "Inventory completed by:" at the end.
  • The framework is designed for self-assessment or external review of an institution's OE maturity.
  • Example: A reviewer would check the institution's mandate letter document to see if Open Education is explicitly mentioned or required.

📌 Source and purpose

📌 Origin of the tool

  • The document is attributed to BCcampus.ca, appearing at the bottom of both pages.
  • No other context about the tool's development or validation is provided in the excerpt.

📌 Other notes section

  • The framework includes an "Other Notes/Comments" field for additional observations not captured by the 20 questions.
  • This allows for qualitative context beyond yes/no responses.