Data insights
MYTH: Collecting as much data as possible and then carefully looking through it will produce the understanding you need to manage and grow your organization.
REALITY: Data insights come from clearly identifying what the questions are, what decisions need to be made and exactly what data these require. Qualitative and quantitative methods are used to acquire and analyze data, including surveys, focus groups, structured interviews and best practices research.
VALUE: Strategic planning, market analysis, performance indicators and success benchmarks
We begin with a realistic and thorough assessment of current capabilities and resources, placing your organization in the context of your peers to create a detailed understanding of how you can be more effective. We then evaluate your organization’s structure and governance model, and implement a plan for change management that takes into account the current state of the organization and supports significant change.
Sometimes the organizations we work with on effectiveness are large. We were asked by a state higher education agency to analyze statewide teacher preparation programs and determine whether they were producing the right number and kinds of qualified teachers. We developed a survey-based analysis of teacher hiring practices, conducted research on current and future supply and demand of teachers and determined alignment of the teacher preparation programs with best practices.
At other times the focus is on one key function in an organization. We evaluated a state higher education accountability system by developing and holding focus groups with all key stakeholders. Our analysis of the qualitative data from these meetings formed the basis of our recommendations for improving the effectiveness of this accountability system.
Building capacity in support functions is another aspect of organizational effectiveness. In working with an educational institution we identified key performance metrics for all of its non-instructional departments. A central feature of this project was the development of a reporting system for the metrics that has proven to be critical for the change management needed to make these departments more effective.
Survey research
MYTH: Surveys are easy to develop, can answer most any question and always provide reliable results.
REALITY: Surveys require knowledge and skill to develop, typically address questions concerning attitudes and opinions, and since they rely upon self-reporting, can be unreliable. Because of this, understanding results requires careful analyses and interpretation.
VALUE: Client and customer satisfaction, employee engagement and market opportunity assessment
We begin our survey work by spending time coming to understand what you need to know. For one client, we interviewed representatives of all the major stakeholder groups that the survey was intended for. We’ve also spent time researching issues and talking with other organizations that have carried out similar surveys.
From here, we develop and pilot your survey. Results from the pilot in the form of feedback as well as statistical analyses help us to refine survey items . Our goal is to ensure that you will be working with a valid and reliable survey instrument.
Our survey services allow you to be as involved in the process as you would like. While one client hired us just to develop a valid and reliable instrument for their on-going use, we are more typically engaged in every stage of the project – from development, printing and mailing, reminder mailings and data collection. And, of course, our team specializes in (and thrives on) survey data analysis and reporting.
Impact evaluation
MYTH: Evaluation is expensive and complicated, produces boring data and is designed to show program failure
REALITY: Impact evaluation requires getting crystal clear about what question you are trying to answer, why you are asking it, and ownership of the answer by key stakeholders. With these in place, evaluation can be done efficiently and yield actionable insights.
VALUE: Data-informed adjustments to programs and services based on effectiveness and outcomes
Strategic planning ranges from long-term plans for an entire organization to more short-term or focused initiatives. Whatever the scope of the planning, we begin by working with the organization’s leadership to define a detailed planning process, including timelines and responsibilities for deliverables. We then collect relevant information about the organization, its peers and the external environment and use stakeholder input to identify strategic issues.
A long-term strategic plan requires significant time and effort and a detailed process is needed to complete the plan successfully. As part of developing a yearlong planning process for a liberal arts college, we held a one-day workshop for the leadership of the college. It provided key features of the entire planning process, methods for collecting and analyzing internal and external data, approaches to engaging stakeholders and details on how to develop strategic goals and action plans.
In our strategic planning work we are occasionally asked to help out in a more focused way with a strategic planning process already underway. This was the case when we were asked to develop a business plan for an educational nonprofit to support its efforts to raise funds for its national expansion.
Strategic plans are often at risk because insufficient attention is given to implementation. We have worked with organizations that needed additional support to be able to implement plans. Our work in this area has ranged from a US division of an international organization with a large, complex strategic plan to a small consortium of non-profits with a simple strategic plan and limited revenue.
Centralizing for organizational effectiveness
The goal that this client brought to us was the desire to develop effective and financially sustainable supporting operations for the nine different agencies that were members of this consortium. The belief was that more efficient supporting operations would make this service financially sustainable, and since there was significant duplication across current operations, sharing operational support at the consortium level would increase efficiency and, in turn, sustainability.
Our approach was to analyze operations at each of the nine agencies with the goal of classifying those processes into three types: those that must be performed at local agencies, those that could be regionally shared and, finally, those that could be centralized. By sharing or centralizing a large portion of supporting operations, duplication in the current operations can be eliminated, thereby delivering the same services with greater efficiency.
Strategic Plan Implementation
Even well executed strategic plans get ignored when there is no detailed implementation plan. This is because strategic plans often call for changes that require significant effort and resources. Organizations must answer some key questions to avoid this fate: Are we willing and able to extend the resources to implement the needed changes? Are we prepared to support the needed change management that is an inevitable by-product of implementing changes? Are we sufficiently committed to our plan to follow through with its implementation?
These were some of the questions that arose for an organization that had grown through the combination of various smaller entities. The organization had gone through a planning process with outside support and we were engaged to help then implement one of the key recommendations. Because of the way the organization had grown, they had multiple locations in various parts of the country, and the recommendation was that they determine a location for a national headquarters that would allow them to bring a significant portion of their team together in one place.
Our approach to identifying the headquarters location began by working with leadership to fully understand what they hoped to gain by bringing their team together in one place and, just as importantly, what they were putting at risk by making this sort of change. Based on this understanding of the hoped for benefits and the associated risks, we developed an analytic approach that allowed the leadership to clearly evaluate a set of potential locations. Most important to this analysis was proximity to the communities they serve, accessibility to specialized talent needed for the organization and a broad range of economic factors. The outcome was the selection of a location for the new headquarters that will allow them to serve their clients more effectively.
Planning and Market Analyses
What can you do when you don’t have the resources needed to develop a full strategic plan? Steeping back from the ongoing demands of running an organization to think carefully and critically about what its most important goals are and how to achieve them is often difficult to do. But that’s just what strategy is, and unless time is made to do it, the organization could be at risk of letting “business as usual” undermine fulfilling its mission. If a full strategic plan is not called for, or if the resources for carrying it out are not available, the wisest course can be to focus on the strategy for a key part or critical function of the organization.
An important example of a critical function for many organizations is development and fundraising. This was the focus when we were asked to develop a business plan that would be used by an educational nonprofit to support its efforts to raise funds for its national expansion plan. By working collaboratively with leaders at the organization, we developed an approach and document that successfully communicated to external audiences the organization’s value. We drew special attention to its greater effectiveness when compared to more established competitors, its unique financial model and its sound and scalable operational plans and governance structure.
The outcome was a business plan that was instrumental in securing the funding needed to support they key first steps in the organization’s national expansion. Since then, the organization has continued to follow its growth strategy, and is on target to reach its strategic growth goals
Accountability and Performance Metrics
Developing performance metrics for accountability doesn’t have to be an added source of work. While they are sometimes seen as an imposition on already overburdened staff, potentially raising questions about the value of programs or departments, our approach to producing performance metrics for accountability alleviates these concerns. Stakeholder involvement is the key to our success.
We were approached by a large organization to develop a performance assessment system for accountability from the ground up. They wanted to have the organization’s overall mission reflected in the work of each department with measurable inputs and outcomes identified in terms of that mission. Working with each of the twelve departments within this organization, we identified their goals, worked to establish how those goals aligned with the overall mission and how budgetary constraints played a role in their activities. After analyzing the business line and job activities for each department, we drafted metrics for use in their performance assessment. Working collaboratively with department members, those metrics were further developed to align with the organization’s overall mission. We identified sources of data to support these metrics and developed new measures for metrics that were unsubstantiated.
The outcome? Department members enthusiastically shared their job activities as well as how they saw their role in meeting departmental goals. By focusing on business lines and current budgets, we developed metrics that were realistic while at the same time easily altered in a climate of changing budgets and shifting job descriptions. By accessing data that departments currently keep on themselves, we minimized data collection activities.
This organization continues to use these metrics to assess the work they do and, through monthly metric meetings, to facilitate inter-departmental communication and cross-organizational development.
Beyond Surveys: Building Informative Measures
Organizational change is often an intimidating idea. Implementing that change brings with it mixed reactions from both those within the organization and the individuals they serve. How will you know, as you roll out these changes, if your major stakeholders are benefiting from the changes you are making? When a client that we worked with on an organizational effectiveness project asked us to help her develop a way to monitor stakeholder satisfaction as changes to the organization were being put in place, we jumped at the opportunity to follow through on our work.
Working with members of the leadership team, we identified four major stakeholders groups. Together we worked to develop surveys for each group, piloting and re-designing them until we had a reliable instrument. We implemented the survey prior to any changes to establish a base measure of attitudes and satisfaction for each group and have re-implemented annually to follow these stakeholders’ and provide feedback on organizational effectiveness.
Being able to monitor an organizational effectiveness plan require being willing to ask for feedback data. Getting the best data requires designing the right measurement instruments. Good implementation monitoring starts there.
Strategic Planning
Kicking off a strategic planning process can be overwhelming. Not only is there a complex, multi-stage process to put in place, there are often many stakeholders that need to be involved. How do you give structure to the planning process and create a course of action that is less labor-intensive while at the same time more inclusive? How do you motivate stakeholders and capture knowledge from different constituents so that the resulting plan reflects the institution as a whole?
These were some of the questions we were asked when approached by a new president of a small, financially strapped college. The college had never undergone a strategic planning process and this president wanted help giving a framework to that process, motivating stakeholders and insuring that the process was inclusive. We were also informed that many stakeholders were highly skeptical of the value of strategic planning.
Our approach was to present a one-day workshop to a group representing faculty, staff, administrators and the board of directors. Part of the workshop involved a presentation of facts — higher education, the state, the nation. But this group also engaged in learning how to do the various activities required to move through the planning process. Activities were designed to engage participants in learning methods for analyzing the internal and external environments, for how to identify strategic issues given these analyses, for developing a strategic agenda and an operational plan and finally for designing an implementation process. Given these tools, along with some research-based facts about their institution and higher education, we helped design a project plan and communications plan to see the process off.
Stakeholder Research
A few well-designed focus groups can make your work much more relevant to stakeholders. This turned out to be the case for a state higher education agency that sought our help several years ago. They realized that their main publication – an accountability report summarizing the agency’s data-intensive research on trends in higher education – was not reaching its target audience. They wanted to understand why this publication was not being utilized and what they could do to improve utilization while at the same time fulfilling a legal mandate that somewhat constrained how they reported on such trends.
We helped them understand the perspectives and needs of their stakeholders through a few well-planned focus groups. Working with designated staff members, we identified relevant stakeholders and designed a series of focus groups with ten different types of stakeholders. In all ten groups, we focused on participants’ ideas about the strengths and weaknesses of the current report and solicited ideas for how the report could be made more useful, including recommendations for timing and format. We framed the reporting question by asking participants to reflect upon the purpose of an accountability report for higher education in the state.
The outcome? While each focus group generated ideas that reflected the particular types of interests those stakeholder had, our analysis of participants’ comments revealed key themes and specific suggestions for the agency to consider in making their report more useful. And by questioning stakeholders about the larger issue of the general purpose of accountability reporting, the agency was able of reflect upon their own mission in producing their report and its alignment with the perspectives of their stakeholders.
As a result, suggested changes to the agency’s reporting process made this process more streamlined and less labor intensive. The report itself took on a new format that is far more cost effective.
Survey Research to Provide Data for Decision-making
We were asked by a state higher education agency to analyze the effectiveness of teacher preparation programs across the entire state. They wanted to know whether these programs were producing the right number and kinds of qualified teachers. In addition, they wanted to know whether these programs follow best practices in their teacher preparation. To answer these questions, we designed a survey to better understand district-level teacher hiring practices throughout the state and administered it to all hiring officials. In addition we conducted research on current and future supply and demand of teachers based on demographic and economic factors and trends. To determine the alignment of the teacher preparation programs with best practices we conducted a literature review of best practices and compared that to an inventory of practices at all state teacher preparation programs.
The results surprised some key stakeholders. We showed that there is not, and is not expected to be, a significant shortage of teachers even in subjects such as math and science. Moreover, teacher preparation programs are largely aligned with best practices. Our research indicated that the perceived shortage was caused by high turnover of teachers at districts that, because of sub-optimal working conditions, are failing to retain well-qualified teachers.
Evaluation to Inform Curriculum Development
Should you add an alternative method of curriculum delivery or new program? A simple study can answer that question. With an alternative curriculum delivery method recently put in place, we designed and conducted a research study to determine if this new method performed as well as the program’s traditional approach when it came to student outcomes. The result was a deeper and more nuanced understanding of the strengths of the new method and just exactly where it was of benefit to student learning outcomes and where it fell short.
We began by working with the program’s leadership team to clearly establish what the student outcome goals for the program were. By focusing on these goals and operationalizing them, we specified five quantifiable indicators that the program’s leadership agreed were measures of these goals. We worked hard to insure that the measures in question were ones that were currently being kept and had previously been kept on the original mode of delivery. These became our outcome measures. Using available student data, we matched students across these two modes of deliver in terms of relevant academic achievement factors, creating virtual academic “twins.” Then, by performing a regression analysis to examine how students exposed to the new curriculum delivery method compared on the key outcome measures to students just like them but exposed to the traditional delivery method.
Outcomes for the alternative method were favorable and the new delivery method was added at this school. A cost analysis performed independently indicated that keeping the original curriculum was not added expense. And so this school has increased its overall enrollment while knowing that it has not compromising its educational goals.
Assessment Capacity Building
Grants require evaluation. Colleges require assessment. We build faculty and staff capacity. Whether your granting agency requires outcome evaluations or your college requires on-going student learning assessments, we can help you understand how to develop and implement these processes.
With a large national-level grant already in place, a client approached us to help with an evaluation of their program that involved both the documentation of learning and behavioral outcomes and the development of the measurement instruments to do so. Working with faculty involved in the program, we designed detailed performance assessments based upon faculty expertise in the field. During this process, faculty became more aware of their own implicit expectations concerning these outcomes. The behavioral measures they developed were then coupled with content knowledge instruments based upon exams already in use to produce a complete picture of the outcomes of interest to their granter.
Similar work was done at a more local level for a department chair, at a loss and with little time for how to design a valuable assessment of student learning for reporting at the college level. Starting from the department’s high-level statement of learning goals, we worked with faculty to operationalize these goals, identify different levels of success at meeting these goals and produce explicit examples of success at each level. This department went on to use this assessment system, modifying the success levels as well as their specific goal definitions. As a result, they became an example of a successful assessment system and were held up as exemplary for the rest of the college.
Focus Groups and Structured Interviews
What do hiring officials at school districts really look for in a teacher’s job application? What do our employees have to say about the changes we are thinking of making? What do our board members think about the direction we’ve been going and what sorts of strategies would they like to see in the future?
Very often in the course of our projects we discover that there is some fundamental question that is driving the changes under consideration, but that the question has really never been put to the people who can best answer it. Well-placed questions to the right individuals can often facilitate a project.
We have employed focus groups as well as structured interviews to gather quantitative data for many projects and for many purposes. They have help to launch projects, gather feedback during strategic planning processes and provide the fundamental database upon which decisions were made (among others). Sometimes face-to-face data collection is simply irreplaceable.
City Strategic Planning Process
Improving City Department's Operations
Survey of City Residents to Inform Strategy
Program Impact Evaluation
You have questions and a stockpile of data. We combine them to find insights and answers. This is exactly what we did for one client that had kept data for years on every activity it had undertaken and wondered if any of it could shed light on recent concerns that had been raised concerning one of their strategic goals.
After creating an algorithm to clean the data, and documenting the entire life cycle of this process for future reference, we examined the data to determine their level of integrity – how complete, accurate and consistent they were. We then compared what information we had with our client’s concerns and, working together with them, determined a set of specific questions that could be addressed. These questions when answered together would provide a clear response to their concerns.
A number of statistical models were then developed and run to examine these questions.
The result was detailed insight into their concerns and a better understanding of how the actions that were set in place to meet their strategic goal had gone wrong. Moreover, the analyses were limited because of the kind of data that had been kept. Our review of that data suggested new outcome measures that were more important to keep in the future.
Through all of this, we were clear to document all methods and annotate all statistical analyses so that they could be replicated by staff in the future. Our reporting on this study focused on contextual issues and content, allowing the formal details to play a supporting role. We presented this work in multiple public settings and each time found that the analyses and idea motivating this work were easily grasped.