Supervisor Josh Brown stocks a closet that was redesigned by his team to increase systemic respect.
Photo courtesy of University of Utah Health
University of Utah Health’s environmental services (EVS) service contract was due to expire in a year. A handful of U of U Health’s ambulatory health centers had outsourced their EVS needs four years’ prior. Writing the request for proposal (RFP) would have been business as usual, but center managers and patients were complaining about the cleanliness of these centers, and the complaints were growing.
Some of the complaints were within the current contract’s scope; some were beyond. To untangle it all, U of U Health’s ambulatory leadership gave the EVS team two tasks: First, partner with the existing vendor to move back into compliance where needed. Second, write the new RFP with expanded requirements to match the expectations of center managers and patients.
EVS leaders approached the challenge with a spirit of inquiry, fortified by two foundational pillars: continuous improvement and systemic respect.
Building value isn’t a paint-by-numbers endeavor, but there are fundamentals to be followed. An explanation of how U of U Health builds value includes the following steps:
Build on continuous improvement and systemic respect. There’s plenty of support for improvement, but what about continuous improvement? U of U Health’s service contract rolled around every five years. The growing complaints would indicate not much in the way of improvement was going on between RFPs. Continuous improvement implies a second-nature watchfulness for improvement opportunities and a willingness to incrementally revisit the same processes to address those opportunities.
Systemic respect is respect built into the system, not to be mistaken for interpersonal respect. Systemic respect refers to whether the system is performing respectfully toward patients, visitors, employees and providers. Though it concerns the system, it’s not as impersonal as it sounds because operational leaders own and control the systems in the workplaces.
Systemic disrespect can be found everywhere, such as a custodian with no standard reference for common tasks, a custodian given 15 minutes for a task requiring 25 and supplies overstuffed into inadequate spaces. Systemic disrespect causes patients and staff to suffer despite everyone performing their jobs to the letter.
Follow a problem-solving methodology based on the scientific method. Process improvement experts may arm wrestle about the particulars of any given methodology, but they all agree operational leaders, including those in EVS, should follow a methodology. U of U Health’s EVS team uses a problem-solving framework that parallels the scientific method and works through the methodology with a spirit of experimental inquiry. No Ph.D. is needed; just a desire to understand the process issues. U of U Health’s six-phase methodology includes:
- Project definition. Define the project vision, scope and how success will be measured.
- Baseline analysis. Define the current state qualitatively and quantitatively, then refine those measures of success into simple metrics.
- Investigation. Use data to get to the root cause of the issue and direct the decision making.
- Improvement design. Develop and select interventions, test them and develop tools to support the front line as they execute the new process.
- Implementation. Develop then execute an implementation plan.
- Monitoring. Track adherence both to the new process and outcomes of the process.
Get to know the problem. Engaged EVS leaders stay close to the work, but that’s not to imply they carry intimate knowledge of every issue simmering in their operation. Early on, leadership gets very close to the processes in scope, which necessitates getting close to the people they support — the front liners executing those processes.
Getting to know the problem overlaps three phases in U of U Health’s methodology: project definition, baseline analysis and investigation.
As part of project definition, leaders frame the discussion in process problems, as opposed to people problems. With complaints growing, U of U Health’s EVS leaders may have felt tempted to blame the contracted staff. Instead, they investigated process.
Leaders also listen for common cheats, like defining the problem as the absence of someone’s favored solution. For example, “the problem is, we don’t have enough staff.” Another seductive error is to adopt a victim mentality by putting the problem outside the team’s control, such as bemoaning patient behavior.
EVS leaders can hone in on solid problem definition by asking, “How will we measure success?” and then build project-specific SMART (Specific, Measurable, Attainable, Relevant and Time-bound) goals. Defining how success will be measured is more important at this point than the actual numbers. U of U Health often leaves the actual numbers blank in their SMART goals, which will be filled in during baseline analysis.
A baseline analysis is a comprehensive story of the current state process in quantitative and qualitative terms. It may even reveal the problem isn’t such a big deal, and the organization may choose to redirect the effort.
To get familiar with the qualitative side of the current state (i.e., its baseline state), the leader’s most powerful tools are ears, eyes and open-ended questions. Building systemic respect requires the voice of the customer (i.e., patients, family and clinical staff) and the voice of front-line custodians. These voices can be heard through digital channels, such as patient surveys, and risk reporting systems, such as RL Suite software. They can likewise be gathered in-person with gemba (Japanese for “the real place”) visits.
Whatever problem the team seeks to solve, it originates in a process that occurs in a place. That place is called the gemba. The EVS leader and team goes to the gemba to learn elements of the process that are unobservable in a dataset. Quantitative analyses are vital to value improvement but aren’t a substitute for eyes directly on the work as it happens.
Quantitatively, a minimal baseline analysis could be considered completed when all the blanks in the SMART goals are filled in, but the baseline numbers in SMART goals are probably averages, which are likely misleading. Averages are convenient but tell very little of the actual situation.
In baseline analysis, the team completes the quantitative assessment with histograms, run charts and scatter diagrams. These three analytical techniques are part of Ishikawa’s “Seven Friendly Tools.” They include a process map (i.e., flow chart); a check sheet (a tally sheet, not a checklist); a cause-and-effect diagram (i.e., a fishbone diagram); a pareto chart; a run chart (sometimes listed as control chart); a scatter diagram (to seek associations between variables); and a histogram (a diagram depicting variation of a single group).
They’re friendly because they are simple and intuitive to most people. The U of U Health includes three bonus friendly tools: Box-and-whisker charts (a diagram comparing variation of multiple groups); benchmarking; and spaghetti diagrams (maps to scale, showing how much team members walk).
Any given problem is unlikely to need all of these tools. And, of course, there are other tools. For example, in U of U Health’s EVS case, there were two gap analyses. They also conducted a thorough time study in one center and applied the results to its other centers. Knowing when the team analyzed enough is part of the art of improvement science. If a team finds itself asking, “What should we do about it?” more than “What else should we be looking for?”, it may be time to move on to improvement design.
Use solutions designed by and for front-line employees. Author Daniel Pink explains the roots of employee engagement in his book, Drive. They are, quite simply, autonomy, mastery and purpose. Neither Pink nor U of U Health is unrealistic about the variety of people who come to work in our institutions. Some are simply trading minimal effort for their wages. But to engage the majority of employees, leaders are wise to supply ample autonomy, mastery and purpose.
The purpose question is easy in health care EVS: creating safe, pleasant and healing environments. Autonomy implies that custodians, given the time and tools, can execute their defined work according to standards, unencumbered by micromanagement. Mastery asks for leaders to define expectations, provide proper training and leave room for front-line expertise to influence their own processes.
While it’s easy to agree with these aspirations, delivery is not intuitive. U of U Health’s EVS leaders returned to their methodology, specifically the improvement design phase, which focuses on process standardization. In their case, a revised process wasn’t needed, per se. Standard processes were defined, but the custodians had difficulty following them. Framed in the language of systemic respect, proper time and tools were missing from the workplaces served by the contractor.
With leadership guidance, team members are asked to develop standard work, which supports successful execution of standard processes. Standard work minimizes variability and prevents errors. It’s a visual shorthand, including the sequence of steps and target times. It’s thoughtful on word count, white space and layout. Unobtrusive but always accessible to the front line, literally in an instant, standard work is integrated into the flow of the work.
Similarly, the team developed forcing functions to support the standard processes. Forcing functions are standard work’s more intrusive cousins, for more critical aspects of the work. These are error-prevention mechanisms or process-stopping mechanisms. (Toyota calls these “poka yoke” and “jidoka.”) A checklist is a common jidoka health care example that pauses the process to check for quality. Forcing functions make it easier to get the critical work right every time.
Leaders can bolster systemic respect in more straightforward ways during improvement design. A few examples: Standardizing custodian carts according to the type of space being cleaned, providing point-of-use tools where feasible and finding ways to minimize walking distance. It’s unreasonable to expect leaders to know of these smaller opportunities; however, it is entirely reasonable for leaders to recognize their own blind spots. And the solution is so simple: just ask.
At U of U Health, smallish process problems are called rocks in the shoes, a phrase they didn’t coin, but no one can remember where it came from originally. Rocks in the shoes are made visible via a whiteboard (because invisible problems are impossible to solve), and solutions are developed by the front line.
Implementation is sometimes so straightforward, it’s done with minimal formality. Other solutions require expert project management. In either case, EVS leaders demonstrate their commitment by being on hand for the rollout and continuing to listen to the voice of the front line during the change. Some specific improvements will be adopted as designed, others will require some on-the-spot adaptation and still others may be abandoned.
Provide monitors from which the front-line employees can learn. The last step is also the last phase of the U of U Health methodology: monitor. Health care systems are pretty good about getting the front line the information needed to maintain work flowing. Systems often stumble with metrics, here defined as a comprehensive view of recent history, including process and performance measures.
Regular affirmation of EVS’s purpose — to provide safe, pleasant and healing environments — can be competently accomplished with less than 1% of a leader’s time. Add to that leader’s duties a feedback loop of metrics that quantify EVS’s impact on patient experience, and perhaps it’s up to 2%.
Value-minded leaders go further, asking whether their metrics are helping the team learn and improve, and if they are meaningful and actionable to the front line. These are the true purpose of metrics, in keeping with the pillars of continuous improvement and systemic respect.
“Gotcha” measures hammer away at compliance while systemically respectful measures bolster engagement.
U of U Health’s EVS leadership could have approached their assignment with a typical compliance mindset: Quality standards aren’t being met, and someone must be held to account.
Of course, accountability is vital. But of equal importance, they viewed it as an opportunity for continuous improvement and to grow systemic respect.
Jessica Rivera is environmental services director, Dane Falkner is senior value engineer and Steve Johnson is value engineering director at University of Utah Health, Salt Lake City. They can be reached at Jess.Rivera@hsc.utah.edu, Dane.Falkner@hsc.utah.edu, and Steven.Johnson@hsc.utah.edu.