Organizational development (OD) interventions: examples & best practices

case study organizational development interventions

In the world of organizational development, change is a constant process of discovery, analysis and action. An effective OD intervention can be one of the best mechanisms for creating impactful change and helping improve organizational efficiency.

The right OD intervention can help ensure you're solving the right problems, achieve your desired change velocity and also navigate any resistance. In this guide, we'll explore the different types of organization development interventions available to org dev teams and give you practical advice for implementing them along the way.

Design your next session with SessionLab

Join the 150,000+ facilitators 
using SessionLab.

Recommended Articles

A step-by-step guide to planning a workshop, 54 great online tools for workshops and meetings, how to create an unforgettable training session in 8 simple steps.

  • 18 Free Facilitation Resources We Think You’ll Love

At the heart of any organizational development strategy is the desire for meaningful change and growth. But whether you’re working in a small startup or a large enterprise, implementing change can pose a challenge and isn’t without its risk.

The process of improving organizational effectiveness can lead to tough decisions and its not uncommon to see resistance to change, difficulties with employee engagement or face friction when implement large-scale change across an entire organization.

While change can be kickstarted in many forms, one of the most effective and common tools used by top HR and change management teams is an OD intervention.

Interventions aim to formalize key actions within a change process and provide a framework for successful change. In this guide, we’ll explore the four types of OD intervention and explain how and when you might deploy them in your organization.

We’ll also share some organizational development intervention examples and give practical advice and tips for implementing these interventions. So whether you’re new to organizational development or you already have an intervention in mind, there’s something for you in this guide.

What are Organizational Development (OD) interventions?

Organizational Development (OD) interventions refer to a systematic and planned series of actions or activities designed to improve the overall effectiveness, health, and performance of an organization. 

To simplify, an OD intervention is a process that is actioned in response to a need for change. You might radically redesign your organizational structure because of inefficiencies in how your org works together and achieves your goals.

If you identify a significant ongoing issue in how your organization operates, innovates or grows, this is often a trigger point for an OD intervention.

For example, if you struggle to find and retain the right talent, your HR and hiring teams might use an OD intervention to identify issues with job description or design, DEI initiatives or onboarding and employee happiness. 

OD interventions are typically large in scale and are designed to have a major impact on key areas of how your organization operates. They require the coordination and efforts of multiple departments and the input of senior leadership in order to take effect.

OD interventions follow a process of identifying and exploring the problem, diagnosing the issue further and then carefully developing a strategy that considers people, processes and other organizational factors.

After crafting a solution in the form of a proposed intervention, then comes the challenge of actually enacting that change and then evaluating the impact of the solution.

In the above example, your org dev team would work with affected teams to understand the situation and build an action plan that may radically change the organization.

Perhaps you discover job design is an issue or that there is a communication gap between your SMT and the rest of the company that has left your employees feeling unheard and unvalued. Finding a solution that adequately addresses organizational challenges requires a thorough exploration and analysis of the problem at hand and the people affected.

While conducting an OD intervention can be a little overwhelming, with the right process you can improve organizational performance, take care of your people and create lasting change.

case study organizational development interventions

What are the 4 types of organizational development (OD) interventions?

Organizational development interventions can take various forms, and they are typically categorized into different types based on their focus and objectives. Some common types of organizational development interventions include:

Human process interventions:

Human process interventions focus on improving group dynamics within the organization and how teams work together. Group interventions are common here, and change managers working in this area will likely run workshops and facilitate team building interventions with a desire to improve dynamics and interpersonal relationships on the team.

Techno-structural interventions:

OD interventions in this bracket typically focus on improving team productivity and performance by leveraging new technology and by considering how an organization is structured. Typical actions can include deploying new tools to streamline team workflows, automating processes or shifting organizational structures in order to maximise efficiency and reduce overhead.

Human resource management interventions:

Human resource management interventions typically focus on developing talent, creating employee training plans and otherwise working on how your organizations sources, nurtures and develops your people. Diversity interventions and wellness interventions also fall under this banner and as such, they’re typically implemented and coordinated by HR teams.

Strategic change interventions: 

Organization development interventions related to strategy can be the among far reaching and impactful when it comes to improving an organization’s performance. This kind of change often aims to be transformational in nature and is often actioned when the long-term survival of the organization is at risk or there is a desire to radically alter how a company operates.

While other OD interventions exist, as Cummings and Worley noted in their book, Organization Development and Change 9th Edition , most interventions fall under the four types of OD interventions outlined in this guide.

That said, this list is not exhaustive nor are these OD interventions mutually exclusive. The actions your organization will take to create meaningful change will likely feature elements of various intervention types. 

When considering what changes and interventions might be most effective, try not to be pigeon-holed into just one type of OD intervention or restrict yourself to set organizational strategies.

Think about the desired end state of your OD initiative and conduct a thorough root-cause analysis to select the most appropriate intervention(s). After implementation, evaluate the efficacy of your actions and be open to using other types of intervention in your ongoing quest to improve organizational efficiency. 

case study organizational development interventions

Human Process Interventions

The original, best known and most regularly deployed OD intervention are those which focus on human processes. These kinds of interventions aim to improve interpersonal, group and organizational dynamics. 

Facing challenges with team culture, communication or conflict resolution between teams and individuals? OD teams will often run interventions in the form of soft skills training , team building programs and improving relationships between different departments.

These can be low effort, such as running a weekly games session for employees to deepen bonds and get to know each other better. They can also include long term programs for conflict resolution and soft skills training, culture committees or mentoring and coaching opportunities for your team. 

I recall an early career moment where there was friction between sales and support. Both teams felt misunderstood by the other and were regularly coming into conflict. Not only did this affect team morale, but it also contributed to a decline in our CSAT score AND missed sales targets.

By helping the team understand one another more deeply and creating a clear, collaborative process of handling high ticket customers and sharing information, the issue gradually improved. 

Examples of human process interventions

Individual interventions.

Interventions on the individual level can have a massive impact on not only a single person’s happiness or job satisfaction but on how the system operates as a whole. Individual interventions often take the form of one-to-one interactions designed to improve how a single employee relates to their work, their team and themselves. Common interventions at this level can include one-to-one mentoring, individual growth plans, buddy systems and work shadowing programs. 

Managers or HR teams might work with specific individuals to help resolve conflicts, build skills or better integrate them into the team. Regular one-to-ones can inform this process, but an individual intervention is often called for when a problem is discovered.

A common trigger point for such an intervention is during an employee feedback process. For example, if a manager has received a lot of feedback from their team that suggests they aren’t managing well, they may get coaching from a senior leader to help them improve their leadership skills.  

Team forming interventions

Group dynamics are an important aspect of how a team functions. Team forming interventions are focused on helping improve those dynamics, creating alignment and helping groups get to know each other more deeply in a safe environment. Interventions designed to help accelerate team cohesion and bring groups together are some of the most common you’ll run, and it’s likely you’re doing some of them already. 

Common trigger points for such an intervention can include when a new team is formed, discovering problems with how a team works together or wanting to reassert team values and shared bonds. Team building events, participatory workshops and any shared activities that create opportunities for connection and trust are all common interventions in this area.

Workshops are among the most powerful formats for team forming interventions . When first bringing a team together, you might run a team canvas workshop to help a group align on their values, explore team dynamics and decide how they want to work together.

Alternatively, you might conduct a skills workshop where your team gets to ideate and learn something new as a group. In any case, be sure to support your process with the right workshop tools in order to create engagement and get results.

Purposeful team building activities are another key human process intervention. Simply spending time playing games, having fun or sharing our stories can have a powerful effect on team dynamics. The important thing is providing an opportunity for people to get to know each other more deeply, create bonds and grow together. 

case study organizational development interventions

Intergroup interventions

When two different departments in your organization are finding it difficult to work together, an intergroup intervention is a great step. Mapping how the two teams would like to collaborate and deepening understanding of the challenges and specifics of each group’s work can help smooth things out and improve efficiency too.

In the sales/support example above, a combination of team building and process design was key to the intergroup intervention we undertook. Only by coming together and talking about the problems while also getting to know each other as individuals were we able to move forward. 

For companies with few opportunities for inter-departmental work, simply bringing employees together for a team building activity so they can understand one another better is a powerful step towards change. 

Tips for human process interventions

One size doesn’t fit all.

Whenever you’re working with people, it’s important to note that everyone is different and what works with one team or individual may not work for another . I recall an occasion where a company-wide team building activity (casino and club night) chosen by upper management was chosen without asking the team how they felt.

While some folks loved it, many people felt uncomfortable or were disengaged. The desired goal of improved team connections wasn’t met and instead, the group ended up feeling more fractured. Especially in the case of intergroup relations interventions, remember to include people from both groups and think about their varying needs!

Choose your intervention with the individual or group affected in mind and where possible, directly include them in the process. For example, an individual development plan should absolutely factor in how the person in question learns best and the unique context of the situation. Processes and systems are good, but don’t forget that the best outcomes arise out of solutions that have those people affected at heart. 

It’s an ongoing process

One common mistake I’ve seen with organizations deploying change is to assume a single intervention will solve a problem forever. Human processes are all about relationships between people and teams. Like any relationship, these need nurturing over time.

While a single team building event can recharge the tanks and help cement bonds, without care and consistent attention, that hard work can be for nothing. 

For example, let’s say you run a company values workshop to help create alignment on the future of the organization and improve company culture. Your team comes together to choose core values and everyone feels good at the end of the session. Then, 6 months later, you ask your team what your values are and nobody can remember what they are. Without follow-up actions and a process of keeping those values alive and present, the desired change in culture has been ineffective. 

While most organizational development interventions are ongoing in nature, human processes can prove to be especially liquid and require extra attention from people throughout the system.

People are complex! Be sure to create systems to check-in on progress, continue the good work of an intervention afterwards and reinforce the change you wish to create.

Repetition is a key element of these processes so think not about running a single company event, but how to ensure you continuously build on your company culture.

case study organizational development interventions

Empower your managers

While large-scale interventions benefit from research, analysis and oversight provided by a change manager, some changes can benefit from speed.

At the human process level, line managers are often the first to see issues and spot opportunities for change. So why not give them the tools and permission to try and create positive changes for their teams?

For example, let’s say that a team member comes to you feeling overwhelmed and stressed because they’re having difficulty finding childcare. In the long-term, a company policy around childcare would be great, but that doesn’t solve the immediate issue for the team members affected.

At a human process level, proactivity and timeliness can make all the difference. Ensure your company policies and organizational culture support managers in making timely, responsible and effective interventions on behalf of their team. 

As with any change process, be sure to log and track changes and reflect on the impact. In addition to alleviating difficulties for individuals and teams, smaller, fast-moving intervention techniques can provide important insights for company-wide initiatives. 

Feedback loops are vital 

Human systems are dynamic and ever-changing. Without feedback loops, it’s possible for issues or opportunities within those systems to go unnoticed. For some interventions such as coaching or mentoring programs, feedback is an implicit part of the process.

For others, change managers will need to create a process for gathering feedback in order to monitor, evaluate and improve OD interventions with the input of all stakeholders.  Whatever system you use, it’s also vital that feedback goes both ways. Running a train the trainer course and giving your trainees feedback on their progress is important, but you should also get feedback about the program from trainees, managers and any other stakeholders. 

When it comes to human systems, you’ll also find it most effective to have a system for gathering feedback well in advance of any intervention. Try to make giving and receiving feedback a consistent process for your teams and use tools to support the process where possible. 

Visual representation of the ADDIE cycle - Analyze, Design, Develop, Implement, Evaluate.

Techno-Structural Interventions:

Techno-structural interventions aim to better align an organization’s structure, technology and processes with its goals and objectives. OD interventions in this area can be among the most far reaching for any org dev team and they’re often deployed when change feels paramount for a company’s survival or for maintaining a competitive edge.

Low growth, a rapidly changing market or key areas of a business underperforming? These can be triggers for a techno-structural intervention. 

Tasks such as organizational restructuring, process redesign, job enrichment or even downsizing fall under this umbrella. Other common interventions for OD teams include implementing new tools and technologies to improve efficiency, streamline workflows and future proof the company.  

In techno-structural interventions, there is often an emphasis on continuous process improvement. Switching to Agile or lean methodologies or embracing total quality management processes like Six Sigma, as made famous by their use at Ford Motor Company , are common tasks.

As large-scale processes that can include changing business direction or radically repositioning your product, these interventions can be a challenge to implement. 

Without a change management plan, change can be slow, meet resistance or simply not catch on. Be sure to leverage the skills and expertise of change managers and senior leadership when conducting these kinds of interventions.

Examples of techno-structural interventions

Organizational restructuring.

Restructuring an organization means rethinking how some or all of your workforce is structured and operates . Who reports to who? Which departments fall under which manager and who is responsible for making decisions that affect different areas of the business?

Common triggers for an organizational restructure include a need for greater revenue or reduced costs, a desire to refocus or change company goals or a move into a new market.

Creating innovative new products or services or simply working to resolve issues with workload, resource management or siloing are also common interventions that require a technological or structural approach.

For example, in a small startup, it’s not uncommon for all of your developers and designers to sit under the same branch in your org chart with a single founding developer as their manager.

For a while, this works and then as you grow, you start seeing bottlenecks in your dev process, and there are too many direct reports for your founding developer to handle while also trying to innovate and place your product in the market.

At this stage, an organizational restructure will be necessary in order to ensure efficiency, avoid burnout and also ensure you have the right skillset present among your dev team. 

Rapid growth or reduction in your team size is another trigger for a restructure. Sometimes, this might mean a single department may split or combine with others.

On other occasions, it’s necessary for organizations to completely rethink how the hierarchy of their teams works – for example, switching from a functional org structure, where each department reports to a department head, to a matrix structure, where cross functional teams are put together on a project by project basis. 

case study organizational development interventions

Business process reengineering

BPR is a process of radically redesigning how your organization works. It’s a comprehensive model of analyzing, redesigning and optimizing your organizational processes in order to improve business performance. 

This is particularly valuable for organizations who need to see significant change in order to remain competitive or where redundancies and inefficiencies in processes are creating massive costs or an inability to meet goals. 

Organizations implementing a BPR intervention typically begin by mapping all current business processes and analyzing them for opportunities, gaps and issues. After validating ideas for improvement, the organization will design an ideal future state and begin moving towards it. 

By definition, BPR is wide ranging in nature, and team’s working with this kind of intervention should not feel constrained with their suggestions. Removing redundant processes or implementing a new helpdesk to improve the efficiency of your customer support team might be enough to save costs, but what if you fixed the root cause of your largest customer issues or invested in self-serve support? 

If your team is finding themselves coming up against the same problems even after a solution or quick-fix has been implemented, you may need to go further. That’s the perfect time for a more thorough and radical appraisal and solution process such as BPR. 

Work design interventions

Work design interventions are used when an organization wishes to improve the content or organization of the work and responsibilities falling upon individual employees or departments.

The way our work is designed affects how we feel about our job our ourselves. The tasks, working hours or contact points associated with our role can have a massive impact on our overall motivation, engagement and stress.

We all want our teams to be happy and productive, and a work design intervention can cover everything from redesigning job roles and individual tasks, to finding ways to automate or improve processes that impact job satisfaction or productivity. 

For example, let’s say that individuals on a team feel stressed because they have a large workload and don’t feel supported in achieving their goals. OD interventions might include redesigning job sepcs, allocating more resources, creating reward and recognition schemes or even improving autonomy and self management. 

Deep understanding of the problem is key when considering work design interventions. Be sure to conduct interviews, run a focus group and implement a continuous feedback system so you can see problems emerge and understand whether team’s need more support, job control, enrichment, development opportunties or something else entirely. 

case study organizational development interventions

Tips for running a techno-structural intervention

Document your current processes. .

While the ideal state is that your processes are well documented in advance of problems arising, it’s not uncommon for there to be gaps in your documentation when you get around to thinking about interventions. In fact, it’s entirely possible that one of the first steps of the problem analysis and diagnosis stage will be to document any missing processes. 

Before you start implementing a new process, be sure to take time to understand how your organization operates now. Try to map your processes from end-to-end and be sure to capture all the actors involved in the system. Redesigning a sales process without thinking about how it might impact your support team is a surefire way of causing new problems. 

When documenting, be sure to involve stakeholders from across the organization so you can gain an accurate, in-depth picture of your processes. Not going deep enough or speaking to the people who actually enact or work with a process is another pitfall you can avoid by simply speaking to the right people. 

Moving forward, aim for each team to document your processes as a matter of habit and ongoing improvement. Not only will it help any changes be smoother should you need to make them but it can help surface issues and opportunities more quickly. 

Use a proven framework and do your research

Changing the structure or processes or a large organization is a difficult undertaking but you are not the first person to encounter this challenge. Lean on proven frameworks and the work of other thinkers, experts and organizations.

At SessionLab, we transitioned to an EOS framework to help us nail down our strategy, create a new org chart and organize our work . We found that the structure, advice and existing knowledge around EOS allowed us to make better decisions, transition faster and focus on implementation, rather than trying to come up with an entirely new solution. 

No two organizations are the same but there’s something to learn from how others have changed for the better. Try looking at how successful organizations at a similar maturity or size to your own operate or better yet, look at those that have solved some of the challenges you’re facing. Join a masterclass or community – the ongoing support and insight of peers can also be invaluable in actioning change. 

Session Planner full view with blocks and notes

Run workshops to surface insights quickly and collaboratively 

When thinking about introducing new processes it’s imperative that you first explore and diagnose a problem correctly . When it comes to how teams and departments operate, it’s not uncommon for hidden variables or unspoken actions within the system to be at the heart of your issues. So how do you bring them out into the open and encourage openness from your team? 

Speaking to major stakeholders and business people across the org is vital, but it’s often not enough to just send out an email asking for input.

Workshops are some of the most powerful intervention techniques available to change managers and org dev teams. Ideating on possible solutions collaboratively is often a more effective way to truly discover the root cause of issues and create solutions that account for the people who will be most affected by the process you are changing. 

SessionLab is an effective tool for designing and delivering the workshops that you’ll use to support your OD intervention process. Invite stakeholders to co-create your agenda in real-time and involve them in the change process. Save time designing your key workshops and ensure your process is efficient with SessionLab.

Human Resource Management (HRM) Interventions

HRM interventions concentrate on developing and managing human resources within the organization. Examples include improving hiring processes, creating and reinforcing diversity, improving performance management processes and building opportunities for career development. People are one of the most important parts of how an organization functions and HRM interventions are designed to directly impact the people working in your company. 

As the name would suggest, these kinds of interventions are often deployed by or in conjunction with HR teams in response to difficulties with hiring or retaining staff, employee satisfaction or problems with performance. 

Effective change tracking, strong feedback loops and good communication are essential elements of a successful HRM intervention. Programs and initiatives that form the backbone of human resource development – such as wellness or training programs – are ongoing in nature.

You’ll often find that such an intervention takes time to achieve its chosen goal and the strength of your research is a key element of success. Purposeful interventions that incorporate the direct input of your employees are more likely to be fit for purpose and create long-lasting change. 

Examples of human resource management interventions

Employee wellness interventions.

Staff are reporting high levels of burnout and managers are noting that their direct reports are feeling overwhelmed or stressed. This is the perfect time for an employee wellness intervention.

While HR teams might also consider job design and other factors, these programs most commonly involve the creation of new opportunities and programs designed to alleviate issues and improve the health of your team. 

Some common strategies include creating new employee benefits linked to health and wellness . Cycle-to-work schemes, free gym memberships and budgets to support employees in improving their own wellbeing can all have positive impacts on team wellness.

You might also provide opportunities for staff to access company healthcare and counselling. On the lighter side, creating a budget for healthy lunches and office snacks, giving opportunities to volunteer or exercise on company time can also have an immediate impact. 

While the same is true for most interventions, employee wellness programs absolutely require the involvement of everyone on your team when choosing what to implement. A poorly designed or unfit for purpose intervention can quite easily have a negative impact on wellness.

Let’s say you create a scheme where everyone in the office gets a free healthy lunch. Great for your onsite team, but how about your hybrid and remote employees? If you don’t offer a similar benefit or take them into account, they could feel less valued and overall wellness could suffer.

Performance management interventions

Managing and hopefully improving the performance of your team over time is a necessity for any successful business. But what about if the problem you uncover issues with staff performance or a lack of process for tracking and improving the performance of your team? Time for a performance management intervention.

For some organizations, such an intervention might include actually setting up a performance management system and ensuring that every member of staff is given frequent feedback and opportunities to improve. For others, this might mean enabling managers with better tools and processes or creating reward programmes to encourage higher performance.

Coaching, mentoring and the unblocking of other issues that might impact employee performance are also key tasks that can be part of a performance management intervention. 

A key part of a successful performance management intervention is truly understanding the root cause of an issue. Underperforming staff may face issues with job design, internal or external pressures or may not have even been given feedback or an opportunity to improve before.

Try not to jump into the deep-end with punitive measures unless you’ve already taken a more holistic approach that gives staff the feedback, tools and opportunities they need to develop.   

case study organizational development interventions

Talent development interventions 

As a company grows and roles change, it’s not uncommon to discover that your team has skill gaps that need to be filled. You might find that a changing market means that key competencies need to be updated or supported with new training. Or you might discover that people are unhappy with the pace of their career development and are leaving the company as a result.

Talent development interventions are all about managing and developing your team so they’re better positioned to do their jobs, grow in their careers and stick around.

Common talent development interventions include designing new training programs and coaching opportunities, personal growth plans and even reconsidering how you onboarding, compensate and promote members of your team. These kinds of interventions also extend to rethinking how your HR team goes about attracting and hiring new team members. 

Any time you are struggling with team performance, remember that the solution is only as good as the analysis of the problem. Talk to team members at different levels and who have been with the company for different lengths of time.

Only once you’ve truly identified the root cause of the issue can you implement an intervention that will serve everyone on your team and prevent issues from occurring in the future. 

Tips for running HRM interventions

Engage people throughout the organization .

Any intervention that directly affects your team should get some level of input from the people being affected. For some interventions, it’s absolutely paramount to source input and get feedback from your employees. 

For example, a wellness program for your remote employees without the input of remote team members isn’t likely to serve their needs.

Underperforming sales team? Rather than making an assumption at a management level, talk to your sales reps and see what they think the issue is. Not only are these people more likely to be able to identify the root cause of a problem, but they’re also instrumental in actioning any given intervention.

Engaging people early in the process is helpful for getting buy-in and removing barriers to change. Don’t keep your change discussions entirely confined to management meetings and get input when you can, so long as it’s appropriate. 

Support your process with data

Human-process interventions can sometimes be kickstarted by qualitative data : anecdotes about how people on the team are feeling or gut feelings from management about burnout or stress. While these kinds of comments and discussions are vital, it’s also important to back-up any change with data and processes to determine the viability and success of any initiative.

The gut feeling about a problem is likely onto something, but without data of some kind, it can be hard to be confident that your solution is the right one to support your wider business strategy.

For example, before making a large-sweeping change to working hours, maybe survey your team to find out if that works for them. Want to roll out your marketing training program to other teams? What data about team performance and employee satisfaction do you have that supports that decision?

Feeling like your hiring process is bringing in a large number of low quality interviewees and want to make a change? Check out industry standards, compare across job roles and back up your feelings with hard data wherever possible. 

Start measuring employee KPIS before the need for an intervention 

While some challenges are difficult to predict, HR teams are in a great position to pro-actively source input, monitor employee happiness and prepare for wider change.

If you’re already using a performance management system, it’s easy to start tracking how your team feels, see the efficacy of personal development plans and monitor things like onboarding efficacy and retention.

If you’re not, it really pays to start measuring employee sentiment and refining your feedback loops so that you have something to point to if the need for a HRM intervention arises. 

Even something as simple as a monthly employee satisfaction survey can help your HR team see issues coming, track changes over time and also create an ongoing channel for surfacing opportunities for improvement.  

Properly resource line managers 

Managers across your organization are vital parts of making any HRM intervention a success. Whether they’re directly involved as a result of overhauling performance management processes or indirectly affected because of changes to flexible working hours or giving back schemes, your managers often have extra work or overhead created by HRM interventions.

Don’t underestimate the impact and ripple effects of even the smallest interventions. Line managers are often the frontline in actioning change or hearing misgivings from employees. They can often be those people who pick up slack within the system. 

Be sure to take this additional workload into account and create extra resources and support processes for line managers . Consult them before any changes are rolled out, involve them in the process as much as needed and think about how to make it easier for them to implement and support processes when engaging with their teams. 

case study organizational development interventions

Strategic Change Interventions

While other organizational development interventions can operate on the individual to small group level, strategic change interventions are more far-reaching in scope. These interventions are designed to analyze and radically redefine how an organization functions or what it hopes to achieve. 

An organization might reconsider its vision or goals because of changes in the market or because the team has conflicting ideas about their shared missions or core values. Other times, the changes can come about because of issues preventing a company from meeting their goals, such as how a team is structured or how a culture of innovation is nurtured. 

Interventions that impact core business strategies are usually undertaken when the survival or competitive edge of a business is at risk. Strategic change can be prompted by internal or external factors, but they’re very rarely taken lightly. The desire is for a massive improvement in how the company functions and the work required is often massive in scale too.

Done right, however, and companies who deploy these interventions can create incredible innovation, reverse falling revenue forecasts and radically improve employee happiness too. 

Examples of strategic change interventions

Transformational change interventions.

Examples of interventions for transformational change include a top-to-bottom organizational redesign, perhaps in response to a changing environment, a major pivot or a desire to enact meaningful culture change.

Dangers to the long term viability of the business, major competition or market shifts can be a common trigger for a transformational change intervention. You can also find that analyzing challenges to employee retention can uncover an issue with company culture that only a massive change and restructure can improve. 

Expect transformational changes to radically alter how a company operates, shifting the status quo and transforming the organization into something that is better positioned to achieve its goals. 

Continuous change interventions

Continuous change interventions are designed to help an organization make minor improvements on an ongoing basis. Creating a culture of learning, developing an experimental, continuous growth model or creating space for innovation and cooperative structures are common actions taken here. 

Trans-organzational change interventions

Trans-organizational change refers to interventions where two or more organizations are involved. Mergers and acquisitions fall under this umbrella though major business partnerships are also an example of a task that might require an OD intervention. 

case study organizational development interventions

Tips for implementing strategic change interventions 

Map your systems and organizational structure.

Before you decide where to take your organization, you’ll need a clear view of where you are right now. Activities such as Systems Mapping are a great first start for any intervention but they are especially valuable when considering large-scale strategic change.

Not only can you more accurately assess the scope of what you’re doing, but you can also draw out where changes need to take place. 

Try creating a map of your organization with a process of systems mapping to better understand and enact your proposed change. These are workshops dedicated to drawing out (and actually drawing) all the stakeholders and other elements (such as suppliers, for example) that compose the wider system of which your company is a part.

Systems Mapping will help your teams look at the big picture and figure out best place to intervene.   

Get outside help

Enacting or even deciding whether to undergo a major strategic change is a significant undertaking. Experience and expertise is invaluable in making a change process a success and when conducting major organizational change, a consultant or agency can make all the difference.

Receiving advice from someone who has enabled change for dozens of companies and has seen many processes from inception to completion is invaluable. They’ll help you ask the right questions, show you a proven framework for change and also help you navigate any roadblocks. Consultants are also adept at working around unintentional biases or assumptions that can form as a long-term employee. 

If you want to improve your change velocity, feel confident in the changes you’re making and streamline the change management process, professional assistance is absolutely worth investigating. 

Remember that big changes take time (and sometimes multiple interventions)

Strategic changes can involve upending how your company operates, thinks about its culture or how it positions itself in the market. While a single intervention might help you successfully restructure your team, it will take further work and careful management to help those teams thrive in the new environment.

Committing to large-scale organizational change means committing to a process that will take time, consistent effort and potentially further interventions along the way. Prepare your teams and managers for a long, ongoing process and be sure to check in along the way. 

Shifting your target customer base for example, might require your sales and marketing teams to radically rethink how they source and talk to customers. While the eventual change might be great, don’t expect to see a complete upswing overnight. Be sure to take this into account when setting targets and when managing your people.

Set expectations accordingly and ensure there are feedback and support systems in place for your team while such large scale changes are in action.  Run effective team meetings to keep track of what’s happening and ensure stakeholders can synchronise effectively.

More tips for a successful OD intervention 

Organizational development is a complex process that can test even the most seasoned teams . The good news is that you’re not the first company undergoing a process of change and there are a heap of best practices and tips you can use to help you achieve your desired change. 

We’ve included tips for each of the different types of OD intervention above, though we also wanted to share some additional OD best practices that should help, regardless of the kind of intervention you’re running. 

Carefully assess the current state of the business

Designing and deploying the right OD intervention means gaining a thorough understanding of where your business is currently at. Not only will you need to determine what needs to change, but also gain an understanding of drivers and potential blockers to that change.

Failure to do this properly can result in slow or unsuccessful change. It can even lead to changes with unintended consequences or negative effects. 

There are various tools for assessing the state of the business. A change management framework is one such tool, though you’ll likely synthesize everything from stakeholder input, current business performance, risk assessments and other situational analysis tools.

The key here is to ensure you deeply understand the system being changed in order to propose the right change and have the resources and environment to make it happen. 

Find the root cause of your problem

Long lasting change comes from a deep understanding of the root cause of an issue. Facing challenges with high staff turnover and low morale? Bringing in free snacks and reducing working hours over the holidays might have a short-term impact on employee happiness, but it’s unlikely to truly solve the issue. 

Whenever engaging in an organizational development process, be sure to go deep enough to truly understand the cause of an issue before enacting change. Don’t rely on assumptions and talk to your team, often multiple times while conducting a root cause analysis.

Review performance data and thoroughly analyze what you find. (Sometimes, it’s enough to just keep asking why!) If in doubt, run a problem solving  workshop to truly surface what’s going on and create a safe space for uncovering issues. 

Not spending enough time analyzing an issue is a potential pitffall for any organization seeking to improve. Without finding the root cause of organizational issues, it’s entirely possible to treat the symptoms rather than finding a cure. Avoid this by going deep, involving people across the organization, backing up ideas with data where possible and be ready to challenge your assumptions. 

Fishbone Analysis   #problem solving   ##root cause analysis   #decision making   #online facilitation   A process to help identify and understand the origins of problems, issues or observations.

Have a clear purpose and end-state

People are more likely to get behind change when they know exactly what it is they are working towards. The purpose of an intervention should be clear, focused and simple to explain. If you can’t easily explain why you’re making a change or it’s overly complex or unclear, chances are you’re trying to do too much or you don’t have a clear understanding of the problem you are trying to solve. 

Clarity of purpose helps ensure that you are taking the right actions and that your change will be successful. Decision making gets easier when you have a clear purpose too. Does this support our purpose and will it help us achieve our goal? Yes: let’s do it. No: either we don’t do it or it could be the focus of a separate intervention or change initiative. 

In addition to a clear purpose, it’s useful to have a desired end-state in mind when conducting any organizational development activities. What will the business look like when you’re done? How will you know your change has been a success?

Asking these questions helps you align and focus your actions while also giving you a means to evaluate the impact of your process. An exciting, aspirational end-state is also invaluable when it comes to getting support for your intervention and reducing possible resistance to change. 

Align interventions with organizational goals 

Successful change requires many moving parts across your organization to be working in tandem. Your organizational goals or mission are often the north star for anything your team does, including any development processes. Often, the simplest way to determine the right OD intervention is to ask whether it helps your organization better achieve its goals. If the answer is yes, then it’s a great candidate for action. Aligning your interventions with your greater goals can also help ensure that the team is able to get behind them and understand why the intervention is being run. For example, let’s say you’re an NGO whose mission is to help provide learning opportunities for disadvantaged folks.

Interventions that are aligned with that organizational goal, either helping your team reach more people with better tools or clearly improve your team’s ability to do their core tasks are much more likely to succeed than interventions that seem tangential or don’t support that core mission. 

case study organizational development interventions

Work backwards from your ideal future state

A common facilitation technique for creating change is backcasting, or imagining an ideal future state and working backwards to decide how to achieve it. Often, the prospect of organizatioanl change can leave teams overwhelmed with how to achieve it or be unsure of how their actions might result in a desired change.

Working backwards can simplify the process, reducing noise and help crystallize your shared purpose. By thinking big, you can often find that the ideal steps towards change become more clearer.   

An aspirational future state can also be an effective tool when getting stakeholder buy-in. A shared vision gives everyone a clear target and it’s easier to align various actions around an organizational goal they believe in.  

Backcasting   #define intentions   #create   #design   #action   Backcasting is a method for planning the actions necessary to reach desired future goals. This method is often applied in a workshop format with stakeholders participating. To be used when a future goal (even if it is vague) has been identified.

Communicate effectively

Resistance to change can often come as a result of poor communication or a lack of understanding about why a change is being implemented at all. How you talk about your OD intervention is an important part of ensuring that stakeholders and those affected get behind the initiative. 

When rolling out your OD intervention, create a communication plan and be sure to highlight the purpose of the proposed change. For example, rolling out a personal development program without context can cause confusion or anxiety. Is this a genuine desire to improve career prospects and employee fulfilment, or is there an issue with my performance and is my job at risk? 

Clearly communicate why and how you’re making changes, create documentation that is easy to access and create space for questions and answers too. By providing a clear vision and purpose for such a program, you can make it easier for everyone involved to get involved and help your change take root. 

Be wary of analysis paralysis 

Thorough analysis and careful planning is an integral part of leading an organizational change. But is it possible to do too much?

For some teams lacking in confidence or expertise, it’s possible they delay making changes or continue to analyze and weigh up options even when the path is clear. It’s a tough balance, but spending too long assessing when a case for change is clear can actually undermine the process or create barriers to change. 

You can mitigate this potential by following a proven framework, having clear organizational timelines and by bringing in consultants to help increase the velocity of your process. In other cases, it’s a matter of using an 80/20 principle or using a bias for action methodology to make a decision and move forward.

If your company is just starting the process of organizational development, it’s natural to want to tread carefully. Just be certain that your process is efficient and that your team’s anxieties are aired and don’t get in the way of progress. 

A group of people looking at a poster with notes on it

Get started!

In the case of small interventions or change processes, you can often adopt a more lightweight process and get started more quickly. You may not need to mobilize your entire change management team for a localized intervention. You might also find that you can gain confidence in a proposed change without performing a top-to-bottom situational analysis.

Change can only happen once a process is set in action and in some cases, it’s worthy to just get started, monitor the results and empower your teams to be proactive. This is different for every organization and while it’s common for small teams to be more agile, large organizations often have more red-tape, and with good reason. 

Recognise the specific circumstances of your organization and review every OD intervention you perform to see how you can do better. If your changes are successful but your team feels like you spend too long assessing when things were clear early in the process, that makes a good case for trying to streamline your process.

Not every change needs a large intervention 

In my experience, the thoroughness of the process directly correlates to the scale of the proposed change. Over-engineering small-scale change processes can create unnecessary friction or lead to frustrated team members. This can cause just as many problems as under-engineering a large scale intervention and developing a poor solution. In doubt about a small, low–risk change but don’t want to block an enthusiastic team mate? Call it an experiment and monitor the impact. Sometimes, you can learn more from just getting started, rather than adding it to a massive organizational to-do-list.  

In any case, it’s worthy to explore how you can create continuous change by engaging your team proactively in the process. Teams are vital actors in any change process and by empowering them, you can often avoid future issues and ensure opportunities are taken where possible.  

Leaders are integral 

Without leadership support, organizational change can struggle to get traction. Everyone from senior leaders to line managers are instrumental in helping change be a success. This might include modelling changes yourself by attending skills workshops, volunteering or cycling to work in line with sustainability goals.

Often, leaders and line managers are also responsible for tracking employee sentiment, keeping change processes front-of-mind and helping employees adapt to change.

Without leadership backing, change can be slow or ineffective. Get them onboard early and give them the tools they need to brief and support their teams and they can help any change process be smooth and purposeful. 

Leaders aren’t just important when helping enact change. During the early stages of an org dev process, leaders are often key stakeholders in research and analysis tasks. They’re well positioned to provide input, spot additional risks and see dependencies you might not.

Engage leaders throughout the organization as early as possible and keep them in-the-loop. Change doesn’t just come from your senior management team! Even the most well-designed OD interventions can fall down if logistics or team workloads don’t align with the process. 

case study organizational development interventions

Acknowledge the additional workload of change and plan accordingly 

Change is hard for most living things, humans included. Whatever the level of involvement in planning, enacting and evaluating organizational change, the process can create additional work or mental load for those affected. Acknowledge this and be proactive in order to support your team and remove potential barriers to change too.

This might look like simply reducing workload in other areas to create space for change, creating support structures or otherwise addressing the unique pain points that might come up in the process.

Sometimes, even an acknowledgement and group discussion about change can be sufficient to clear the air and give teams the opportunity to suggest ways to counterbalance any increased workload. 

Use measurable metrics for success

Measuring the effects of change is an integral part of organizational development, but how can you ensure you are measuring the right metrics and have confidence that your change has been successful?

KPIs and data-based measurements are your friend here. Seeing a clear change in revenue, customer satisfaction or staff turnover in numbers can provide verifiable proof your change has been successful. That said, think about sourcing both qualitative and quantitative data where possible.

Staff might anecdotally report lower stress in a one-on-one meeting, but how are sick days trending since you implemented the change? Revenue might be up, but did your sales team bag an enormous contract that has skewed data?   

It’s also important to decide on the metrics of success before you implement any change. You’ll want to use metrics that will be directly affected by what you’re doing and align your actions accordingly.

It’s also vital that you actually have the means to measure what you want to measure and ideally source existing data to serve as a point of comparison. If you’re conducting an employee wellness program to lower stress, see if you have previous surveys or performance metrics to serve as a baseline.  Using a combination of leading and lagging indicators can also be helpful. For example, a leading indicator for the efficacy of your wellness program might be how many people take advantage of new services on a weekly basis. If more people take advantage of the services, you’d expect to see lower stress – great, but it’s only one piece of the puzzle.

A lagging indicator might be how many staff are reporting high stress levels in a monthly employee survey or even the productivity levels for a team or department. With a combination of these kinds of metrics, you can not only determine if your change has been successful, but also see where in the process you might make improvements. 

case study organizational development interventions

Use tools to support your process

Successfully implementing an OD intervention strategy means organizing tasks, project managing the process and evaluating its impact. It’s a lot of work that can be streamlined by using the right tools.

Use change management software to optimize the end-to-end process of an intervention and improve the velocity of change you’re enacting.

It’s also worth recognising that various barriers to change can be mitigated by using efficient processes and bespoke tools. When committing to creating impactful organizational change, invest in tools that will help you achieve your goals faster and more efficiently.

With SessionLab, you can create stakeholder workshops and braining storming sessions in minutes. Drag and drop blocks to create your session. Invite collaborators to co-create your design in one-place and make changes with ease.

Developing a new learning program ? Create your ideal learning flow and invite your course managers and subject matter experts to collaborate in one-place.

Innovation experts and consultancies have found SessionLab to be a vital part of creating change for their clients. Get started for free and save time and effort in your session design process.

Evaluate & adjust 

Once an intervention is complete, it’s time to evaluate. Using your carefully chosen success metrics in combination with stakeholder input, you’ll determine if you’ve achieved your goals and if not, how might you change or adjust your intervention to do so.

While it can be tempting to see a green KPI and call it a day, proper investigation of why you were successful can help ensure you can repeat that success in future. It can also help your org dev team improve their processes and fuel the next intervention too!

And how about if you feel the need to make changes in the middle of an intervention? However well you’ve designed and run a change process, it’s possible for something unexpected to occur or for additional elements to emerge. Be sure to have a system of checking-in on progress and adjusting where necessary.

Use your KPIs, talk to your team and create open channels for feedback . In some change processes, you might freely adjust throughout the process or you may want to complete the entire intervention before properly evaluating and making changes. 

For example, if you’re running a series of soft skills workshops and discover that employees are struggling to engage, the workshop facilitator might take a different approach in order to fulfil the needs of the intervention and you might adjust the program as a result.

On the other hand, if you’re rolling out a new interview process with your HR team and they’ve had some feedback that it’s too long from a few participants. It might be too early to make changes when the success of the intervention is the quality of the final hire. 

Whatever your process, thorough evaluation is necessary to first determine the success of your intervention and then enable your team to make the right adjustments. Ensure you have the means to collect data and input from your team in order to evaluate well early so that you aren’t picking up the pieces later! 

Conclusion 

OD interventions are a key tool for any company wanting to improve organizational performance, stay competitive and create meaningful change.

Whether it’s finding ways to improve employee development, implement new tools or radically restructure your team, we hope that this guide will help you take the first steps in creating your intervention strategy.

In my experience, while the distinctions between different types of group interventions are useful for understanding the role of organizational development and what tools might be available, they are not mutually exclusive.

For example, fixing a complex problem like low employee satisfaction may include a combination of human process, human resource management and other change strategies. As with any change process, the solutions used should respond to the specifics of the challenge and situation you face.

Your own OD intervention strategy will likely feature elements of various intervention types and in truth, OD interventions are most successful when tailored to the organization at hand and the problems they are facing. Looking for more resources? Discover how change management software can help facilitate successful OD interventions and improve organizational effectiveness.

Running workshops as part of your group interventions? Explore how to create engaging and impactful sessions in this workshop planning guide.

case study organizational development interventions

James Smart is Head of Content at SessionLab. He’s also a creative facilitator who has run workshops and designed courses for establishments like the National Centre for Writing, UK. He especially enjoys working with young people and empowering others in their creative practice.

Leave a Comment Cancel reply

Your email address will not be published. Required fields are marked *

cycle of workshop planning steps

Going from a mere idea to a workshop that delivers results for your clients can feel like a daunting task. In this piece, we will shine a light on all the work behind the scenes and help you learn how to plan a workshop from start to finish. On a good day, facilitation can feel like effortless magic, but that is mostly the result of backstage work, foresight, and a lot of careful planning. Read on to learn a step-by-step approach to breaking the process of planning a workshop into small, manageable chunks.  The flow starts with the first meeting with a client to define the purposes of a workshop.…

case study organizational development interventions

Effective online tools are a necessity for smooth and engaging virtual workshops and meetings. But how do you choose the right ones? Do you sometimes feel that the good old pen and paper or MS Office toolkit and email leaves you struggling to stay on top of managing and delivering your workshop? Fortunately, there are plenty of great workshop tools to make your life easier when you need to facilitate a meeting and lead workshops. In this post, we’ll share our favorite online tools you can use to make your life easier and run better workshops and meetings. In fact, there are plenty of free online workshop tools and meeting…

case study organizational development interventions

How does learning work? A clever 9-year-old once told me: “I know I am learning something new when I am surprised.” The science of adult learning tells us that, in order to learn new skills (which, unsurprisingly, is harder for adults to do than kids) grown-ups need to first get into a specific headspace.  In a business, this approach is often employed in a training session where employees learn new skills or work on professional development. But how do you ensure your training is effective? In this guide, we'll explore how to create an effective training session plan and run engaging training sessions. As team leader, project manager, or consultant,…

Design your next workshop with SessionLab

Join the 150,000 facilitators using SessionLab

Sign up for free

AIHR

Access to 13 certificate programs,
courses and all future releases

Personal Coaching and Career Guidance

Community and live events

Resource and template library

case study organizational development interventions

  • Organizational Development
  • 20 OD Interventions Every HR...

20 OD Interventions Every HR Practitioner Should Know

If you’re a business that wants to improve, learning to design effective OD interventions is essential. Organizational Development interventions can make an organization more efficient, boost the happiness of your employees, and help you leverage rapidly advancing technology.

OD Interventions Cover Image

What are OD interventions?

Types of od interventions.

  • Diagnostic interventions are systematic processes used to assess an organization’s current functioning, identify areas for improvement, and provide data-driven insights to guide effective change strategies.
  • Human process interventions are organizational development interventions related to interpersonal relations, group, and organizational dynamics. These were the earliest form of interventions and are often aimed at improving communication within the workplace. 
  • Techno-structural interventions are targeted toward structural and technological issues such as organizational design , work redesign, and employee engagement. 
  • Human resource management interventions impact areas such as performance management, talent development , DEIB, and wellbeing in the workplace.
  • Strategic change interventions revolve around transformational change, restructuring, and uniting two or more organizations together during a merger.
  • Issues being addressed: OD interventions help companies solve a problem related to a root cause. An example is a high number of employees leaving a company. The present issue is a high employee turnover rate, but OD interventions look to solve the cause of high turnover. In the case of employees leaving, you can expect that a small organization will be impacted entirely on all levels by this issue. In contrast, a multi-national company will only be affected in the locations where turnover is high. 
  • Number of people involved: The more people involved in an OD intervention, the longer it takes to make a change. For example, a human process intervention with a small team will go through quicker than techno-structural interventions in a tech organization. 
  • Solution: Solutions are created to address the root cause of an issue. But they might not be immediate. A solution can also refer to change efforts intended to create an ideal future for the organization. In the latter case, upper management and decision-makers are generally impacted more than the staff until the change has happened.

Different types of OD interventions, including human process and technostructural interventions.

Why does your company need organizational development (interventions)?

What are examples of organizational development interventions, 1. diagnostic interventions, human process interventions, 2. individual interventions, 3. group interventions.

4. Team building

5. intergroup relations interventions, 6. third-party interventions, 7. organizational confrontation meeting, technostructural interventions, 8. organizational (structural) design.

  • Hierarchical
  • Customer-centric

9. Total quality management

10. work design, 11. job enrichment.

  • Variety of tasks: Give your employee new tasks or ones that go beyond their everyday duties. 
  • Giving autonomy: Empower employees to make decisions about their work and avoid micromanaging.
  • Employee feedback: Make sure your team receives input regarding their performance, skills, and ability to work within a group.
  • Assigning meaningful work: Help employees make sense of their work by showing them how it benefits the company and how they contribute to overall organizational goals.
  • Creating incentive programs: Create recognition for a job well done through incentive programs like bonuses or extra days off.

12. Large-group interventions

13. business process reengineering, human resource management interventions, 14. performance management.

15. Developing talent

  • Individualized career planning
  • Internal or external coaching 
  • Task/job rotations
  • Educational budget
  • Mentorship programs
  • Internal or external workshops
  • Conferences 
  • On-the-job training
  • Leadership training.

16. Diversity interventions

17. wellness interventions .

  • Headspace for Organizations

Strategic change interventions 

18. transformational change.

  • Restructuring: Changing your business’s structural chart by adding, removing, or combining departments.
  • Retrenchment: Decreasing employee headcount by closing an office or division of the company. 
  • Turnaround: Replacing all top management within a failing business to turn things around.
  • Outsourcing: Hiring another company to complete tasks for your own company. This is common in customer service departments. 
  • Spin-off: Breaking up a company into distinct, smaller companies. Google is well-known for this when it created its umbrella organization, Alphabet Inc., and now owns many household name companies such as Nest and YouTube. 

19. Continuous change

20. transorganizational change.

  • Starbucks & Spotify : Starbucks and Spotify partnered to give all Starbucks employees a free Spotify Premium subscription, which they were encouraged to use to help generate playlists from two decades of Starbucks’ soundtracks. These playlists were also made available via Starbucks’ mobile app.
  • GoPro & Red Bull : GoPro partnered with Red Bull and became their exclusive provider of imaging technology, capturing sensational footage at hundreds of annual events across 100 countries.
  • Apple & MasterCard : Apple and MasterCard partnered to give Apple Pay users all the benefits of MasterCard, but through their Apple device. A user’s card information is replaced with a ‘token’ every time a purchase is made, which protects their personal details.  
  • Taco Bell & Doritos : In 2012, Taco Bell introduced the Doritos Locos Taco to their menu – a shell made out of Doritos chips in Nacho Cheese, filled with your usual taco fillings. It was so popular that Taco Bell had to hire 15,000 more employees and start four new production lines to meet demand. It remains one of their top selling items today.

How to design effective OD interventions

  • Start with thorough diagnostics : Conduct an in-depth assessment of your organization to understand your biggest challenges. You can do this through a diagnostic intervention, for example, by collecting employee feedback, performance data, and carrying out organizational culture evaluations. 
  • Set clear objectives and KPIs for what you want to achieve : Ensure that you define clear, quantifiable metrics that align with your goals so that you can track your progress. For instance, employee engagement scores, productivity rates, and turnover rates. 
  • Engage stakeholders : Get key stakeholders involved from all levels of the organization for the planning and implementation. This is essential to ensure buy-in and address diverse perspectives. 
  • Customize interventions : Adapt OD intervention to the unique needs and context of your organization, Be sure to consider factors like your culture, structure, and any industry-specific challenges you face. 
  • Evaluate and iterate : The final step is to continuously monitor your interventions so you can assess the impact they’re making through the KPIs that you set. Use your findings to make adjustments and improvements for sustained success.

A final word

case study organizational development interventions

Jayla Cosentino

Related articles.

The advantages of job enlargement and steps to implement it.

What Is Job Enlargement? [A Guide + 3 Examples]

Definition of job enrichment and its benefits.

Job Enrichment: A Practical Guide + 13 Examples

An 8-step process for HR professionals on how to implement a 9/80 schedule.

What Is a 9/80 Schedule? Your (2024) Explainer Guide

New articles.

Outline of the 5 Rs of Workforce Planning, highlighting steps and a free template.

Free Workforce Planning Template (Plus 5 Practical Examples)

The role of the Chief People Officer within organizations continues to expand.

Chief People Officer: All You Need To Know About the Role

11 steps for HR in applying skills mapping, from defining objectives to data analysis.

What is Skills Mapping? Your 11-Step Implementation Guide

Subscribe to our weekly newsletter.

  • 1.7K shares

Are you ready for the future of HR?

Learn modern and relevant HR skills, online

case study organizational development interventions

Understanding an organizational change and development intervention applied in a Global Software Industry: A case study: A Case Study

New citation alert added.

This alert has been successfully added and will be sent to:

You will be notified whenever a record that you have chosen has been cited.

To manage your alert preferences, click on the button below.

New Citation Alert!

Please log in to your account

Information & Contributors

Bibliometrics & citations, index terms.

Social and professional topics

Professional topics

Management of computing and information systems

Project and people management

Software management

Software and its engineering

Software creation and management

Collaboration in software development

Software development process management

Software notations and tools

Software configuration management and version control systems

Recommendations

Organizational linkages for surviving technological change: complementary assets, middle management, and ambidexterity.

Technological innovation sometimes requires industry incumbents to shift to a completely new core technology. To successfully navigate a technological transition, firms often face the ambidextrous challenge of “exploiting” existing complementary assets ...

Understanding software process improvement in global software development: a theoretical framework of human factors

Presently, most of the software development organizations are adopting the phenomena of Global Software Development (GSD), mainly because of the significant return on investment it produces. However, GSD is a complex phenomenon and there are many ...

Coordination of Software Development Teams across Organizational Boundary -- An Exploratory Study

Coordinating teams across geographical, temporal and cultural boundaries has been identified as a critical task to achieve the success of global software projects. Organizational boundary is another dimension of global distribution, which is a less ...

Information

Published in.

cover image ACM Other conferences

Association for Computing Machinery

New York, NY, United States

Publication History

Permissions, check for updates.

  • Research-article
  • Refereed limited

Funding Sources

  • Samsung Eletrônica da Amazônia

Acceptance Rates

Contributors, other metrics, bibliometrics, article metrics.

  • 0 Total Citations
  • 42 Total Downloads
  • Downloads (Last 12 months) 42
  • Downloads (Last 6 weeks) 9

View Options

Login options.

Check if you have access through your login credentials or your institution to get full access on this article.

Full Access

View options.

View or Download as a PDF file.

View online with eReader .

HTML Format

View this article in HTML Format.

Share this Publication link

Copying failed.

Share on social media

Affiliations, export citations.

  • Please download or close your previous search result export first before starting a new bulk export. Preview is not available. By clicking download, a status dialog will open to start the export process. The process may take a few minutes but once it finishes a file will be downloadable from your browser. You may continue to browse the DL while the export process is in progress. Download
  • Download citation
  • Copy citation

We are preparing your search results for download ...

We will inform you here when the file is ready.

Your file of search results citations is now ready.

Your search export query has expired. Please try again.

Breadcrumbs Section. Click here to navigate to respective pages.

Organization Development Interventions

Organization Development Interventions

DOI link for Organization Development Interventions

Get Citation

To effectively adapt and thrive in today’s business world, organizations need to implement effective organizational development (OD) interventions to improve performance and effectiveness at the individual, group, and organizational levels. OD interventions involve people, trust, support, shared power, conflict resolution, and stakeholders’ participation, just to name a few. OD interventions usually have broader scope and can affect the whole organization. OD practitioners or change agents must have a solid understanding of different OD interventions to select the most appropriate one to fulfill the client’s needs. There is limited precise information or research about how to design OD interventions or how they can be expected to interact with organizational conditions to achieve specific results.

This book offers OD practitioners and change agents a step-by-step approach to implementing OD interventions and includes example cases, practical tools, and guidelines for different OD interventions. It is noteworthy that roughly 65% of organizational change projects fail. One reason for the failure is that the changes are not effectively implemented, and this book focuses on how to successfully implement organizational changes.

Designed for use by OD practitioners, management, and human resources professionals, this book provides readers with OD basic principles, practices, and skills by featuring illustrative case studies and useful tools. This book shows how OD professionals can actually get work done and what the step-by-step OD effort should be. This book looks at how to choose and implement a range of interventions at different levels. Unlike other books currently available on the market, this book goes beyond individual, group, and organizational levels of OD interventions, and addresses broader OD intervention efforts at industry and community levels, too. Essentially, this book provides a practical guide for OD interventions. Each chapter provides practical information about general OD interventions, supplies best practice examples and case studies, summarizes the results of best practices, provides at least one case scenario, and offers at least one relevant tool for practitioners.

TABLE OF CONTENTS

Part i | 52  pages, foundations, chapter chapter 1 | 13  pages, what is an od intervention, chapter chapter 2 | 20  pages, understanding different od intervention models, chapter chapter 3 | 16  pages, steps for implementing the od intervention model: from entry to separation, part ii | 141  pages, individual and small-group interventions, chapter chapter 4 | 21  pages, individual interventions: instrument-guided development, chapter chapter 5 | 34  pages, individual intervention: executive and management coaching, chapter chapter 6 | 18  pages, individual interventions: mentoring and sponsorship (levels of advocacy), chapter chapter 7 | 28  pages, small-group interventions: achieving effectiveness through interpersonal training, chapter chapter 8 | 37  pages, small-group intervention: team-building, part iii | 104  pages, intermediate and large interventions, chapter chapter 9 | 27  pages, intermediate-sized interventions, chapter chapter 10 | 29  pages, large-scale interventions, chapter chapter 11 | 18  pages, industry-wide interventions, chapter chapter 12 | 26  pages, community-based interventions, part iv | 23  pages, chapter chapter 13 | 12  pages, the future of organization development interventions, chapter chapter 14 | 9  pages, what unique issues surface when implementing od interventions.

  • Privacy Policy
  • Terms & Conditions
  • Cookie Policy
  • Taylor & Francis Online
  • Taylor & Francis Group
  • Students/Researchers
  • Librarians/Institutions

Connect with us

Registered in England & Wales No. 3099067 5 Howick Place | London | SW1P 1WG © 2024 Informa UK Limited

Cases and Exercises in Organization Development & Change

Cases and Exercises in Organization Development & Change

  • Donald L. Anderson - University of Denver, USA
FormatPublished DateISBNPrice

"Has a number of timely case studies, including ones on non-profit and educational institutions."

An excellent book with lots of applied problems/case studies.

Good cases and excellent overall structure of the book. however, I was also looking for mini-cases

A well written book that has a number of useful cases and activities that will help to link theory to practice for change management and organisational development.

A mix of great, some useful exercises and cases though some seem to be a bit basic and perhaps out of date

Donald L. Anderson

Donald L. Anderson , Ph.D., University of Colorado, teaches organization development at the University of Denver and organization design at the University of Colorado, Boulder. He is a practicing organization development consultant and has consulted internally and externally with a wide variety of organizations, including Fortune 500 corporations, small businesses, nonprofit organizations, and educational institutions. Dr. Anderson’s research interest is in discourse in organizational and institutional settings, and his studies of organizational discourse and change have been published in journals such as the Journal of Organizational Change Management, Gestion , and Journal of Business and Technical Communication . He is the author of the text Organization Design: Creating Strategic and Agile Organizations (SAGE, 2019) and editor of the text Cases and Exercises in Organization Development & Change (2nd ed., SAGE, 2017).

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Taylor & Francis Open Select

Logo of taylorfranopen

How to design, implement and evaluate organizational interventions for maximum impact: the Sigtuna Principles

Ulrica von thiele schwarz.

a School of Health, Care and Social Welfare, Mälardalen University, Västerås, Sweden

b Medical Management Centre, LIME, Karolinska Institutet, Stockholm, Sweden

Karina Nielsen

c Institute of Work Psychology (IWP), University of Sheffield, Sheffield, UK

Kasper Edwards

d Department of Management Engineering, Technical University of Denmark, Lyngby, Denmark

Henna Hasson

e Unit for Implementation and Evaluation, Center for Epidemiology and Community Medicine, Stockholm, Sweden

Christine Ipsen

Carl savage, johan simonsen abildgaard.

f National Research Center for the Working Enviroment, Copenhagen, Denmark

Anne Richter

Caroline lornudd.

g Institute of Environmental Medicine (IMM), Karolinska Institutet, Stockholm, Sweden

Pamela Mazzocato

Julie e. reed.

h NIHR CLAHRC for Northwest London, Chelsea and Westminster Hospital, London, UK

i School of Health and Welfare, Halmstad University, Halmstad, Sweden

Research on organizational interventions needs to meet the objectives of both researchers and participating organizations. This duality means that real-world impact has to be considered throughout the research process, simultaneously addressing both scientific rigour and practical relevance. This discussion paper aims to offer a set of principles, grounded in knowledge from various disciplines that can guide researchers in designing, implementing, and evaluating organizational interventions. Inspired by Mode 2 knowledge production, the principles were developed through a transdisciplinary, participatory and iterative process where practitioners and academics were invited to develop, refine and validate the principles. The process resulted in 10 principles: 1) Ensure active engagement and participation among key stakeholders; 2) Understand the situation (starting points and objectives); 3) Align the intervention with existing organizational objectives; 4) Explicate the program logic; 5) Prioritize intervention activities based on effort-gain balance; 6) Work with existing practices, processes, and mindsets; 7) Iteratively observe, reflect, and adapt; 8) Develop organizational learning capabilities; 9) Evaluate the interaction between intervention, process, and context; and 10) Transfer knowledge beyond the specific organization. The principles suggest how the design, implementation, and evaluation of organizational interventions can be researched in a way that maximizes both practical and scientific impact.

Introduction

Interventions in the workplace can target individuals, groups or whole organizations, and aim to improve individual, group and/or organizational outcomes by mitigating or preventing problems, or by promoting positive outcomes. Often, these types of interventions aim to achieve the intended outcomes by changing the way work is organized, designed, or managed. These are referred to as “organizational interventions” (Nielsen, 2013 ). Organizational interventions typically consist of multiple components, sometimes at multiple levels (i.e., employee, group, leader, and organizational; Nielsen et al., 2017 ), and are typically embedded in their context of application (Montano et al., 2014 ; Nielsen & Abildgaard, 2013 ). Examples include job redesign interventions (e.g., Holman & Axtell, 2016 ), Business Continuity Management aiming to support post-disaster recovery in organizations (Malinen et al., 2019 ), and participatory occupational health interventions, often including both leader and employee activities (e.g., Framke & Sørensen, 2015 ).

The fact that organizational interventions involve changing the way work is organized, designed, or managed means that organizational interventions cannot be researched without substantial collaboration between researchers and the organization and its stakeholders (e.g., managers and employees) (Kristensen, 2005 ). They need to benefit both the organization and the researcher and meet the dual objectives of both parties (Kristensen, 2005 ). These objectives may differ and follow different logics, even among organizational key stakeholders. They may also be contradictory. Traditionally, the objectives for an intervention researcher in work and organizational psychology have been to evaluate the effects of an intervention (often focusing on if something works) and to test theories. The emphasis is on internal validity and the ability to draw causal inferences. The underlying logic dictates that interventions are designed beforehand, preferably based on theory (Fishbein & Yzer, 2003 ), and then implemented as designed (freezing the interventions). Following this logic, the influence of contextual factors is considered noise that should be minimized (Nielsen, 2017 ; Nielsen & Miraglia, 2017 ). Impact on practice is often only considered after the research has been completed.

The main purpose of an organization, however, is not to serve as an arena for researchers, but to produce goods or services (Kristensen, 2005 ). This does not mean that organizational stakeholders do not see the value of research, but if and when the research process collides with organizational needs, organizational needs will take precedence. For example, an organization may not be willing to wait years to know if an intervention was successful or not, and they may not see the point of “freezing” an intervention if changing it would make it easier to use and/or increase its effectiveness (e.g., von Thiele Schwarz et al., 2016 ). Thus, even when researchers and organizational stakeholders understand and share each other’s objectives, the logics underlying these ambitions likely differ. This means that research on organizational interventions would benefit from novel approaches that reconcile these apparent contradictory objectives by simultaneously considering both scientific rigour and practical impact.

Such a reconciliation puts specific demands on how the research is conducted; it has considerable impact on the entire organizational intervention process, from design and implementation to evaluation. This discussion paper sets out to address the lack of guidance available for researchers committed to this endeavour. The purpose of this paper is to offer a set of principles, grounded in knowledge from various disciplines, that can guide researchers in designing, implementing, and evaluating organizational interventions that are both scientifically rigorous and practically relevant. In this, the intention is to advance, rather than conclude, the discussion on how to optimize the impact of research on organizational interventions.

The principles contribute to work and organizational psychology in five ways. First, the principles are specifically designed to face the dual and sometimes contradictory objectives of organizational interventions. Traditional guidance for organizational interventions primarily focuses on addressing the concerns of researchers. Less attention is paid to how interventions can directly benefit the organization, or more broadly, how the results can and will be used down-stream (Griffiths, 1999 ; Rogers, 2008 ). The principles do not suggest that researchers’ concerns for rigour should be abandoned. Instead, the principles suggest striving for rigour in the light of dual objectives, and how real-world impact of organizational research can be managed upstream, that is, as part of the knowledge generation process rather than as a separate process after the research has been conducted. Thus, the principles address the tension between trustworthiness and usefulness of research evidence.

Secondly, the principles add to the limited understanding of the sustainability of organizational interventions (Kristensen, 2005 ; Lennox et al., 2018 ). From an organizational perspective, sustainability is practically inseparable from the real-world impact of an intervention (von Thiele Schwarz et al., 2016 ). Specifically, the principles highlight how sustainability can be approached throughout the design, implementation, and evaluation of the intervention rather than only once the project is finished.

Thirdly, the principles address all stages of interventions: from design and implementation to evaluation. In this respect, the principles add to the current literature because the existing frameworks that have been developed specifically for organizational interventions have primarily focused on evaluation (e.g., Bauer & Jenny, 2012 ; Biron & Karanika-Murray, 2013 ; Nielsen & Abildgaard, 2013 ; Nielsen & Randall, 2013 ; von Thiele Schwarz et al., 2016 ).

Fourthly, the principles take into consideration that organizational interventions are complex, dynamic, and recursive, and consist of multiple components, sometimes at multiple levels (i.e., employee, group, leader, and organizational) and are typically imbedded in a system (the organization) that is also complex in that it includes multiple factors interacting in unpredictable ways (Schelvis et al., 2015 ). Thus, the principles add to the current limited understanding of how the specific conditions in which organizational interventions operate affect their design, implementation, and evaluation (Griffiths, 1999 ; Kompier & Aust, 2016 ; Kristensen, 2005 ; Nielsen & Miraglia, 2017 ; Van der Klink et al., 2001 ; von Thiele Schwarz et al., 2016 ).

Finally, the principles contribute to work and organizational psychology by synthesizing a breadth of knowledge about organizational interventions that exist in neighbouring fields, including change management, work and organizational psychology, improvement science, implementation science, operations management, occupational health, and applied ergonomics. Therefore, rather than inventing approaches specifically for work and organizational psychology, we build on established knowledge from related fields facing similar challenges, and draw upon different epistemological and ontological points of departure, from positivism to interpretivism and pragmatism.

Inspired by Mode 2 knowledge production (Gibbons et al., 1994 ), we brought together transdisciplinary practitioners and academics with experience of organizational interventions and took them through a process to identify key principles for designing, implementing, and evaluating organizational interventions. The core group consisted of 11 academic experts (the authors) from change management, work and organizational psychology, improvement science, implementation science, operations management, organizational theory, occupational health, and applied ergonomics. The researchers were recruited through purposeful snowball sampling of researchers involved in organizational intervention research (Vogt & Johnson, 2011 ).

Mode 2 knowledge production differs from traditional academic knowledge production (i.e., Mode 1) along five dimensions (MacLean et al., 2002 ). Firstly, transdisciplinarity : While many disciplines research organizational interventions, no one has the definite answer “how to”. We strived to bring together a range of perspectives rather than relying on an in-depth inquiry of knowledge from a single discipline. Secondly, context of application : We included practitioners iteratively throughout the process to ensure that the principles reflected real-life issues concerning organizational interventions and to minimize the knowledge generation-knowledge use gap by including knowledge users’ skills and understanding in the knowledge production. Thus, intended users of the knowledge produced are part of a knowledge co-production process rather than mere recipients of the finished product. Thirdly, heterogeneity and organizational diversity : Due to the complex nature of organizational interventions, we included practitioners and researchers with experience from various types of institutions and organizations with different approaches to knowledge, ensuring interaction across settings to offer different perspectives on interventions and how knowledge is generated and applied. Fourthly, reflexivity and social accountability : Mode 2 knowledge production builds on iterative and reflexive production of knowledge where the potential impact and value (external validity) is integrated in the entire process. Specifically, we used a workshop set-up where we engaged in discussions to make the different perspectives on organizational interventions apparent and transparent. The principles were rigorously questioned and evolved through discussions that allowed participants to reflect on their perspectives in contrast to other disciplines. Finally, diverse range of quality controls : We engaged in open discussions of each principle and how to apply them in different practical cases, presented by both core members of the group and invited academics and practitioners. We also included quality controls by presenting the principles at conferences to invite practitioners and academics external to the workshop process to validate the principles.

Following the Mode 2 knowledge production principles, a participatory and iterative approach was used to develop (phase 1) and validate and refine the principles (phase 2). The procedure for the development of the principles is outlined in Table 1 , detailing each activity, its purpose, the range of participants involved, as well as the outcome of each step.

Outline of the procedure for the development of the principles

PurposeActivityParticipantsOutcome
Phase 1 Development of principles
Exploration to identify best practices
(April 2016)
Workshop with Open Space Technologies to amend best practices to principles through interactive discussions that examined general applicability, interconnectivity, nomenclature, and perceived importance.11 researchers across fields (the authors)15 preliminary principles that summarized the most essential approaches for
succeeding with organizational interventions
Substantiation and clarification
(April-June 2016)
Working in pairs on a shared platform, the content of the principles was clarified. Each principle was reviewed by the rest of the group members.The authorsSubstantiation and validation of the 15 preliminary principles from each represented research field
Critical revisions (June 2016)
Subsequent revisions of principles (July 2016-March 2017)
Work meeting to eliminate overlap and redundancy.
Working individually on a shared platform; the content of the principles was clarified.
3 of the authors
The authors
15 principles were reduced to 10
Phase 2 External validation and refinement of principles
External validation for both practical and scientific relevance
(April 2017)
One-day workshop using fishbowl methodology to revise the 10 principles.The authors and 9
invited practitioners and senior academics
Deeper understanding of ambiguities related to nomenclature and epistemologies; resulted in a final articulation.
RefinementOne-day workshop, iteratively working with the whole group, in pairs, and individually to revise the principles.The authors 
Further validation with scientific scholars (May and June 2017)Two symposiums (European Association of Work and Organizational Psychology and Work, Stress and Health; APA-NIOSH). Principles and exemplifying empirical cases were presented. Participants documented and discussed the presence or absence of principles and their feasibility.5 of the authors and 2 other researchers presented cases and invited feedback from symposium participants (n  = around 100)Insight into alignment between participants’ perceptions of principles needed for successful and organizational interventions and the principles
Subsequent refinement of principles
(June-October 2017)
Refinement based on input from practitioners and academics external to the core group.The authorsSuccinct description of the 10 principles
Further validation with practitioners and researchers (October 2017)Workshop with practitioners that presented intervention tools that matched the principles.The authors and 7
invited practitioners and senior academics
Check of principles’ robustness. Final version of principles.

In Phase 1, a two-day workshop was held in the Swedish town of Sigtuna, a key trading and meeting point on the Baltic at the time of the Vikings and hence the name of the principles. Starting from the participants’ current understanding (e.g., Nielsen & Abildgaard, 2013 ; Nielsen & Randall,  2013 ; Reed, Howe, Doyle & Bell, 2018 ; von Thiele Schwarz et al., 2016 ), a broad range of best practices were identified and explored through reflexive conversations inspired by the Open Space Technology (OST) (Owen, 2008 ). OST is a participant-driven, real-time approach that relies on self-organization to explore topics of interest to participants. We used this approach to allow participants to move freely in and out of smaller groups, gathering around emerging principles visualized on flipchart papers. Discussions were captured by developing each flipchart. At this stage, the number of principles were allowed to expand and retract, combining old and adding new flipcharts as needed. The flipcharts were then examined by the whole group and through discussions of similarities and differences, they were condensed into a first set of 15 principles. These were further amended and condensed over the following year (see Table 1 ). In Phase 2, the principles were refined and validated with external experts, including both academics and practitioners, through a series of meetings and workshops (e.g., a symposium at the EAWOP 2017 Conference). Written and oral feedback revealed an overall agreement on the relevance and importance of the principles, but that some were ambiguous. We therefore refined the principles during the following five months, with an additional workshop in October 2017, to finalize the principles.

The principles

Organizational interventions often consist of three phases: 1) design, 2) implementation, and 3) evaluation (Tafvelin et al., 2019 ). The principles cut across the three phases, as illustrated in Figure 1 .

Principle 1: Ensure active participation and engagement among key stakeholders

An external file that holds a picture, illustration, etc.
Object name is PEWO_A_1803960_F0001_OC.jpg

Ten principles for how to design, implement, and evaluate organizational interventions

This principle recognizes that employees and organizations are not passive recipients of organizational interventions (Nielsen, 2013 ). They need to shape, manage, and own interventions. Participatory approaches are currently recommended by national and international policy bodies for managing psychosocial risk and for organizational interventions (Nielsen, 2017 ). Participation is relevant to consider across the design, implementation and evaluation of interventions, and among employees as well as managers at all levels of the organization. The latter includes ensuring senior management support and ownership over the intervention at the appropriate level of the organization (Hasson et al., 2014 ).

In the design phase, participation can increase the appropriateness of the intervention by ensuring that participants’ expertise is considered in the development of the intervention, e.g., what changes are feasible and appropriate in their workplace (Storkholm, Savage et al., 2019 ). During implementation, participants are more like to be committed to the intervention if they have had a chance to influence it (Rosskam, 2009 ). Participation can also facilitate integration into existing work practices and procedures (principle 6) (Tsutsumi et al., 2009 ). For evaluation, participation increases the likelihood that stakeholders will accept the validity of any findings the evaluation will yield, and commitment to act on them (i.e., evaluability) (Leviton et al., 2010 ).

What is meant by participation varies greatly, both in terms of the degree of influence and in terms of what the participants gain influence over (i.e., the content, the process, or the goal of the intervention) (Abildgaard et al., 2018 ). Based on the substantive evidence supporting active engagement, our proposition is for active forms of participation where researchers and organizational stakeholders, including employees, work closely together throughout the design, implementation, and evaluation of the intervention, enabling influence over all aspects of the intervention, including as co-creative partners (Brydon-Miller et al., 2003 ; Storkholm, Mazzocato et al., 2019 ).

Although this principle acknowledges the value of close collaboration and power-sharing (Brydon-Miller et al., 2003 ), it also acknowledges that the appropriate level of participation varies. For example, the optimal level of participation will vary with the characteristics of the intervention (e.g., the aim), and with contextual factors. These may include cultural differences affecting expectations on degree of participation. For example, participation will be less challenging if it does not deviate from the cultural norms, such as in the European Nordic countries, where there is a long-standing tradition emphasizing collaboration and participation between employer and employees (Gustavsen, 2011 ).

Degree of participation will also vary during the course of the change process. Participation will be required from different stakeholders at different time points and for different purposes. For example, senior management involvement may be needed when the overall project is designed to ensure the fulfilment of Principles 2 and 3 (Hasson et al., 2018 ), whereas employee involvement may be most important when the intervention is being implemented on the local level, giving employees and line-managers space and time to integrate the intervention into their work context. Thus, the proposition here is to find the appropriate level of participation across multiple stakeholders. Appropriate level of participation entails understanding the embedded power structures in the organizations (formal/stable hierarchies and informal/fluctuating structures) because they impact the level of influence that the different stakeholders have on the intervention. Not everyone will feel comfortable to speak up, not everyone’s voice will count (Wåhlin-Jacobsen, 2019 ), and there will be information asymmetries in what people know in the organization that affects the willingness and opportunity to participate, and thus, who has influence over or benefits most from the intervention. As an outsider, such power structures may be tricky to notice, and it may therefore be better to make the default assumption that power-issues are at play.

Principle 2: Understand the situation (starting points and objectives)

As outlined in the introduction, organizational interventions are embedded in the organizational context. Therefore, this principle suggests that researchers acknowledge that organizations are social systems, with their own unique dynamics and histories. We propose that the likelihood of a successful outcome is greater when organizational contexts are actively considered in the design, implementation, and evaluation (Nielsen & Randall, 2013 ; von Thiele Schwarz et al., 2016 ). Thus, building on disciplines such as engineering and quality improvement, we argue that researchers need to understand the context and take it into account before finalizing the design and starting to implement an organizational intervention (Edwards & Jensen, 2014 ). In its most basic form, this principle encourages researchers to refrain from conducting organizational interventions if they have not ensured that the organization needs it. For example, introducing a physical exercise intervention may not be the most appropriate in a context where work overload is the main issue.

Understanding the current situation includes considering the work systems, working conditions, history, challenges, and problems, as well as the implicit and explicit goals and intended outcomes of the intervention (e.g., the ideal end state). Such understanding can be achieved through recurrent conversations and negotiations between stakeholders, as well as through more formal assessments describing the situation (e.g., surveys and risk assessments). Knowledge about the organizational context matters for the design and implementation, as well as the evaluation of organizational interventions. First, it helps identifying or designing the most suitable intervention, by clarifying the direction of the change (from where, to where) (Aarons et al., 2011 ). Then, clarifying the gap between present and desired states may create an urge for change (i.e., “creative tension”), supporting engagement and participation (Principle 1) (Senge, 2006 ). An understanding of the context can also enable uncovering organizational factors that can make or break the implementation (barriers and facilitators) (e.g., financial constraints, staff turnover, etc.) so that these can be managed. Finally, the knowledge provides information about the starting point (“baseline”), which is essential for evaluation because it makes it possible to track changes over time (Batalden & Davidoff, 2007 ).

Different stakeholders may not understand the situation in the same way. We do not suggest that all stakeholders must have a fully shared understanding of the situation (i.e., what the problem, the intervention, and the desired outcome are), although it helps to have a mutual agreement on the need for and purpose of the change (Frykman et al., 2014 ; Storkholm et al., 2017 ) (i.e., shared sense-making) (Weick, 1995 ). It is, however, important to understand that there are different perspectives. Research on perceptual distance has shown that a lack of congruence, for example, between managers and employees, has an independent, negative effect on intervention outcomes (e.g., Hasson et al., 2016 ; Tafvelin et al., 2019 ). Thus, even if stakeholders do not have a shared understanding of the situation, it is essential that they know if that is the case, so that any misunderstandings can be managed upfront.

Principle 3: Align the intervention with existing organizational objectives

As described in the introduction, organizations are not neutral settings for testing research hypotheses, and therefore organizational interventions need to benefit the organization as well as researchers. This requirement for dual benefit means they need to be designed and implemented with consideration of how they contribute to organizational objectives as well as the researchers’ objectives. Alignment with the organization’s objectives serves several purposes. First, alignment helps create engagement by demonstrating how the intervention can contribute to important organizational outcomes. It can reduce the risk of contradictions that emerge when the aim of an intervention is not in line with other objectives (Ipsen et al., 2015 ; Ipsen & Jensen, 2012 ). Secondly, it reduces the risk of unintended side effects that emerge when an intervention is designed, implemented, and evaluated without consideration of how it may affect other areas (Bamberger et al., 2016 ) (e.g., when an intervention benefits one employee group at the expense of another). Finally, aligning objectives is essential to minimize the risk of the intervention becoming a time-limited ancillary project, discontinued once the researchers or a key change agent in the organization move on. Thus, this principle is central for the sustainability of organizational interventions.

Striving for alignment also involves trade-offs and challenges. First, for researchers, it may mean adjusting their research agenda to ensure it is benefitting the organization – or refraining from doing the research in a particular organization where there is no alignment. With regard to the different organizational stakeholders, aligning the intervention with organizational objectives means that the intervention is placed in the landscape of (perceived or real) contradictory and competing organizational objectives, such as those between safety and production (von Thiele Schwarz & Hasson, 2013 ). This may amplify tensions between stakeholder-groups which, in turn, may pose a barrier to the implementation of the intervention. It may also become an ethical dilemma when researchers and change agents need to favour one organizational objective over another.

Aligning the intervention with organizational objectives does not suggest that the objectives of the intervention automatically change, for example, from focusing on employee health and well-being to focusing on efficiency. Instead, it suggests that discussions about how an intervention might affect various organizational objectives should be considered during the design of the intervention and continually revisited to avoid unexpected agendas derailing the intervention at a later stage. Thus, at a minimum, this principle points to the need to disclose any competing objectives so that they can be managed or monitored to avoid derailment and unsustainable improvements.

Principle 4: Explicate the program logic

Given that organizational interventions are dependent on their context, it is essential for the design, implementation, and evaluation to explicate how they are supposed to work. This involves outlining the logical links between the intervention activities and immediate, short-, and long-term outcomes (e.g., Pawson, 2013 ; Rogers, 2008 ) including identifying multiple possible intervention activities as well as multiple pathways (Abildgaard et al., 2019 ). Drawing on the field of program evaluation, this principle suggests explicating a program logic (also known as a program theory, logic model, impact pathway, or theory of intervention) as a way to clarify the proposed theoretical mechanisms that explain why certain activities are expected to produce certain outcomes (Pawson & Tilley, 1997 ). Program logic focuses on the theory of change, i.e., the how and why intervention activities may work (Havermans et al., 2016 ; Kristensen, 2005 ; Nielsen & Miraglia, 2017 ), rather than, for example, theories of health (i.e., the relationship between exposure to work factors and employee health).

Program logic is used in the design as well as the implementation and evaluation of an intervention. First, it identifies which intervention activities are most likely to close the gap between the current and desired state. An important part of this is utilizing best available knowledge. Secondly, it guides implementation by clarifying the mechanisms, thereby explicating the implementation strategies needed to support behavioural change. Finally, it is a blueprint for the evaluation, as it describes when and what to measure.

To explicate the program logic, multiple sources of information are needed, so it may benefit from co-creation with stakeholders (von Thiele Schwarz et al., 2018 ). The development process may differ depending on the extent to which intervention activities are predefined, such as when the change involves implementation of guidelines or an evidence-based intervention. When intervention activities are predefined, they become the starting point for logically identifying appropriate proximal and distal outcomes. When intervention activities are not predefined, the program logic becomes an important part of identifying the intervention activities. This is done by starting from the outcomes and working backwards so that activities that could lead to the desired outcomes are identified (Reed et al., 2014 ; Saunders et al., 2005 ). In both cases, the program logic should be considered a hypothesis to be continuously tested and revised throughout implementation.

Principle 5: Prioritize intervention activities based on effort-gain balance

Once the program logic has helped to explicate the goals of the intervention and the possible intervention activities, it may be necessary to prioritize between different activities. This principle suggests that the decision of which activities to prioritize should be based on an effort-gain balance analysis. This prioritization involves considering the anticipated impact of an intervention on the one hand and the expected effort needed to realize it on the other (Batalden & Stoltz, 1993 ; Kotter, 1996 ; Wilder et al., 2009 ). Prioritizing activities, therefore, entails striving to strike a balance between the investment (in effort) that an organization is ready to commit to and the potential gains that can be achieved. Understanding this ”return on investment” balance for each intervention activity can inform the prioritization between potential intervention activities and is therefore a calculation integral to the design phase (Cox et al., 2000 ).

There is a need to identify the potential gains of certain activities in congruence with the goals of the intervention. Potential gains are often evident from the goals of the intervention and the alignment process (Principle 3), or can be illuminated from previous studies. For example, a gain might be improved social support through an intervention involving providing mailmen with mobile phones so they can call each other when on route. Subsequently, the efforts needed to achieve these gains needs to be considered. Efforts include the resources (time, money, emotional and cognitive efforts) involved in bringing about the changes and mitigating the barriers to the design and implementation, e.g., it is not only the financial costs of buying mobile phones, but also ensuring that mailmen have each other’s phone numbers and have the skills to use the mobile phones (i.e., implementation efforts).

Prioritizing and conducting effort-gain analyses is not straightforward. Limited prior experience with implementation or the lack of an organizational learning culture will require additional effort (Kaplan et al., 2011 ). The advantage of effort-gain balance analyses is that it helps prioritize some activities (low effort-high gain) over others (high effort-low gain). Activities that are low effort-low gain may, however, at times be a feasible starting point, from a motivational perspective, because they can build momentum by showing immediate, albeit limited, results (Cox et al., 2002 ). High effort-high gain activities may be prioritized when they offer a solution to a central problem, as well as when there is a relative match between the level of complexity of the problem and the solution (Storkholm et al., 2019 ). The prioritization may also be postponed for implementation later, when the organizational members have further developed their capability to manage change. Overall, using the knowledge of various stakeholders (Principle 1), is vital for ensuring a balanced understanding of efforts-gains.

Principle 6: Work with existing practices, processes, and mindsets

During the design, implementation, and evaluation of an organizational intervention, piggybacking on what is already known, already in place, and already done, can help integrate the intervention with the organizational practices, processes, and individual mindsets (Sørensen & Holman, 2014 ; von Thiele Schwarz et al., 2017 ). Thus, this principle addresses both organizational (practices and processes) and individual factors. Following this principle involves making the intervention fit with organizational logics and daily work. This fit may reduce the risk of conflict with existing organizational procedures, practices, and mindsets (Storkholm et al., 2017 ) and facilitate stakeholder engagement (Bauer & Jenny, 2012 ; Nielsen & Abildgaard, 2013 ).

This principle is particularly important when planning the implementation because the creation of separate implementation structures is costly, hinders synergies, and prevents the intervention activities from becoming an integrated part of everyday work (von Thiele Schwarz et al., 2015 ). New structures are easily abandoned once the project is over, which hinders sustainability (Ipsen et al., 2015 ).

The principle draws on developments in work and organizational psychology (e.g., (Stenfors Hayes et al., 2014 ; Zwetsloot, 1995 )), which in turn build on the integrated management system movement in quality management (Jørgensen et al., 2006 ; Wilkinson & Dale, 1999 , 2002 ). As an alternative to the conventional praxis of trying to minimize contextual influence, this principle is an example of how the interrelatedness between an intervention and its context should be embraced. For example, if an organization already has a process for working with quality improvement, it may be possible to extend it to include implementation of the intervention activities (von Thiele Schwarz et al., 2017 ). Other examples of working with existing practices can be to use groups, meetings, and communication pathways that are already in place, rather than creating new practices (Malchaire, 2004 ).

Nevertheless, it is not always feasible to follow this principle. For example, it is not applicable when the existing processes are part of the problem. That may be the case when the content of the intervention calls for changes of the system, rather than within the system. The implication is that this principle calls for the same careful consideration as when balancing quality improvement, i.e., improvement within a system, and innovation, which challenges the system by breaking new ground (March, 1991 ; Palm et al., 2014 ). Thus, it is vital to acknowledge that it may very well be the existing practices, processes, and mindsets that are the root causes of the problems, which in turn makes changing them a core intervention objective. What we are proposing is that the effort involved in breaking new ground such as challenging existing practices, processes, and mindsets should never be underestimated. Thus, challenging them should be done with intention, not by accident.

Principle 7. Iteratively observe, reflect, and adapt

Based on the premise that organizational interventions are complex, researchers and organizations need to iteratively observe, reflect, and (frequently) make adaptations to the planned intervention, implementation, or context as the intervention unfolds. This principle calls for ongoing monitoring and evaluation of the intervention progress, as well as the use of that information to improve the intervention content to ensure goal attainment. It also calls for increased attention to factors related to the change process, for example, the frequency of use of problem-solving tools in an intervention, in contrast to only focusing on the intervention’s outcomes.

The principle contrasts with traditional ways of designing, implementing, and evaluating organizational interventions as if they were episodic changes in a static system, with a clearly delineated beginning and end (Nielsen et al., 2010 ). It builds upon a shift from focusing solely on solving specific problems without questioning the solution (i.e., the intervention) (single-loop learning) to focusing on double-loop learning, which allows the solution, process, and goal to be questioned and modified based on continual monitoring and evaluation (Argyris & Schön, 1996 ).

The ability of interventions to achieve intended outcomes is mediated by a number of factors related to the interactions between content (intervention activities), process, and context (Pettigrew & Whipp, 1993 ). Thus, although the program logic (Principle 4) provides a hypothesized model for how it may be played out, this is a hypothesis: How this plays out cannot be fully anticipated beforehand, particularly in interrelated systems where changes in one part of the system can have unintended consequences in another. Therefore, interventions can seldom be fixed and implemented exactly as planned (Chambers et al., 2013 ). This principle calls for careful attention to what the core components of the intervention are, so that their evolution can be continually monitored, evaluated, and adapted to achieve the intended outcomes. This achievement is, after all, what matters for organizations; they care less about if the intervention is implemented exactly as planned, as long as it works.

Data and analysis are key to ensuring rigour in the process of observing, refining, and adapting an intervention (Storkholm et al., 2019 ). We suggest iterative cycles in which data concerning the intervention, implementation, context, and outcomes are monitored and used to inform potential adaptations (e.g., Shewhart’s cycle of plan-do-study-act) (Taylor et al., 2014 ). As a result, organizations and researchers would benefit from a systematic approach to evaluate progress using pragmatic scientific principles (Savage et al., 2018 ). To ensure this is done rigorously, we suggest to: 1) Use formal and informal methods (surveys, interviews, observations, documents, conversations) to collect data, 2) Mind the time lags as derived from the program logic, 3) Use the information to adapt the design or implementation of the intervention to the context; 4) Conduct small-scale rapid tests of activities, and increase the scale as data accumulate; 5) Identify new systemic challenges that may require the focus of the intervention activities to be revisited; and 6) Consider how intervention activities may adversely affect other parts of the system. This approach ensures a dynamic approach to change. It also positions evaluation as an ongoing process, managed locally by the organization, rather than the domain of the researcher after design and implementation (von Thiele Schwarz et al., 2016 ; Woodcock et al., 2020 ).

Principle 8. Develop organizational learning capabilities

This principle broadens the scope of researching organizational interventions from a narrow focus on specific study objectives to a broader commitment to facilitate a learning capability within the organization. Building a learning capability ensures that lessons are harvested within the organization to support future change efforts. This includes lessons from designing, implementing, and evaluating an intervention, as well as the tools, infrastructures, and practices developed. Organizational interventions tend to become finite projects which are not sustained over time even though they are often costly (Bernerth et al., 2011 ). Therefore, researchers involved in organizational interventions need to ensure that individual and organizational benefits are optimized. This principle also highlights the potential added value for organizations collaborating with researchers by ensuring that at least some of the know-how stays in the organization when the researchers leave. For example, it may involve engaging Human Resources staff or internal consultants to deliver intervention components rather than using external experts, or adding components that facilitate transfer of intervention-specific learning to other situations.

This principle is rooted in the disciplines of organizational learning, organizational behaviour, pragmatism, and systems theory, as well as in management concepts such as lean management. Developing a learning capability is essential to an organization’s ability to address future challenges and continually learn from change processes (Nielsen & Abildgaard, 2013 ). This principle builds on the double-loop learning of Principle 7 and expands it to include triple-loop-learning (i.e., that the organization becomes aware of the processes and structures needed to improve how learning is constructed, captured, disseminated, and incorporated) (McNicholas et al., 2019 ; Visser, 2007 ).

Principle 9: Evaluate the interaction between intervention, process, and context

If organizational interventions are to be conducted as outlined in the previous principles, it has implications for evaluation, both in terms of evaluation design and analytic approaches. Conceptually, this principle calls for a move away from answering research questions concerning whether an intervention works (isolated from context) to focusing on for whom, when, and why it works, and how it can be improved, building on theories and practice in change management, evaluation science, and organizational science (Pawson, 2013 ; Pettigrew & Whipp, 1993 ). By applying this principle, the evaluation sheds light on how a change was brought about: how the intervention interacted with the context (including participants and structures), and how this enabled certain mechanisms to trigger intended outcomes (Pawson, 2013 ). It contributes to theory building as well as to answering the type of questions practitioners ask.

The evaluation needs to capture the iterative changes to the intervention outlined in Principle 7, as well as the reasons for those changes and the impact they have on outcomes. Yet, in order to meet the objective of contributing both to science and practice, this needs to be done in a way that allows causal inferences as well as accumulation of data across cases. Process evaluation is an important first step in this, addressing research questions such as if employee participation, leadership support or facilitation explains variation in outcomes of an intervention (Biron & Karanika-Murray, 2013 ). It also calls for research designs beyond pre- and post-measurement, e.g., stepped-wedged designs, propensity scores, and regression discontinuity (Schelvis et al., 2015 ).

Realist evaluation is another example of how some of the complexities of organizational interventions can be captured. (Pawson & Tilley, 1997 ). It allows for hypothesized configurations derived from a program logic (Principle 4) to be tested. For example, using multi-group structure equation modelling, one study tested if the impact of a new participatory and structured problem-solving approach (kaizen) on employee wellbeing was explained by whether the kaizen work also included an occupational health perspective and showed that that was indeed the case (von Thiele Schwarz et al., 2017 ).

There may be a need to move beyond traditional variable-oriented methods and case-studies. One example is statistical process control charts, which build on rigorous time-series analyses to detect if an outcome changes over time over and above the expected natural variation in a pattern that can be attributed to “special-causes” – including, intervention activities (Benneyan et al., 2003 ; Shewhart, 1930 ). This analysis allows for testing of research questions. For example, to establish if graphical feedback can have a positive impact on hospital infection trends, or if variation in performance can be reduced by eliminating waste in the work process (Thor et al., 2007 ).

A third example is configurational comparative methodology (Thiem, 2017 ). It is a statistical method from the person-/case-oriented family, rather than the variable-oriented approaches that most evaluations of organizational interventions rely on. Coincidence analysis allows assessment of multiple pathways to the same outcome. For example, one study showed that in order to have high intention to change, a positive attitude among staff was always needed, whereas behaviour control was only important under some circumstances (Straatmann et al., 2018 ). These three examples are very different; yet, they are all examples of evaluation methodologies that combine sensitivity to what works for whom and in which circumstances with scientific rigour (Pawson, 2013 ; Pawson & Tilley, 1997 ).

Principle 10: Transfer knowledge beyond the specific organization

A cornerstone of organizational research, and what sets it apart from consulting, is the ambition to not only induce change in a single setting but to transfer knowledge from the specific to the general by cumulating learning that can be generalized, abstracted into theory, disseminated, and scaled up. Dissemination and the scaling up of organizational interventions is different from evaluations of interventions that aim to draw generalizable conclusions about the effects of an intervention and where knowledge is accumulated through replication of (the same) intervention. Accumulation through replication requires interventions to be fixed over time and isolated from the context of application. When organizational interventions are approached as outlined in these principles, accumulation through replication is not feasible because the intervention is integrated into, and interacts with, specific organizational contexts and changes over time through ongoing adaptations. This principle builds on the assumption that interventions seldom have an effect independent of the context in which they are used (Semmer, 2006 ).

In these cases, scalability cannot focus on statistical generalization and accumulation of knowledge through the identification of specific interventions independent of context. Knowledge need to be developed in other ways. This principle outlines that generalization, dissemination, and scalability should rely on analytical generalization, drawing from case study research (Flyvbjerg, 2006 ; Yin, 2013 ). This includes addressing research questions such as “What is required to reproduce the impact of the intervention in different settings?” Therefore, we encourage striving for accumulating knowledge, that is, the gradual refinement of phenomena by focusing on the aspects included in the principles, including how various factors interact to produce an outcome, and comparing and contrasting these across studies (Pawson & Tilley, 1997 ). This accumulation can, for example, be done using methodologies for literature reviews such as qualitative metasyntheses (Docherty & Emden, 1997 ; Nielsen & Miraglia, 2017 ). Thus, rather than striving for generalizability, this principle suggests striving for specificity, a gradual increase in the precision of the knowledge of what works for whom and when (Pawson, 2013 ).

This article set out to propose principles for how to design, implement, and evaluate organizational interventions based on expertise from multiple disciplines, offering suggestions for how organizational interventions can be researched in a way that makes the end result both scientifically rigorous and practically applicable. We propose a way to move the field of organizational interventions forward. Using a Mode 2 knowledge production approach, we draw on our expertise and the literature from multiple disciplines, to propose principles for further empirical testing and development.

In this paper, the principles are presented as discrete entities and in a linear fashion. This is a necessary simplification of a complex process for presentational purposes. The principles may overlap, their order is not self-evident, and they are interrelated. Further research is needed into the contribution of each individual principle, their timing, and the interrelatedness between principles; we hope this paper sparks an interest to advance this agenda.

Viewed one-by-one, the principles are not unique. They reflect the available evidence and/or best practice in one or more research disciplines that are concerned with changes in organizations. Some of them, for example, Principle 1 (Ensure active participation and engagement among key stakeholders), rest on substantial empirical evidence and are common across many disciplines. Others, like Principle 4 (Explicate the program logic), represent established methodological practices in some disciplines (e.g., evaluation science), but not many others. Due to the origins of the principles from various disciplinary backgrounds, the principles are reflected in existing discipline-specific frameworks. For example, Engagement of various stakeholders (Principle 1) and Understanding the situation (e.g., conducting a needs assessment) (Principle 2) and Develop program logic models (Principle 4) are part of the Centres for Disease Control (CDC) Framework for Program Evaluation in Public Health (Centers for Disease Control and Prevention, 1999 ). However, the CDC framework does not reflect the ambition to meet both research- and organizational objectives, or the dynamic characteristics of organizational interventions. Another example is Brown and Gerhardt’s integrative practice model (Brown & Gerhardt, 2002 ). The model focuses on the design and formative evaluation of training programs and it emphasizes the need for alignment both with strategy (Principle 3) and work procedures (Principle 6) and iterative development (Principle 7) of training material. Yet, it does not discuss principles such as the use of program logics, choosing activities based on effort-gain balance, or going beyond the training of a specific skill to developing learning capabilities (Brown & Gerhardt, 2002 ).

Thus, instead of claiming that each principle is unique on its on, we argue that the contribution lies in the convergence of principles across multiple disciplines, and in how they represent a common understanding across a group of experts from various disciplines and research fields, despite their differences in theoretical, empirical, and epistemological backgrounds. Thus, the Sigtuna Principles represent common denominators for researching improvements in organizations that go beyond specific disciplines and may be one step towards a more general theory of organizational interventions.

Do all the principles need to be followed for organizational interventions to be successful? This is an empirical question that calls for further exploration. Our expectation is that the more principles are followed, the better. The degree to which it is feasible to do so is likely to differ between occasions and studies. For example, sometimes, the intervention is predefined, as when guidelines or new legislation is to be implemented, meaning that some principles are not applicable. The number of principles employed will also depend on the mandate that the researcher has in the organization. Sometimes the mandate is restricted to parts of the process, such as when the researcher is only involved in the design or the evaluation phases. This restriction, too, will affect which principles are applicable.

Nevertheless, when combined, these principles offer the potential for a transformative change in the way research into organizational interventions is conducted in at least two interrelated ways. First, it changes the role of researchers and the relationships between the researchers and the organization towards a partnership between equals, where both perspectives are needed to negotiate the inherent contradictions between the objectives. For researchers, adopting a dual perspective implies a change from considering the intervention in isolation, mainly judging the content based on theory or what would yield the highest effect sizes, to thinking about how the practical impact of the change can be maximized. Such an approach includes considering the intervention in relation to the restraints and possibilities of the context where the intervention is set, and the change process, and to determine which activities would result in the most optimal solution given that context (von Thiele Schwarz et al., 2019 ).

Secondly, the combination of Principles 1–9 on the one hand and Principle 10 on the other, implies a change in the relationship between science and practice. This change involves moving from a one-way street from evidence to practice, where evidence can first be established and then disseminated, implemented, and have an impact, to a constructivist view on knowledge development, where the science base is gradually refined in the interaction with practice (Greenhalgh & Wieringa, 2011 ). The principles encourage researchers to consider impact upstream, by asking how value for organizational stakeholders can be optimized throughout the design, implementation, and evaluation of the intervention, not just after the research is finished.

The change inherent in applying the principles is not easy, but neither is researching organizations without such considerations: there are whole books dedicated to all the pitfalls involved (e.g., Karanika-Murray & Biron, 2015a ). The reasons for derailment include factors related to the intervention itself (e.g., incorrect content), the context (e.g., concurrent organizational changes), and the process (e.g., conflicts and power struggles) (Karanika-Murray & Biron, 2015b ), all well-known to organizational researchers. These principles do not solve all of these challenges, but encourage researchers to build relationships with organizational stakeholders so they can be involved in trouble-shooting and problem-solving issues that might threaten to derail the change process – and the research.

The target group for this paper is researchers, yet it is not limited to this group. The principles encourage a way of working in partnership between research and practice, and therefore, the principles are relevant for practitioners, too. In fact, the principles may be of value to practitioners whether a researcher is involved or not. All but the last few principles are related to issues that are of common interest to both practitioners and researchers. The principles are also potentially applicable to other fields, given their development as part of an aspiration to find synergies across communities of practice in various research fields.

The development of the principles followed a structured process focused on harvesting learning from experts from various fields, which increases the trustworthiness of the result. However, they were developed by a relatively small group of people, and although many research fields were represented, not all fields relevant to organizational interventions were. There is still a risk that the principles do not reflect a broader understanding of the phenomena. A thorough validation process with other researchers and practitioners was employed to mitigate this risk.

Conclusions

This paper presents ten principles that could contribute to the transformation of how organizational interventions are researched, and thereby increase the potential real-world impact. We hope these principles spark interest in the entire intervention process, from the design and implementation to evaluation, and towards a mutually beneficial relationship between the need for robust research and the flexibility needed to achieve change in practice.

Acknowledgments

The authors would like to acknowledge the input from practitioners and researchers participating in the workshops and validation sessions and in particular, thank Gregory A Aarons, Hanna Augustsson, Marit Christensen, Kevin Daniels, Christian Dyrlund Wåhlin-Jacobsen, Désirée Füllemann, Liv Gish, Peter Hagedorn-Rasmussen, Sara Ingvarsson, Sara Korlen, Robert Larsson, Andreas Hellström, Michael Munch-Hansen, Monika Oswaldsson, Rachael Potter, Signe Poulsen, Thim Prætorius, Raymond Randall, Henk van Rhee, Ole Henning Sørensen, Andreas Stenling, Christian Ståhl, Susanne Tafvelin, and Johan Thor.

Funding Statement

This work was supported by grants from the following agencies and grants. Funding for the meetings and the writing of this paper was generously provided by a network grant from the Joint Committee for Nordic research councils in the Humanities and Social Sciences (grant number 2016-00241/NOS-HS). In addition, UvTS was funded by the Swedish Research Council (2016-01261). JR was funded by the National Institute for Health Research (NIHR) under the Collaborations for Leadership in Applied Health Research and Care (CLAHRC) programme for North West London, and The Health Foundation. The views expressed in this publication are those of the authors and not necessarily those of the funders.

1. Owen ( 2008 ).

2. Priles ( 1993 ).

Disclosure statement

The authors declare no conflicts of interest.

  • Aarons, G. A., Hurlburt, M., & Horwitz, S. M. (2011). Advancing a conceptual model of evidence-based practice implementation in public service sectors . Administration Policy in Mental Health and Mental Health Services Research , 38 ( 1 ), 4–23. 10.1007/s10488-010-0327-7 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Abildgaard, J. S., Hasson, H., von Thiele Schwarz, U., Løvseth, L. T., Ala-Laurinaho, A., & Nielsen, K. (2018). Forms of participation: The development and application of a conceptual model of participation in work environment interventions . Economic and Industrial Democracy , 41 (3), 746–769. 10.1177/0143831X17743576 [ CrossRef ] [ Google Scholar ]
  • Abildgaard, J. S., Nielsen, K., Wåhlin-Jacobsen, C. D., Maltesen, T., Christensen, K. B., & Holtermann, A. (2019). ‘Same, but different’–a mixed methods realist evaluation of a cluster-randomised controlled participatory organisational intervention . Human Relations . 10.1177/0018726719866896 [ CrossRef ] [ Google Scholar ]
  • Argyris, C., & Schön, D. A. (1996). Organizational learning 11: Theory, method and practice . Addison-Wellesley. [ Google Scholar ]
  • Bamberger, M., Tarsilla, M., & Hesse-Biber, S. (2016). Why so many “rigorous” evaluations fail to identify unintended consequences of development programs: How mixed methods can contribute . Evaluation and Program Planning , 55 , 155–162. 10.1016/j.evalprogplan.2016.01.001 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Batalden, P. B., & Davidoff, F. (2007). What is “quality improvement” and how can it transform healthcare? Quality and Safety in Health Care , 16 ( 1 ), 2–3. 10.1136/qshc.2006.022046 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Batalden, P. B., & Stoltz, P. K. (1993). A framework for the continual improvement of health care: Building and applying professional and improvement knowledge to test changes in daily work . The Joint Commission Journal on Quality Improvement , 19 ( 10 ), 424–447. 10.1016/S1070-3241(16)30025-6 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Bauer, G. F., & Jenny, G. J. (2012). Moving towards positive organisational health: Challenges and a proposal for a research model of organisational health development . In: J. Houdmondt, S. Leka & R.R. Sinclair (Eds.). Contemporary Occupational Health Psychology: Global Perspectives on Research and Practice. Vol 2 (pp 126–145). Oxford, UK: Wiley-Blackwell. [ Google Scholar ]
  • Benneyan, J., Lloyd, R., & Plsek, P. (2003). Statistical process control as a tool for research and healthcare improvement . Quality and Safety in Health Care , 12 ( 6 ), 458–464. 10.1136/qhc.12.6.458 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Bernerth, J. B., Walker, H. J., & Harris, S. G. (2011). Change fatigue: Development and initial validation of a new measure . Work & Stress , 25 ( 4 ), 321–337. 10.1080/02678373.2011.634280 [ CrossRef ] [ Google Scholar ]
  • Biron, C., & Karanika-Murray, M. (2013). Process evaluation for organizational stress and well-being interventions: Implications for theory, method, and practice . International Journal of Stress Management, 21 (1), 85–111.  https://doi.org/10.1037/a0033227 . [ Google Scholar ]
  • Brown, K. G., & Gerhardt, M. W. (2002). Formative evaluation: An integrative practice model and case study . Personnel Psychology , 55 ( 4 ), 951–983. 10.1111/j.1744-6570.2002.tb00137.x [ CrossRef ] [ Google Scholar ]
  • Brydon-Miller, M., Greenwood, D., & Maguire, P. (2003). Why action research? Action Research , 1 ( 1 ), 9–28. 10.1177/14767503030011002 [ CrossRef ] [ Google Scholar ]
  • Centers for Disease Control and Prevention . (1999). Framework for program evaluation in public health . MMWR 1999; 48 ( No. RR-11 ). https://www.cdc.gov/mmwr/PDF/rr/rr4811.pdf [ PubMed ]
  • Chambers, D., Glasgow, R., & Stange, K. (2013). The dynamic sustainability framework: Addressing the paradox of sustainment amid ongoing change . Implementation Science , 8 ( 1 ), 117. 10.1186/1748-5908-8-117 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Cox, T., Griffiths, A., & Randall, R. (2002). Interventions to control stress at work in hospital staff . HSE Contract Research Report. [ Google Scholar ]
  • Cox, T., Griffiths, A., & Rial-González, E. (2000). Research on work-related stress Report : Office for Official Publications of the European Communities. [ Google Scholar ]
  • Docherty, S. M., & Emden, C. (1997). Qualitative metasynthesis: Issues and techniques . Research in Nursing & Health , 20 ( 4 ), 365–371. 10.1002/(SICI)1098-240X(199708)20:4<365::AID-NUR9>3.0.CO;2-E [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Edwards, K., & Jensen, P. L. (2014). Design of systems for productivity and well being . Applied Ergonomics , 45 ( 1 ), 26–32. 10.1016/j.apergo.2013.03.022 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Fishbein, M., & Yzer, M. C. (2003). Using theory to design effective health behavior interventions . Communication Theory , 13 ( 2 ), 164–183. 10.1111/j.1468-2885.2003.tb00287.x [ CrossRef ] [ Google Scholar ]
  • Flyvbjerg, B. (2006). Five misunderstandings about case-study research . Qualitative Inquiry , 12 ( 2 ), 219–245. 10.1177/1077800405284363 [ CrossRef ] [ Google Scholar ]
  • Framke, E., & Sørensen, O. H. (2015). Implementation of a participatory organisational-level occupational health intervention-focusing on the primary task . International Journal of Human Factors and Ergonomics , 3 ( 3–4 ), 254–270. 10.1504/IJHFE.2015.072998 [ CrossRef ] [ Google Scholar ]
  • Frykman, M., Hasson, H., Atlin Muntlin, Å., & von Thiele Schwarz, U. (2014). Functions of behavior change interventions when implementing teamwork at an emergency department: A comparative case study . BMC Health Services Research , 14 ( 1 ), 218. 10.1186/1472-6963-14-218 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Gibbons, M., Limoges, C., Nowotony, H., Schwartzman, S., Scott, P., & Trow, M. (1994). The new production of knowledge: The dynamics of science and research in contemporary societies . Sage Publications. [ Google Scholar ]
  • Greenhalgh, T., & Wieringa, S. (2011). Is it time to drop the ‘knowledge translation’metaphor? A critical literature review . Journal of the Royal Society of Medicine , 104 ( 12 ), 501–509. 10.1258/jrsm.2011.110285 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Griffiths, A. (1999). Organizational interventions: Facing the limits of the natural science paradigm . Scandinavian Journal of Work, Environment & Health , 25 (6), 589–596. 10.5271/sjweh.485 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Gustavsen, B. (2011). The Nordic model of work organization . Journal of the Knowledge Economy , 2 ( 4 ), 463–480. 10.1007/s13132-011-0064-5 [ CrossRef ] [ Google Scholar ]
  • Hasson, H., Lornudd, C., von Thiele Schwarz, U., & Richter, A. (2018). Supporting Interventions: Enabling senior management to enhance the effectiveness of a training program for line managers. In Nielsen K. & Noblet A. (Eds.), Organizational interventions for health and well-being: A handbook for evidence-based practice (pp. 220–236). Routledge. [ Google Scholar ]
  • Hasson, H., Villaume, K., von Thiele Schwarz, U., & Palm, K. (2014). Managing implementation: Roles of line managers, senior managers, and human resource professionals in an occupational health intervention . Journal of Occupational and Environmental Medicine , 56 ( 1 ), 58–65. 10.1097/JOM.0000000000000020 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Hasson, H., von Thiele Schwarz, U., Nielsen, K., & Tafvelin, S. (2016). Are we all in the same boat? The role of perceptual distance in organizational health interventions . Stress & Health , 32 ( 4 ), 294–303. 10.1002/smi.2703 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Havermans, B. M., Schelvis, R. M., Boot, C. R., Brouwers, E. P., Anema, J. R., & van der Beek, A. J. (2016). Process variables in organizational stress management intervention evaluation research: A systematic review . Scandinavian Journal of Work, Environment & Health , 42 ( 5 ), 371–381. 10.5271/sjweh.3570 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Holman, D., & Axtell, C. (2016). Can job redesign interventions influence a broad range of employee outcomes by changing multiple job characteristics? A quasi-experimental study . Journal of Occupational Health Psychology , 21 ( 3 ), 284 doi: 10.1037/a0039962 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Ipsen, C., Gish, L., & Poulsen, S. (2015). Organizational-level interventions in small and medium-sized enterprises: Enabling and inhibiting factors in the PoWRS program . Safety Science , 71 , 264–274. 10.1016/j.ssci.2014.07.017 [ CrossRef ] [ Google Scholar ]
  • Ipsen, C., & Jensen, P. L. (2012). Organizational options for preventing work-related stress in knowledge work . International Journal of Industrial Ergonomics , 42 ( 4 ), 325–334. 10.1016/j.ergon.2012.02.006 [ CrossRef ] [ Google Scholar ]
  • Jørgensen, T. H., Remmen, A., & Mellado, M. D. (2006). Integrated management systems–three different levels of integration . Journal of Cleaner Production , 14 ( 8 ), 713–722. 10.1016/j.jclepro.2005.04.005 [ CrossRef ] [ Google Scholar ]
  • Kaplan, H. C., Provost, L. P., Froehle, C. M., & Margolis, P. A. (2011). The model for understanding success in quality (MUSIQ): Building a theory of context in healthcare quality improvement . BMJ Quality & Safety , 21 (1), 13–20. https://qualitysafety.bmj.com/content/21/1/13 https://qualitysafety.bmj.com/content/21/1/13 [ PubMed ] [ Google Scholar ]
  • Karanika-Murray, M., & Biron, C. (2015b). Why do some interventions derail? Deconstructing the elements of organizational interventions for stress and well-being. In Karanika-Murray M. & Biron C. (Eds.), Derailed organizational interventions for stress and well-being (pp. 1–15) . Springer. doi: 10.1007/978-94-017-9867-9_1 [ CrossRef ] [ Google Scholar ]
  • Karanika-Murray, M., & Biron, C. (2015a). Derailed organizational interventions for stress and well-being . Springer. [ Google Scholar ]
  • Kompier, M., & Aust, B. (2016). Organizational stress management interventions: Is it the singer not the song? Scandinavian Journal of Work, Environment & Health , 42 ( 5 ), 355–358. 10.5271/sjweh.3578 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Kotter, J. P. (1996). Leading Change . Harvard Business School Press. [ Google Scholar ]
  • Kristensen, T. (2005). Intervention studies in occupational epidemiology . Occupational and Environmental Medicine , 62 ( 3 ), 205–210. 10.1136/oem.2004.016097 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Lennox, L., Maher, L., & Reed, J. (2018). Navigating the sustainability landscape: A systematic review of sustainability approaches in healthcare . Implementation Science , 13 ( 1 ), 27. 10.1186/s13012-017-0707-4 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Leviton, L. C., Khan, L. K., Rog, D., Dawkins, N., & Cotton, D. (2010). Evaluability assessment to improve public health policies, programs, and practices . Annual Review of Public Health , 31 ( 1 ), 213–233. 10.1146/annurev.publhealth.012809.103625 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • MacLean, D., MacIntosh, R., & Grant, S. (2002). Mode 2 management research . British Journal of Management , 13 ( 3 ), 189–207. 10.1111/1467-8551.00237 [ CrossRef ] [ Google Scholar ]
  • Malchaire, J. (2004). The SOBANE risk management strategy and the Déparis method for the participatory screening of the risks . International Archives of Occupational and Environmental Health , 77 ( 6 ), 443–450. 10.1007/s00420-004-0524-3 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Malinen, S., Hatton, T., Naswall, K., & Kuntz, J. (2019). Strategies to enhance employee well‐being and organisational performance in a postcrisis environment: A case study . Journal of Contingencies and Crisis Management , 27 ( 1 ), 79–86. 10.1111/1468-5973.12227 [ CrossRef ] [ Google Scholar ]
  • March, J. G. (1991). Exploration and exploitation in organizational learning . Organization Science , 2 ( 1 ), 71–87. 10.1287/orsc.2.1.71 [ CrossRef ] [ Google Scholar ]
  • McNicholas, C., Lennox, L., Woodcock, T., Bell, D., & Reed, J. E. (2019). Evolving quality improvement support strategies to improve plan–do–study–act cycle fidelity: A retrospective mixed-methods study . BMJ Quality & Safety , 28 ( 5 ), 356–365. 10.1136/bmjqs-2017-007605 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Montano, D., Hoven, H., & Siegrist, J. (2014). Effects of organisational-level interventions at work on employees’ health: A systematic review . BMC Public Health , 14 ( 1 ), 135. 10.1186/1471-2458-14-135 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Nielsen, K. (2013). Review article: How can we make organizational interventions work? Employees and line managers as actively crafting interventions . Human Relations , 66 ( 8 ), 1029–1050. 10.1177/0018726713477164 [ CrossRef ] [ Google Scholar ]
  • Nielsen, K. (2017). Organizational occupational health interventions: What works for whom in which circumstances? Occupational Medicine , 67 (6), 410–412 . 10.1093/occmed/kqx058 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Nielsen, K., & Abildgaard, J. S. (2013). Organizational interventions: A research-based framework for the evaluation of both process and effects . Work & Stress , 27 ( 3 ), 278–297. 10.1080/02678373.2013.812358 [ CrossRef ] [ Google Scholar ]
  • Nielsen, K., & Miraglia, M. (2017). What works for whom in which circumstances? On the need to move beyond the ‘what works?’ question in organizational intervention research . Human Relations , 70 ( 1 ), 40–62. 10.1177/0018726716670226 [ CrossRef ] [ Google Scholar ]
  • Nielsen, K., Nielsen, M. B., Ogbonnaya, C., Känsälä, M., Saari, E., & Isaksson, K. (2017). Workplace resources to improve both employee well-being and performance: A systematic review and meta-analysis . Work & Stress , 31 ( 2 ), 101–120. 10.1080/02678373.2017.1304463 [ CrossRef ] [ Google Scholar ]
  • Nielsen, K., & Randall, R. (2013). Opening the black box: Presenting a model for evaluating organizational-level interventions . European Journal of Work and Organizational Psychology , 22 ( 5 ), 601–617. 10.1080/1359432X.2012.690556 [ CrossRef ] [ Google Scholar ]
  • Nielsen, K., Randall, R., Holten, A. L., & González, E. R. (2010). Conducting organizational-level occupational health interventions: What works?. Work & Stress , 24 ( 3 ), 234–259. doi: 10.1080/02678373.2010.515393 [ CrossRef ] [ Google Scholar ]
  • Owen, H. (2008). Open space technology: A user’s guide . Berrett-Koehler Publishers. [ Google Scholar ]
  • Palm, K., Lilja, J., & Wiklund, H. (2014). The challenge of integrating innovation and quality management practice . Total Quality Management and Business Excellence , 27 (1–2), 34–47. [ Google Scholar ]
  • Pawson, R. (2013). The science of evaluation: A realist manifesto . Sage. [ Google Scholar ]
  • Pawson, R., & Tilley, N. (1997). Realistic evaluation . Sage. [ Google Scholar ]
  • Pettigrew, A., & Whipp, R. (1993). Managing change for competitive success . Wiley-Blackwell. [ Google Scholar ]
  • Priles, M. A. (1993). The fishbowl discussion: A strategy for large honors classes . English Journal , 82 ( 6 ), 49. 10.2307/820165 [ CrossRef ] [ Google Scholar ]
  • Reed, J. E., Howe, C., Doyle, C., & Bell, D . (2018). Simple rules for evidence translation in complex systems: a qualitative study . BMC medicine, , 16 ( 1 ), 92. https://doi.org/10.1186/s12916-018-1076-9 [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Reed, J. E., McNicholas, C., Woodcock, T., Issen, L., & Bell, D. (2014). Designing quality improvement initiatives: The action effect method, a structured approach to identifying and articulating programme theory . BMJ Quality & Safett , 23 ( 12 ), 1040–1048. 10.1136/bmjqs-2014-003103 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Rogers, P. J. (2008). Using programme theory to evaluate complicated and complex aspects of interventions . Evaluation , 14 ( 1 ), 29–48. 10.1177/1356389007084674 [ CrossRef ] [ Google Scholar ]
  • Rosskam, E. (2009). Using participatory action research methodology to improve worker health. In Schnall P., Dobson M., & Rosskam E. (Eds.), Unhealthy work: Causes, consequences, cures (pp. 211_229). Baywood Publishing Company. [ Google Scholar ]
  • Saunders, R. P., Evans, M. H., & Joshi, P. (2005). Developing a process-evaluation plan for assessing health promotion program implementation: A how-to guide . Health Promotion Practice , 6 ( 2 ), 134–147. 10.1177/1524839904273387 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Savage, M., Storkholm, M. H., Mazzocato, P., & Savage, C. (2018). Effective physician leaders: An appreciative inquiry into their qualities, capabilities and learning approaches . BMJ Leader , 2 ( 3 ), 95–102. 10.1136/leader-2017-000050 [ CrossRef ] [ Google Scholar ]
  • Schelvis, R. M., Hengel, K. M. O., Burdorf, A., Blatter, B. M., Strijk, J. E., & van der Beek, A. J. (2015). Evaluation of occupational health interventions using a randomized controlled trial: Challenges and alternative research designs . Scandinavian Journal of Work, Environment & Health , 41 (5), 491–503. 10.5271/sjweh.3505 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Semmer, N. K. (2006). Job stress interventions and the organization of work . Scandinavian Journal of Work, Environment & Health , 32 ( 6 ), 515–527. 10.5271/sjweh.1056 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Senge, P. (2006). The fifth discipline, the art and practice of the learning organization, doubleday . Doubleday Currency. [ Google Scholar ]
  • Shewhart, W. A. (1930). Economic quality control of manufactured product 1 . Bell System Technical Journal , 9 ( 2 ), 364–389. 10.1002/j.1538-7305.1930.tb00373.x [ CrossRef ] [ Google Scholar ]
  • Sørensen, O. H., & Holman, D. (2014). A participative intervention to improve employee well-being in knowledge work jobs: A mixed-methods evaluation study . Work & Stress , 28 ( 1 ), 67–86. 10.1080/02678373.2013.876124 [ CrossRef ] [ Google Scholar ]
  • Stenfors Hayes, T., Hasson, H., Augustsson, H., Hvitfeldt Forsberg, H., & von Thiele Schwarz, U. (2014). Merging occupational health, safety and health promotion with Lean: An integrated systems approach (the LeanHealth project). In Burke R., Cooper C., & Biron C. (Eds.), Creating healthy workplaces: Stress reduction, improved well-being, and organizational effectiveness , (pp. 281–299). Gower Publishing. [ Google Scholar ]
  • Storkholm, M. H., Mazzocato, P., & Savage, C. (2019). Make it complicated: A qualitative study utilizing a complexity framework to explain improvement in health care . BMC Health Services Research , 19 ( 1 ), 842. 10.1186/s12913-019-4705-x [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Storkholm, M. H., Mazzocato, P., Savage, M., & Savage, C. (2017). Money’s (not) on my mind: A qualitative study of how staff and managers understand health care’s triple aim . BMC Health Services Research , 17 ( 1 ), 98. 10.1186/s12913-017-2052-3 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Storkholm, M. H., Savage, C., Tessma, M. K., Salvig, J. D., & Mazzocato, P. (2019). Ready for the triple aim? Perspectives on organizational readiness for implementing change from a Danish obstetrics and gynecology department . BMC Health Services Research , 19 ( 1 ), 517. 10.1186/s12913-019-4319-3 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Straatmann, T., Rothenhöfer, L. M., Meier, A., & Mueller, K. (2018). A configurational perspective on the theory of planned behaviour to understand employees’ change‐supportive intentions . Applied Psychology , 67 ( 1 ), 91–135. 10.1111/apps.12120 [ CrossRef ] [ Google Scholar ]
  • Tafvelin, S., Nielsen, K., Abildgaard, J. S., Richter, A., von Thiele Schwarz, U., & Hasson, H. (2019). Leader-team perceptual distance affects outcomes of leadership training: Examining safety leadership and follower safety self-efficacy . Safety Science , 120 (December) , 25–31. 10.1016/j.ssci.2019.06.019 [ CrossRef ] [ Google Scholar ]
  • Tafvelin, S., von Thiele Schwarz, U., Nielsen, K., & Hasson, H. (2019). Employees’ and line managers’ active involvement in participatory organizational interventions: Examining direct, reversed, and reciprocal effects on well-being . Stress and Health, 35 (1), 69–80 . 10.1002/smi.2841 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Taylor, M. J., McNicholas, C., Nicolay, C., Darzi, A., Bell, D., & Reed, J. E. (2014). Systematic review of the application of the plan–do–study–act method to improve quality in healthcare . BMJ Quality & Safety, 23, 290–298. https://qualitysafety.bmj.com/content/23/4/290.short [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Thiem, A. (2017). Conducting configurational comparative research with qualitative comparative analysis: A hands-on tutorial for applied evaluation scholars and practitioners . American Journal of Evaluation , 38 ( 3 ), 420–433. 10.1177/1098214016673902 [ CrossRef ] [ Google Scholar ]
  • Thor, J., Lundberg, J., Ask, J., Olsson, J., Carli, C., Härenstam, K. P., & Brommels, M. (2007). Application of statistical process control in healthcare improvement: Systematic review . BMJ Quality & Safety , 16(5) ( 5 ), 387–399. 10.1136/qshc.2006.022194 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Tsutsumi, A., Nagami, M., Yoshikawa, T., Nogi, K., & Kawakami, N. (2009). Participatory intervention for workplace improvements on mental health and job performance among blue-collar workers: A cluster randomized controlled trial . Journal of Occupational and Environmental Medicine , 51 ( 5 ), 554–563. 10.1097/JOM.0b013e3181a24d28 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Van der Klink, J., Blonk, R., Schene, A. H., & Van Dijk, F. (2001). The benefits of interventions for work-related stress . American Journal of Public Health , 91 ( 2 ), 270–276. doi: 10.2105/ajph.91.2.270 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Visser, M. (2007). Deutero-learning in organizations: A review and a reformulation . Academy of Management Review , 32 ( 2 ), 659–667. 10.5465/amr.2007.24351883 [ CrossRef ] [ Google Scholar ]
  • Vogt, W. P., & Johnson, B. (2011). Dictionary of statistics & methodology: A nontechnical guide for the social sciences . Sage. [ Google Scholar ]
  • von Thiele Schwarz, U., & Hasson, H. (2013). Alignment for achieving a healthy organization. In Bauer G. F. & Jenny G. J. (Eds.), Salutogenic organizations and change (pp. 107–125). Springer. [ Google Scholar ]
  • von Thiele Schwarz, U., Richter, A., & Hasson, H. (2018). Getting on the same page - Co-created program logic (COP). In Nielsen K. & Noblet A. (Eds.), Organizational interventions for health and well-being: A handbook for evidence-based practice (pp. 42–67). Routledge. [ Google Scholar ]
  • von Thiele Schwarz, U., Augustsson, H., Hasson, H., & Stenfors-Hayes, T. (2015). Promoting employee health by integrating health protection, health promotion, and continuous improvement: A longitudinal quasi-experimental intervention study . Journal of Occupational and Environmental Medicine , 57 ( 2 ), 217–225. 10.1097/JOM.0000000000000344 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • von Thiele Schwarz, U., Hasson, H., & Aarons, G. (2019). The Value Equation: Three complementary propositions for reconciling adaptation and fidelity in evidence-based practice implementation . BMC Health Service Research , 19 , 868 . 10.1186/s12913-019-4668-y [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • von Thiele Schwarz, U., Lundmark, R., & Hasson, H. (2016). The Dynamic Integrated Evaluation Model (DIEM): Achieving Sustainability in Organizational Intervention through a Participatory Evaluation Approach . Stress & Health special issue on Organizational health interventions – advances in evaluation methodology , 32 ( 4 ), 285–293. 10.1002/smi.2701 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • von Thiele Schwarz, U., Nielsen, K. M., Stenfors-Hayes, T., & Hasson, H. (2017). Using kaizen to improve employee well-being: Results from two organizational intervention studies . Human Relations , 70 ( 8 ), 966–993. 10.1177/0018726716677071 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Wåhlin-Jacobsen, C. D. (2019). The terms of “becoming empowered”: How ascriptions and negotiations of employee identities shape the outcomes of workplace voice activities . Scandinavian Journal of Management , 35 ( 3 ), 101059. 10.1016/j.scaman.2019.101059 [ CrossRef ] [ Google Scholar ]
  • Weick, K. E. (1995). Sensemaking in organizations (Foundations for organizational science) . Sage Publications Inc. [ Google Scholar ]
  • Wilder, D. A., Austin, J., & Casella, S. (2009). Applying behavior analysis in organizations: Organizational behavior management . Psychological Services , 6 ( 3 ), 202–211. 10.1037/a0015393 [ CrossRef ] [ Google Scholar ]
  • Wilkinson, G., & Dale, B. (1999). Integrated management systems: An examination of the concept and theory . The TQM Magazine , 11 ( 2 ), 95–104. 10.1108/09544789910257280 [ CrossRef ] [ Google Scholar ]
  • Wilkinson, G., & Dale, B. (2002). An examination of the ISO 9001: 2000 standard and its influence on the integration of management systems . Production Planning & Control , 13 ( 3 ), 284–297. 10.1080/09537280110086361 [ CrossRef ] [ Google Scholar ]
  • Woodcock, T., Adeleke, Y., Goeschel, C., Pronovost, P., & Dixon-Woods, M. (2020). A modified Delphi study to identify the features of high quality measurement plans for healthcare improvement projects . BMC Medical Research Methodology , 20 ( 1 ), 1–9. 10.1186/s12874-019-0886-6 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Yin, R. K. (2013). Case study research: Design and methods . Sage. [ Google Scholar ]
  • Zwetsloot, G. I. (1995). Improving cleaner production by integration into the management of quality, environment and working conditions . Journal of Cleaner Production , 3 ( 1 ), 61–66. 10.1016/0959-6526(95)00046-H [ CrossRef ] [ Google Scholar ]

Penn State  Logo

  • Help & FAQ

Organization Development Interventions Executing Effective Organizational Change

  • Learning & Performance Systems

Research output : Chapter in Book/Report/Conference proceeding › Chapter

To effectively adapt and thrive in today’s business world, organizations need to implement effective organizational development (OD) interventions to improve performance and effectiveness at the individual, group, and organizational levels. OD interventions involve people, trust, support, shared power, conflict resolution, and stakeholders’ participation, just to name a few. OD interventions usually have broader scope and can affect the whole organization. OD practitioners or change agents must have a solid understanding of different OD interventions to select the most appropriate one to fulfill the client’s needs. There is limited precise information or research about how to design OD interventions or how they can be expected to interact with organizational conditions to achieve specific results. This book offers OD practitioners and change agents a step-by-step approach to implementing OD interventions and includes example cases, practical tools, and guidelines for different OD interventions. It is noteworthy that roughly 65% of organizational change projects fail. One reason for the failure is that the changes are not effectively implemented, and this book focuses on how to successfully implement organizational changes. Designed for use by OD practitioners, management, and human resources professionals, this book provides readers with OD basic principles, practices, and skills by featuring illustrative case studies and useful tools. This book shows how OD professionals can actually get work done and what the step-by-step OD effort should be. This book looks at how to choose and implement a range of interventions at different levels. Unlike other books currently available on the market, this book goes beyond individual, group, and organizational levels of OD interventions, and addresses broader OD intervention efforts at industry and community levels, too. Essentially, this book provides a practical guide for OD interventions. Each chapter provides practical information about general OD interventions, supplies best practice examples and case studies, summarizes the results of best practices, provides at least one case scenario, and offers at least one relevant tool for practitioners.

Original languageEnglish (US)
Title of host publicationOrganization Development Interventions
Subtitle of host publicationExecuting Effective Organizational Change
Publisher
Pages1-340
Number of pages340
ISBN (Electronic)9781000418347
ISBN (Print)9781032049137
DOIs
StatePublished - Jan 1 2021

All Science Journal Classification (ASJC) codes

  • Economics, Econometrics and Finance(all)
  • General Business, Management and Accounting

Access to Document

  • 10.4324/9781003019800

Other files and links

  • Link to publication in Scopus
  • Link to the citations in Scopus

Learning and Organizational Change: A Case Study of Using Learning Intervention in a Strategic Management Course

  • January 2023
  • Open Journal of Business and Management 11(05):2175-2197
  • 11(05):2175-2197

Sydney Moyo at Collaborating Consulting, Switzerland

  • Collaborating Consulting, Switzerland

Discover the world's research

  • 25+ million members
  • 160+ million publication pages
  • 2.3+ billion citations

María José Latorre-Medina

  • Francisco Javier Blanco-Encomienda
  • Peter Martin

Thomas Diamante

  • Julia Sloan
  • Mike Pedler

Maxwell Yurkofsky

  • Jere Brophy
  • James C. Votruba

Stephen Brookfield

  • J Manag Educ
  • Steven Maranville
  • Recruit researchers
  • Join for free
  • Login Email Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google Welcome back! Please log in. Email · Hint Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google No account? Sign up

This paper is in the following e-collection/theme issue:

Published on 12.9.2024 in Vol 26 (2024)

Developing and Evaluating Digital Public Health Interventions Using the Digital Public Health Framework DigiPHrame: A Framework Development Study

Authors of this article:

Author Orcid Image

Original Paper

  • Tina Jahnel 1 , BA, MA, PhD   ; 
  • Chen-Chia Pan 2, 3, 4 , BA, MA   ; 
  • Núria Pedros Barnils 2, 3 , BA, MA   ; 
  • Saskia Muellmann 2, 4 , BA, MA, Dr. PH   ; 
  • Merle Freye 2, 5 , Dr. jur.   ; 
  • Hans-Henrik Dassow 2, 6 , Bsc, MA   ; 
  • Oliver Lange 2, 7 , Bsc, MA, Dr. rer. pol.   ; 
  • Anke V Reinschluessel 2, 8, 9 , Bsc, Msc, PhD   ; 
  • Wolf Rogowski 2, 7 , Dr. oec. publ.   ; 
  • Ansgar Gerhardus 2, 10 , MA, MPH, Dr. med.  

1 Department of Health Services Research, Institute for Public Health and Nursing Research, University of Bremen, Bremen, Germany

2 Leibniz ScienceCampus Digital Public Health, Bremen, Germany

3 Department of Prevention and Health Promotion, Institute for Public Health and Nursing Research, University of Bremen, Bremen, Germany

4 Leibniz Institute for Prevention Research and Epidemiology, Bremen, Germany

5 Institute for Information, Health and Medical Law, University of Bremen, Bremen, Germany

6 Institute for Philosophy, University of Bremen, Bremen, Germany

7 Department of Health Care Management, Institute for Public Health and Nursing Research, University of Bremen, Bremen, Germany

8 Digital Media Lab, University of Bremen, Bremen, Germany

9 Human-Computer Interaction Group, University of Konstanz, Konstanz, Germany

10 Department for Health Services Research, Institute for Public Health and Nursing Research, University of Bremen, Bremen, Germany

Corresponding Author:

Tina Jahnel, BA, MA, PhD

Department of Health Services Research

Institute for Public Health and Nursing Research

University of Bremen

Grazer Str 4

Bremen, 28359

Phone: 49 042121868808

Email: [email protected]

Background: Digital public health (DiPH) interventions may help us tackle substantial public health challenges and reach historically underserved populations, in addition to presenting valuable opportunities to improve and complement existing services. However, DiPH interventions are often triggered through technological advancements and opportunities rather than public health needs. To develop and evaluate interventions designed to serve public health needs, a comprehensive framework is needed that systematically covers all aspects with relevance for public health. This includes considering the complexity of the technology, the context in which the technology is supposed to operate, its implementation, and its effects on public health, including ethical, legal, and social aspects.

Objective: We aimed to develop such a DiPH framework with a comprehensive list of core principles to be considered throughout the development and evaluation process of any DiPH intervention.

Methods: The resulting digital public health framework (DigiPHrame) was based on a scoping review of existing digital health and public health frameworks. After extracting all assessment criteria from these frameworks, we clustered the criteria. During a series of multidisciplinary meetings with experts from the Leibniz ScienceCampus Digital Public Health, we restructured each domain to represent the complexity of DiPH. In this paper, we used a COVID-19 contact–tracing app as a use case to illustrate how DigiPHrame may be applied to assess DiPH interventions.

Results: The current version of DigiPHrame consists of 182 questions nested under 12 domains. Domain 1 describes the current status of health needs and existing interventions; domains 2 and 3, the DiPH technology under assessment and aspects related to human-computer interaction, respectively; domains 4 and 5, structural and process aspects, respectively; and domains 6-12, contextual conditions and the outcomes of the DiPH intervention from broad perspectives. In the CWA use case, a number of questions relevant during its development but also important for assessors once the CWA was available were highlighted.

Conclusions: DigiPHrame is a comprehensive framework for the development and assessment of digital technologies designed for public health purposes. It is a living framework and will, therefore, be updated regularly and as new public health needs and technological advancements emerge.

Introduction

The overarching goal of public health is to promote and improve the health and well-being of people and communities. In recent years, digital interventions specifically designed for public health purposes have emerged on a large scale. Digital public health (DiPH) interventions may help us tackle substantial public health challenges, including aging populations [ 1 ], the dual burden of noncommunicable and communicable diseases [ 2 ], and the health impacts of climate change [ 3 ]. Moreover, DiPH interventions present valuable opportunities to improve and complement existing health care services and reach historically underserved populations.

With the COVID-19 pandemic, we have seen how digital technologies may accelerate responses to public health emergencies. For example, digital contact-tracing apps have become a major component to monitor community transmission and curb the spread of the virus in a population [ 4 ]. Further, the development of information platforms for international real-time public health data has supported policy and decision makers in planning and executing containment strategies. Another relevant field that became more visible during the pandemic concerns public health education. Digital platforms of health authorities and national agencies played a critical role in rapidly engaging and educating the population through prompt dissemination of trusted and tailored public health information, while limiting the visibility of information from unreliable sources [ 5 ].

As with other health technologies, DiPH interventions need to be developed through an iterative process, considering a multitude of factors right from the beginning of the conceptualization process. However, these factors (eg, acceptability, usability, data security, and sustainability) are sometimes not well thought out during the development or not at all considered, often resulting in low-value interventions that are ineffective and burdensome and reduce both quality and efficiency. In turn, the development of DiPH interventions is often triggered through technological advancements (ie, what is possible) rather than current public health needs [ 6 ].

Although vast amounts of new health apps are launched in app stores regularly, the number of downloads for many of these apps generally stays notoriously low [ 7 ]. Individual decisions around the initial use, adoption, rejection, and continued use of an app might be influenced by concerns regarding data security and data protection issues, costs to purchase the app, or user-friendliness for different user groups [ 8 , 9 ]. Other societal aspects, such as sustainable financing and regulatory requirements, are described as challenges to fulfill public health functions. Thus, these aspects may influence the design of a DiPH intervention and need to be considered from the beginning of the development process [ 10 ].

During the development and evaluation process, a number of different stakeholders assess the potential impact of DiPH interventions (eg, tech companies, health insurances, governments, and health organizations). As such, for each DiPH intervention, a great variety of potential users and user environments must be considered. To systematically develop and evaluate DiPH interventions, a comprehensive framework is needed that systematically covers all aspects with relevance for public health. This includes considering the complexity of the technology, the context in which the technology is supposed to operate, its implementation, and its effects on public health (eg, ethical, legal, and social aspects). Such a comprehensive framework would cover all phases, from conceptualization to evaluation, of all types of DiPH interventions and all parties [ 11 ].

Existing Frameworks for Digital Health Interventions, Health Technologies, and Public Health Interventions

Interventions are often developed without a systematic method and without drawing on the evidence and theories. This point was made by Martin Eccles, Emeritus Professor of Clinical Effectiveness in the United Kingdom, in referring to a frequently used principle of intervention design, the ISLAGIATT (It Seemed Like A Good Idea At The Time!) principle. This means that we jump straight to intervention and crucially miss out understanding the behaviors we are trying to change or do not consider contextual facilitators and barriers for a successful implementation of the intervention. Frameworks that integrate a wide range of domains allow us to think ahead and help us avoid potential pitfalls before they occur so that we can design appropriate interventions based on this analysis [ 12 ].

Although frameworks for digital health interventions, health technologies, and public health interventions have been developed previously, to the best of our knowledge, no framework for the systematic development and assessment of digital interventions for public health purposes exists today. Assessment criteria for health-related technologies have been developed previously, although their focus generally lies on either health technology [ 13 , 14 ] or digital health-relevant [ 15 ] aspects.

One prominent example of assessing various health technologies is health technology assessment (HTA). “HTA is a multidisciplinary process that uses explicit methods to determine the value of a health technology at different points in its lifecycle. The purpose is to inform decision-making in order to promote an equitable, efficient, and high-quality health system” [ 16 ]. Based on this methodology, various organizations have developed frameworks with different foci [ 13 , 14 , 17 ]. For instance, the European Network for Health Technology Assessment (EUnetHTA) developed the health technology core model for assessing the dimensions of value to facilitate the production and sharing of HTA information, such as evidence on efficacy, effectiveness, and patient aspects, to inform decisions. The model has a broad scope and offers a common ground to various stakeholders by offering a standard structure and a transparent set of proposed HTA questions [ 13 ]. HTA frameworks are generally applied to already developed technologies rather than providing standards for evaluation aspects that should be considered throughout development. However, this is important because existing interventions would likely be outdated by the time their assessment is finished.

Assessment frameworks specifically designed for the evaluation of digital health technology also exist. The National Institute for Health and Care Excellence (NICE) recently developed an Evidence Standards Framework (ESF) for digital health technologies [ 15 ], aiming at providing standards for clinical evidence of (novel) health technology’s (cost-)effectiveness within the UK health and care system. Similar to other frameworks [ 18 - 21 ], the ESF lacks applicability to public health technologies, due to its focus on clinical outcomes. Other frameworks focus on evaluation and assessment criteria along the life cycle of digital health interventions yet still lacking a public health focus [ 22 ].

Digital interventions heavily rely on user interaction and engagement. However, public health frameworks generally do not include specific measures to assess usability, user experience, and the design aspects crucial for promoting sustained user engagement [ 23 ]. Furthermore, DiPH interventions often require integration into existing health care systems, which can be complex and fraught with interoperability, data security, and data protection challenges—issues that are often not properly addressed in public health frameworks [ 24 ]. Although these are just a few examples, they illustrate how unique aspects of DiPH interventions may fall short in existing public health frameworks.

Gaps and Objective

Together, we identified the following gaps:

  • Absence of a framework for digital interventions in public health: Although frameworks for health technologies and public health interventions exist, there is no established framework specifically tailored for the systematic development and assessment of digital interventions in public health.
  • Limited applicability of existing assessment frameworks.
  • Inadequate consideration of usability and integration challenges.

Addressing these gaps requires the development of a comprehensive framework specifically tailored for digital interventions in public health, integrating diverse domains and considering usability, user experience, and integration challenges throughout the development and assessment process so that developers and assessors need not draw on multiple frameworks. The main focus of this paper was to present the current form of the digital public health framework (DigiPHrame) and describe its development process, followed by a use case to illustrate its application. More detailed information on the scoping review that served as a starting point to develop DigiPHrame can be found in the protocol that we preregistered on the Open Science Framework (OSF) [ 25 ]. The German contact-tracing app Corona-Warn-App (CWA), as a digital public warning system with a clear public health focus, was deemed as a suitable use case to illustrate the application of DigiPHrame.

Search Strategy

We developed DigiPHrame in several steps, as shown in Figure 1 . First, we conducted a scoping review to identify existing frameworks for public health and digital health interventions (see the protocol and registration on the OSF [ 26 ]). See Table 1 for the eligibility criteria. For information sources, we searched journal papers in the electronic literature databases MEDLINE (via PubMed), Scopus, IEEE, CINAHL (via EBSCO), and PsycINFO (via Ovid). Our search strategy was first developed around our core concepts as our primary search keywords and Boolean operators: (“Public Health” [Title/Abstract] OR “Digital Health” [Title/Abstract]) AND Evaluation [Title] AND Framework [Title]). The search syntax was then expanded to include the synonyms, wildcards, and relevant subject terms of the primary keywords to increase the sensitivity of our searches. Next, we modified the subject terms and search field of the search syntax to adapt to each database (see Multimedia Appendix 1 ). We also manually searched relevant reviews’ reference lists. The final search was completed on April 12, 2022, with no publication date limitations.

case study organizational development interventions

CriterionInclusionExclusion
FrameworkDevelopment or evaluation framework for health interventions related to public health or digital healthNo framework or guidance in the report
ReportFramework or guidance outlining the standards, principles, criteria, or properties needed to support the systematic development or evaluation of health interventions aimed at health promotion or prevention with or without digital technologiesFramework or guidance not focusing on developing, monitoring, validating, or evaluating health interventions; not providing specific standards, principles, criteria, or properties; only designed for 1 specific tool; and not applicable to other health interventions and only applicable to pharmaceutical/surgical/clinical/rehabilitation interventions
Publication typeJournal papers, study/policy/program reported in gray literatureComment, correction, letter, editorial, protocol, oral presentation, poster
LanguageEnglishOther language than English
Access to full textAccess to full text of studies selected for data codingNo access to full text

Selection of Sources of Evidence

After deduplication, 4830 titles and abstracts were screened by 2 researchers independently, resulting in 433 (9%) full texts, which were then assessed by 2 independent researchers. Disagreements between researchers were resolved through dialogue, with the involvement of a third party, if necessary, although a definitive agreement score was not established. In total, 68 (15.7%, see Multimedia Appendix 4 ) papers were included for data extraction (see the Preferred Reporting Items for Systematic reviews and Meta-Analyses [PRISMA] flowchart in Multimedia Appendix 2 [ 27 ]).

Data Charting and Data Items

We extracted all pertinent assessment criteria from the frameworks identified through the scoping review. Initially, these criteria were assigned to HTA domains/subdomains [ 13 ], although several criteria could not be assigned due to thematic misfit (akin to deductive coding). Subsequently, new categories were formed (akin to inductive coding). One researcher performed the coding initially, followed by a collaborative examination of the coded sections by 2 researchers, leading to adjustments during the discussion process (eg, reassignment to other domains, reassignment to other subdomains within a domain, summarization of subdomains, and deletion of irrelevant domains or subdomains). Questions describing the subdomains were devised by us based on the criteria (here, too, a proposal was made by one person, followed by verification by a second person). We consulted additional literature for the categorization of ethical principles [ 28 ].

A group of multidisciplinary experts from the Leibniz ScienceCampus Digital Public Health (LSC DiPH) were assigned to the domains corresponding to their expertise for counseling. Each domain was restructured with proficient inputs to represent the complexity of DiPH. Where necessary, additional literature was consulted, especially when the included frameworks fell short of offering criteria specific to DiPH.

The Proposed Framework

A first draft of the proposed framework was sent to an interdisciplinary expert panel consisting of 105 members of the LSC DiPH. Feedback was gathered as unrestricted comments on the domains we developed. We reached out to experts from diverse fields, including medicine, public health, global health, psychology, sociology, human-computer interaction, (health) economics, informatics, sports science, medical biometry, architecture, urban planning, statistics, ethics, policy analytics, and law, assigning them domains based on their respective expertise. A deadline for feedback submission was set for July 18, 2022. Additionally, the same members of the LSC DiPH were invited to partake in a consensus meeting held on July 19, 2022. Participants were grouped into domain-specific discussions according to their areas of expertise, with these discussions being moderated by the DigiPHrame team. This resulted in the first version of the framework [ 29 ].

Use Case: Corona-Warn-App

Shortly after the COVID-19 pandemic in 2020, numerous digital contact-tracing apps were developed or proposed, with official government support in some territories and jurisdictions. The rationale was that contact tracing is an important tool in infectious disease control, but as the number of cases rises, time constraints make it more challenging to trace transmissions effectively. Digital contact tracing, especially if widely deployed, may be more effective than traditional methods of contact tracing [ 30 ].

COVID-19 apps include mobile apps for digital contact tracing (ie, identifying persons, or “contacts,” who may have been in contact with an infected individual) deployed during the COVID-19 pandemic. Privacy concerns have been raised, especially about systems tracking users’ geographical location. Alternatives include co-opting Bluetooth signals to log a user’s proximity to other smartphones. For example, the open source CWA funded by the German government was based on proximity tracing using Bluetooth signals. The app provides a function for users to warn other users by uploading their positive test results anonymously on a voluntary basis to the CWA server. Users are then notified about any contacts with infected persons and can get tested on a voluntary basis.

The same experts were invited to a workshop on February 23, 2023, where the proposed framework was applied to a study case and tested for face validity. The case study was the CWA. Followed by several revision meetings by the framework team between February and May 2023, the second version of the proposed framework was finalized in May 2023 [ 31 ].

Ethical Considerations

Ethical approval is not applicable for this study since it did not involve human subjects.

Characteristics of DigiPHrame

DigiPHrame comprises a set of criteria framed as open-ended questions clustered within domains that lead interested parties through a broad spectrum of crucial elements when developing and evaluating DiPH interventions. The evolution of domains and subdomains through the stepwise process, including the number of questions per subdomain in each version, can be found in Multimedia Appendix 3 . The framework in its current form was uploaded to the LSC DiPH website and the OSF [ 32 ] in May 2023 and is a revised version of the original framework that was first published in July 2022. In total, DigiPHrame consists of 182 questions, structured by 12 domains ( Figure 2 ).

case study organizational development interventions

Domain 1 describes the current status of health needs and existing interventions; domains 2 and 3 are aimed at the DiPH technology under assessment and aspects related to human-computer interaction, respectively; domains 4 and 5 are aimed at structural and process aspects, respectively; and domains 6-12 assessment criteria address contextual conditions and the outcomes of the DiPH intervention from broad perspectives.

Next, we defined the domains and illustrated how DigiPHrame can be applied using the CWA as a use case. The CWA is a digital public warning system that was designed and developed during the COVID-19 pandemic and has a clear public health focus. We briefly outlined the purpose and characteristics of the CWA. From each domain, we applied 1 assessment question as an example.

Domain 1: Health Conditions and Current Public Health Interventions

Domain 1 involves background information for DiPH interventions, describing the population, conditions, and observance of health inequities. Furthermore, this domain addresses current public health interventions and common alternatives.

Question 1.5 asks, “What is the expected level of digital literacy of the target population?” In the case of the CWA, the target population comprises the entire population within a geographically delimited space. Therefore, the entire spectrum of digital literacy is to be expected. Thus, different forms of representation (eg, graphics, text, and sound) of risk exposure and other information related to the COVID-19 pandemic must be available, which was not the case and might have prevented people from using it.

Domain 2: Technology and Usability

Domain 2 guides one through assessing general technical aspects of the health technology of interest. The questions focus on what digital tools are applied and how aspects such as interoperability, data integration, internet connectivity, and others are integrated.

Question 2.17 is, “Does the software require an internet connection (eg, all the time, once in a while, or once)?” In the case of the CWA, an internet connection is necessary as the major functionality of warning people is distributed via the internet. Only a fraction of the available functions work completely without an internet connection, such as the contact diary. Generally, the app does not need a continuous internet connection. However, the device on which the app is installed needs to be connected to the internet, at best multiple times a day but at least once a day to sync the contacts and update on test results.

Domain 3: Usability

Domain 3 focuses on how usable the health technology system is in order to ensure that its users can perform the required tasks (ie, the intended function) safely, effectively, efficiently, and with satisfaction. Therefore, accessibility, user empowerment, credibility, and trustworthiness are also considered in this domain.

Question 3.3 asks, “Are the health technology and DiPH intervention available in relevant languages?” When the CWA was first launched, it was available only in German and English. Russian, 1 of the most spoken immigrant languages in Germany, was not provided in the CWA until much later versions. Since version 2.20.0 for iOS and version 2.20.4 for Android, the CWA is available in German, English, Turkish, Bulgarian, Polish, Romanian, and Ukrainian.

Domain 4: Infrastructure and Organization

Domain 4 on structural aspects considers the structure of the context in which a DiPH intervention is developed and implemented, as well as the involved stakeholders. Question 4.4 asks, “Is the DiPH intervention flexible to suit local, cultural, or social needs?” Initially, the German government promoted centralized storage of user data, which, according to the Federal Ministry of Health, would allow it to better track the spread of infections. However, this led to resistance from digital experts and data protectionists. As a consequence, the CWA was developed, with decentralized data collection across various servers. This approach ensured that the data could be decoupled, thereby hindering any potential tracing of app users.

Domain 5: Implementation

Domain 5 describes aspects to consider before and during integration of a DiPH intervention into the health care system to ensure that the intervention is delivered properly. The domain focuses on the theory used for implementing the DiPH intervention, infrastructure, process, and agents, as well as implementation outcomes and dissemination.

Question 5.9 asks, “Which implementation difficulties (eg, duration, scope, disruptivity, centrality, complexity, and the number of steps required) did the DiPH intervention encounter?” In case of the CWA, necessary features (eg, sharing test results and embedding vaccination certificates) were not initially available when the app was first launched in June 2020, but had to be continuously added to the app.

Contextual Conditions and Outcome-Related Domains

Domain 6: intended and unintended health-related effects.

Domain 6 considers the positive and negative effects of a DiPH intervention on physical, mental, and social health; the quality of life and well-being; and the knowledge, beliefs, and behavior of individuals and the population in the short, intermediate, and long terms.

Question 6.2 asks, “To what extent is the DiPH intervention expected to impact the physical, mental, and social health of the individual and the population?” With its goal to prevent infections, the CWA was expected to positively affect individuals’ and, ultimately, population health. It is unclear how the large red warning sign displayed on users’ smartphones when a high-risk contact with an infected person occurs would affect their mental health. Although generally accepted, the CWA was not used by the majority of the population and was widely discussed in terms of data privacy concerns prior to the launch of the app. With some individuals using the CWA and some not (including, sometimes, strong opinions in favor or against the benefit of the app within a social circle), this may have affected an individual’s relationships and social health.

Domain 7: Social, Cultural and Intersectional Aspects

Domain 7 examines the societal, cultural, and intersectional dimensions pertinent to communities and groups of individuals, such as ethnic or demographic groups, people residing in the same neighborhood, those sharing common interests, or individuals with specific physical or mental conditions.

Question 7.5 asks, “Which factors in the society/community are relevant for DiPH intervention implementation?” In the case of the CWA, it is the availability of compatible smartphones (eg, older smartphones are not compatible), trust that data will be protected and not used for other purposes (eg, analog data from guests of restaurants, not the data from the app, were used to identify suspects of thefts), and willingness to enter one’s data in the case of infection.

Domain 8: Ethics

Domain 8 addresses the moral considerations that arise from the implementation of DiPH interventions. The categorization of ethical principles is based on the influential Principles of Biomedical Ethics by Beauchamp and Childress [ 28 ].

Question 8.20 asks, “Does the DiPH intervention discriminate against particular segments of the target population?” Although efforts were successively visible to avoid discrimination, it took too much time to offer the app in different languages frequently spoken in Germany. People using phones with older operating systems were also excluded from applying the app.

Domain 9: Legal and Regulatory Aspects

Domain 9 generates awareness about which areas of law must be considered when developing or evaluating DiPH interventions. It is not the purpose of the domain to pose every specific legal question that has to be answered in order to develop or evaluate DiPH interventions. Since laws differ from country to country, the domain helps detect fields of law and typical problems in those fields that could be relevant for developers and evaluators. The applicable law and its requirements depend on the country.

Question 9.6 asks, “Have you considered the potential reimbursement of DiPH interventions in a national health system (some countries may have specific requirements for reimbursement)?” This raises awareness about the requirements for reimbursement of the DiPH interventions in a national health system or for other payers. Regarding the CWA, the provider offered the intervention for free (without a reimbursement option) because the free-of-charge offer of the CWA promises a broader and quicker distribution of the app.

Domain 10: Data Security and Data Protection

Domain 10 focuses on the technological protection of data and, therefore, combines the aspects of data confidentiality, data integrity, data authenticity, data availability, and data controllability. Data protection relates to whether the system is allowed to process personal data.

If personal data are transferred to third parties, question 10.25 asks, “Is there is a legal basis for the transfer, and are the requirements of the legal basis fulfilled?” Regarding the case of the CWA, T-Systems International GmbH and SAP Deutschland SE & Co. KG are acting on the Robert Koch Institute’s behalf. The legal basis is a contract that is binding on the processor with regard to the controller and that sets out the subject matter and duration of the processing, the nature and purpose of the processing, the type of personal data and categories of data subjects, and the obligations and rights of the controller (Art. 28(3) of the General Data Protection Regulation [GDPR]). Otherwise, the Robert Koch Institute only passes on data to third parties if it is legally obliged to do so or if this is necessary for legal action or criminal prosecution in the case of attacks on the app’s technical infrastructure.

Domain 11: Cost and Economics

Domain 11 assesses DiPH interventions regarding whether they can be considered a rational use of scarce resources. Question 11.1 asks, “Which relevant costs and effects can be identified?” Considering the costs and effects of a DiPH intervention from the beginning could help compare it with other interventions and show that it is economically dominant (ie, it is at least as effective as but costs less than the alternative interventions). Further, this information might be the basis for health economic evaluation (see questions 11.4- 11.6) to see whether what it costs per health gain is considered acceptable by the payer. In the case of the CWA, there are various relevant costs of the intervention itself, such as development and operation (2020: €52.8, or US $57.5, million; 2021: €63.5, or US $69.1, million) and promotion (2020 and 2021: €13.7, or US $14.9, million) [ 16 ]. Taking a broader (societal) perspective, there might be further costs, such as costs of further testing when the CWA receives a warning and costs of unrelated survival gains or benefits, such as a reduction in the loss of earnings, reduction in hospitalizations and rehabilitation measures, and reduction in deaths [ 32 ]. However, to the best of our knowledge, the pandemic context and the decision process about the CWA lead to a situation where a decision was made without formally considering cost-effectiveness in comparison with alternative decision options.

Domain 12: Sustainability

Domain 12 asesses environmental, social, and economic sustainabilty. Given the goal to reduce carbon emissions in health care, question 12.1 asks, “Which resources are necessary to develop and maintain the DiPH intervention?” In the case of the CWA, servers need to run, which produces carbon emissions, and computers need to be obtained to ensure compatibility of health offices with the CWA. Measuring and evaluating these resource consumptions also allows decision makers to consider more climate-friendly design alternatives for DiPH interventions.

Application of DigiPHrame for the Development and Evaluation of DiPH Interventions

In the use case of the CWA, we highlighted a number of questions relevant during the development of the CWA but also important for assessors once the CWA was available. For example, developers needed to consider how the data would be collected and shared without interfering with data privacy and data protection laws. Similarly, assessors needed to find ways of evaluating the effectiveness of the CWA (eg, did the CWA prevent infections?) without relevant data (due to decentralized data storage, data from different individuals could not be connected, and thus, only estimates could be determined). In future scenarios, DigiPHrame can serve as a checklist for both developers and assessors to help them avoid overlooking key issues with relevance to the performance of the intervention. Although for some questions, it might be enough to use common sense (in the case of the CWA, it could be questions surrounding the usability of the app), for others, specialist expertise may be necessary (eg, questions regarding legal and regulatory issues).

DigiPHrame is agile and primarily user led ( Textbox 1 ). We deliberately included the option of feedback loops in the framework to support the agile development process. Although it is advised to consider all domains and respective questions, developers may decide which domains are assessed at what stage of their development process and which questions are relevant for the respective DiPH intervention. For an intervention under development, a first orientation might be enough to understand whether it is worth continuing along the determined path or whether adjustments might be necessary. Developers may also decide to put specific questions on hold and revise them at a later stage in case any changes or additions need to be made to the DiPH intervention. Similarly, assessors may delay answering certain questions in case no robust evidence is available at the time to answer the questions.

Users of DigiPHrame are encouraged to first answer a list of questions regarding a general description of their digital public health (DiPH) intervention. Providing general characteristics will help assessors better understand the DiPH intervention under assessment. DigiPHrame is further equipped with a standardized answer scheme to help developers in answering the questions and, if necessary, plan the next steps in the development process. For assessors, the answer scheme can serve as a checklist to tick off all relevant criteria.

DigiPHrame users can respond to each question using the provided answer scheme. The first 2 assessment indicators are “not applicable” when the question is irrelevant to the particular DiPH intervention and “assessment result” to provide the answer or additional information to the assessor. The last 3 columns of the answer scheme focus on the current status of the DiPH intervention during the assessment. These columns include “Assessment completed and sufficient” when the assessment is finished and satisfactory, “Assessment done but improvement needed” when the assessment is complete but indicates the need for improvements or changes to the DiPH intervention, and “Assessment only partially done or not possible yet” when the assessment is incomplete or not feasible at the moment.

Example answer scheme:

Criterion: population

Question: Who is the target population of the DiPH intervention?

Assessment indicator scheme:

  • Not applicable (N/A): yes
  • Assessment result: The entire population is at risk of getting infected with SARS-CoV-2.
  • Assessment completed and sufficient: yes
  • Assessment done but improvement needed: Briefly outline the necessary changes/expected date for revising the question.
  • Assessment only partially done or not possible yet: Insert specific steps to be taken/expected date of completion.

Principal Findings

A unified framework for digital interventions with a public health focus.

Although health-related digital technologies hold great potential for enhancing public health and addressing health-related inequalities at a relatively low cost, new developments are often driven by technological advancements and assessments and primarily revolve around clinical aspects of health. To the best of our knowledge, no existing frameworks consider digital interventions specifically designed for public health purposes. Additionally, previous frameworks primarily emphasize clinical aspects when addressing digital health technologies, neglecting the public health perspective. As an example, although the ESF [ 15 ] emphasizes clinical outcomes, crucial for any intervention’s success, it omits essential aspects, such as sociocultural, ethical, legal, and sustainability factors, vital for effectively implementing DiPH interventions. DigiPHrame includes aspects regarding clinical outcomes (eg, domain 6: intended and unintended health-related effects), among others derived from the ESF, but also the above-mentioned factors. Moreover, although HTA frameworks [ 13 ] are often designed for evaluating existing technologies, our objective was to devise a comprehensive framework applicable across all stages of development and evaluation. DigiPHrame adopts a comprehensive public health perspective and can serve as a guide, specifically for developers and assessors throughout the entire development and assessment of DiPH interventions. DigiPHrame provides users with criteria concerning clinical effectiveness, technical functions, and usability, as well as organizational, legal, ethical, economic, and sociocultural aspects. Users have the flexibility to determine the relevant domains and assessment questions based on their specific needs and the stage of the process without relying on multiple development and assessment frameworks. Furthermore, the users of DigiPHrame are encouraged to take a broader view and may be inspired to include other perspectives that were not initially within their scope (eg, sociocultural aspects, ethics, and sustainability).

A Holistic Framework Covering Relevant Domains for DiPH Interventions

Additionally, a deeper understanding of contextual factors is necessary to assess what will work in one country versus another. These factors can either enhance or hinder the adoption and diffusion of DiPH technologies. Although many frameworks tend to overemphasize technical aspects, it is essential to acknowledge that various other factors influence success or failure. These factors include disparities in health expenditure, demographic conditions, health infrastructure, information and communication technology (ICT) skill levels, digital health literacy, clinical and patient engagement, and many more. Recognizing and understanding these key differences within and across countries is crucial for policy makers and other stakeholders in public health and DiPH. Although our framework considers these factors, future work needs to apply DigiPHrame in diverse contexts and countries to validate and continuously update the current version of the framework. In addition, although our framework aims to be universally applicable to various DiPH technologies, it will require revision as new public health needs and DiPH technologies emerge. Therefore, our framework can serve as the foundation for a development and assessment toolkit that developers, decision makers, and other users alike can use.

As we illustrated with the German contact-tracing app CWA, that was launched during the first wave of the COVID-19 pandemic, DigiPHrame can be applied for all stages, including design, implementation, and evaluation. This may have helped avoid potential pitfalls from the beginning that would have otherwise occurred further down the development process.

Strengths and Limitations

Our framework has several key strengths that set it apart. First, it is based on a comprehensive scoping review of digital health and public health frameworks (OSF [ 25 ]), ensuring a robust foundation. Additionally, we conducted scientific consensus meetings involving interdisciplinary experts, ensuring a breadth of perspectives in its development. Second, the assessment themes within our framework were derived from existing frameworks developed in various Western countries, including Germany, the United Kingdom, and the United States. This demonstrates the broad applicability of DigiPHrame across different geographical contexts, making it adaptable to diverse settings in high-income countries. Another strength of our framework is its universality. It is not limited to specific types of DiPH interventions and, therefore, can be applied to any digital intervention with the overarching aim of improving public health outcomes. This flexibility allows for its widespread application across a wide range of interventions. Furthermore, DigiPHrame is designed as a living framework that will evolve and adapt as technology advances. To do so, we will continue to revise the domains and questions and regularly test any changes for face validity using a variety of use cases. This will ensure that it remains relevant and up to date in the fast-paced DiPH landscape, accommodating emerging technologies and methodologies. Lastly, we incorporated input and expertise from various research fields throughout the entire development process of DigiPHrame. We fostered an interdisciplinary perspective by involving experts from different disciplines, including public health, epidemiology, psychology, philosophy, law, economics, human-computer interaction, and sociology, enriching the framework with diverse insights and knowledge.

Although our framework has several strengths, it is important to acknowledge certain limitations. First, going through the proposed framework might require significant time and expertise due to its complexity and depth. Nevertheless, it is flexible; it is up to the assessor to decide which domains and criteria are applicable to their specific case. This flexibility is advantageous, allowing the framework to be adapted to diverse contexts and DiPH interventions. However, it may also introduce subjectivity in the evaluation process, as different assessors may choose different domains and criteria, leading to varying outcomes. Ensuring transparency and consistency in domain selection could help mitigate this concern. Additionally, we intend to develop a condensed version of the framework focusing on the most critical domains and questions. Second, we engaged experts from diverse research fields to address potential inconsistencies during the development process. However, it is worth noting that the majority of our consultations did not extend to a broader geographical range, particularly in terms of incorporating specific aspects from low- and middle-income countries. It is crucial to recognize that contexts may differ significantly, including factors such as technology accessibility, digital health literacy, and legal requirements. Although DigiPHrame aims to be applicable across different geographical contexts, users of the framework are advised to consider and adhere to their local requirements and nuances. Furthermore, in our scoping review, we focused on primary prevention and health promotion but not on secondary and tertiary prevention (eg, rehabilitation). This could have limited the frameworks and criteria we found. Although as per our definition, DiPH focusses on primary prevention and health promotion, future research may also include frameworks focused on secondary and tertiary prevention. Lastly, we did not provide any evaluation methods along with the framework. As DigiPHrame evolves, however, our goal is to provide suitable existing methods and develop novel evaluation methods for DiPH interventions.

DigiPHrame is a comprehensive framework for the development and assessment of digital technologies designed for public health purposes. Our framework may assist in designing and evaluating DiPH interventions that serve public health needs rather than displaying technological advancements. Moreover, DigiPHrame may help avoid overlooking important aspects, such as acceptability, usability, data security, and sustainability, which would otherwise result in low-value interventions that are not user friendly, violate (data protection) laws, or are not sustainable. We aim to revise and improve DigiPHrame as new technologies emerge, and encourage developers and assessors to use and contribute to improving DigiPHrame.

Acknowledgments

The authors gratefully acknowledge the Leibniz ScienceCampus Bremen Digital Public Health support, jointly funded by the Leibniz Association (W4/2018), the Federal State of Bremen, and the Leibniz Institute for Prevention Research and Epidemiology (BIPS).

The authors would also like to thank Dorothee Jürgens, Sarah Janetzki, Sarah Forberger, and Jonathan Kolschen for their contributions to conducting the scoping review and data extraction for developing the first version of the framework.

Data Availability

The data collected and analyzed during this study are available from the corresponding author upon reasonable request.

Authors' Contributions

TJ and AG conceived the concept of the manuscript. TJ drafted the first version of the manuscript. All authors contributed to the literature search and writing and editing of the manuscript. All authors have read and approved the final manuscript.

Conflicts of Interest

None declared.

Search syntax.

Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) flow diagram.

Evolution of domains and subdomains.

Included Reports.

  • Chen C, Ding S, Wang J. Digital health for aging populations. Nat Med. Jul 18, 2023;29(7):1623-1630. [ CrossRef ] [ Medline ]
  • Rosen JM, Kun L, Mosher RE, Grigg E, Merrell RC, Macedonia C, et al. Cybercare 2.0: meeting the challenge of the global burden of disease in 2030. Health Technol (Berl). 2016;6(1):35-51. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Rahimi-Ardabili H, Magrabi F, Coiera E. Digital health for climate change mitigation and response: a scoping review. J Am Med Inform Assoc. Nov 14, 2022;29(12):2140-2152. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Murray CJL, Alamro NMS, Hwang H, Lee U. Digital public health and COVID-19. Lancet Public Health. Sep 2020;5(9):e469-e470. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Budd J, Miller BS, Manning EM, Lampos V, Zhuang M, Edelstein M, et al. Digital technologies in the public-health response to COVID-19. Nat Med. Aug 2020;26(8):1183-1192. [ CrossRef ] [ Medline ]
  • Zeeb H, Pigeot I, Schüz B, Leibniz-WissenschaftsCampus Digital Public Health Bremen. [Digital public health-an overview]. Bundesgesundheitsblatt Gesundheitsforschung Gesundheitsschutz. Feb 2020;63(2):137-144. [ CrossRef ] [ Medline ]
  • Biviji R, Vest JR, Dixon BE, Cullen T, Harle CA. Factors related to user ratings and user downloads of mobile apps for maternal and infant health: cross-sectional study. JMIR Mhealth Uhealth. Jan 24, 2020;8(1):e15663. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Borghouts J, Eikey E, Mark G, De Leon C, Schueller SM, Schneider M, et al. Barriers to and facilitators of user engagement with digital mental health interventions: systematic review. J Med Internet Res. Mar 24, 2021;23(3):e24387. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Whitelaw S, Pellegrini D, Mamas M, Cowie M, Van Spall HGC. Barriers and facilitators of the uptake of digital health technology in cardiovascular care: a systematic scoping review. Eur Heart J Digit Health. Mar 2021;2(1):62-74. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Greenhalgh T, Wherton J, Papoutsi C, Lynch J, Hughes G, A'Court C, et al. Beyond adoption: a new framework for theorizing and evaluating nonadoption, abandonment, and challenges to the scale-up, spread, and sustainability of health and care technologies. J Med Internet Res. Nov 01, 2017;19(11):e367. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Rodriguez-Villa E, Torous J. Regulating digital health technologies with transparency: the case for dynamic and multi-stakeholder evaluation. BMC Med. Dec 03, 2019;17(1):226. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Lorencatto F, Charani E, Sevdalis N, Tarrant C, Davey P. Driving sustainable change in antimicrobial prescribing practice: how can social and behavioural sciences help? J Antimicrob Chemother. Oct 01, 2018;73(10):2613-2624. [ CrossRef ] [ Medline ]
  • Lampe K, Mäkelä M, Garrido MV, Anttila H, Autti-Rämö I, Hicks NJ, et al. European Network for Health Technology Assessment (EUnetHTA). The HTA core model: a novel method for producing and reporting health technology assessments. Int J Technol Assess Health Care. Dec 2009;25 Suppl 2:9-20. [ CrossRef ] [ Medline ]
  • Joore M, Grimm S, Boonen A, de Wit M, Guillemin F, Fautrel B. Health technology assessment: a framework. RMD Open. Nov 03, 2020;6(3):e001289. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Unsworth H, Dillon B, Collinson L, Powell H, Salmon M, Oladapo T, et al. The NICE Evidence Standards Framework for digital health and care technologies - developing and maintaining an innovative evidence framework with global impact. Digit Health. 2021;7:20552076211018617. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • What is health technology assessment (HTA)? International Network of Agencies for Health Technology Assessment. URL: https://www.inahta.org/ [accessed 2024-07-22]
  • Kirwin E, Round J, Bond K, McCabe C. A conceptual framework for life-cycle health technology assessment. Value Health. Jul 2022;25(7):1116-1123. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Mummah SA, Robinson TN, King AC, Gardner CD, Sutton S. IDEAS (Integrate, Design, Assess, and Share): a framework and toolkit of strategies for the development of more effective digital interventions to change health behavior. J Med Internet Res. Dec 16, 2016;18(12):e317. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Bradway M, Carrion C, Vallespin B, Saadatfard O, Puigdomènech E, Espallargues M, et al. mHealth assessment: conceptualization of a global framework. JMIR Mhealth Uhealth. May 02, 2017;5(5):e60. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Khoja S, Durrani H, Scott RE, Sajwani A, Piryani U. Conceptual framework for development of comprehensive e-health evaluation tool. Telemed J E Health. Jan 2013;19(1):48-53. [ CrossRef ] [ Medline ]
  • Henson P, David G, Albright K, Torous J. Deriving a practical framework for the evaluation of health apps. Lancet Digit Health. Jun 2019;1(2):e52-e54. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Kowatsch T, Otto L, Harperink S, Cotti A, Schlieter H. A design and evaluation framework for digital health interventions. it - Inf Technol. 2019;61(5):253-263. [ CrossRef ]
  • Stratil JM, Baltussen R, Scheel I, Nacken A, Rehfuess EA. Development of the WHO-INTEGRATE evidence-to-decision framework: an overview of systematic reviews of decision criteria for health decision-making. Cost Eff Resour Alloc. Feb 11, 2020;18(1):8. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Vanderkruik R, McPherson ME. A contextual factors framework to inform implementation and evaluation of public health initiatives. Am J Eval. Oct 03, 2016;38(3):348-359. [ CrossRef ]
  • Muellmann S, Pan C, Jahnel T, Forberger S, Jürgens D, Barnils N. Frameworks for the development and evaluation of digital technologies for public health: a scoping review protocol. OSF Registries. URL: https://osf.io/n8jge [accessed 2024-07-22]
  • Frameworks for the development and evaluation of digital technologies for public health. Open Science Framework. URL: https://osf.io/ku38m/ [accessed 2024-07-22]
  • Tricco AC, Lillie E, Zarin W, O'Brien KK, Colquhoun H, Levac D, et al. PRISMA Extension for Scoping Reviews (PRISMA-ScR): checklist and explanation. Ann Intern Med. Oct 02, 2018;169(7):467-473. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Beauchamp T, Childress J. Principles of Biomedical Ethics. Oxford, UK. Oxford University Press; 2012.
  • Pan CN, Jürgens D, Muellmann S, Janetzki S, Kolschen J, Freye M. Developing and Assessing Digital Public Health Interventions: A Comprehensive Framework. 1st Version. Bremen. Leibniz ScienceCampus Digital Public Health; 2022.
  • Ellmann S, Maryschok M, Schöffski O, Emmert M. The German COVID-19 digital contact tracing app: a socioeconomic evaluation. Int J Environ Res Public Health. Nov 02, 2022;19(21):14318. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Pozo-Martin F, Beltran Sanchez MA, Müller SA, Diaconu V, Weil K, El Bcheraoui C. Comparative effectiveness of contact tracing interventions in the context of the COVID-19 pandemic: a systematic review. Eur J Epidemiol. Mar 16, 2023;38(3):243-266. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Pan CN, Jürgens D, Muellmann S, Janetzki S, Kolschen J, Freye M. Developing and assessing digital public health interventions: a digital public health framework (DigiPHrame). OSF Registries. URL: https://osf.io/ub3w4 [accessed 2024-07-22]

Abbreviations

Corona-Warn-App
digital public health
digital public health framework
Evidence Standards Framework
health technology assessment
Leibniz ScienceCampus Digital Public Health
Open Science Framework

Edited by A Mavragani; submitted 03.11.23; peer-reviewed by L Maaß, G Humphreys, BC Silenou, V Zander; comments to author 29.01.24; revised version received 27.03.24; accepted 27.06.24; published 12.09.24.

©Tina Jahnel, Chen-Chia Pan, Núria Pedros Barnils, Saskia Muellmann, Merle Freye, Hans-Henrik Dassow, Oliver Lange, Anke V Reinschluessel, Wolf Rogowski, Ansgar Gerhardus. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 12.09.2024.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research (ISSN 1438-8871), is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.

Fact sheets

  • Facts in pictures
  • Publications
  • Questions and answers
  • Tools and toolkits
  • Endometriosis
  • Excessive heat
  • Mental disorders
  • Polycystic ovary syndrome
  • All countries
  • Eastern Mediterranean
  • South-East Asia
  • Western Pacific
  • Data by country
  • Country presence 
  • Country strengthening 
  • Country cooperation strategies 
  • News releases
  • Feature stories
  • Press conferences
  • Commentaries
  • Photo library
  • Afghanistan
  • Cholera 
  • Coronavirus disease (COVID-19)
  • Greater Horn of Africa
  • Israel and occupied Palestinian territory
  • Disease Outbreak News
  • Situation reports
  • Weekly Epidemiological Record
  • Surveillance
  • Health emergency appeal
  • International Health Regulations
  • Independent Oversight and Advisory Committee
  • Classifications
  • Data collections
  • Global Health Observatory
  • Global Health Estimates
  • Mortality Database
  • Sustainable Development Goals
  • Health Inequality Monitor
  • Global Progress
  • World Health Statistics
  • Partnerships
  • Committees and advisory groups
  • Collaborating centres
  • Technical teams
  • Organizational structure
  • Initiatives
  • General Programme of Work
  • WHO Academy
  • Investment in WHO
  • WHO Foundation
  • External audit
  • Financial statements
  • Internal audit and investigations 
  • Programme Budget
  • Results reports
  • Governing bodies
  • World Health Assembly
  • Executive Board
  • Member States Portal

WHO and partners establish an access and allocation mechanism for mpox vaccines, treatments, tests

In coordination with Member States, the World Health Organization (WHO) and partners have established an access and allocation mechanism for mpox medical countermeasures including vaccines, treatments and diagnostic tests. The Access and Allocation Mechanism (AAM) will increase access to these tools for people at highest risk and ensure that the limited supplies are used effectively and equitably.   

This is part of the response to the public health emergency of international concern declared by WHO Director-General Dr Tedros Adhanom Ghebreyesus on 14 August 2024, following an upsurge of mpox in the Democratic Republic of the Congo and neighbouring countries. Fifteen countries in Africa have reported mpox this year. Recommendations issued on the advice of the International Health Regulations Emergency Committee asked States Parties to ensure "equitable access to safe, effective and quality-assured countermeasures for mpox.”  

“Alongside other public health interventions, vaccines, therapeutics and diagnostics are powerful tools for bringing the mpox outbreaks in Africa under control,” said WHO Director-General Dr Tedros Adhanom Ghebreyesus. “The COVID-19 pandemic illustrated the need for international coordination to promote equitable access to these tools so they can be used most effectively where they are most needed. We urge countries with supplies of vaccines and other products to come forward with donations, to prevent infections, stop transmission and save lives.” 

The AAM was established as a part of the interim Medical Countermeasures Network (i-MCM-Net). The i-MCM-Net brings together partners from around the world, including UN and other international agencies, health organizations, civil society organizations, industry and private sector to build an effective ecosystem for the development, manufacturing, allocation and delivery of medical countermeasures. The network was endorsed by WHO Member States as a mechanism to operate in the interim, as negotiations continue towards a pandemic agreement. 

Along with WHO, the AAM for mpox includes members of the i-MCM-Net: the Africa Centres for Disease Control and Prevention, the Coalition for Epidemic Preparedness Innovations, the EU’s Health Emergency Preparedness and Response Authority, FIND, Gavi, the PAHO Revolving Fund, UNICEF, Unitaid and others. 

Over 3.6 million doses of vaccines have been pledged for the mpox response.  This includes 620 000 doses of the MVA-BN vaccine pledged to affected countries by the European Commission, Austria, Belgium, Croatia, Cyprus, France, Germany, Luxembourg, Malta, Poland, Spain, and the United States of America, as well as vaccine manufacturer Bavarian Nordic. Japan has pledged 3 million doses of the LC16 vaccine, the largest number of doses pledged so far.   

The recent surge in mpox cases, coupled with the limited availability of vaccines and other medical countermeasures, underscores the need for a collaborative and transparent process to distribute these critical resources fairly. The AAM is working to allocate the currently scarce supplies of vaccines and diagnostics for those at the highest risk of infection, including for vaccinating contacts of confirmed cases, and providing access to point of care diagnostics to countries with ongoing mpox outbreaks so that people who might be suspected cases can systematically be tested and cared for.  

The AAM will operate based on these guiding principles:  

  • Preventing illness and death : Prioritize vaccination and other tools to interrupt transmission for those at greatest risk to prevent illness and death.  
  • Mitigating inequity : Ensure equitable access to medical countermeasures for all people at risk, irrespective of socio-economic or demographic background.  
  • Ensuring transparency and flexibility : Establish and maintain clear and open communication about allocation decisions and be ready to adapt strategies as new data emerge or situations change. 

“WHO and partners are supporting the government of the Democratic Republic of the Congo and other countries to implement an integrated approach to case detection, contact tracing, targeted vaccination, clinical and home care, infection prevention and control, community engagement and mobilization, and specialized logistical support,” said Dr Mike Ryan, Executive Director of WHO’s Health Emergencies Programme. “The AAM will provide a reliable pipeline of vaccines and other tools in order to ensure the success on the ground in interrupting transmission and reducing suffering.”  

Media Contacts

WHO Media Team

World Health Organization

More information about i-MCM-Net

Mpox global strategic preparedness and response plan

WHO's work on mpox

IMAGES

  1. (PDF) Structural and Process Interventions for Organizational

    case study organizational development interventions

  2. 18 Organizational Development Examples From Companies

    case study organizational development interventions

  3. 8 Steps For Organizational Development Interventions

    case study organizational development interventions

  4. AN Overview OF Organizational Development Interventions

    case study organizational development interventions

  5. Organizational Development Interventions

    case study organizational development interventions

  6. 5 Phases of Organizational Development (Goals & Interventions)

    case study organizational development interventions

VIDEO

  1. Issues in Organizational Development| Part 1

  2. Chapter II of Organisational Diagnosis and Development -by Dr. Shalmali Gadge

  3. Case Study: Organisational Restructuring

  4. Case Study: Organizational Behaviour: Personality

  5. Workplace Change to Reduce Turnover and Improve Wellbeing (Part 2)

  6. Evaluate

COMMENTS

  1. Organizational development (OD) interventions: examples & best

    In the world of organizational development, change is a constant process of discovery, analysis and action. An effective OD intervention can be one of the best mechanisms for creating impactful change and helping improve organizational efficiency. The right OD intervention can help ensure you're solving the right problems, achieve your desired ...

  2. 18 Organizational Development Examples From Companies

    3. Organizational redesign at Corning. Glassware manufacturer Corning had a mold machine shop struggling with cost overages and slow delivery. Redesigning the shop's structure and workflow and training employees in communication and high-performance skills led to lower costs, increased profits, and better-skilled employees. 4.

  3. 20 OD Interventions Every HR Practitioner Should Know

    Henkel - Case Study Building future-proof digital HR capabilities with an in-house academy. ... Organizational development interventions are not the same as ad hoc transformation efforts, for example, when a company makes change decisions once a problem arises and on the go. Instead, an OD intervention strategy is a systematic, research-based ...

  4. Cases and Exercises in Organization Development & Change

    Part I: Cases in the Organization Development Process. Case 1: Contracting for Success: Scoping Large Organizational Change Efforts. Case 2: The Discipline Dilemma in Rainbow High School. Case 3: A Case of Wine: Assessing the Organizational Culture at Resolute Winery. Case 4: Utilizing Exploratory Qualitative Data Collection in Small ...

  5. How to design, implement and evaluate organizational interventions for

    Method. Inspired by Mode 2 knowledge production (Gibbons et al., Citation 1994), we brought together transdisciplinary practitioners and academics with experience of organizational interventions and took them through a process to identify key principles for designing, implementing, and evaluating organizational interventions.The core group consisted of 11 academic experts (the authors) from ...

  6. Understanding an organizational change and development intervention

    Understanding an organizational change and development intervention applied in a Global Software Industry: A case ... OBJECTIVE: This study aims to carry out a case study to identify the motivations and actions that supported an episodic organizational change (EOC) in a software industry company that end up in an adoption of a new model of team ...

  7. Organization Development Interventions

    To effectively adapt and thrive in today's business world, organizations need to implement effective organizational development (OD) interventions to improve performance and effectiveness at the individual, group, and organizational levels. OD interventions involve people, trust, support, shared power, conflict resolution, and stakeholders ...

  8. What are organizational development interventions?

    Organizational development (OD) interventions are planned activities or projects aimed at improving an organization. This could be anything from team-building workshops and leadership training to revamping how a company manages its projects. The goal is to boost performance, enhance communication, and foster a positive workplace culture.

  9. Cases and Exercises in Organization Development & Change

    Designed for courses in organization development and change, this is a comprehensive collection of case studies and exercises. Original cases are written by experts in the field and designed to focus very precisely on a specific topic in the OD process or intervention method. Each case is accompanied by learning objectives, discussion questions, references, and suggested additional readings.

  10. (Pdf) Strategic Organizational Development Interventions: a Case of

    Organizational development interventions are a sequential flow of activities, actions, and events intended to help an organization improve its performance and effectiveness (Das and Bhatt, 2016 ...

  11. Using Case Studies for Organizational Learning in Development Agencies

    12.4 Lessons Learned in Aligning Case Studies with an Organizational Learning Agenda . In the previous section we noted that case studies on development practice are used in different ways and with different levels of systematization for the purpose of organizational learning. Here we can make use of our IGOIL categorization to explain how case ...

  12. Cases and Exercises in Organization Development & Change

    Mary K. Foster and Vicki F. Taylor. Case 9. Organization Culture - Diagnosis and Feedback. Bruce O. Mabee. Case 10. Engaging Broader Leaders in the Strategic Planning of Lincoln Women's Services. Maria Vakola. Case 11. Resistance to Change: Technology Implementation in the Public Sector.

  13. Cases and Exercises in Organization Development and Change

    Designed for courses in organization development and change, this is a comprehensive collection of case studies and exercises. Original cases are written by experts in the field and designed to focus very precisely on a specific topic in the OD process or intervention method. ... Cases in Organization Development Interventions"" ""Case 18 ...

  14. How to design, implement and evaluate organizational interventions for

    Method. Inspired by Mode 2 knowledge production (Gibbons et al., 1994), we brought together transdisciplinary practitioners and academics with experience of organizational interventions and took them through a process to identify key principles for designing, implementing, and evaluating organizational interventions.The core group consisted of 11 academic experts (the authors) from change ...

  15. Organization Development Techniques: Their Impact on Change

    Organization development (OD), as an applied arm of the field of organizational behavior, purports to facilitate organizational change through the use of a variety of change interventions. ... A Meta-Analysis Method for OD Case Studies. ... The Comparative Impact of Organization Development Interventions on Ha... Go to citation Crossref Google ...

  16. Organization Development Interventions Executing Effective ...

    To effectively adapt and thrive in today's business world, organizations need to implement effective organizational development (OD) interventions to improve performance and effectiveness at the individual, group, and organizational levels. OD interventions involve people, trust, support, shared power, conflict resolution, and stakeholders ...

  17. Using Case Studies for Organizational Learning in Development Agencies

    12.3 Using Case Studies for Organizational Learning in Four Development Agencies. anizations. have di erent ways of curating, documenting, and mobilizingffknowledge. Generating and using case studies as a tool for organizational learning requires a considerable investment of an organization's time an. ff ff.

  18. Organization Development Interventions

    To effectively adapt and thrive in today's business world, organizations need to implement effective organizational development (OD) interventions to improve performance and effectiveness at the individual, group, and organizational levels. OD interventions involve people, trust, support, shared power, conflict resolution, and stakeholders' participation, just to name a few.

  19. (PDF) Learning and Organizational Change: A Case Study of Using

    Abstract The purpose of the paper is to examine how organizational development and change (ODC) consultants engage in complex processes of facilitating and implementing team interventions in ...

  20. Synthesising Practice-Based Case Study Evidence From Community

    Practice-based case studies report on the evidence generated from the implementation of an intervention in a real-life practice setting and include the learning from those involved in the development and delivery of that intervention. Such case studies typically provide a narrative explaining how the intervention developed in that context and ...

  21. Developing and Evaluating Digital Public Health Interventions Using the

    Background: Digital public health (DiPH) interventions may help us tackle substantial public health challenges and reach historically underserved populations, in addition to presenting valuable opportunities to improve and complement existing services. However, DiPH interventions are often triggered through technological advancements and opportunities rather than public health needs.

  22. WHO and partners establish an access and allocation mechanism for mpox

    In coordination with Member States, the World Health Organization (WHO) and partners have established an access and allocation mechanism for mpox medical countermeasures including vaccines, treatments and diagnostic tests. The Access and Allocation Mechanism (AAM) will increase access to these tools for people at highest risk and ensure that the limited supplies are used effectively and equitably.