Introducing the Project Execution Plan

Introducing the Project Execution Plan

By Kevin Mattheys


Globally, skilled human capital is in short supply, thus impacting the quality, cost and schedules of projects. This applies in both operating and service companies alike as experienced personnel retire, while projects become ever more complex. In many cases, the result is the inability of projects to meet the delivery expectations as set out in the business case, which directly impacts on the financial and reputational health of the business concerned. It is therefore important to have the appropriate systems in place, with the supporting tools, processes and resources to protect against these increasing levels of risk and thereby assist in enhancing project performance.

Independent Project Analysis (IPA), a reputable project benchmarking company, state that project execution planning is the process of defining and documenting (via the Project Execution Plan) the approach to be followed in executing a capital project (Merrow, 2011). The Project Execution Plan must answer some basic questions, such as:

  • What is the business need and what are the project objectives?
  • Who will participate, when will they participate and what roles will they have?
  • How will the project be contracted, sequenced, managed and controlled?
  • When will stage transitions and specific activities take place?
  • What monitoring, control and governance criteria need to be applied?
  • Are there any extraordinary initiatives that may be required which need to be planned and budgeted for?

By answering these and other questions in a definitive manner, and committing it to paper in the project execution plan (PEP), substantial cost and schedule duration savings could be achieved, quality improved and scope changes reduced.

Merrow (2011) highlights the fact that one of the most important drivers of project success during the Implementation phase of a project lies in developing a sound and well thought through PEP during the early stages of the project. A well-defined and communicated PEP is a key driver of cost and schedule reduction, with as much as 10% to 15% saving in schedule slip and cost. Other noteworthy findings were that a well-defined PEP correlated with improved start-up duration, early operational performance, the amount of contingency required in the estimate as well as the number of design changes during execution.  These findings confirm the results of a 2006 study by the Construction Industry Institute (CII, 2006).

The Project Execution Plan is used by the project team and management to assure, firstly, that the right aspects for project implementation are considered and secondly, that the project has been described in such a way that during each stage of front-end loading (FEL) it is clear and concise as to what needs to be done.

Setting the scene

A section of the OTC Stage-Gate Model is shown in Figure 1. It depicts the Initiation phase where the business will prepare the initial idea, the Front-end Loading (FEL) phase consisting of 3 stages where the project team will develop the business idea further, the Implementation phase including Delivery and Commissioning followed by a sustainable Operation phase and eventual Closure. This is not to say that this is the only model to be followed, but it is important that the model being followed is at least similar to the model below and has a gated approach to delivering on projects.

Figure 1:  Section of the OTC Stage-Gate Model

The PEP goes through a cyclical process of updates during each of the FEL stages until the end of FEL 3 is reached. At the start of the project (FEL 1) there is only preliminary information available and what is known is written up in the PEP. As the project develops further, more clarity is gained and this is then captured in the PEP until the project is sufficiently defined to implement.

Development of a Project Execution Plan

A PEP development model

The Project Execution Plan development is initiated at the start of the FEL 1 stage. Although there are many variations of a PEP available, the development of a PEP should be based on a similar model to that developed by OTC and shown in Figure 2. We normally find that several elements are missing or incomplete, e.g. close out and next stage planning, and that is why this comprehensive model was developed.

Figure 2:  The PEP Development Model

The PEP is a document which is continually updated during each of the FEL stages until the end of FEL 3. Each of the major sections which form the construct of the PEP Development Model is described in more detail below:

Background, Overview & Scope

We start off with the yellow oval in the PEP Development Model.  A vital part of any PEP is to describe the project by looking at the business objectives, the business value chain, the project scope, potential risks that could stop or delay the project, boundary conditions for the team and other critical elements of the project to align all parties.

It is important that this section is well written as it sets the scene (and the scope) for the remainder of the project and provides the basis around which further development, and eventually implementation, of the project takes place.  The business charter (what the business expects from the project team) is also included here.

Frame the Project

This section starts with a comprehensive business chain development workshop (called a Framing & Alignment meeting) and includes an overview of the scope, some high-level milestones and a first pass cost estimate. This is shown in the pink/orange coloured oval above.

Of key importance here is, inter alia, confirming the project execution outcomes, understanding the Work Breakdown Structure and capital cost estimate, the key project milestones and schedule assumptions, key project stakeholders, the high-level implementation strategy, as well as requirements for integration management (normally required on larger projects).

Planning Project Implementation

Here one would describe the project team and systems required to prepare the project for executing the various FEL stages as well as for final project implementation. This is the green oval shown below.

The grouping of blue ovals describes the various plans required for the implementation of the project. This is when all the plans come together and specifically addresses design/engineering, contract/procurement plans, construction, commissioning and close out of the project. It is very important to plan for project close out as this activity is typically not done due to time, resource or budget constraints.

Turning to the dark blue central circle of the PEP model we find four categories of plans listed, namely monitor and control plans, supporting plans, support services and project governance plans. We discuss each in turn.

Monitor and Control Plans

It is understood that if there is no control, you are flying blind and you will end up in unexpected places with less than desired results. The various plans which are required for control and monitor activities during the project are described here. Typical plans will be the project controls plans, safety, health & environmental plans, risk management plans, change management plans, quality plans, and others.

Supporting Plans

Every project invariably needs support services which are traditionally available within the business and its structures. Typical support areas are project accounting, human resources, document management, lessons learnt and industrial relations. These should be listed and included in the PEP to support the project to achieve its objectives successfully.

An area usually neglected by project teams is ensuring excellent communication to stakeholders and shareholders via a communication and engagement plan. It is also important for budgeting purposes that these items are identified and included in the overall PEP.

Support Services

Described in this section is a list of generic services which may or not be required for the project.

It is important that the various services required during the project are described as resources and budgets will need to be sought. These could be items such as project benchmarking, team effectiveness surveys, corporate social responsibility programmes or other initiatives to assist the project.

Project governance

For any good project to be implemented successfully, certain key decisions need to be made at various junctures along the project time line. In order to support or guide these decisions, certain governance activities are either mandatory or negotiable. Gate reviews are mandatory as are certain procedures or approval limits. Negotiable items could include exceptions to the corporate approved vendor database or spending approval levels depending on the unique nature of the project. They need to be documented and agreed, however.

A governance structure consisting of reporting requirements, boards, steering committees and other steering and/or approval forums is also a prerequisite for this section.

Additional requirements

The sections above describe the PEP in broad outline.  Not shown on the model in Figure 2 are two important items that should be included in a PEP, namely:

  • Next Stage Plan: Whilst the sections above are related to the generic project, this section requires that a certain amount of preparation work be done to ensure the activities and deliverables required in the next stage are addressed. As an example, the work to be done during FEL 3 must be planned during FEL 2.
  • References: Certain documents are critical inputs to a PEP but are normally too lengthy to include in the PEP. An example is the Business Plan or Project Information Memorandum. Critical information is gleaned from these documents, but they are very comprehensive documents and do not fit well within a PEP. Rather extract the required information and refer the reader back to the source documents that can also be attached as an addendum to the PEP.

The various sections described above are then translated into a typical table of contents for the PEP. This is by no means a definitive list, but is a very good starting point for most projects.  By following this model, a generic PEP can be a useful way to ensure a consistent approach is used.  It also provides a useful framework for communicating and aligning with all role players.

The PEP Development Cycle

It is extremely important to remember that the development of a PEP is an iterative process from FEL 1 to the end of FEL 3 where the level of information for each section of the PEP becomes progressively more detailed as more knowledge and insight is gained about the project. At the end of FEL 3, the PEP becomes the definitive plan for project implementation. It also describes/prescribes the role and governance requirements of engineering and other contractors as part of their work in contributing to the success of the project. The development of a PEP starts during the FEL 1 or prefeasibility stage as shown in Figure 3.

Figure 3:  The PEP Development Cycle

The input required to start the development of the PEP is the sponsor mandate and a project kick-off meeting. The project charter and business objectives also provide inputs that need to be considered in developing the PEP. By following the model shown in Figure 3, one gets a good idea of the level of definition the PEP requires during each of the stages.  These range from philosophy statements to preliminary plans to definitive plans by the end of FEL 3.

PEP development is not the responsibility of the project manager on his own, but every team lead needs to understand the PEP and provide his or her input into the sections that they are responsible for. The items covered in the PEP however remain consistent throughout the project life-cycle, but the level of detail increases through FEL 1, 2 and 3 as demonstrated above.

At the end of FEL1 and FEL 2, the PEP contains an overall view of the total project life-cycle and implementation plan, as well as the detailed plan for the next stage. At the end of FEL 3 the PEP contains the full Implementation phase plan, covering project delivery and commissioning, and will therefore form the definitive basis for the Project Control Base against which all progress, performance payments and changes will be measured and reported.

Closing remarks

Front-end planning has long been recognised as an important process that increases the likelihood of project success (Hansen, Too & Le, 2018).  CII (2006) state that front-end planning and the development of a PEP is a process of developing enough strategic information with which owners can address project and business risk and decide to commit resources to a project. 

The PEP is a vital component of the project manager’s and project team’s armoury.  It sets out the scope, mandate, plans, etc. of what the project is going to deliver.  It acts as an extremely important communication tool and should not be treated lightly.  Many project teams seem to think that once the plan is updated that is the end.  On small projects you may get away with it, but on large projects you do so at your peril.


CII (Construction Industry Institute), 2006, RS213-1 – Front End Planning: Break the Rules, Pay the Price. Austin, Texas: Construction Industry Institute, The University of Texas at Austin.

Hansen, S., Too, E. & Le, T., 2018, Retrospective look on front-end planning in the construction industry: A literature review of 30 years of research. International Journal of Construction Supply Chain Management Vol. 8, No. 1.

Merrow, E.W., 2011, Industrial megaprojects: concepts, strategies, and practices for success., John Wiley & Sons, Inc., Hoboken, New Jersey.

You may also be interested in

Disrupting Project Controls – Fast Forward 20 Years

Disrupting Project Controls – Fast Forward 20 Years

By Kevin Mattheys


Sitting in the OTC office in Wellington, Western Cape (near Cape Town, South Africa), I was looking out of the window at the welcome rain falling gently to the ground and providing the trees and vegetation some much needed sustenance.  I was also thinking about how the world has changed over my lifetime. It was then that my weekly issue of MindBullets popped up in my email account.  MindBullets is a free weekly publication from Future World where they try and predict future events based on current knowledge and technology.

This MindBullets article discussed ‘disruption’, the rate at which it is happening and some of the technologies creating the disruption.  Whilst reading the article, I thought it prudent to write this article on where the project controls industry will find itself 20 years from now as a ‘conversation starter’.  Twenty years might sound like a long time, but it could also happen sooner, so I have been somewhat conservative.  I suspect not much work has currently been done on topic, as there still appears to be a proliferation of spreadsheet and in-the-drawer databases being used.

This phase in the ongoing advancement of humanity is what is commonly referred to as the ‘4th Industrial Revolution’.  Some of the questions that will need to be asked are:

  • Can machines and /or technology help us implement projects better?
  • Will project controls even exist in future and, if so, in what shape or form?
  • How much automation will take place? and
  • What skills will be required to get the best from technology?

These are some of the questions that will need to be tackled in almost all organisations involved in projects and project controls activities.  There will no doubt be advantages and disadvantages, but will the advantages outweigh the disadvantages?

I guess the question can be asked for project management and other disciplines as well, but for now let us focus on project controls.


Innovations in technology and technology related applications continue to proliferate at an astonishing pace.  Moore’s law certainly seems to be playing out in this field.  What was considered revolutionary a few years ago has had its head overturned and is now outdated.

I have been receiving the weekly articles from Future World since 2011.  What tweaked my interest in 2014 was their prediction for a concept called 3D printing (sometimes referred to as additive manufacturing) and ever since then I have been intrigued by this technology.  Of course, this is not the only technology that is causing disruption, but the advances made in the past four to five years in this field have been staggering.

By implication one can probably assume that similar strides are being made in other technologies.  Let us look at each of the technologies out there currently that I am aware of and dissect them a little.  They are not addressed in any particular order but provide a sample for the reader to better understand what is happening behind the scenes so that they can be factored into the conversations.

Current Disruptive Technologies

3D Printing (additive manufacturing)

Essentially this technology uses filaments of various type e.g. plastic, metal, nylon, wood filler, carbon fiber, etc. The filaments are then passed through a heating nozzle which essentially reduces the filament to a liquid type of substance which is then applied layer by layer to the object being produced.

Houses have been produced in some areas of the world using 3D printing (Byttner, 2016).  In Holland, the architectural firm Dus Architects has already printed a ‘Canal House’ with 3D-printers.  Another example is that the Chinese company Yingchuang New Materials in Shanghai is already 3D-printing 10 houses per day.  Recently, a 3D-printed office building was unveiled in Dubai.

Unconfirmed reports state that NASA no longer sends spare parts with their space ships.  If needed, they can simply print a spare part in space. Other applications have already been developed in the medical, manufacturing, healthcare, optics, education and food industries.


Drones are a relative newcomer to the scene.  To the American military, they are UAVs (Unmanned Aerial Vehicles) or RPAS (Remotely Piloted Aerial Systems). However, they are more commonly known as drones.  Drones are used in situations where manned flight is considered too risky or difficult.

Areas where drones are currently being used is in agriculture, recording of live events, surveying dangerous areas, delivery of small items, tracking wildlife, law enforcement and shooting of commercials and movies.  Agriculture is also adapting to this technology to monitor crops, watering patterns and soil suitability.  This is an area that will continue to grow as more and more creative uses are found for these machines.

As a practical application why not use drones to view a project being built and the images sent back to a central location where further analytics, project progressing, etc. can be done without having to go to the site?  Furthermore, if cameras on drones are fitted with appropriate lenses, then e.g. welds can be analysed for cracks, or hot spots can be detected.  The opportunities go on and on.

Artificial Intelligence (AI)

Artificial intelligence (AI) is intelligence demonstrated by machines, in contrast to the natural intelligence displayed by humans and other animals. AI strives to replicate human thinking and analysis, but via the machine.  Apart from being adept at playing chess, other areas where AI is currently used is in voice recognition, speech recognition, medical diagnosis and search engines.

If recent predictions are to be believed, then this is a huge growth area.  According to a new market research report, Artificial Intelligence (AI) in Construction Market – Global Forecast to 2023 (MarketsandMarkets™, 2018), the global market is expected to grow from US$ 407.2 Million in 2018 to US$ 1,831.0 Million by 2023, at a compound annual growth rate of 35.1% during the forecast period.

Data Analytics

Data analysis is a process of inspecting, cleansing, transforming, and modeling data with the goal of discovering useful information, suggesting conclusions, and supporting decision-making.  In the past large companies used relational databases to extract information for decision making, but this was somewhat limited.  With the vast amounts of data available both inside and outside the organization, it has now become necessary to analyse this information quicker and more reliably.

Cloud Data Storage

Cloud storage allows world-wide storage and retrieval of any amount of data at any time.  You can use cloud storage for a range of scenarios, including serving website content, storing data for archival and disaster recovery, or distributing large data objects to users via direct download.  I think this is also an area that is going to improve greatly.

Most of our work at OTC is based on working in the cloud and the benefits are not difficult to see.

Internet of Things (IoT)

The Internet of Things is the network of physical devices, vehicles, home appliances and other items embedded with electronics, software, sensors, actuators, and connectivity which enables these objects to connect and exchange data.

We already see widespread application of this technology, but what will the future hold for project controls practitioners?  Working from home, always available, data at the touch of a button.


Blockchain refers to a type of data structure that enables identifying and tracking transactions digitally and sharing this information across a distributed network of computers, creating in a sense a distributed trust network.  The distributed ledger technology offered by blockchain provides a transparent and secure means for tracking the ownership and transfer of assets.

What will this technology bring to the procurement processes of companies?  It will almost certainly ensure the integrity of materials and equipment delivered to site.


Bitcoin, Ethereum, EOS, Ripple, Litecoin, Bitcoin Cash, Binance Coin, IOTA, TRON and NEO.  Do you recognize these terms? If you do not, then I suggest you find out what they are. A good place to start is the article, Bitcoin for beginners (Mayer, 2017).

According to Wikipedia (2018), a cryptocurrency is a digital asset designed to work as a medium of exchange that uses strong cryptography to secure financial transactions, control the creation of additional units, and verify the transfer of assets.  A cryptocurrency is a kind of digital currency, virtual currency or alternative currency. Cryptocurrencies use decentralised control as opposed to centralised electronic money and central banking systems. Who knows how many other ‘currencies’ will pop up in future to challenge the current crop of cryptocurrencies.

Now the questions get interesting.  What does this mean for projects and companies who use traditional payment, financing and procurement currencies?  Are cost controllers and estimators able to control and estimate costs with this technology?  How will the financial reports be prepared and presented? Remember with this technology it is possible that many traditional transaction fee costs, interest rate costs and other costs associated with lending institutions could be rendered for ‘free’.

Facial / Object Recognition

A facial recognition system is a technology capable of identifying or verifying a person from a digital image or a video frame from a video source.  There are multiple methods in which facial recognition systems work, but in general, they work by comparing selected facial features from a given image with faces within a database.

John Holland is an Australian construction company who are actively embracing technology.  In the field of safety, they use facial recognition technology to identify workers who are not wearing the appropriate personal protective equipment (PPE) on site (McLean, 2018).

Other disruptive technologies

Other disruptive technologies which are out there, and which we haven’t even touched on, are clean energy, self-driving vehicles and biotechnology.  It seems as if ‘disruption’ is the new way of the world. The more it can be done, the better it seems.

I sometimes wonder about whether it is always a good thing…

Generic Skills for the Future

The generic skills listed below will remain essential in the future work environment:

  • Complex Problem Solving: The skill to see relationships between industries and craft creative solutions to problems that are yet to appear;
  • Critical Thinking: People who can turn data into insightful interpretations will be sought after due to the complexity and interconnectedness of fields such as computer science, engineering and biology;
  • Creativity: The ability to build something out of ideas is a skill that will pay off now and in future;
  • People Management: Robots may acquire analytical and mathematical skills, but they cannot replace humans in leadership and managerial roles that require people skills;
  • Coordinating with others: Effective communication and team collaboration will be a top demand in any company;
  • Emotional Intelligence: Qualities such as empathy and curiosity for future managers;
  • Judgement and Decision-Making: The ability to condense vast amounts of data with the help of data analytics into insightful interpretations and measured decisions;
  • Service Orientation: People who know the importance of offering value to clients in the form of services and assistance;
  • Negotiation skills: Deriving win-win situations with businesses and individuals will be extremely important; and
  • Cognitive Flexibility: The ability to switch between different persona as the situation demands.

Do your corporate training programs address these areas?  Will you need to up-skill yourself?

The future of project controls Fig 1

My View on Project Controls in 20 Years

I am of the firm opinion that all the above technologies, as well as others which will still be developed in the future, will in some way, shape or form impact on project controls.

Data Analytics (or what is sometimes referred to as ‘Big Data’) will play a large role in project controls enabled by cloud computing. Modelling of plants or facilities will become the norm, and everything will be planned and modelled to the finest detail before any work is done on site.  Schedules will be automatically produced by the modelling software along with cost estimates, etc. from standardised templates.

Imagine a world where statistical simulations (possibly Monte Carlo or some other disruptive application) are the norm on cost estimates and schedules and then live data is used to track and monitor progress related to engineering, procurement, construction etc.  This data is then used to provide an almost real time statistical forecast on the end of job cost and schedule which means a narrowing of the conventional distribution curves of today.  The data is then fed back automatically into the respective data engines leading to significant productivity and forecasting gains for future projects. What then are the implications of a company managing multiple projects concurrently and all this data is available real time? Astonishing to say the least.

There will be fundamental change in procurement (Blockchain) and construction practices (3D printing and recognition software for permitting, site access, etc.).  Collection of as built drawings will no longer be an issue as all drawings are attached to their correct repositories and updated online for immediate storage back into the database.

Project controls in its current format will change and may even merge with the project management function or just become a data engine feed, but with analytical capabilities. The availability of real time data will increase the project controllers’ productivity enormously as the required data will be online and not in some obscure place where it is difficult to find. Collecting, updating and storing data will continue to ensure decisions can be made easier and faster.  Transparency of data will be enhanced.  No more in-the-drawer spreadsheets or databases.  No more hiding of data.  No more manipulating of data and reports to play to someone’s agenda.

Gavin Halse, who is currently a consulting partner at OTC, discusses the impact that product life-cycle management (PLM) may have in an interesting article on the subject.   He contends that in the world of future projects he sees a shift from traditional megaprojects towards smaller, more agile projects that involve many more stakeholders / participants in a new networked economy (Halse, 2017).  This will have a significant impact on project controls because project complexity will increase exponentially. The long waterfall design, build, commission, hand over to the operator, operate will change.

Multi-purpose factories will adapt product lines to toll manufacture on demand, many of the projects will be to retread plant to produce new products – this is like staying in business, but on a much bigger scale. The owner’s role will also change, as will the operator’s. This will mean that project controls and business modelling/operations will converge much more and not operate in separate silos.

I think a new change that will emerge from all of this will be in the commercial / legal fields as I do not believe we have thought about this going forward, e.g. how do we write / manage contracts for a digital age and with so much disruption going on?

Concluding remarks

What was very interesting for me while preparing this article was the lack of articles that have been published on the world wide web relating to use of technology in the project controls or project management arenas.  It may mean that people are actively working on this in the background, but I doubt it.  Another interesting take away is a comment by Byttner (2016) that the construction industry has been somewhat reluctant to get dragged into the digital age and that it is now ready for disruption.

In the article above I have not delved into huge detail, but rather tried to make it light reading to stimulate the thinking and the conversations which will be necessary in preparing for disruption. It should serve as a starting point for tackling the role that will be required of project controls people (and others) in the future.

Whatever your view of the future of project controls is, be assured it will be different.  I would like to convey my thanks to John Hollman and Gavin Halse for providing further insight into this complex topic.


You may also be interested in

Develop and Analyse Project Schedules for Realism – Part 2

Develop and Analyse Project Schedules for Realism – Part 2

By Kevin Mattheys

This is the second part of a 2-part series of articles covering the development and control of robust, but realistic, project schedules.  The focus of the articles is as follows:

  • Part 1 – A composite scheduling process (published November 2016); and
  • Part 2 – Assessing schedules for realism.

This second part is an overview of schedule assessment metrics that can be used to ensure that project schedules are realistic.


It is a common management phrase which states that “if you want to manage it, then measure it”. To this end, various metrics have been developed and used by various organisations to assess the quality of schedules to improve project performance.  Some of the more well-known are:

  • The Defense Contract Management Agency (DCMA), which has developed criteria for performing an objective and thorough analysis of a schedule;
  • The National Defense Industrial Association (NDIA), in its Planning and Scheduling Excellence Guide (PASEG) (2016), describes, in addition to the DCMA criteria, a set of generally accepted scheduling principles (GASP); and
  • The US Government Accountability Office (GAO) (2015) with its guide, the GAO Schedule Assessment Guide: Best Practices for Project Schedules.

While the guides mentioned above are all written with a focus on government projects, I have not found anything that is not applicable to projects in general.  In fact, many times in the past it has been shown that the United States of America’s Defense Force are at the leading edge of developing useful procedures, standards or other tools that become useful to industry in general.  In this article, we’ll consider each of these guides in turn, and then try to identify areas of difference that can be considered useful in putting an enhanced or improved schedule quality assessment system in place.


According to Winter (2011), the USA Under Secretary of Defense for Acquisition and Technology mandated the use of an Integrated Master Schedule (IMS) for contracts greater than US$20 million in March 2005. He also directed the DCMA to establish guidelines and procedures to monitor and evaluate these schedules. The DCMA then internally produced a programme in response to this requirement and released their 14-Point assessment checks as a framework for schedule quality control.

However, according to the National Defense Industrial Association (NDIA), the generally accepted scheduling principles (GASP) were originally developed as a governance mechanism for the Program Planning and Scheduling Subcommittee (PPSS). The PPSS is a subcommittee formed by the Industrial Committee on Program Management (ICPM) working group under the auspices of NDIA. The GASP was thus developed collaboratively with inputs from both Government and Industry.

I think that in the context of this article, the important issue is that there is a GASP, irrespective of who developed it.  It is also essential to understand that GASP is intentionally broad and sets high expectations for excellent scheduling, yet do not specify methodologies. Therefore, avoid viewing the GASP as dogma; instead, continually strive to meet or exceed the GASP with creative and practical approaches that work for the size, value, risk, and complexity of the program and the skills and capabilities of the programme team.

The DCMA Generally Accepted Scheduling Practices

The Defense Contract Management Agency’s schedule assessment system helps in determining schedule consistency, allows for constructive discussions based on the analysis, facilitates setting of realistic schedule baselines, is based on proven metrics and provides two tripwires for early detection of possible deviations against standard.

The generally accepted scheduling principles (GASP) are eight over-arching tenets for building, maintaining, and using schedules as effective management tools. The GASP is concise and easily understood, yet sets high expectations for programme or project management teams to develop and use schedules.

The first five GASP tenets describe the requisite qualities of a valid schedule; that is, one that provides complete, reasonable, and credible schedule information based on realistic logic, durations, and dates. The latter three GASP tenets reflect increased scheduling maturity that yields an effective schedule. An effective schedule provides timeous and reliable data, aligns time-phased resources, and is built and maintained using controlled and repeatable processes. Figure 1 below shows the groupings.


Figure 1:   The GASP tenet groupings

The GASP tenets serve several purposes, including that:

  • They provide high level targets for sound scheduling;
  • They also serve as a validation tool for the programme team or organisation to assess the schedule maturity or areas needing improvement;
  • They can be used as a governance tool to assess new or different scheduling approaches with objectivity and detachment; and
  • They can be used as a compliance standard in contracts for the accepted schedule.

Achieving a GASP-compliant schedule indicates it is not merely healthy, but fit. A healthy schedule is functional and meets minimum management purposes, but a fit schedule is robust and dynamic.

Let us now consider each of the 8 tenets and where the 14-point assessment checks fit in, starting with the “Valid” group of tenets:

  • Tenet 1 – Complete: Schedules must represent authorised discrete effort for the entire contract.
    • Assessment Point 1: High Duration – A high proportion of activities with this condition indicates a distinct lack of sufficiently detailed plans.
  • Tenet 2 – Traceable: Schedules reflect realistic and meaningful network logic that horizontally and vertically integrates the likely sequence for project execution. Coding plays a large part in this principle.
    • Assessment Point 2: Missing Logic – This metric is used to test the “completeness” of the schedule and see how well the activities in the schedule are linked together.
    • Assessment Point 3: Relationship Types – The use of Finish to Start should be the most widely used, Start to Start and Finish to Finish used sparingly and Start to Finish used by exception.
  • Tenet 3 – Transparent: Schedules provide full disclosure of project status and forecast and include documented ground rules, assumptions, schedule building and maintaining methods, approach to analysing critical paths, etc.
    • Assessment Point 4: Activities with Leads – The use of leads (negative lags) normally indicates that the schedule is not detailed enough or the scheduler is trying to “fix” activity dates. This makes it harder to analyse the critical path, distorts the total float in the schedule and may cause resource conflicts.
    • Assessment Point 5: Activities with Lags – The use of excessive lags complicates the schedule to the point of not being maintainable and ineffective in forecasting future dates. This makes it harder to analyse the critical path.
  • Tenet 4 – Statused: Schedule reflects consistent and regular updates of completed work, interim progress, achievable remaining durations relative to status date and accurately maintained logic relationships.
    • Assessment Point 6: Invalid Dates – Activities that have not started should not have projected start or finish dates before the status date. Activities should also not have actual start or finish dates after the status date.
    • Assessment Point 7: Missed Activities – This metric checks for the number of activities that have missed their baseline finish dates and are therefore not meeting the baseline plan.
    • Assessment Point 8: Baseline Execution Index (BLI) – Measures the percentage of activities completed as a percentage of the activities that should have been completed.
  • Tenet 5 – Predictive: Schedules accurately forecast the likely completion dates and impacts to the project baseline plan through valid network logic and achievable task durations from the status date to project completion.
    • Assessment Point 9: High Float – Excessive amounts of positive float may indicate incomplete schedule logic or an unstable network. Large negative float values may indicate a scheduler capturing the date incorrectly.
    • Assessment Point 10: Negative Float – Float less than 0 working days is considered negative float. This indicates that the project is overrunning its stated completion date and re-planning needs to take place to correct the situation.
    • Assessment Point 11: Critical Path Test – This test must be run manually. It is used to check the integrity of the critical path by ensuring that a delay introduced into the path has the appropriate impact on the end date.
    • Assessment Point 12: Critical Path Length Index (CPLI) – This is an indicator of the likelihood of completing the schedule on time and measures the relative efficiency required to complete the schedule on time.

The three tenets for the “Effective” group are:

  • Tenet 6 – Usable: Schedules provide meaningful metrics for timely and effective communication, tracking and improving performance, mitigating issues and risks as well as capturing opportunities.
    • Assessment Point 13: Hard constraints – The overuse of date constraints is considered one of the most common abuses of a schedule. Overuse leads to a “hardening” of the schedule and limits the forecasting capability of the schedule.
  • Tenet 7 – Resourced: Resources align with the schedule baseline and forecast to enable stakeholders to view and assess the time-phased labour and other costs required to achieve project baseline and forecast targets.
    • Assessment Point 14: No Assigned Resources – Without resources assigned it is extremely difficult to determine if there is a fighting chance of completing the activity / project. Resource profiles indicate resourcing needs and show if progress is being achieved / maintained at the rate planned for.
  • Tenet 8 – Controlled: Schedules are baselined and maintained using a rigorous, stable and repeatable process.  Schedule additions, deletions and updates conform to this process.

The United States Government Accountability Office (GAO) Schedule Assessment Guide

GAO’s research tells us that the four characteristics of a high-quality, reliable schedule are that it is comprehensive, well-constructed, credible, and controlled.

A comprehen­sive schedule includes all activities for both the government (owner) and its contractors necessary to accomplish a program’s objectives as defined in the program’s WBS. The schedule includes the labour, materials, travel, facilities, equipment, and the like needed to do the work and depicts when those resources are needed and when they will be available. It realistically reflects how long each activity will take and allows for discrete progress measurement.

A schedule is well-constructed if all its activities are logically sequenced with the most straightforward logic possible. Unusual or complicated logic techniques are used judi­ciously and justified in the schedule documentation. The schedule’s critical path rep­resents a true model of the activities that drive the program’s earliest completion date, and total float accurately depicts schedule flexibility.

A schedule is credible if it is horizontally traceable—that is, it reflects the order of events necessary to achieve aggregated products or outcomes. It is also vertically traceable: activities in varying levels of the schedule map to one another and key dates presented to management in periodic briefings are in sync with the schedule. Data about risks are used to predict a level of confidence in meeting the program’s completion date. Neces­sary schedule contingency and high-priority risks are identified by conducting a robust schedule risk analysis.

Finally, a schedule is controlled if trained schedulers update it regularly using actual progress and logic, based on information provided by activity owners, to realistical­ly forecast dates for programme activities. Updates to the schedule are accompanied by a schedule narrative that describes salient changes to the network. The current schedule is compared against a designated baseline schedule to measure, monitor, and report the program’s progress. The baseline schedule is accompanied by a basis document that explains the overall approach to the program, defines ground rules and assumptions, and describes the unique features of the schedule. The baseline schedule and current schedule are subjected to configuration management control.

The GOA also indicate that there are 10 scheduling best practices required to achieve the four characteristics of a high quality, reliable schedule.  They are self-explanatory and I will not delve into any of them here; suffice to say they are mostly represented and described in the DCMA literature discussed earlier.  Table 1 shows how the 10 scheduling best practices are mapped to these four characteristics.

Table 1:   The GOA Schedule Characteristics and Best Practices


The United States National Defense Industrial Association (NDIA)

The NDIA Planning & Scheduling Excellence Guide (PASEG) published this guide in March 2016 to provide the programme management team, including new and experienced master planners/schedulers, with practical approaches for building, using, and maintaining an Integrated Master Schedule (IMS). It also identifies knowledge, awareness, and processes that enable the user to achieve reasonable consistency and a standardized approach to project planning, scheduling and analysis.

Sound schedules merge cost and technical data to influence programme management decisions and actions. Realistic schedules help stakeholders make key go-ahead decisions, track and assess past performance, and predict future performance and costs. Industry and Government agree that improving IMS integrity has a multiplier effect on improved programme management.

The three sections in the document that are of real interest to me are listed in Table 2.  This NDIA PASEG (available on the internet), can also serve as a useful guide for relatively new planners.

Table 2:   Three important sections of the NDIA PASEG


Key Observations

In both guides the GASP features prominently.  If we then adopt the GASP as the minimum schedule quality standard and do a comparative analysis with the GAO Schedule Assessment Guide and the NDIA Planning & Scheduling Excellence guide, the following can be observed:

Firstly, regarding the GAO Schedule Assessment Guide:

  • In this guide the GASP, by and large, is described with the same or similar words.
  • The guide seems to emphasise the planning process, rather than interrogating the schedule and analysing it with measurable results.
  • What I did like, and which will be included in the minimum standard, is best practice 1 “Capturing All Activities”. If one looks at the description, then it is very clear that the work breakdown structure (WBS) is central to this, as in theory “all activities need to link to a WBS element and all WBS elements need to link to an activity”.  This clearly then would need to be incorporated into tenet 1.

Secondly, regarding the NDIA Planning & Scheduling Excellence Guide:

  • This is an excellent and rather comprehensive guide to planning and scheduling. Certainly, value will be obtained by using this guide as a reference.
  • I love the fact that this guide focuses on the Integrated Master Schedule as its point of centrality for schedules. This is critical for large or complex projects or programmes. I would suggest that a measure to be added to GASP is whether all schedules linking to the Master Schedule have a minimum GASP quality measurement rating.
  • Another like is the “Leadership, Buy in and Commitment” section. This is an area that generally is very problematic. Do approved roles and responsibilities for all parties involved in the schedule development and maintenance exist and how to measure these?  I also support introducing the Current Execution Index (CEI), which measures the team’s ability to accurately forecast ahead.
  • This guide also emphasis Earned Value analysis and one can then add Schedule Performance to the mix. Cost Performance would be reported elsewhere as the GASP is purely schedule related.

Concluding remarks

This has been a fascinating exercise spurred on by Jurie Steyn, the editor of these articles, challenging me to expand and improve on the generic minimum standard GASP discussed at the start of this article.  By identifying additional value adding schedule measures, the quality of programme / project schedules improves, and hence the success of ventures improves.

In summary, four new measures will be added to the minimum standard GASP used within OTC.  This is by no means an all-inclusive list and we will continue searching for ways to improve on this to the benefit off all.  If you have any suggestions after reading this article, please pass them on and let us get the mindset entrenched in the industry that quality schedules are the new norm and the preferred standard.


DCMA (Defense Contract Management Agency), Date unknown, Generally Accepted Scheduling Practices (GASP).

GAO (United States Governance Accountability Office), 2015, GAO-16-89G Schedule Assessment Guide.

NDIA (National Defense Industrial Association), 2016, Planning & Scheduling Excellence Guide (PASEG), Version 3.0.

Winter, R., 2011, DCMA 14-Point schedule assessment, Available as a PDF from   Accessed on 30 November 2016.


You can download this and other Insight articles from our website at

Develop and Analyse Project Schedules for Realism

Develop and Analyse Project Schedules for Realism

By Kevin Mattheys

This is the first of a 2-part series of articles covering the development and control of robust, but realistic, project schedules.   The focus of the articles is as follows:

The first part covers a project scheduling process which takes from the best processes available and consolidates it into a composite scheduling process.


When presented with a project schedule which shows the correct durations and task dates, we generally assume the schedule is acceptable.  We then establish the project baseline against which the project will be progressed.  Not long after the project kicks off, things start to drift from plan and then it becomes an ongoing exercise to continually replan and rebaseline.  This situation gradually deteriorates and eventually leads to project delays, delay claims and, in some instances, lengthy and expensive court battles.  As is so often the case in life, the devil lies in the detail.

During my career, I have regularly encountered this phenomenon and must attest to many hours of frustration because one, or more, of the following requirements were not adhered to:

  • Basic rules of schedule development and control;
  • Minimum standards for what a quality schedule looks like;
  • Documented backup on how the schedule has been compiled;
  • Proper signoff and communication of the baseline schedule; and
  • A proper change management process.

It became evident that something was amiss in the underlying structure and processes of these schedules.  There appeared to be quality and governance issues at stake which, truth be told, are some of the main drivers of a realistic schedule.  This series of articles will highlight and suggest ways in which this situation can be improved.

Composite Scheduling Process

There have been numerous books written on scheduling; suffice to say that it is very important that a planner / scheduler follows a rigid formal planning process.    Examples of formal planning and scheduling processes can be found in the Project Management Body of Knowledge (PMI, 2013), literature and articles from the American Association of Cost Engineers (Crawford, 2011; Stephenson, 2015), and books such as Project management using earned value, 2nd edition, by Humphreys (2002).

There are certain nuances to each of these high-level processes, which, if selectively stripped out and used to create a composite process, will more fully describe the process for creating a project schedule. Over a period, my personal preference has swung towards using the approach of Humphreys (2002), primarily because of the introduction of the quality assessment sub-process in conjunction with one or two modifications which will address the fundamental frustrations highlighted in the introduction.

The composite schedule development and control process is highlighted in Table 1.  Note that the process described below is not a linear sequential process, but goes through a series of iterations to get to the eventual outcome.

Table 1:   The Schedule Development and Control Process


Let us now discuss each of these in a little more detail.  The focus of the balance of this article will only be on the schedule development and control process.  The second article will address the quality assessment sub-process and the schedule basis document.

The first step in the process is the preparation of the schedule so that we can establish the baseline which the project will be measured against.

Prepare the Schedule

Preparing the schedule comprises the following seven steps:

  • Develop the Network:  This sub-process is the start of schedule preparation.  As part of the inputs a set of target dates are required which could be startup related, market related or project completion related.  The list of schedule activities is derived from the scope which should be in the form of a Work Breakdown Structure (WBS), activity durations are determined and the relevant logic applied.  The various project execution strategies will also determine logic and grouping of activities so bear this in mind.   Once this has been completed, one has what is referred to as a network diagram.
  • Perform Critical Path Analysis:  To determine a critical path (the longest path through the network) for a project, a series of backward and forward passes of the network are simulated by the planning software.  The resultant critical path is the sequence of activities such that if any one of the activities were to be delayed, it would cause the project to be delayed.
  • Resource Loading:  It has been statistically proven that proper attention to resource loading leads to improved project outcomes and more reliable schedules.  Resource loading and critical path analysis are interchangeable, as some planners like to do the critical path analysis after the resource loading exercise has been undertaken.  I am personally not too fussed, so long as both actions occur iteratively until an optimum solution is achieved.
  • Crash the Network:  “Crashing” is a technique whereby various strategies related to overtime, additional manpower or sequencing of work are investigated to determine if there are opportunities to reduce the project duration.  This is a lengthy and complex process and usually leads to reduced project durations.  However, the additional costs need to be weighed against the improved outcomes.
  • Schedule Quality Assessment:  This is a sub-process that I believe is extremely underutilised, yet assists greatly in enhancing the realism of the schedule.  Until recently, there did not appear to be much research on this aspect of the schedule.  In the absence of anything substantive this set of criteria can be used to analyse a schedule and determine fundamental problems which will need to be corrected.  It is also not a panacea for all schedule issues but can be used as a minimum standard which can be added to over time. More on this in next month’s follow up article.
  • Risk Analysis:  This sub-process should not be undertaken until a proper quality assessment has been conducted.  Incorrect use of logic, relationships, constraints, float and long durations could have a detrimental effect on the outcomes of the risk analysis and render the whole exercise invalid if we do not use a realistic schedule. The object of a schedule risk analysis is to determine the possible impact on project durations should the probability of certain risks materialize or events occur.  After the analysis, a range of probabilistic dates are presented and the team decides on an appropriate level of risk.  This then becomes the schedule contingency (like cost contingency) and should be managed in the same way.
  • Benchmark the Schedule:  Benchmarking appears to be an activity that takes place on an ad hoc basis.  When initially establishing the project schedule it is imperative to conduct a benchmarking exercise to determine the realism of the proposed project schedule before it becomes the de facto baseline.  Once the initial baseline has been established it may not be necessary to conduct a benchmarking exercise again unless there has been a significant de-scoping or scope increase.

Baseline the Schedule

Setting a Traceable Schedule:  On completion of Schedule Preparation sub-processes, it is important to put the baseline in place.  Included in this exercise, all the elements for assessing progress to determine progress objectively should be included.  Without having an established and agreed baseline the team will be flying blind once progress measurement starts.

The exact measures aka Earned Value criteria need to be in the schedule so there is no debate about the percent complete as it should simply be a question of determining whether the activity is complete or not?

  • Prepare and Signoff on Schedule Basis Document:  Another area that perplexes me is the lack of preparation and / or upkeep of the schedule basis document.  This document describes the scope, explains the inner constructs of the project schedule, progressing criteria, assumptions, exclusions etc.  Planners apart, not many people can understand the inner workings of the schedule so it is vital that this information is captured in an easy to read and understandable way. It is critical that this document is used to get parties to sign off on the schedule before the baseline is set. A typical schedule basis document should contain the following:
    • The Plan – This should include a list and description of the activities with their planned dates (start and finish).  A complete scope of works should also be added;
    • Schedule control baseline – A time phased, logically linked, resource loaded, detailed interpretation of the plan based on an aggregation of a common attribute of all activities to be measured and assessed.
    • Planned schedule – A list of activities with their planned date information usually illustrated as a bar chart;
    • Schedule basis – A description of the activities and resources covered, included methodologies, standards, references and defined deliverables, assumptions, inclusions and exclusions made, key milestones and constraints, calendars and some indication of the level of risk and uncertainty used to establish schedule contingency;
    • Schedule control plan – The progress weighting and measurement approach should be included.  It should include a description of how project performance should be measured and assessed in respect of the schedule including rules for earning progress and the procedures for evaluating progress and forecasting remaining durations.  This should all form part of the schedule basis document;
    • Quality Assessment Results – The results of the quality assessment should accompany the schedule which this assessment relates to; and
    • Signoff – Ensure signoff from all parties that this is the accepted schedule and baseline reflecting the scope of work as described earlier.

Monitor and Control the Schedule

  • Update the Schedule:  The regular sub-process of updating the schedule now commences with daily, weekly or monthly updates and reports against the baseline.
  • Schedule Change Management:  An area that does not get the attention it deserves is in schedule change management.  Too often the focus is on cost without understanding that a slippage on schedule may also have an impact on cost.  In addition, this is the only mechanism through which the project baseline can be changed.  The planner must ensure that he makes no changes to the schedule unless formally agreed to and signed off via a formal change management process.
  • Schedule Risk assessment:  Ongoing quarterly or bi-annual risk assessments should be conducted to test for the realism of the schedule in being able to accommodate potential risks.

Concluding remarks

In this article, I’ve highlighted some of the concerns I believe are hindering the development of realistic schedules.  I then identified four key areas (there may be more), which if implemented properly, will lead to significantly improved realism in schedules.  These are:

  • a well though through and consistent schedule development and control process;
  • a schedule quality assessment conducted frequently (GASP);
  • a basis document used for signoff; and
  • benchmarking.

Whist many people will be saying this takes a lot of time the effort is well worth it in the long run.  The effort is only expended in the initial setup and development, but thereafter it is pure maintenance.


Crawford, T.X., 2011, Professional Practice Guide #4: Planning and scheduling, 3rd edition, AACE International.

Humphreys, G.C., 2002, Project Management using Earned Value, 2nd Edition, Humphreys and Associates, Inc.

PMI (Project Management Institute), 2013, A guide to the project management body of knowledge (PMBOK® Guide), 5th edition., Project Management Institute, Inc.

Stephenson, H.L., 2015, TCM Framework: An integrated approach to portfolio, program and project management, 2nd edition, AACE International.


You can download this and other Insight articles from our website at

How important is the Project Master Schedule?

How important is the Project Master Schedule?

Project Master Schedule   In conversation with a couple of colleagues the other day this simple question was raised.  Initially there was disbelief that the question could even arise.  After all who used a Project Master Schedule these days? Responses ranged from :

  • “We analyse the detail to pick up issues and concerns and address them at the lowest level of the schedule”;
  • “We only produce one at the early stages of the project to give us a high level idea of the project durations and then it gets put into someone’s drawersomewhere”;
  • “Its hard work keeping it up to date”;
  • “We extract only activities from the month before and month after the status date to report on”.

I was rather taken aback at these responses as I have always used the Project Master Schedule on projects I have worked on to give me a quick overview of how the project is doing.  If I pick up an anomaly I then delve into the detail to find the problem that needs to be addressed.

So what is a Project Master Schedule?

To me it is a one page summary of the total project indicating the various phases, key milestones / deliverables which ties together Front-end Loading and Implementation (Engineering, procurement, construction and commissioning) in such a way that it addresses the business need-by dates.  It is a schedule that reflects planned, actual, forecast and status dates. Of course there can be variants to this but in its simplest form that is it.  

Key features of a master schedule include:

  • Simple to read, ideally one page only, used as a communication tool for senior management and the team;
  • Is a dynamic document for the duration of the project; and
  • The lowest level of the schedule and the highest level of the schedule must resonate with each other.

  In summary it is one of two critical documents I use to determine if we are in control of the project.  The other is the Earned Value graph but more of that later. What do you use on your projects ? Is it something similar ? Do you use one at all ?  Please share your thoughts. Remember, if you are a member you can download your own copy of our  practical guide on how to implement a project master schedule from OTC Toolkits at